CN107003759B - Method for selecting text - Google Patents
Method for selecting text Download PDFInfo
- Publication number
- CN107003759B CN107003759B CN201580030742.1A CN201580030742A CN107003759B CN 107003759 B CN107003759 B CN 107003759B CN 201580030742 A CN201580030742 A CN 201580030742A CN 107003759 B CN107003759 B CN 107003759B
- Authority
- CN
- China
- Prior art keywords
- touch
- joint
- text
- preset
- joint touch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method of selecting text, the method comprising: when a joint touch gesture acting on the touch-sensitive surface is detected, if an application function associated with the joint touch gesture does not exist, and a user interface displayed by a display is a text application interface, and a track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface in response to the joint touch gesture. The operation flow of selecting the text is optimized, the operation steps of selecting the text are simplified, and further the user experience is improved.
Description
Technical Field
Embodiments of the present invention relate to a method of selecting text, and more particularly, to a method of selecting text on a display having a touch-sensitive surface using an articulated touch gesture.
Background
With the rapid popularization and development of touch screen type portable electronic devices, more and more people handle multimedia and text applications through the portable electronic devices. For example: people can browse web pages, send and receive e-mails, send and receive instant messages and the like through the portable electronic equipment. When the user needs to share the text with the friend or copy the text, the user can select the target selection text. The operation of selecting text is generally complicated. For example: a user finger contacts a text area to be selected in the touch screen; after the finger keeps the touch state for a preset time interval, popping up a left marker post and a right marker post in the text area; the user touching and dragging the signpost may adjust the text selection area. Like this kind of operation, the user and the interactive step of touch-sensitive screen are more, and user experience is worth improving.
Disclosure of Invention
In order to improve the user experience of text selection operation in the prior art, the embodiment of the invention provides a technical scheme for selecting a text. The technical scheme comprises the following steps:
in a first aspect, an embodiment of the present invention provides a method for selecting text, which is applied to a portable electronic device including a display having a touch-sensitive surface, and includes:
when an articulation touch gesture is detected acting on the touch-sensitive surface;
if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a first possible implementation manner of the first aspect, the method further comprises executing a first application function associated with the trajectory of the joint touch gesture if the first application function exists.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a second possible implementation manner, the joint touch gesture is composed of a joint touch action; when the grid capacitance value of the touch-sensitive surface generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the acceleration signal in the Z-axis direction is within the first preset acceleration range, the touch action is the joint touch action, and the gesture formed by the joint touch action is the joint touch gesture.
In a second aspect, an embodiment of the present invention provides a portable electronic device, including:
a display having a touch-sensitive surface;
the acceleration sensor is used for acquiring the acceleration in the Z-axis direction;
a memory to store instructions;
a processor that invokes instructions stored in the memory to implement:
when an articulation touch gesture is detected acting on the touch-sensitive surface;
if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a first possible implementation manner of the second aspect, the instructions are further configured to execute the first application function if there is a first application function associated with the trajectory of the joint touch gesture.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a second possible implementation manner, the joint touch gesture is composed of a joint touch action; when the grid capacitance value of the touch-sensitive surface generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the acceleration signal in the Z-axis direction is within the first preset acceleration range, the touch action is the joint touch action, and the gesture formed by the joint touch action is the joint touch gesture.
In a third aspect, an embodiment of the present invention provides an apparatus, where the apparatus includes: the device comprises a detection unit, a first judgment unit, an identification unit, a second judgment unit and a text selection unit;
the detection unit is used for detecting joint touch gestures acting on the touch-sensitive surface;
the first judging unit is used for judging whether an application function associated with the track of the joint touch gesture exists or not;
the identification unit is used for identifying whether the user interface displayed by the display is a text application interface;
the second judging unit is used for judging whether the track of the joint touch gesture is matched with a preset track;
the text selection unit is used for responding to the joint touch gesture when detecting the joint touch gesture acting on the touch-sensitive surface and displaying a text selection area on the text application interface when the joint touch gesture is detected, and if the application function associated with the track of the joint touch gesture does not exist and the user interface displayed by the display is the text application interface and the track of the joint touch gesture is matched with a preset track, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a first possible implementation manner of the third aspect, the apparatus further includes an execution unit, configured to execute the first application function if there is a first application function associated with the trajectory of the joint touch gesture.
With reference to the third aspect or the first possible implementation manner of the third aspect, in a second possible implementation manner, the joint touch gesture is composed of a joint touch action; when the grid capacitance value of the touch-sensitive surface generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the acceleration signal in the Z-axis direction is within the first preset acceleration range, the touch action is the joint touch action, and the gesture formed by the joint touch action is the joint touch gesture.
In a fourth aspect, embodiments of the present invention provide a user interface on a portable electronic device, the portable electronic device including a display, a memory, and a processor for executing instructions stored in the memory, wherein the display has a touch-sensitive surface, the user interface comprising:
an interface for displaying a text application;
when an articulation touch gesture is detected acting on the touch-sensitive surface; if the application function associated with the track of the joint touch gesture does not exist, the user interface displayed by the display is a text application interface, and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying an interface of a text selection area on the text application interface, wherein the text selection area is located between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
In a fifth aspect, embodiments of the invention provide a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a display with a touch-sensitive surface, cause the portable electronic device to perform the following:
when an articulation touch gesture is detected acting on the touch-sensitive surface;
if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
The technical scheme of the embodiment of the invention discloses that when a joint touch gesture acting on a touch-sensitive surface is detected, if an application function associated with the joint touch gesture does not exist, and a user interface displayed by a display is a text application interface, and when a track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface. The scheme of the embodiment of the invention simplifies the operation steps of selecting the text, thereby improving the user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention;
fig. 2 is a schematic external structural diagram of a portable electronic device 100 according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a text selection area according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for selecting a text according to an embodiment of the present invention;
FIG. 5 is an exemplary user interface with a horizontal line (i.e., ") trajectory for a joint touch gesture according to embodiments of the present invention;
FIG. 6 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 5, according to an embodiment of the present invention;
FIG. 7 is an exemplary user interface with a vertical line (i.e., "|") trajectory for a joint touch gesture according to embodiments of the present invention;
FIG. 8 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 7 in accordance with embodiments of the present invention;
FIG. 9 is an exemplary user interface with a diagonal trajectory (i.e., "/") for a joint touch gesture according to embodiments of the present invention;
FIG. 10 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 9 according to an embodiment of the present invention;
FIG. 11 is an exemplary user interface with a diagonal trajectory (i.e., "\") for a joint touch gesture in accordance with an embodiment of the present invention;
FIG. 12 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 11 in accordance with embodiments of the present invention;
FIG. 13 is an exemplary user interface for performing word processing functions on a text selection field provided by embodiments of the present invention;
FIG. 14 is a simplified schematic diagram of an internal structure of an electronic device with a touch-sensitive display unit provided in accordance with an embodiment of the invention;
fig. 15 is a functional structure diagram of an apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
For convenience of illustration, the portable multifunction device 100 including a touch screen is used as an example of the embodiments of the present invention, and those skilled in the art will appreciate that the embodiments of the present invention are also applicable to other devices, such as handheld devices, vehicle-mounted devices, wearable devices, computing devices, and various forms of User Equipment (UE), Mobile Stations (MS), terminals (Terminal), Terminal Equipment (Terminal Equipment), and the like.
The electronic device 100 may support a variety of applications, such as text applications (e-mail applications, blogging applications, web browsing applications, etc.); the touch-sensitive display unit of the electronic device 100 may visually present a user interface of the application, and a user may execute various applications through the touch-sensitive display unit of the electronic device 100.
Fig. 1 is a schematic diagram of an internal structure of a portable electronic device 100 according to an embodiment of the present invention. The electronic device 100 may comprise components such as a touch sensitive display unit 130, an acceleration sensor 151, a proximity light sensor 152, an ambient light sensor 153, a memory 120, a processor 190, a radio frequency unit 110, an audio circuit 160, a speaker 161, a microphone 162, a WiFi (wireless fidelity) module 170, a bluetooth module 180, a power supply 193, an external interface 197, etc.
Those skilled in the art will appreciate that fig. 1 is merely exemplary of a portable electronic device and is not intended to be limiting of portable electronic devices and may include more or fewer components than those shown, or some components may be combined, or different components.
The touch-sensitive display unit 130 is sometimes referred to as a "touch screen" for convenience and may also be referred to as or as a touch-sensitive display system, and may also be referred to as a display having a touch-sensitive surface (touch-sensitive surface). The display with the touch-sensitive surface comprises a touch-sensitive surface and a display screen; a screen interface may be displayed and a touch action may be received.
The touch-sensitive display unit 130 provides an input interface and an output interface between the device and the user. The touch-sensitive display unit 130 may collect touch operations by a user on or near the touch-sensitive display unit, such as user operations on or near the touch-sensitive display unit using a finger 202, joint, stylus, or any suitable object. The touch-sensitive display unit may detect a touch action on the touch-sensitive display unit, a grid capacitance value of the touch-sensitive display unit, and a contact coordinate, send the touch action, the grid capacitance value of the touch-sensitive display unit, and the contact coordinate information to the processor 190, and receive and execute a command sent by the processor 190. The touch-sensitive display unit 130 displays visual output. The visual output may include graphics, text, icons, video, and any combination thereof (collectively "graphics"). In some embodiments, some or all of the visual output may correspond to a user interface object.
The touch-sensitive display unit 130 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies may be used in other embodiments. Touch-sensitive display unit 130 may detect contact and any movement or break thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch-sensitive display unit 130. In an exemplary embodiment, projected mutual capacitance sensing technology is used.
The user may make contact with touch-sensitive display unit 130 using any suitable object or appendage, such as a stylus, a finger, a joint, and so forth. In some embodiments, the user interface is designed to work primarily with joint-based contacts and gestures. In some embodiments, the device translates the coarse joint-based input into a precise pointer/cursor position or command to perform the action desired by the user.
In some embodiments, device 100 may include a touch pad (not shown) for activating or deactivating particular functions in addition to the touch-sensitive display unit. In some embodiments, the trackpad is a touch-sensitive area of the device that, unlike a touch-sensitive display unit, does not display visual output. The trackpad may be a touch-sensitive surface separate from the touch-sensitive display unit 130 or an extension of the touch-sensitive surface formed by the touch-sensitive display unit.
The acceleration sensor 151 can detect the magnitude of acceleration in various directions (typically three axes). Meanwhile, the acceleration sensor 151 may also be used to detect the magnitude and direction of gravity when the terminal is stationary, and may be used in applications for recognizing gestures of a mobile phone (e.g., horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (e.g., pedometer, tapping), and the like. In the embodiment of the present invention, the acceleration sensor 151 is configured to obtain a gravitational acceleration of the touch action of the user contacting the touch-sensitive display unit in the Z-axis direction.
The memory 120 may be used to store instructions and data. The memory 120 may mainly include a storage instruction area and a storage data area. The storage data area can store the incidence relation between the joint touch gesture and the application function and can also store preset track information. The storage instruction area may store an operating system, instructions required for at least one function, and the like. The instructions may cause processor 190 to perform a method comprising: when an articulation touch gesture is detected acting on the touch-sensitive surface; if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by a display is a text application interface; and if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface; the second endpoint is located at a second location in the text application interface. Executing the application function if the application function associated with the trajectory of the joint touch gesture exists.
The processor 190 is a control center of the electronic device 100, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the electronic device 100 and processes data by operating or executing instructions stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Alternatively, processor 190 may include one or more processing units; preferably, the processor 190 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 190. In some embodiments, the processor, memory, and/or the like may be implemented on a single chip, or in some embodiments, they may be implemented separately on separate chips. In an embodiment of the present invention, processor 190 is further configured to invoke instructions in the memory to enable when an articulation touch gesture is detected that acts on the touch-sensitive surface; if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by a display is a text application interface; and if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface; the second endpoint is located at a second location in the text application interface. Executing the application function if the application function associated with the trajectory of the joint touch gesture exists.
The radio frequency unit 110 may be configured to receive and transmit information or receive and transmit signals during a call, and in particular, receive downlink information of a base station and then process the downlink information to the processor 190; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the radio frequency unit 110 may also communicate with network devices and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), etc.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between a user and the electronic device 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, which is received by the audio circuit 160 and then converted into audio data, and then the audio data is processed by the audio data output processor 190 and then transmitted to another terminal via the rf unit 110, or the audio data is output to the memory 120 for further processing.
WiFi belongs to short-range wireless transmission technology, and the electronic device 100 can help the user send and receive e-mails, browse web pages, access streaming media, etc. through the WiFi module 170, which provides the user with wireless broadband internet access. Although fig. 1 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the electronic device 100, and may be omitted entirely as needed within the scope not changing the essence of the invention.
Bluetooth is a short-range wireless communication technology. By using the bluetooth technology, the communication between mobile communication terminal devices such as a palm computer, a notebook computer, and a mobile phone can be effectively simplified, and the communication between the above devices and the Internet (Internet) can also be successfully simplified, so that the data transmission between the electronic device 100 and the Internet becomes faster and more efficient through the bluetooth module 180 of the electronic device 100, and a road is widened for wireless communication. Bluetooth technology is an open solution that enables wireless transmission of voice and data. Fig. 1, however, shows a WiFi module 170, but it is understood that it does not belong to the essential components of the electronic device 100, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The electronic device 100 further includes a power source 193 (e.g., a battery) for supplying power to various components, which may preferably be logically connected to the processor 190 via a power management system 194, such that functions of managing charging, discharging, and power consumption are performed via the power management system 194.
The electronic device 100 further includes an external interface 197, which may be a standard Micro USB interface, or a multi-pin connector, and may be used to connect the electronic device 100 to communicate with other devices, or to connect a charger to charge the electronic device 100.
Although not shown, the electronic device 100 may further include a camera, a flash, and the like, which are not described in detail herein.
The method for selecting text is described below by taking the electronic device 100 as an example.
Fig. 2 is a schematic external structural diagram of the portable electronic device 100 according to the embodiment of the present invention. In the present embodiment, the electronic apparatus 100 may include a touch-sensitive display unit 130, an acceleration sensor 151, a volume control key 132, a switch key 133, a microphone 162, a speaker 161, an external interface 197, and an earphone jack 163. The touch-sensitive display unit 130 may display one or more graphics 300 in the user interface 200, receive touch inputs from a user, and may reduce the number of physical inputs or controls on the electronic device 100 by using the touch-sensitive display unit 130 as a primary input or control for operating the electronic device 100. In this embodiment, the touch-sensitive display unit may be referred to as a "menu button". In some other embodiments, a "menu button" may be a physical button or other physical input or control device. The acceleration sensor 151 is used for acquiring the gravity acceleration of the touch action of the user on the touch-sensitive display unit in the Z-axis. Turning on or off the power of the electronic device 100 may be accomplished by depressing and holding the switch key in a depressed state for a predetermined time interval. Locking the electronic device 100 may be achieved by depressing the switch button and releasing it before a predetermined time interval. In other embodiments, voice input to activate some functions may also be received through the microphone 162.
Fig. 3 is a schematic diagram illustrating a text selection area according to an embodiment of the present invention. The text selection area 301 may be represented on the touch-sensitive display unit 130 as text bounded by a first endpoint 302a and a second endpoint 302b, located between the first endpoint and the second endpoint. Those skilled in the art will appreciate that the selected text region 301 may contain any portion of the text shown in FIG. 3, and that the text selected in FIG. 3 is merely one example. Further, the first end point 302a may be associated with a first marker post 303a and the second end point 302b may be associated with a second marker post 303 b. First marker post 303a and second marker post 303b may be used to indicate the location of first endpoint 302a and second endpoint 302b, respectively. Since the signpost is easier to manipulate than the end points, when a user wishes to move one or both of the first end point 302a or the second end point 302b to a new location, moving the end point to the new location may be accomplished by moving the signpost associated with the end point to the new location. In some embodiments, marker posts 303a and 303b may be other shapes, sizes, and colors. This embodiment is merely an example.
Fig. 4 is a flowchart of a method for selecting a text according to an embodiment of the present invention.
Method 400 may be performed on a portable electronic device (e.g., electronic device 100 of fig. 1 or 2) having a touch-sensitive display unit and a plurality of applications, including a text application. In some embodiments, some operations in method 400 may be combined, and/or the order of some operations may be changed.
As described below, the method 400 provides a more efficient way to quickly select text. The method helps the user to select the text by fewer operation steps. The operation steps of selecting the text are simplified, and the user experience is improved.
The portable electronic device detects an articulation touch gesture (401) acting on the touch-sensitive surface.
For example: the step 401 can specifically include steps 4011 to 4013.
When detecting a touch action acting on the touch-sensitive surface (4011); judging whether the touch action is a joint touch action (4012); detecting a joint touch gesture consisting of the joint touch actions (4013).
Upon detecting an articulation touch gesture acting on the touch-sensitive surface, it is determined whether an application function associated with a trajectory of the articulation touch gesture exists (402).
When it is determined that there is an application function associated with the trajectory of the joint touch gesture, the application function is executed (403).
When it is determined that there is no application function associated with the trajectory of the joint touch gesture, identifying whether a user interface displayed by the display is a text application interface (404).
Illustratively, the text application interface may display web browsing, email, notepad, instant messaging, blog application, and the like.
And if the user interface displayed by the display is a text application interface, judging whether the track of the joint touch gesture is matched with a preset track (405).
And if the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area (406) on the text application interface.
And when the touch-sensitive display unit receives a touch action acting on the touch-sensitive surface, the touch information is transferred to the processor. The touch information may include one or more of touch point coordinates, grid capacitance values of the touch sensitive display unit, and touch actions. The touch actions may include pressing, moving, and lifting actions.
In some embodiments, it may be determined whether the touch action is a joint touch action based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action (4012). The touch sensitive surface grid capacitance information includes a grid capacitance value and a grid number of non-zero capacitance values.
When the grid capacitance value meets a first preset capacitance value range, the grid number of the non-zero capacitance value is smaller than a preset value, and the acceleration signal in the Z-axis direction is in the first preset acceleration range, it can be judged that the touch action is a joint touch action. When the grid capacitance value meets a second preset capacitance value range, the grid number of the non-zero capacitance value is larger than or equal to a preset value, and the acceleration signal in the Z-axis direction is in the second preset acceleration range, it can be judged that the touch action is a finger touch action.
For example, when the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a first preset capacitance value range (for example, less than or equal to 0.42pF), the number of grids with non-zero capacitance values distributed is less than 7, and the Z-axis direction acceleration signal is within the first preset acceleration range (for example, within 5ms, the acceleration signal is greater than 3g) within a preset time, the touch action may be determined as a joint touch action. When the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a second preset capacitance value range (for example, greater than 0.42pF, less than or equal to 0.46pF), the number of grids with non-zero capacitance values distributed is greater than or equal to 7, and in a preset time, the Z-axis direction acceleration signal is within a second preset acceleration range (for example, within 5ms, the acceleration signal is less than 2g, and g is gravity acceleration), it may be determined that the touch action is a finger touch action. It is understood that the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may also hit the touch-sensitive display unit 130 at a fast speed, and as long as the determination condition of the joint touch action is satisfied, the joint touch action in the embodiment of the present invention may be referred to as the joint touch action.
The touch gesture may consist of a touch action. For example, a tap gesture consists of two touch actions, press and lift; the swipe gesture consists of three touch actions, a press, a move, and a lift. When it is determined that the touch action is a joint touch action, a joint touch gesture composed of the joint touch actions may be detected (4013). For example: the joint click gesture consists of two joint touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting.
In some embodiments, it is determined whether there is an application function associated with the trajectory of the joint touch gesture by looking up the association of the trajectory of the joint touch gesture with the application function stored in memory 120 (402). The joint touch gesture may be composed of different joint touch actions. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting, and an articulation slide gesture consists of three articulation touch actions of pressing, moving, and lifting. And the moving track between the pressing and the lifting is the track of the joint touch gesture. The user may preset the association relationship between the trajectories of the various joint touch gestures and the application functions, and store the association relationship between the trajectories of the joint touch gestures and the application functions in the memory 120.
For example, the user may previously set the association relationship of the trajectory "C" of the joint touch gesture with the camera application function, and save the association relationship of the trajectory "C" of the joint touch gesture with the camera application function in the memory 120. When the joint touch gesture with the track of "C" acting on the touch-sensitive surface is detected, the track of the joint touch gesture "C" can be judged to be associated with the camera application function by searching the association relationship between the track of the joint touch gesture stored in the memory 120 and the application function.
In some embodiments, when a depressed articulating touch action is detected in region A of the touch sensitive display unit, the depression moves on the touch sensitive display unit to region B where a raised articulating touch action is detected. The joint touch event that the joint is pressed down in the area A and lifted up after moving to the area B is a joint touch gesture. For example: the position of the touch action (e.g., zone a or zone B) can be determined from the contact coordinate information. The joint touch gesture may be comprised of a joint touch action. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting. The area A is a starting contact area of the joint touch gesture and the touch-sensitive display unit; and the area B is an ending contact area of the joint touch gesture and the touch-sensitive display unit. The moving track from the area A to the area B is the track of the touch gesture. The electronic device compares the detected track of the joint touch gesture with a preset track, and judges whether the track of the joint touch gesture is matched with the preset track (405). The preset track information may be preset by the electronic device when the electronic device leaves a factory, or may be preset by a user. The preset trajectory information may be stored in a memory (e.g., memory 120 in fig. 1).
Illustratively, taking the preset trajectory as a straight line as an example, the straight line may be a horizontal line, a vertical line, or a diagonal line (e.g., "-", "|", "/" or "\\"). It should be noted that the preset trajectory may be a trajectory in other forms, and may be adaptively adjusted according to specific design requirements. In the embodiment of the present invention, the preset track is taken as an example of a straight line, and does not limit the scheme of the present invention.
FIG. 5 is an exemplary user interface with a horizontal line (e.g., ") traced by a joint touch gesture in accordance with an embodiment of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a horizontal line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B. And the electronic equipment compares the detected transverse line track with a preset straight line track and judges that the transverse line track is matched with the preset track.
In some embodiments, if it is determined that the trajectory of the joint touch gesture matches a preset trajectory, a text selection area is displayed on the text application interface in response to the joint touch gesture (406). The text selection area is located between a first endpoint and a second endpoint, the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. For example: after the track of the joint touch gesture is judged to be matched with a preset track, inserting a first endpoint at the position of a starting contact area A of the joint touch gesture and the touch-sensitive display unit, and inserting a second endpoint at the position of an ending contact area B of the joint touch gesture and the touch-sensitive display unit; the insertion position of the first endpoint is a first position, which may be the beginning or end of a text word or word in a text application interface displayed by the display that is closest to the centroid of the region a; the insertion position of the second endpoint is a second position, which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of the region B; and a text region positioned between the first endpoint and the second endpoint in the text application interface is the text selection region.
FIG. 6 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 5 according to an embodiment of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 5; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of fig. 5; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
FIG. 7 is an exemplary user interface with a vertical line (e.g., "|") trajectory for a joint touch gesture according to embodiments of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a vertical line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B; and the electronic equipment compares the detected vertical line track with a preset straight line track and judges that the vertical line track is matched with the preset track.
FIG. 8 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 7 according to embodiments of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 7; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of FIG. 7; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
FIG. 9 is an exemplary user interface with a diagonal trajectory (e.g., "/") for a joint touch gesture according to embodiments of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a vertical line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B; and the electronic equipment compares the detected oblique line track with a preset straight line track and judges that the oblique line track is matched with the preset track.
FIG. 10 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 9 according to embodiments of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 9; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of fig. 9; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
FIG. 11 is an exemplary user interface with a diagonal trajectory (e.g., "\") for a joint touch gesture in accordance with an embodiment of the present invention. For example: the joint touch position detected at the beginning is in a zone A (the zone A is a starting contact zone) and then the joint touch position is detected to move to a zone B (the zone B is an ending contact zone), the track of the joint touch gesture is a track moving from the zone A to the zone B, such as a vertical line track shown by a dotted line, and the arrow direction is the moving direction of the joint touch gesture from the zone A to the zone B; and the electronic equipment compares the detected oblique line track with a preset straight line track and judges that the oblique line track is matched with the preset track.
FIG. 12 is an exemplary user interface of the text selection area resulting from the trajectory of the joint touch gesture shown in FIG. 11 according to embodiments of the present invention. The insertion location of the first endpoint 302a is the beginning or end of a text word or word closest to the centroid of the joint touch gesture and the beginning contact area A of the touch-sensitive display unit of the example of FIG. 11; the insertion location of the second endpoint 302B is the beginning or end of the text word or word closest to the centroid of the joint touch gesture and the ending contact region B of the touch-sensitive display unit of the example of FIG. 11; the text selection area 301 is located between a first end point 302a and a second end point 302 b.
Optionally, the above embodiments may each further perform a word processing function on the text selection area.
Fig. 13 is an exemplary user interface for performing word processing functions on a text selection field provided by an embodiment of the present invention. The word processing functions may include copy, cut, paste, translate, etc. Other word processing functions may also be invoked by selecting "more," which may include underlining the selected text, bolding the selected text, changing the font, font size, font color selection, etc. of the selected text. The arrangement order and the presentation form of the word processing functions illustrated in fig. 13 can be appropriately adjusted according to design requirements.
The technical scheme of the embodiment of the invention discloses that when a joint touch gesture acting on a touch-sensitive surface is detected, if an application function associated with the joint touch gesture does not exist, and a user interface displayed by a display is a text application interface, and when a track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface. The scheme of the embodiment of the invention simplifies the operation steps of selecting the text, thereby improving the user experience.
Fig. 14 is a simplified schematic diagram of an internal structure of an electronic device with a touch-sensitive display unit according to an embodiment of the present invention.
The functional blocks of the electronic device can be implemented by hardware, software, or a combination of hardware and software to carry out the principles of the present invention. Those skilled in the art will appreciate that the functional blocks described in fig. 14 can be combined or separated into sub-functional blocks to implement the principles of the present invention as described above. Thus, the description herein may support any possible combination or separation or further definition of the functional modules described herein.
Based on the same inventive concept, because the principle of solving the problems of the electronic equipment and the device is similar to the method for selecting the text in the embodiment of the invention, the implementation of the electronic equipment and the device can refer to the implementation of the method, and repeated parts are not described again.
As shown in fig. 14, the electronic device 1400 includes: touch-sensitive display unit 130, acceleration sensor 151, memory 120, processor 190.
The touch-sensitive display unit 130 may be a display having a touch-sensitive surface, and the touch-sensitive display unit 130 includes a touch-sensitive surface and a display screen. The touch-sensitive display unit 130 is used for displaying a screen interface, receiving touch actions acting on a touch-sensitive surface and transmitting touch information to the processor 190; the touch information may include one or more signals of touch point coordinates, grid capacitance values of the touch-sensitive display unit, and touch actions; the touch actions may include pressing, moving, and lifting actions.
The acceleration sensor 151 is configured to detect an acceleration signal in the Z-axis direction and transmit the detected acceleration signal in the Z-axis direction to the processor 190.
The memory area 120 stores instructions.
The processor 190 is coupled to the touch-sensitive display unit 130, the acceleration sensor 151 and the memory 120.
The processor 190 invokes instructions stored in the memory 120 to enable further identifying whether a user interface displayed by the display is a text application interface when an articulation touch gesture is detected that acts on the touch-sensitive surface if there is no application function associated with a trajectory of the articulation touch gesture.
And if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture and displaying a text selection area on the text application interface. The text selection area is positioned between the first endpoint and the second endpoint;
the first endpoint is located at a first position in the text application interface;
the second endpoint is located at a second location in the text application interface.
Optionally, if there is an application function associated with the trajectory of the joint touch gesture, executing the application function.
And when the touch-sensitive display unit receives a touch action acting on the touch-sensitive surface, the touch information is transferred to the processor. The touch information may include one or more of touch point coordinates, grid capacitance values of the touch sensitive display unit, and touch actions. The touch actions may include pressing, moving, and lifting actions.
In some embodiments, it may be determined whether the touch action is a joint touch action based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action; the touch sensitive surface grid capacitance information includes a grid capacitance value and a grid number of non-zero capacitance values.
When the grid capacitance value meets a first preset capacitance value range, the grid number of the non-zero capacitance value is smaller than a preset value, and the acceleration signal in the Z-axis direction is in the first preset acceleration range, it can be judged that the touch action is a joint touch action. When the grid capacitance value meets a second preset capacitance value range, the grid number of the non-zero capacitance value is larger than or equal to a preset value, and the acceleration signal in the Z-axis direction is in the second preset acceleration range, it can be judged that the touch action is a finger touch action.
For example, when the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a first preset capacitance value range (for example, less than or equal to 0.42pF), the number of grids with non-zero capacitance values distributed is less than 7, and the Z-axis direction acceleration signal is within the first preset acceleration range (for example, within 5ms, the acceleration signal is greater than 3g) within a preset time, the touch action may be determined as a joint touch action. When the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a second preset capacitance value range (for example, greater than 0.42pF, less than or equal to 0.46pF), the number of grids with non-zero capacitance values distributed is greater than or equal to 7, and in a preset time, the Z-axis direction acceleration signal is within a second preset acceleration range (for example, within 5ms, the acceleration signal is less than 2g, and g is gravity acceleration), it may be determined that the touch action is a finger touch action. It is understood that the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may also hit the touch-sensitive display unit 130 at a fast speed, and as long as the determination condition of the joint touch action is satisfied, the joint touch action in the embodiment of the present invention may be referred to as the joint touch action.
The touch gesture may consist of a touch action. For example, a tap gesture consists of two touch actions, press and lift; the swipe gesture consists of three touch actions, a press, a move, and a lift. And detecting a joint touch gesture consisting of joint touch actions after judging that the touch actions are joint touch actions. For example: the joint click gesture consists of two joint touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting.
In some embodiments, whether an application function associated with the trajectory of the joint touch gesture exists is determined by looking up the association relationship between the trajectory of the joint touch gesture and the application function stored in the memory 120. The joint touch gesture may be composed of different joint touch actions. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting, and an articulation slide gesture consists of three articulation touch actions of pressing, moving, and lifting. And the moving track between the pressing and the lifting is the track of the joint touch gesture. The user may preset the association relationship between the trajectories of the various joint touch gestures and the application functions, and store the association relationship between the trajectories of the joint touch gestures and the application functions in the memory 120.
For example, the user may previously set the association relationship of the trajectory "C" of the joint touch gesture with the camera application function, and save the association relationship of the trajectory "C" of the joint touch gesture with the camera application function in the memory 120. When the joint touch gesture with the track of "C" acting on the touch-sensitive surface is detected, the track of the joint touch gesture "C" can be judged to be associated with the camera application function by searching the association relationship between the track of the joint touch gesture stored in the memory 120 and the application function.
In some embodiments, when a depressed articulating touch action is detected in region A of the touch sensitive display unit, the depression moves on the touch sensitive display unit to region B where a raised articulating touch action is detected. The joint touch event that the joint is pressed down in the area A and lifted up after moving to the area B is a joint touch gesture. For example: the position of the touch action (e.g., zone a or zone B) can be determined from the contact coordinate information. The joint touch gesture may be comprised of a joint touch action. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting. The area A is a starting contact area of the joint touch gesture and the touch-sensitive display unit; and the area B is an ending contact area of the joint touch gesture and the touch-sensitive display unit. The moving track from the area A to the area B is the track of the touch gesture. The electronic equipment compares the detected track of the joint touch gesture with a preset track, and judges whether the track of the joint touch gesture is matched with the preset track. The preset track information may be preset by the electronic device when the electronic device leaves a factory, or may be preset by a user. The preset trajectory information may be stored in a memory (e.g., memory 120 in fig. 1).
The displaying of the text selection area on the text application interface is specifically that after the track of the joint touch gesture is judged to be matched with a preset track, the text selection area is displayed on the text application interface. The text selection area is located between a first endpoint and a second endpoint, the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. For example: and after the track of the joint touch gesture is judged to be matched with the preset track, inserting a first endpoint at the position of the starting contact area A of the joint touch gesture and the touch-sensitive display unit, and inserting a second endpoint at the position of the ending contact area B of the joint touch gesture and the touch-sensitive display unit. The insertion position of the first endpoint is a first position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of the region a. The insertion position of the second endpoint is a second position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of said zone B. And a text region positioned between the first endpoint and the second endpoint in the text application interface is the text selection region.
Fig. 15 is a functional structure diagram of an apparatus according to an embodiment of the present invention.
In some embodiments, the apparatus includes a detection unit 1501, a first judgment unit 1502, a recognition unit 1503, a second judgment unit 1504, and a selected text unit 1505.
The detection unit 1501 is configured to detect a joint touch gesture acting on the touch-sensitive surface.
The first determining unit 1502 is configured to determine whether an application function associated with a trajectory of the joint touch gesture exists.
The identifying unit 1503 is configured to identify whether the user interface displayed by the display is a text application interface.
The second determining unit 1504 is configured to determine whether the trajectory of the joint touch gesture matches a preset trajectory.
The select text unit 1505 is configured to, when an articulation touch gesture is detected acting on the touch-sensitive surface, display a text selection area on the text application interface in response to the articulation touch gesture if there is no application function associated with a trajectory of the articulation touch gesture and the user interface displayed by the display is a text application interface, the trajectory of the articulation touch gesture matching a preset trajectory. The text selection area is positioned between the first endpoint and the second endpoint; the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface.
Optionally, the apparatus further includes an execution unit 1506.
The execution unit 1506 is configured to execute the application function if the application function associated with the trajectory of the joint touch gesture exists.
And when the touch-sensitive display unit receives a touch action acting on the touch-sensitive surface, the touch information is transferred to the processor. The touch information may include one or more of touch point coordinates, grid capacitance values of the touch sensitive display unit, and touch actions. The touch actions may include pressing, moving, and lifting actions.
In some embodiments, it may be determined whether the touch action is a joint touch action based on the grid capacitance information and a Z-axis direction acceleration signal generated by the touch action. The touch sensitive surface grid capacitance information includes a grid capacitance value and a grid number of non-zero capacitance values.
When the grid capacitance value meets a first preset capacitance value range, the grid number of the non-zero capacitance value is smaller than a preset value, and the acceleration signal in the Z-axis direction is in the first preset acceleration range, it can be judged that the touch action is a joint touch action. When the grid capacitance value meets a second preset capacitance value range, the grid number of the non-zero capacitance value is larger than or equal to a preset value, and the acceleration signal in the Z-axis direction is in the second preset acceleration range, it can be judged that the touch action is a finger touch action.
For example, when the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a first preset capacitance value range (for example, less than or equal to 0.42pF), the number of grids with non-zero capacitance values distributed is less than 7, and the Z-axis direction acceleration signal is within the first preset acceleration range (for example, within 5ms, the acceleration signal is greater than 3g) within a preset time, the touch action may be determined as a joint touch action. When the grid capacitance value of the touch-sensitive display unit indicates that the maximum capacitance value satisfies a second preset capacitance value range (for example, greater than 0.42pF, less than or equal to 0.46pF), the number of grids with non-zero capacitance values distributed is greater than or equal to 7, and in a preset time, the Z-axis direction acceleration signal is within a second preset acceleration range (for example, within 5ms, the acceleration signal is less than 2g, and g is gravity acceleration), it may be determined that the touch action is a finger touch action. It is understood that the joint touch action in the embodiment of the present invention is not necessarily triggered by the finger joint, and other objects may also hit the touch-sensitive display unit 130 at a fast speed, and as long as the determination condition of the joint touch action is satisfied, the joint touch action in the embodiment of the present invention may be referred to as the joint touch action.
The touch gesture may consist of a touch action. For example, a tap gesture consists of two touch actions, press and lift; the swipe gesture consists of three touch actions, a press, a move, and a lift. When it is determined that the touch action is a joint touch action, a joint touch gesture composed of joint touch actions may be detected. For example: the joint click gesture consists of two joint touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting.
In some embodiments, whether an application function associated with the trajectory of the joint touch gesture exists is determined by looking up the association relationship between the trajectory of the joint touch gesture and the application function stored in the memory 120. The joint touch gesture may be composed of different joint touch actions. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting, and an articulation slide gesture consists of three articulation touch actions of pressing, moving, and lifting. And the moving track between the pressing and the lifting is the track of the joint touch gesture. The user may preset the association relationship between the trajectories of the various joint touch gestures and the application functions, and store the association relationship between the trajectories of the joint touch gestures and the application functions in the memory 120.
For example, the user may previously set the association relationship of the trajectory "C" of the joint touch gesture with the camera application function, and save the association relationship of the trajectory "C" of the joint touch gesture with the camera application function in the memory 120. When the joint touch gesture with the track of "C" acting on the touch-sensitive surface is detected, the track of the joint touch gesture "C" can be judged to be associated with the camera application function by searching the association relationship between the track of the joint touch gesture stored in the memory 120 and the application function.
In some embodiments, when a depressed articulating touch action is detected in region A of the touch sensitive display unit, the depression moves on the touch sensitive display unit to region B where a raised articulating touch action is detected. The joint touch event that the joint is pressed down in the area A and lifted up after moving to the area B is a joint touch gesture. For example: the position of the touch action (e.g., zone a or zone B) can be determined from the contact coordinate information. The joint touch gesture may be comprised of a joint touch action. For example, an articulation tap gesture consists of two articulation touch actions of pressing and lifting; the joint sliding gesture consists of three joint touch actions of pressing, moving and lifting. The area A is a starting contact area of the joint touch gesture and the touch-sensitive display unit; and the area B is an ending contact area of the joint touch gesture and the touch-sensitive display unit. The moving track from the area A to the area B is the track of the touch gesture. The electronic equipment compares the detected track of the joint touch gesture with a preset track, and judges whether the track of the joint touch gesture is matched with the preset track. The preset track information may be preset by the electronic device when the electronic device leaves a factory, or may be preset by a user. The preset trajectory information may be stored in a memory (e.g., memory 120 in fig. 1).
The displaying of the text selection area on the text application interface is specifically that after the track of the joint touch gesture is judged to be matched with a preset track, the text selection area is displayed on the text application interface. The text selection area is located between a first endpoint and a second endpoint, the first endpoint is located at a first position in the text application interface, and the second endpoint is located at a second position in the text application interface. For example: and after the track of the joint touch gesture is judged to be matched with the preset track, inserting a first endpoint at the position of the starting contact area A of the joint touch gesture and the touch-sensitive display unit, and inserting a second endpoint at the position of the ending contact area B of the joint touch gesture and the touch-sensitive display unit. The insertion position of the first endpoint is a first position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of the region a. The insertion position of the second endpoint is a second position which may be the beginning or end of a text word or word in the text application interface displayed by the display that is closest to the centroid of said zone B. And a text region positioned between the first endpoint and the second endpoint in the text application interface is the text selection region.
Based on the same inventive concept, because the principle of solving the problems of the electronic device and the apparatus is similar to the method for selecting the text in the embodiment of the present invention, the implementation of the electronic device and the apparatus can refer to the implementation of the method, and repeated details are not repeated.
The technical scheme of the embodiment of the invention discloses that when a joint touch gesture acting on a touch-sensitive surface is detected, if an application function associated with the joint touch gesture does not exist, and a user interface displayed by a display is a text application interface, and when a track of the joint touch gesture is matched with a preset track, a text selection area is displayed on the text application interface. The scheme of the embodiment of the invention simplifies the operation steps of selecting the text, thereby improving the user experience.
In the embodiments provided in the present application, it can be understood by those skilled in the art that all or part of the steps for implementing the above embodiments are only illustrative and can be implemented by hardware, and can also be implemented by program instructions and related hardware, and when the steps are implemented by program instructions and related hardware, the program can be stored in a non-volatile (non-transitory) computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (5)
1. A method of selecting text for use on a portable electronic device having a display with a touch-sensitive surface, the method comprising:
when a touch-sensitive surface grid capacitance value generated by a touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is a joint touch action, wherein a gesture formed by the joint touch action is a joint touch gesture, the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
determining whether an application function associated with a trajectory of the joint touch gesture exists;
executing an application function associated with the trajectory of the joint touch gesture if the application function exists;
if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint; the preset track is preset when the electronic equipment leaves a factory or preset by a user;
the first endpoint is located at a first position in the text application interface, and the first position is: the beginning or the end of a character in the text application interface, which is closest to the starting position of the track of the joint touch gesture;
the second endpoint is located at a second position in the text application interface, where the second position is: a beginning or an end of a word in the text application interface that is closest to an end position of the trajectory of the joint touch gesture.
2. A portable electronic device, characterized in that the portable electronic device comprises:
a display having a touch-sensitive surface;
the acceleration sensor is used for acquiring the acceleration in the Z-axis direction;
a memory to store instructions;
a processor that invokes instructions stored in the memory to implement:
when a touch-sensitive surface grid capacitance value generated by a touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is a joint touch action, wherein a gesture formed by the joint touch action is a joint touch gesture, the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
determining whether an application function associated with a trajectory of the joint touch gesture exists;
executing an application function associated with the trajectory of the joint touch gesture if the application function exists;
if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint; the preset track is preset when the electronic equipment leaves a factory or preset by a user;
the first endpoint is located at a first position in the text application interface, and the first position is: the beginning or the end of a character in the text application interface, which is closest to the starting position of the track of the joint touch gesture;
the second endpoint is located at a second position in the text application interface, where the second position is: a beginning or an end of a word in the text application interface that is closest to an end position of the trajectory of the joint touch gesture.
3. An apparatus for selecting text, the apparatus comprising: the device comprises a detection unit, a first judgment unit, an identification unit, a second judgment unit and a text selection unit;
the detection unit is used for detecting a joint touch gesture acting on the touch-sensitive surface, the joint touch gesture consists of joint touch actions, when the touch-sensitive surface grid capacitance value generated by the touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of the non-zero capacitance values is smaller than a preset value, and the Z-axis direction acceleration signal is within a first preset acceleration range, determining that the touch action is the joint touch action, the joint touch action producing a touch sensitive surface grid capacitance value that is less than a touch sensitive surface grid capacitance value produced by a finger touch action, the number of grids of non-zero capacitance values generated by the joint touch action is smaller than that generated by the finger touch action, the Z-axis direction acceleration generated by the joint touch action is greater than the Z-axis direction acceleration generated by the finger touch action;
the first judging unit is used for judging whether an application function associated with the track of the joint touch gesture exists or not;
an execution unit configured to execute a first application function associated with a trajectory of the joint touch gesture if the first application function exists;
the identification unit is used for identifying whether the user interface displayed by the display is a text application interface or not if the application function associated with the track of the joint touch gesture does not exist;
the second judging unit is used for judging whether the track of the joint touch gesture is matched with a preset track;
the text selection unit is used for responding to the joint touch gesture when detecting the joint touch gesture acting on the touch-sensitive surface and displaying a text selection area on the text application interface when the joint touch gesture is detected, and if the application function associated with the track of the joint touch gesture does not exist and the user interface displayed by the display is the text application interface and the track of the joint touch gesture is matched with a preset track, wherein the text selection area is positioned between a first endpoint and a second endpoint; the preset track is preset when the device leaves a factory or preset by a user;
the first endpoint is located at a first position in the text application interface, and the first position is: the beginning or the end of a character in the text application interface, which is closest to the starting position of the track of the joint touch gesture;
the second endpoint is located at a second position in the text application interface, where the second position is: a beginning or an end of a word in the text application interface that is closest to an end position of the trajectory of the joint touch gesture.
4. A user interface on a portable electronic device, the portable electronic device comprising a display, a memory, and a processor for executing instructions stored in the memory, wherein the display has a touch-sensitive surface, the user interface comprising:
an interface for displaying a text application;
when a joint touch gesture acting on the touch-sensitive surface is detected, wherein the joint touch gesture consists of joint touch actions, and when a touch-sensitive surface grid capacitance value generated by the touch action acting on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and an acceleration signal in the Z-axis direction is within a first preset acceleration range, the touch action is determined to be the joint touch action; the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than the touch-sensitive surface grid capacitance value generated by the finger touch action, the grid number of non-zero capacitance values generated by the joint touch action is smaller than the grid number of non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action; if the application function associated with the track of the joint touch gesture does not exist, the user interface displayed by the display is a text application interface, and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying an interface of a text selection area on the text application interface, wherein the text selection area is located between a first endpoint and a second endpoint; the preset track is preset when the electronic equipment leaves a factory or preset by a user;
the first endpoint is located at a first position in the text application interface, and the first position is: the beginning or the end of a character in the text application interface, which is closest to the starting position of the track of the joint touch gesture;
the second endpoint is located at a second position in the text application interface, where the second position is: a beginning or an end of a word in the text application interface that is closest to an end position of the trajectory of the joint touch gesture.
5. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a display with a touch-sensitive surface, cause the portable electronic device to perform the following:
when a touch-sensitive surface grid capacitance value generated by a touch action on the touch-sensitive surface meets a first preset capacitance value range, the grid number of non-zero capacitance values is smaller than a preset value, and a Z-axis direction acceleration signal is in the first preset acceleration range, determining that the touch action is a joint touch action, wherein a gesture formed by the joint touch action is a joint touch gesture, the touch-sensitive surface grid capacitance value generated by the joint touch action is smaller than a touch-sensitive surface grid capacitance value generated by a finger touch action, the grid number of the non-zero capacitance values generated by the joint touch action is smaller than the grid number of the non-zero capacitance values generated by the finger touch action, and the Z-axis direction acceleration generated by the joint touch action is larger than the Z-axis direction acceleration generated by the finger touch action;
determining whether an application function associated with a trajectory of the joint touch gesture exists;
executing an application function associated with the trajectory of the joint touch gesture if the application function exists;
if the application function associated with the track of the joint touch gesture does not exist, further identifying whether a user interface displayed by the display is a text application interface;
if the user interface displayed by the display is a text application interface and the track of the joint touch gesture is matched with a preset track, responding to the joint touch gesture, and displaying a text selection area on the text application interface, wherein the text selection area is positioned between a first endpoint and a second endpoint; the preset track is preset when the electronic equipment leaves a factory or preset by a user;
the first endpoint is located at a first position in the text application interface, and the first position is: the beginning or the end of a character in the text application interface, which is closest to the starting position of the track of the joint touch gesture;
the second endpoint is located at a second position in the text application interface, where the second position is: a beginning or an end of a word in the text application interface that is closest to an end position of the trajectory of the joint touch gesture.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2015/088618 WO2017035740A1 (en) | 2015-08-31 | 2015-08-31 | Method for selecting text |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107003759A CN107003759A (en) | 2017-08-01 |
CN107003759B true CN107003759B (en) | 2020-10-16 |
Family
ID=58186482
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201580030742.1A Active CN107003759B (en) | 2015-08-31 | 2015-08-31 | Method for selecting text |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107003759B (en) |
WO (1) | WO2017035740A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113805487B (en) * | 2020-07-23 | 2022-09-23 | 荣耀终端有限公司 | Control instruction generation method and device, terminal equipment and readable storage medium |
CN113760167B (en) * | 2021-07-21 | 2024-07-19 | 广州朗国电子科技股份有限公司 | Method for copying object by using gesture, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013044450A1 (en) * | 2011-09-27 | 2013-04-04 | Motorola Mobility, Inc. | Gesture text selection |
CN103605433A (en) * | 2013-11-28 | 2014-02-26 | 吴泽彬 | Multifunctional somatological input device |
CN104049728A (en) * | 2013-03-15 | 2014-09-17 | Lg电子株式会社 | Electronic device and control method thereof |
CN104769533A (en) * | 2013-05-06 | 2015-07-08 | 齐科斯欧公司 | Using finger touch types to interact with electronic devices |
CN104765487A (en) * | 2014-01-06 | 2015-07-08 | 船井电机株式会社 | Input apparatus |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8650507B2 (en) * | 2008-03-04 | 2014-02-11 | Apple Inc. | Selecting of text using gestures |
US8294680B2 (en) * | 2009-03-27 | 2012-10-23 | Sony Mobile Communications Ab | System and method for touch-based text entry |
KR20120097944A (en) * | 2011-02-28 | 2012-09-05 | 삼성전자주식회사 | Terminal having touch screen and method for outputting data thereof |
DE112011105305T5 (en) * | 2011-06-03 | 2014-03-13 | Google, Inc. | Gestures for text selection |
-
2015
- 2015-08-31 WO PCT/CN2015/088618 patent/WO2017035740A1/en active Application Filing
- 2015-08-31 CN CN201580030742.1A patent/CN107003759B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013044450A1 (en) * | 2011-09-27 | 2013-04-04 | Motorola Mobility, Inc. | Gesture text selection |
CN104049728A (en) * | 2013-03-15 | 2014-09-17 | Lg电子株式会社 | Electronic device and control method thereof |
CN104769533A (en) * | 2013-05-06 | 2015-07-08 | 齐科斯欧公司 | Using finger touch types to interact with electronic devices |
CN103605433A (en) * | 2013-11-28 | 2014-02-26 | 吴泽彬 | Multifunctional somatological input device |
CN104765487A (en) * | 2014-01-06 | 2015-07-08 | 船井电机株式会社 | Input apparatus |
Also Published As
Publication number | Publication date |
---|---|
WO2017035740A1 (en) | 2017-03-09 |
WO2017035740A9 (en) | 2017-08-17 |
CN107003759A (en) | 2017-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020201096B2 (en) | Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium | |
CN106775420B (en) | Application switching method and device and graphical user interface | |
CN108549519B (en) | Split screen processing method and device, storage medium and electronic equipment | |
US10725646B2 (en) | Method and apparatus for switching screen interface and terminal | |
CN109426410B (en) | Method for controlling cursor movement, content selection method, method for controlling page scrolling and electronic equipment | |
EP2851779A1 (en) | Method, device, storage medium and terminal for displaying a virtual keyboard | |
CN106933682A (en) | The method and terminal of Message Processing | |
US20150212693A1 (en) | Interaction method and apparatus for listing data on mobile terminal | |
CN106371749A (en) | Method and device for terminal control | |
CN108475161A (en) | Display methods and terminal | |
CN104793879B (en) | Object selection method and terminal device on terminal device | |
CN107066090B (en) | Method for controlling fingerprint identification module and mobile terminal | |
WO2018039914A1 (en) | Method for copying data, and user terminal | |
CN107003759B (en) | Method for selecting text | |
CN108920086B (en) | Split screen quitting method and device, storage medium and electronic equipment | |
CN108304709B (en) | Face unlocking method and related product | |
CN107924261B (en) | Method for selecting text | |
WO2017035794A1 (en) | Method and device for operating display, user interface, and storage medium | |
EP3674867B1 (en) | Human-computer interaction method and electronic device | |
CN114398016A (en) | Interface display method and device | |
CN105700762B (en) | Method and device for displaying option information | |
CN110109582B (en) | Display method and device of mobile terminal and storage medium | |
US20240019942A1 (en) | Input method and terminal | |
CN106648425B (en) | Method and device for preventing mistaken touch of terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |