US20090183930A1 - Touch pad operable with multi-objects and method of operating same - Google Patents
Touch pad operable with multi-objects and method of operating same Download PDFInfo
- Publication number
- US20090183930A1 US20090183930A1 US12/057,883 US5788308A US2009183930A1 US 20090183930 A1 US20090183930 A1 US 20090183930A1 US 5788308 A US5788308 A US 5788308A US 2009183930 A1 US2009183930 A1 US 2009183930A1
- Authority
- US
- United States
- Prior art keywords
- movement amount
- position coordinate
- software object
- zoom
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 230000006399 behavior Effects 0.000 claims abstract description 31
- 230000009471 action Effects 0.000 claims description 75
- 238000006073 displacement reaction Methods 0.000 claims description 16
- 238000010586 diagram Methods 0.000 description 13
- 230000004044 response Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000007480 spreading Effects 0.000 description 2
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- a method of operating a touch pad with multi-objects First of all, touch points of first and second objects on the touch pad are sensed to assert a first position coordinate (X 1 , Y 1 ) and a second position coordinate (X 2 , Y 2 ), respectively. Then, the second object is moved on the touch pad to a further touch point, and the further touch point is sensed to assert a third position coordinate (X 3 , Y 3 ). According to coordinate differences between the first, second and third position coordinates, at least two movement amount indexes are calculated, wherein a first movement amount index is obtained according to a coordinate difference between the first and second position coordinates. Afterwards, a movement amount control signal is generated according to the at least two movement amount indexes.
- the method further includes the following steps.
- a first slope S 212 of the line through the first position coordinate and the second position coordinate is measured as the first movement amount index.
- a second slope S 213 of the line through the first position coordinate and the third position coordinate is measured as a second movement amount index.
- a third slope S 223 of the line through the second position coordinate and the third position coordinate is measured as a third movement amount index.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides a touch pad operable with multi-objects and a method of operating such a touch pad. The touch pad includes a touch structure for sensing touch points of a first and a second object and a controller for generating corresponding touching signals and related position coordinates. Moreover, the controller calculates at least two movement amount indexes according to coordinate differences between these position coordinates, thereby generating a movement amount control signal to control behaviors of a software object.
Description
- This application claims priority under 35 U.S.C. § 119 to Taiwan Patent Application No. 097102211, filed on Jan. 21, 2008, in the Taiwan Intellectual Property Office, the entire contents of which are hereby incorporated by reference.
- The present invention relates to a touch pad, and more particularly to a touch pad operable with multi-objects. The present invention also relates to a method of operating such a touch pad.
- Nowadays, consumable electronic products with touch pads or touch panels are becoming increasingly popular because of their ease and versatility of operation. A representative electronic product with a touch panel is for example an iPhone, which is a mobile phone designed and marketed by Apple Inc. For helping the user well operate the electronic products, the touch sensing interfaces of the electronic products are developed in views of humanization and user-friendliness.
- Conventionally, by simply touching the surface of the touch sensing interface with a finger, the user can make selections and move a cursor Nowadays, with increasing demand of using the touch sensing interface as a control unit, operating the touch pads or touch panels with only one finger is not satisfied. As a consequence, touch sensing interfaces operated with two fingers have been developed. Take the iPhone for example. It is possible to zoom in and out of web pages or photos by placing two fingers on the touch sensing interface and spreading them farther apart or closer together, as if stretching or squeezing the image. The iPhone interface, however, enables the user to move the content up/down or leftward/rightward or rotate the content by a touch-drag motion of a single finger.
- Although the iPhone interface makes it easy to zoom in or out of images by spreading two fingers farther apart or closer together, there are still some drawbacks. For example, since the software for reading out the user's gestures is based on complicated moving control means, there is a need of providing a simplified method for quickly reading out the user's gestures. In the present invention, capacitive or resistive touch pads are concerned.
- Moreover, since the software object is moved up/down or leftward/rightward or rotated by moving a single finger on the touch sensing interface, it is necessary to rotate the software object at a specified angle or move the software object along multi-directions with two fingers. Therefore, there is also a need of rotating the software object at a specified angle or moving the software object along multi-directions with two fingers.
- The present invention provides a method of operating a touch pad with at least two fingers to move the software object up/down or leftward/rightward, rotate the software object at a specified angle, and zoom in/out of the software object.
- The present invention further provides a touch pad operable with at least two fingers to move the software object up/down or leftward/rightward, rotate the software object at a specified angle, and zoom in/out of the software object.
- In accordance with an aspect of the present invention, there is provided a method of operating a touch pad with multi-objects. First of all, touch points of first and second objects on the touch pad are sensed to assert a first position coordinate (X1, Y1) and a second position coordinate (X2, Y2), respectively. Then, the second object is moved on the touch pad to a further touch point, and the further touch point is sensed to assert a third position coordinate (X3, Y3). According to coordinate differences between the first, second and third position coordinates, at least two movement amount indexes are calculated, wherein a first movement amount index is obtained according to a coordinate difference between the first and second position coordinates. Afterwards, a movement amount control signal is generated according to the at least two movement amount indexes.
- In an embodiment, the first object is a first finger, the second object is a second finger, and the first, second and third position coordinates are obtained in an absolute two-dimensional coordinate system or a relative two-dimensional coordinate system.
- In an embodiment, the method further includes the following steps. A first angle of the line through the first position coordinate and the second position coordinate with respect to the x-axis is measured and defined as the first movement amount index. Then, a second angle of the line through the first position coordinate and the third position coordinate with respect to the x-axis, is measured and defined as a second movement amount index. Then, an angle difference between the first angle and the second angle is calculated. According to the positive or negative sign of the angle difference, the movement amount control signal is generated to control behaviors of a software object. For example, the software object is a volume control key and the behaviors of the software object include displacement amount and displacement direction of the volume control key. Alternatively, the software object is a digital image and the behaviors of the software object include rotational amount and rotational direction of the digital image.
- In an embodiment, the method further includes the following steps. A first slope S112 of the line through the first position coordinate and the second position coordinate is measured as the first movement amount index. A second slope S113 of the line through the first position coordinate and the third position coordinate is measured as a second movement amount index. A third slope S123 of the line through the second position coordinate and the third position coordinate is measured as a third movement amount index. If S112≧0, S113≧0, S123<0, (Y2-Y3)>0 and (X2-X3)<0, or if S112≦0, S113≦0, S123>0, (Y2-Y3)<0 and (X2-X3)<0, the movement amount control signal is generated to control a first rotational action of the software object. Whereas, if S112≧0, S113≧0, S123<0, (Y2-Y3)<0 and (X2-X3)>0, or if S112≦0, S113≦0, S123>0, (Y2-Y3)>0 and (X2-X3)>0, the movement amount control signal is generated to control a second rotational action of the software object. For example, the first rotational action and the second rotational action are respectively a clockwise rotational action and a counterclockwise rotational action. The software object is a volume control key and the behaviors of the software object include displacement amount and displacement direction of the volume control key. Alternatively, the software object is a digital image and the behaviors of the software object include rotational amount and rotational direction of the digital image.
- In an embodiment, the method further includes the following steps. A first slope S212 of the line through the first position coordinate and the second position coordinate is measured as the first movement amount index. A second slope S213 of the line through the first position coordinate and the third position coordinate is measured as a second movement amount index. A third slope S223 of the line through the second position coordinate and the third position coordinate is measured as a third movement amount index. If S212≧0, S213≧0, S232≧0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1), or if S212<0, S213<0, S232<0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1), the movement amount control signal is generated to control a first zoom in/out action of the software object. Whereas, if S212≧0, S213≧0, S232≧0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1), or if S212<0, S213<0, S232<0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1), the movement amount control signal is generated to control a second zoom in/out action of the software object. For example, the first zoom in/out action and the second zoom in/out action are respectively a zoom out action and a zoom in action. The software object is a digital image, and the behaviors of the software object include zoom in/out amount and zoom in/out direction of the digital image.
- In an embodiment, the method further includes the following steps. The first object is moved on the touch pad to a further touch point, and the further touch point is sensed to assert a fourth position coordinate (X4, Y4). Then, a third movement amount index is obtained according to a coordinate difference between the second and third position coordinates, a fourth movement amount index is obtained according to a coordinate difference between the first and fourth position coordinates, and a fifth movement amount index is obtained according to a coordinate difference between the fourth and third position coordinates. Afterwards, the movement amount control signal is generated according to the first, third, fourth and fifth movement amount indexes.
- In an embodiment, the method further includes the following steps. A first slope S312 of the line through the first position coordinate and the second position coordinate is measured as the first movement amount index. A third slope S332 of the line through the second position coordinate and the third position coordinate is measured as a third movement amount index. A fourth slope S314 of the line through the first position coordinate and the fourth position coordinate is measured as a fourth movement amount index. A filth slope S343 of the line through the fourth position coordinate and the third position coordinate is measured as a fifth movement amount index. If S312≧0, S332≧0, S314≧0, S343≧0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4), or if S312<0, S332<0, S314<0, S343<0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4), the movement amount control signal is generated to control a first zoom in/out action of the software object. Whereas, if S312≧0, S332≧0, S314≧0, S343≧0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4), or if S312<0, S332<0, S314<0, S343<0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4), the movement amount control signal is generated to control a second zoom in/out action of the software object. For example, the first zoom in/out action and the second zoom in/out action are respectively a zoom out action and a zoom in action. The software object is a digital image, and the behaviors of the software object include zoom in/out amount and zoom in/out direction of the digital image.
- In accordance with another aspect of the present invention, there is provided a touch pad operable with multi-objects. The touch pad is communicated with a host and a display body, and includes a touch structure and a controller. The touch structure has a lower surface communicated with the display body and an upper surface for sensing touch points. When touch points of first and second objects on the touch pad are sensed, first and second touching signals are respectively generated. When the second object is moved on the touch pad to a further touch point and the further touch point is sensed, a third touching signal is generated. The controller is electrically connected to the touch structure and the host for receiving the first, second and third touching signals and generating a first position coordinate (X1, Y1), a second position coordinate (X2, Y2) and a third position coordinate (X3, Y3), respectively. The controller calculates at least two movement amount indexes according to coordinate differences between the first, second and third position coordinates, thereby generating a movement amount control signal. A first movement amount index is obtained according to a coordinate difference between the first and second position coordinates.
- The above objects and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, in which;
-
FIG. 1 is a flowchart illustrating a method of operating a touch pad according to a first preferred embodiment of the present invention; -
FIGS. 2A˜2D are schematic two-dimensional coordinate diagrams illustrating the operating principles of the first preferred embodiment; -
FIGS. 3A and 3B are schematic diagrams illustrating an implementation example of controlling displacement amount and displacement direction of a volume control key according to the angle difference; -
FIGS. 4A and 4B are schematic diagrams illustrating another implementation example of controlling rotational amount and rotational direction of an image according to the angle difference; -
FIG. 5 is schematic block diagram illustrating an interpreting system of the touch pad according to the present invention; -
FIG. 6 is a flowchart illustrating a method of operating a touch pad according to a second preferred embodiment of the present invention; -
FIG. 7 is a schematic two-dimensional coordinate diagram illustrating operating principles of the second preferred embodiment; -
FIG. 8 is a flowchart illustrating a method of operating a touch pad according to a third preferred embodiment of the present invention; -
FIG. 9 is a schematic two-dimensional coordinate diagram illustrating the operating principles of the third preferred embodiment; -
FIGS. 10A and 10B are schematic diagrams illustrating another implementation example of controlling zoom in/out amount and zoom in/out direction of the digital image. -
FIG. 11 is a flowchart illustrating a method of operating a touch pad according to a fourth preferred embodiment of the present invention; and -
FIG. 12 is a schematic two-dimensional coordinate diagram illustrating the operating principles of the fourth preferred embodiment. - The present invention will now be described more specifically with reference to the following embodiments. It is to be noted that the following descriptions of preferred embodiments of this invention are presented herein for purpose of illustration and description only. It is not intended to be exhaustive or to be limited to the precise form disclosed.
- Hereinafter, an embodiment of operating a touch pad according to a first preferred embodiment of the present invention will be illustrated with reference to the flowchart of
FIG. 1 and the two-dimensional coordinate diagrams ofFIGS. 2A˜2D . - When a first object (e.g. a first finger F1) is placed on a touch position of the touch pad 10 (Step A1), the coordinate of the touch point is detected so as to assert a first position coordinate (X1, Y1), as is shown in
FIG. 2A and Step A2 ofFIG. 1 . - Next, as shown in
FIG. 2B and Step A3 ofFIG. 1 , when a second object (e.g. a second finger F2) is placed on another touch point of thetouch pad 10, the coordinate of the touch point is detected so as to assert a second position coordinate (X2, Y2). With the first position coordinate serving as a reference point, a first movement amount index indicating a relation between the first position coordinate (X1, Y1) and the second position coordinate (X2, Y2) is measured. In this embodiment, the first movement amount index is for example a first angle θ1, i.e. θ1=arctan (Y2-Y1)/(X2-X1). - Next, as shown in
FIG. 2C and Step A4 ofFIG. 1 , when the second finger F2 is moved to and stayed at a her touch point of thetouch pad 10, the coordinate of the touch point is detected so as to assert a third position coordinate (X3, Y3). In this embodiment, the second finger F2 is moved from the initial position (i.e. the second position coordinate (X2, Y2)) to a destination position (i.e. the third position coordinate (X3, Y3)) in a clockwise direction M11. With the first position coordinate serving as a reference point, a second movement amount index indicating a relation between the first position coordinate (X1, Y1) and the third position coordinate (X3, Y3) is measured. In this embodiment, the second movement amount index is for example a second angle θ2, i.e. θ2 arctan (Y3-Y1)/(X3-X1). - As shown in
FIG. 2D and Step A5 ofFIG. 1 , an angle difference θ between the first angle θ1 and the second angle θ2 is calculated. According to the positive or negative sign of the angle difference θ, a movement amount control signal C is generated to control behaviors of asoftware object 301. Some exemplary behaviors of thesoftware object 301 to be controlled in response to the movement amount control signal C are shown inFIGS. 4A , 4B and 5, which will be described later. In a case that θ=θ1−θ2<0, the rotational movement amount has a negative sign. Whereas, the rotational movement amount has a positive sign if θ=θ1−θ2>0. - An implementation example of controlling the behaviors of the
software object 301 according to the angle difference θ will be illustrated with reference toFIG. 3A andFIG. 3B . In this embodiment, thesoftware object 301 is a volume control key. The behaviors of thesoftware object 301 to be controlled include displacement amount and displacement direction of the volume control key. - As shown in
FIG. 3A , the first finger F1 is stayed at a touch position of thetouch pad 10 as a reference point, the second finger F2 is moved from a initial position to a destination position in a clockwise direction M11. As previously described inFIGS. 2A˜2B , a movement amount control signal C is generated. In response to the movement amount control signal C, the volume control indicator of thevolume control key 301 moves downwardly (i.e. in a clockwise direction M12). On the contrary as shown inFIG. 33 , if the second finger F2 is moved from an initial position to a destination position in a counterclockwise direction M21, the volume control indicator of thevolume control key 301 moves upwardly (i.e. in a counterclockwise direction M22). - Another implementation example of controlling the behaviors of the
software object 301 according to the angle difference θ will be illustrated with reference toFIG. 4A andFIG. 4B . In this embodiment, thesoftware object 301 is for example a digital image. The behaviors of thesoftware object 301 to be controlled include rotational amount and rotational direction of the digital image. - As shown in
FIG. 4A , the first finger F1 is stayed at a touch position of thetouch pad 10 as a reference point, the second finger F2 is moved from a initial position to a destination position in a clockwise direction M31. As is also described inFIGS. 2A˜2B , a movement amount control signal C is generated. In response to the movement amount control signal C, theimage 301 is rotated in the clockwise direction M32. On the contrary, as shown inFIG. 4B , if the second finger F2 is moved from an initial position to a destination position in a counterclockwise direction M41, theimage 301 is rotated in the counterclockwise direction M42. -
FIG. 5 is schematic block diagram illustrating an interpreting system of the touch pad according to the present invention. The interpreting system ofFIG. 5 includes thetouch pad 10, adisplay body 20 and ahost 30. - The
touch pad 10 is communicated with thehost 30, and includes atouch structure 101 and acontroller 102. Thecontroller 102 is electrically communicated with thetouch structure 101 and thehost 30. Thetouch structure 101 is communicated with thehost 30. For example, the lower surface of thetouch structure 101 can be combined with thedisplay body 20 by a mechanical assembling action M, as is shown inFIG. 5 . Alternatively, thetouch structure 101 can be electrically connected with the display body 20 (not shown). When the first finger F1 or the second finger F2 are respectively placed on first and second touch points on the upper surface of thetouch pad 10, a first touching signal S1 and a second touching signal S2 are asserted to thecontroller 102. When the second finger F2 is moved to and stayed at a third touch point of thetouch pad 10, a third touching signal S3 is asserted to thecontroller 102. - When the touching signals S1, S2 and S3 are received by the
controller 102, a first position coordinate (X1, Y1), a second position coordinate (X2, Y2) and a third position coordinate (X3, Y3) axe respectively generated. With the first position coordinate (X1, Y1) serving as a reference point, a first angle θ1 of the second position coordinate (X2, Y2) and a second angle θ2 of the third position coordinate (X3, Y3) are calculated. According to the positive or negative sign of the angle difference θ, a movement amount control signal C is asserted to thehost 30. In response to the movement amount control signal C, thehost 30 can control behaviors of the display information (i.e. the software object 301) shown on thedisplay body 20. - In the first preferred embodiment as described in
FIGS. 1 , 2, 3 and 4, thesoftware object 301 is rotated in either a clockwise direction or counterclockwise direction according to the angle difference. Nevertheless, thesoftware object 301 can be controlled according to the slope of line through different touch points, thereby increasing the computing speed. - Hereinafter, another embodiment of operating a touch pad according to the present invention will be illustrated with reference to the flowchart of
FIG. 6 and the two-dimensional coordinate diagram ofFIG. 7 . - When a first object (e.g. a first finger F1) is placed on a touch position of the touch pad 10 (Step B1), the coordinate of the touch point is detected so as to assert a first position coordinate (X1, Y1) (Step B2).
- In Step B3, when a second object (e.g. a second finger F2) is placed on another touch point of the
touch pad 10, the coordinate of the touch point is detected so as to assert a second position coordinate (X2, Y2). - In Step B4, when the second finger F2 is moved to and stayed at a further touch point of the
touch pad 10, the coordinate of the touch point is detected so as to assert a third position coordinate (X3, Y3). In this embodiment, the second finger F2 is moved from the initial position (i.e. the second position coordinate (X2, Y2)) to a destination position (i.e. the third position coordinate (X3, Y3)) in a clockwise direction M11. - In Step B5, a first slope S112 of the line through the first position coordinate (X1, Y1) and the second position coordinate (X2, Y2) is measured and defined as a first movement amount index, i.e. S112 (Y2-Y1)/(X2-X1). Likewise, a second slope S113 of the line through the first position coordinate (X1, Y1) and the third position coordinate (X3, Y3) is measured and defined as a second movement amount index, i.e. S113=(Y3-Y1)/(X3-X1). Likewise, a third slope S123 of the line through the second position coordinate (X2, Y2) and the third position coordinate (X3, Y3) is measured and defined as a third movement amount index, i.e. S123=(Y2-Y3)/(X2-X3).
- In Step B6, if the first slope S112≧0, the second slope S113≧0, the third slope S123<0, (Y2-Y3)>0 and (X2-X3)<0, a movement amount control signal C is generated to control a first rotational action (e.g. a clockwise rotational action) of the
software object 301. Alternatively, if the first slope S112≦0, the second slope S113≦0, the third slope S123>0, (Y2-Y3)<0 and (X2-X3)<0, the movement amount control signal C is also generated to control the first rotational action (e.g. a clockwise rotational action) of thesoftware object 301. - In Step B7, if the first slope S112≧0, the second slope S113≧0, the third slope S123<0, (Y2-Y3)<0 and (X2-X3)>0, a movement amount control signal C is generated to control a second rotational action (e.g. a counterclockwise rotational action) of the
software object 301. Alternatively, if the first slope S112≦0, the second slope S113≦0, the third slope S123>0, (Y2-Y3)>0 and (X2-X3)>0, the movement amount control signal C is also generated to control the second rotational action (e.g. a counterclockwise rotational action) of thesoftware object 301. - Hereinafter, another embodiment of operating a touch pad according to the present invention will be illustrated with reference to the flowchart of
FIG. 8 and the two-dimensional coordinate diagram ofFIG. 9 . In this embodiment, two fingers are employed to zoom in or out of a digital image. - When a first object (e.g. a first finger F1) is placed on a touch position of the touch pad 10 (Step C1), the coordinate of the touch point is detected so as to assert a first position coordinate (X1, Y1) (Step C2).
- In Step C3, when a second object (e.g. a second finger F2) is placed on another touch point of the
touch pad 10, the coordinate of the touch point is detected so as to assert a second position coordinate (X2, Y2). - In Step C4, when the second finger F2 is moved to and stayed at a further touch point of the
touch pad 10, the coordinate of the touch point is detected so as to assert a third position coordinate (X3, Y3). In this embodiment, the second finger F2 is moved from the initial position (i.e. the second position coordinate (X2, Y2)) to a destination position (i.e. the third position coordinate (X3, Y3)) in a zoom-out direction M61. - In Step C5, a first slope S212 of the line through the first position coordinate (X1, Y1) and the second position coordinate (X2, Y2) is measured and defined as a first movement amount index, i.e. S212=(Y2-Y1)/(X2-X1). Likewise, a second slope S213 of the line through the first position coordinate (X1, Y1) and the third position coordinate (X3, Y3) is measured and defined as a second movement amount index, i.e. S213=(Y3-Y1)/(X3-X1). Likewise, a third slope S232 of the line through the third position coordinate (X3, Y3) and the second position coordinate (X2, Y2) is measured and defined as a third movement amount index, i.e. S232=(Y2-Y3)/(X2-X3).
- In Step C6, if the first slope S212≧0, the second slope S213≧0, the third slope S232≧0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1), a movement amount control signal C is generated to control a first zoom in/out action (e.g. a zoom-out action in the direction M61 as shown in
FIG. 10A ) of thesoftware object 301. Alternatively, if the first slope S212<0, the second slope S213<0, the third slope S232<0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1), the movement amount control signal C is also generated to control the first zoom in/out action (e.g. a zoom-out action in the direction M61 as shown inFIG. 10A ) of thesoftware object 301. - In Step C7, if the first slope S212≧0, the second slope S213≧0, the third slope S232≧0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1), a movement amount control signal C is generated to control a second zoom in/out action (e.g. a zoom-in action in the direction M71 as shown in
FIG. 10B ) of thesoftware object 301. Alternatively, if the first slope S212<0, the second slope S213<0, the third slope S232<0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1), the movement amount control signal C is also generated to control the second zoom in/out action (e.g. a zoom-in action in the direction M71 as shown inFIG. 10B ) of thesoftware object 301. - Another implementation example of controlling the behaviors of the
software object 301 will be illustrated with reference toFIG. 10A andFIG. 10B . In this embodiment, thesoftware object 301 is a digital image. The behaviors of thesoftware object 301 to be controlled include zoom in/out amount and zoom in/out direction of the digital image. As shown inFIG. 10A , the first finger F1 is stayed at a touch position of thetouch pad 10 as a reference point and the second finger F2 comes closer to the first finger F1 in the direction M61, so that theimage 301 is squeezed in the zoom out direction M62. On the contrary, as shown inFIG. 10B , the first finger F1 is stayed at a touch position of thetouch pad 10 as a reference point and the second finger F2 is spread apart from the first finger F1 in the direction M71 so that theimage 301 is stretched in the zoom in/out direction M72. - Hereinafter, a further embodiment of operating a touch pad according to the present invention will be illustrated with reference to the flowchart of
FIG. 11 and the two-dimensional coordinate diagram ofFIG. 12 . In this embodiment, two fingers are simultaneously moved to zoom in or out of an image. - When a first object (e.g. a first finger F1) is placed on a touch position of the touch pad 10 (Step D1), the coordinate of the touch point is detected so as to assert a first position coordinate (X1, Y1) (Step C2).
- In Step D3, when a second object (e.g. a second finger F2) is placed on another touch point of the
touch pad 10, the coordinate of the touch point is detected so as to assert a second position coordinate (X2, Y2). - In Step D4, the first finger F1 and the second finger F2 are simultaneously moved. When the second finger F2 and the first finger F1 are moved to and stayed at specified touch points of the
touch pad 10, the coordinates of the touch points are detected so as to respectively assert a third position coordinate (X3, Y3) and a fourth position coordinate (X4, Y4). In this embodiment, the second finger F2 is moved from the initial position (i.e. the second position coordinate (X2, Y2)) to the destination position (i.e. the third position coordinate (X3, Y3)) in a first zoom-out direction M81. In addition, the first finger F1 is moved from the initial position (i.e. the first position coordinate (X1, Y1)) to the destination position (i.e. the fourth position coordinate (X4, Y4)) in a second zoom-out direction M82. - In Step D5, a first slope S312 of the line through the first position coordinate (X1, Y1) and the second position coordinate (X2, Y2) is measured and defined as a first movement amount index, i.e. S312=(Y2-Y1)/(X2-X1). Likewise, a third slope S332 of the line through the third position coordinate (X3, Y3) and the second position coordinate (X2, Y2) is measured and defined as a third movement amount index, i.e. S332=(Y2-Y3)/(X2-X3). Likewise, a fourth slope S314 of the line through the first position coordinate (X1, Y1) and the fourth position coordinate (X4, Y4) is measured and defined as a fourth movement amount index, i.e. S314=(Y4-Y1)/(X4-X1). Likewise, a fifth slope S343 of the line through the fourth position coordinate (X4, Y4) and the third position coordinate (X3, Y3) is measured and defined as a fifth movement amount index, i.e. S343=(Y3-Y4)/(X3-X4).
- In Step D6, if the first slope S312≧0, the third slope S332≧0 the fourth slope S314≧0, the fifth slope S343≧0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4), a movement amount control signal C is generated to control a zoom-out action of the
software object 301 in the directions M81 and M82 (as shown inFIG. 12 ). Alternatively, if S312<0, the third slope S332<0, the fourth slope S314<0, the fifth slope S343<0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4), the movement amount control signal C is also generated to control the zoom-out action of thesoftware object 301 in the directions M81 and M82 (as shown inFIG. 12 ). - In Step D7, if the first slope S312≧0, the third slope S332≧0 the fourth slope S314≧0, the fifth slope S343≧0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4), a movement amount control signal C is generated to control a zoom-in action (not shown) of the
software object 301. Alternatively, if S312<0, the third slope S332<0, the fourth slope S314<0, the fifth slope S343<0, (X2-X1)<(3-X4), and (Y2-Y1)<(Y3-Y4), the movement amount control signal C is also generated to control the zoom in/out action (not shown) of thesoftware object 301. - From the above embodiment, the method of the present invention can use two fingers to operate the touch pad to rotate the software object at a specified angle, move the software object along multi-directions with two fingers, and zoom in/out the software object.
- While the invention has been described in terms of what is presently considered to be the most practical and preferred embodiments, it is to be understood that the invention needs not be limited to the disclosed embodiment. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims which are to be accorded with the broadest interpretation so as to encompass all such modifications and similar structures.
Claims (14)
1. A method of operating a touch pad with multi-objects, comprising steps of:
sensing touch points of first and second objects on said touch pad to assert a first position coordinate (X1, Y1) and a second position coordinate (X2, Y2), respectively;
moving said second object on said touch pad to a further touch point, and sensing said further touch point to assert a third position coordinate (X3, Y3);
calculating at least two movement amount indexes according to coordinate differences between said first, second and third position coordinates, wherein a first movement amount index is obtained according to a coordinate difference between said first and second position coordinates; and
generating a movement amount control signal according to said at least two movement amount indexes.
2. The method according to claim 1 wherein said first object is a first finger, said second object is a second finger, and said first, second and third position coordinates are obtained in an absolute two-dimensional coordinate system or a relative two-dimensional coordinate system.
3. The method according to claim 2 further comprising steps of:
measuring a first angle of the line through said first position coordinate and said second position coordinate with respect to the x-axis, and defining said first angle as said first movement amount index;
measuring a second angle of the line through said first position coordinate and said third position coordinate with respect to the x-axis, and defining said second angle as a second movement amount index; and
calculating an angle difference between said first angle and said second angle, and generating said movement amount control signal to control behaviors of a software object according to the positive or negative sign of said angle difference, wherein said software object is a volume control key and said behaviors of said software object include displacement amount and displacement direction of said volume control key, or said software object is a digital image and said behaviors of said software object include rotational amount and rotational direction of the digital image.
4. The method according to claim 2 further comprising steps of:
measuring a first slope S112 of the line through said first position coordinate and said second position coordinate as said first movement amount index, measuring a second slope S113 of the line through said first position coordinate and said third position coordinate as a second movement amount index, and measuring a third slope S123 of the line through said second position coordinate and said third position coordinate as a third movement amount index;
generating said movement amount control signal to control a first rotational action of said software object if S112≧0, S113≧0, S123<0, (Y2-Y3)>0 and (X2-X3)<0, or if S112≦0, S113≦0, S123>0, (Y2-Y3)<0 and (X2-X3)<0; and
generating said movement amount control signal to control a second rotational action of said software object if S112≧0, S113≧0, S123<0, (Y2-Y3)<0 and (X2-X3)>0, or if S112≦0, S113≦0, S123>0, (Y2-Y3)>0 and (X2-X3)>0,
wherein said first rotational action and said second rotational action are respectively a clockwise rotational action and a counterclockwise rotational action, said software object is a volume control key and said behaviors of said software object include displacement amount and displacement direction of said volume control key, or said software object is a digital image and said behaviors of said software object include rotational amount and rotational direction of the digital image.
5. The method according to claim 2 further comprising steps of:
measuring a first slope S212 of the line through said first position coordinate and said second position coordinate as said first movement amount index, measuring a second slope S213 of the line through said first position coordinate and said third position coordinate as a second movement amount index, and measuring a third slope S223 of the line through said second position coordinate and said third position coordinate as a third movement amount index;
generating said movement amount control signal to control a first zoom in/out action of said software object if S212≧0, S213≧0, S232≧0, (X2-X1)>(X3×1), and (Y2-Y1)>(Y3-Y1), or if S212<0, S213<0, S232<0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1); and
generating said movement amount control signal to control a second zoom in/out action of said software object if S212≧0, S213≧0, S232≧0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1), or if S212<0, S213<0, S232<0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1),
wherein said first zoom in/out action and said second zoom in/out action are respectively a zoom out action and a zoom in action, said software object is a digital image, and said behaviors of said software object include zoom in/out amount and zoom in/out direction of said digital image.
6. The method according to claim 2 further comprising steps of:
moving said first object on said touch pad to a further touch point, and sensing said further touch point to assert a fourth position coordinate (X4, Y4);
obtaining a third movement amount index according to a coordinate difference between said second and third position coordinates;
obtaining a fourth movement amount index according to a coordinate difference between said first and fourth position coordinates;
obtaining a fifth movement amount index according to a coordinate difference between said fourth and third position coordinates; and
generating said movement amount control signal according to said first, third, fourth and fifth movement amount indexes.
7. The method according to claim 6 further comprising steps of:
measuring a first slope S312 of the line through said first position coordinate and said second position coordinate as said first movement amount index, measuring a third slope S332 of the line through said second position coordinate and said third position coordinate as a third movement amount index, measuring a fourth slope S314 of the line through said first position coordinate and said fourth position coordinate as a fourth movement amount index, and measuring a fifth slope S343 of the line through said fourth position coordinate and said third position coordinate as a fifth movement amount index;
generating said movement amount control signal to control a first zoom in/out action of said software object if S312≧0, S332≧0, S314≧0, S343≧0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4), or if S312<0, S332<0, S314<0, S343<0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4); and
generating said movement amount control signal to control a second zoom in/out action of said software object if S312≧0, S332≧0, S314≧0, S343≧0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4), or if S312<0, S332<0, S314<0, S343<0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4),
wherein said first zoom in/out action and said second zoom in/out action are respectively a zoom out action and a zoom in action, said software object is a digital image, and said behaviors of said software object include zoom in/out amount and zoom in/out direction of said digital image.
8. A touch pad operable with multi-objects, said touch pad being communicated with a host and a display body and comprising:
a touch structure having a lower surface communicated with said display body and an upper surface for sensing touch points, wherein first and second touching signals are respectively generated when touch points of first and second objects on said touch pad are sensed, and a third touching signal is generated when said second object is moved on said touch pad to a farther touch point and said further touch point is sensed; and
a controller electrically connected to said touch structure and said host for receiving said first, second and third touching signals and generating a first position coordinate (X1, Y1), a second position coordinate (X2, Y2) and a third position coordinate (X3, Y3), respectively, wherein said controller calculates at least two movement amount indexes according to coordinate differences between said first, second and third position coordinates, thereby generating a movement amount control signal,
wherein a first movement amount index is obtained according to a coordinate difference between said first and second position coordinates.
9. The touch pad according to claim 8 wherein said first object is a first finger, said second object is a second finger, and said first, second and third position coordinates are obtained in an absolute two-dimensional coordinate system or a relative two-dimensional coordinate system.
10. The touch pad according to claim 9 wherein said touch pad is operated by the following steps of:
measuring a first angle of the line through said first position coordinate and said second position coordinate with respect to the x-axis, and defining said first angle as said first movement amount index;
measuring a second angle of the line through said first position coordinate and said third position coordinate with respect to the x-axis, and defining said second angle as a second movement amount index; and
calculating an angle difference between said first angle and said second angle, and generating said movement amount control signal to control behaviors of a software object according to the positive or negative sign of said angle difference, wherein said software object is a volume control key and said behaviors of said software object include displacement amount and displacement direction of said volume control key, or said software object is a digital image and said behaviors of said software object include rotational amount and rotational direction of the digital image.
11. The touch pad according to claim 9 wherein said touch pad is operated by the following steps of:
measuring a first slope S112 of the line through said first position coordinate and said second position coordinate as said first movement amount index, measuring a second slope S113 of the line through said first position coordinate and said third position coordinate as a second movement amount index, and measuring a third slope S123 of the line through said second position coordinate and said third position coordinate as a third movement amount index;
generating said movement amount control signal to control a first rotational action of said software object if S112≧0, S113≧0, S123<0, (Y2-Y3)>0 and (X2-X3)<0, or if S112≦0, S113≦0, S123>0, (Y2-Y3)<0 and (X2-X3)<0; and
generating said movement amount control signal to control a second rotational action of said software object if S112≧0, S113≧0, S123<0, (Y2-Y3)<0 and (X2-X3)>0, or if S112≦0, S113≦0, S123>0, (Y2-Y3)>0 and (X2-X3)>0,
wherein said first rotational action and said second rotational action are respectively a clockwise rotational action and a counterclockwise rotational action, said software object is a volume control key and said behaviors of said software object include displacement amount and displacement direction of said volume control key, or said software object is a digital image and said behaviors of said software object include rotational amount and rotational direction of the digital image.
12. The touch pad according to claim 9 wherein said touch pad is operated by the following steps of:
measuring a first slope S212 of the line through said first position coordinate and said second position coordinate as said first movement amount index, measuring a second slope S213 of the line through said first position coordinate and said third position coordinate as a second movement amount index, and measuring a third slope S223 of the line through said second position coordinate and said third position coordinate as a third movement amount index;
generating said movement amount control signal to control a first zoom in/out action of said software object if S212≧0, S213≧0, S232≧0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1), or if S212<0, S213<0, S232<0, (X2-X1)>(X3-X1), and (Y2-Y1)>(Y3-Y1); and
generating said movement amount control signal to control a second zoom in/out action of said software object if S212≧0, S213≧0, S232≧0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1), or if S212<0, S213<0, S232<0, (X2-X1)<(X3-X1), and (Y2-Y1)<(Y3-Y1),
wherein said first zoom in/out action and said second zoom in/out action are respectively a zoom out action and a zoom in action, said software object is a digital image, and said behaviors of said software object include zoom in/out amount and zoom in/out direction of said digital image.
13. The touch pad according to claim 9 wherein said touch pad is operated by the following steps of:
moving said first object on said touch pad to a further touch point, and sensing said further touch point to assert a fourth position coordinate (X4, Y4);
obtaining a third movement amount index according to a coordinate difference between said second and third position coordinates;
obtaining a fourth movement amount index according to a coordinate difference between said first and fourth position coordinates;
obtaining a fifth movement amount index according to a coordinate difference between said fourth and third position coordinates; and
generating said movement amount control signal according to said first, third, fourth and fifth movement amount indexes.
14. The touch pad according to claim 9 wherein said touch pad is operated by the following steps of:
measuring a first slope S312 of the line through said first position coordinate and said second position coordinate as said first movement amount index, measuring a third slope S332 of the line through said second position coordinate and said third position coordinate as a third movement amount index, measuring a fourth slope S314 of the line through said first position coordinate and said fourth position coordinate as a fourth movement amount index, and measuring a fifth slope S343 of the line through said fourth position coordinate and said third position coordinate as a fifth movement amount index;
generating said movement amount control signal to control a first zoom in/out action of said software object if S312≧0, S332≧0, S314≧0, S343≧0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4), or if S312<0, S332<0, S314<0, S343<0, (X2-X1)>(X3-X4), and (Y2-Y1)>(Y3-Y4); and
generating said movement amount control signal to control a second zoom in/out action of said software object if S312≧0, S332≧0, S314≧0, S343≧0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4), or if S312<0, S332<0, S314<0, S343<0, (X2-X1)<(X3-X4), and (Y2-Y1)<(Y3-Y4),
wherein said first zoom in/out action and said second zoom in/out action are respectively a zoom out action and a zoom in action, said software object is a digital image, and said behaviors of said software object include zoom in/out amount and zoom in/out direction of said digital image.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/345,726 US8350822B2 (en) | 2008-01-21 | 2012-01-08 | Touch pad operable with multi-objects and method of operating same |
US13/613,439 US9024895B2 (en) | 2008-01-21 | 2012-09-13 | Touch pad operable with multi-objects and method of operating same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW097102211A TWI460621B (en) | 2008-01-21 | 2008-01-21 | Touch pad for processing a multi-object operation and method using in the same |
TW097102211 | 2008-01-21 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/345,726 Division US8350822B2 (en) | 2008-01-21 | 2012-01-08 | Touch pad operable with multi-objects and method of operating same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090183930A1 true US20090183930A1 (en) | 2009-07-23 |
Family
ID=40875547
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/057,883 Abandoned US20090183930A1 (en) | 2008-01-21 | 2008-03-28 | Touch pad operable with multi-objects and method of operating same |
US13/345,726 Expired - Fee Related US8350822B2 (en) | 2008-01-21 | 2012-01-08 | Touch pad operable with multi-objects and method of operating same |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/345,726 Expired - Fee Related US8350822B2 (en) | 2008-01-21 | 2012-01-08 | Touch pad operable with multi-objects and method of operating same |
Country Status (2)
Country | Link |
---|---|
US (2) | US20090183930A1 (en) |
TW (1) | TWI460621B (en) |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080270886A1 (en) * | 2007-04-30 | 2008-10-30 | Google Inc. | Hiding Portions of Display Content |
US20100064262A1 (en) * | 2008-09-05 | 2010-03-11 | Kye Systems Corp. | Optical multi-touch method of window interface |
US20100208057A1 (en) * | 2009-02-13 | 2010-08-19 | Peter Meier | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
US20110043462A1 (en) * | 2009-08-24 | 2011-02-24 | Ayman Shabra | Touchscreen apparatus, integrated circuit device, electronic device and method therefor |
US20110078624A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Manipulating Workspace Views |
CN102073405A (en) * | 2010-11-30 | 2011-05-25 | 广东威创视讯科技股份有限公司 | Image zooming and rotating judgment method |
US20110181528A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Resizing Objects |
US20110187748A1 (en) * | 2010-01-29 | 2011-08-04 | Samsung Electronics Co. Ltd. | Apparatus and method for rotating output image in mobile terminal |
WO2011094276A1 (en) * | 2010-01-26 | 2011-08-04 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
WO2011094281A1 (en) * | 2010-01-26 | 2011-08-04 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
US20110216095A1 (en) * | 2010-03-04 | 2011-09-08 | Tobias Rydenhag | Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces |
CN102279667A (en) * | 2011-08-25 | 2011-12-14 | 中兴通讯股份有限公司 | Method and device for responding screen touch event and communication terminal |
US20110304613A1 (en) * | 2010-06-11 | 2011-12-15 | Sony Ericsson Mobile Communications Ab | Autospectroscopic display device and method for operating an auto-stereoscopic display device |
ITMI20102210A1 (en) * | 2010-11-29 | 2012-05-30 | Matteo Paolo Bogana | METHOD FOR INTERPRETING GESTURES ON A RESISTIVE TOUCH SCREEN. |
US20120249440A1 (en) * | 2011-03-31 | 2012-10-04 | Byd Company Limited | method of identifying a multi-touch rotation gesture and device using the same |
EP2530569A1 (en) * | 2011-05-30 | 2012-12-05 | ExB Asset Management GmbH | Convenient extraction of an entity out of a spatial arrangement |
US20130152024A1 (en) * | 2011-12-13 | 2013-06-13 | Hai-Sen Liang | Electronic device and page zooming method thereof |
US20130167084A1 (en) * | 2011-12-27 | 2013-06-27 | Panasonic Corporation | Information terminal, method of controlling information terminal, and program for controlling information terminal |
US20130271416A1 (en) * | 2010-12-09 | 2013-10-17 | Beijing Lenovo Software Ltd. | Touch Control Method And Electronic Device |
US20130283206A1 (en) * | 2012-04-23 | 2013-10-24 | Samsung Electronics Co., Ltd. | Method of adjusting size of window and electronic device therefor |
CN103412718A (en) * | 2013-08-21 | 2013-11-27 | 广州九游信息技术有限公司 | Card moving method and system based on double-finger control |
US20140139446A1 (en) * | 2012-11-22 | 2014-05-22 | Giovan Giuseppe Boccanfuso | System and method for manipulating an image |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
CN103902151A (en) * | 2012-12-27 | 2014-07-02 | 安捷伦科技有限公司 | Method for Controlling the Magnification Level on a Display |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
EP2778880A3 (en) * | 2013-03-15 | 2015-03-11 | Samsung Electronics Co., Ltd. | Method for controlling display function and an electronic device thereof |
US9063649B2 (en) * | 2010-08-31 | 2015-06-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US20160026321A1 (en) * | 2014-07-22 | 2016-01-28 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US20170214899A1 (en) * | 2014-07-23 | 2017-07-27 | Metaio Gmbh | Method and system for presenting at least part of an image of a real object in a view of a real environment, and method and system for selecting a subset of a plurality of images |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101199970B1 (en) * | 2010-10-29 | 2012-11-12 | 전남대학교산학협력단 | Acquisition method of multi-touch feature and multi-touch gesture recognition using the multi-touch feature |
TW201222344A (en) * | 2010-11-16 | 2012-06-01 | Elan Microelectronics Corp | Method for continuing a multi-finger gesture on a touch panel |
US20120169776A1 (en) * | 2010-12-29 | 2012-07-05 | Nokia Corporation | Method and apparatus for controlling a zoom function |
TWI472967B (en) * | 2011-05-19 | 2015-02-11 | Elan Microelectronics Corp | The method of transmitting the coordinates of the touch device, the method of transmitting the resist vector by the touch device, and the computer readable medium |
CN103412720B (en) * | 2013-06-28 | 2016-12-28 | 贵阳朗玛信息技术股份有限公司 | Process method and the device thereof of touch control type input signal |
USD749117S1 (en) * | 2013-11-25 | 2016-02-09 | Tencent Technology (Shenzhen) Company Limited | Graphical user interface for a portion of a display screen |
USD733745S1 (en) * | 2013-11-25 | 2015-07-07 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with graphical user interface |
TWI529583B (en) * | 2014-12-02 | 2016-04-11 | 友達光電股份有限公司 | Touch system and touch detection method |
US10698601B2 (en) | 2016-11-02 | 2020-06-30 | Ptc Inc. | Second touch zoom control |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6958749B1 (en) * | 1999-11-04 | 2005-10-25 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
US7138983B2 (en) * | 2000-01-31 | 2006-11-21 | Canon Kabushiki Kaisha | Method and apparatus for detecting and interpreting path of designated position |
US20080062139A1 (en) * | 2006-06-09 | 2008-03-13 | Apple Inc. | Touch screen liquid crystal display |
US20090184939A1 (en) * | 2008-01-23 | 2009-07-23 | N-Trig Ltd. | Graphical object manipulation with a touch sensitive screen |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5825352A (en) * | 1996-01-04 | 1998-10-20 | Logitech, Inc. | Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad |
US7750891B2 (en) * | 2003-04-09 | 2010-07-06 | Tegic Communications, Inc. | Selective input system based on tracking of motion parameters of an input device |
US6901167B2 (en) * | 2001-04-04 | 2005-05-31 | Microsoft Corporation | Detecting multiple objects in digital image data |
US7920129B2 (en) * | 2007-01-03 | 2011-04-05 | Apple Inc. | Double-sided touch-sensitive panel with shield and drive combined layer |
-
2008
- 2008-01-21 TW TW097102211A patent/TWI460621B/en not_active IP Right Cessation
- 2008-03-28 US US12/057,883 patent/US20090183930A1/en not_active Abandoned
-
2012
- 2012-01-08 US US13/345,726 patent/US8350822B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6958749B1 (en) * | 1999-11-04 | 2005-10-25 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
US7138983B2 (en) * | 2000-01-31 | 2006-11-21 | Canon Kabushiki Kaisha | Method and apparatus for detecting and interpreting path of designated position |
US20080062139A1 (en) * | 2006-06-09 | 2008-03-13 | Apple Inc. | Touch screen liquid crystal display |
US20090184939A1 (en) * | 2008-01-23 | 2009-07-23 | N-Trig Ltd. | Graphical object manipulation with a touch sensitive screen |
Cited By (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10254946B2 (en) | 2007-04-30 | 2019-04-09 | Google Llc | Hiding portions of display content |
US8065603B2 (en) * | 2007-04-30 | 2011-11-22 | Google Inc. | Hiding portions of display content |
US20080270886A1 (en) * | 2007-04-30 | 2008-10-30 | Google Inc. | Hiding Portions of Display Content |
US11036385B2 (en) | 2007-04-30 | 2021-06-15 | Google Llc | Hiding portions of display content |
US20100064262A1 (en) * | 2008-09-05 | 2010-03-11 | Kye Systems Corp. | Optical multi-touch method of window interface |
US9934612B2 (en) | 2009-02-13 | 2018-04-03 | Apple Inc. | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
US20100208057A1 (en) * | 2009-02-13 | 2010-08-19 | Peter Meier | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
US8970690B2 (en) * | 2009-02-13 | 2015-03-03 | Metaio Gmbh | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
US20110043462A1 (en) * | 2009-08-24 | 2011-02-24 | Ayman Shabra | Touchscreen apparatus, integrated circuit device, electronic device and method therefor |
US10564826B2 (en) | 2009-09-22 | 2020-02-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11334229B2 (en) | 2009-09-22 | 2022-05-17 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10788965B2 (en) | 2009-09-22 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11972104B2 (en) | 2009-09-22 | 2024-04-30 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10282070B2 (en) | 2009-09-22 | 2019-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10928993B2 (en) | 2009-09-25 | 2021-02-23 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11947782B2 (en) | 2009-09-25 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US11366576B2 (en) | 2009-09-25 | 2022-06-21 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US20110078624A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Manipulating Workspace Views |
US10254927B2 (en) | 2009-09-25 | 2019-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US9310907B2 (en) | 2009-09-25 | 2016-04-12 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8539385B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
US8612884B2 (en) | 2010-01-26 | 2013-12-17 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US8677268B2 (en) | 2010-01-26 | 2014-03-18 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US20110181528A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Resizing Objects |
WO2011094276A1 (en) * | 2010-01-26 | 2011-08-04 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
US8539386B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
WO2011094281A1 (en) * | 2010-01-26 | 2011-08-04 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
US20110187748A1 (en) * | 2010-01-29 | 2011-08-04 | Samsung Electronics Co. Ltd. | Apparatus and method for rotating output image in mobile terminal |
WO2011107839A1 (en) * | 2010-03-04 | 2011-09-09 | Sony Ericsson Mobile Communications Ab | Methods, devices, and computer program products providing multi-touch drag and drop operations for touch-sensitive user interfaces |
US20110216095A1 (en) * | 2010-03-04 | 2011-09-08 | Tobias Rydenhag | Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces |
US20110304613A1 (en) * | 2010-06-11 | 2011-12-15 | Sony Ericsson Mobile Communications Ab | Autospectroscopic display device and method for operating an auto-stereoscopic display device |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
US9626098B2 (en) | 2010-07-30 | 2017-04-18 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9063649B2 (en) * | 2010-08-31 | 2015-06-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
ITMI20102210A1 (en) * | 2010-11-29 | 2012-05-30 | Matteo Paolo Bogana | METHOD FOR INTERPRETING GESTURES ON A RESISTIVE TOUCH SCREEN. |
CN102073405A (en) * | 2010-11-30 | 2011-05-25 | 广东威创视讯科技股份有限公司 | Image zooming and rotating judgment method |
US20130271416A1 (en) * | 2010-12-09 | 2013-10-17 | Beijing Lenovo Software Ltd. | Touch Control Method And Electronic Device |
US9857896B2 (en) * | 2010-12-09 | 2018-01-02 | Lenovo (Beijing) Co., Ltd. | Touch control method and electronic device |
US20120249440A1 (en) * | 2011-03-31 | 2012-10-04 | Byd Company Limited | method of identifying a multi-touch rotation gesture and device using the same |
US8743065B2 (en) * | 2011-03-31 | 2014-06-03 | Byd Company Limited | Method of identifying a multi-touch rotation gesture and device using the same |
EP2530569A1 (en) * | 2011-05-30 | 2012-12-05 | ExB Asset Management GmbH | Convenient extraction of an entity out of a spatial arrangement |
CN102279667A (en) * | 2011-08-25 | 2011-12-14 | 中兴通讯股份有限公司 | Method and device for responding screen touch event and communication terminal |
US20130152024A1 (en) * | 2011-12-13 | 2013-06-13 | Hai-Sen Liang | Electronic device and page zooming method thereof |
US9354780B2 (en) * | 2011-12-27 | 2016-05-31 | Panasonic Intellectual Property Management Co., Ltd. | Gesture-based selection and movement of objects |
US20130167084A1 (en) * | 2011-12-27 | 2013-06-27 | Panasonic Corporation | Information terminal, method of controlling information terminal, and program for controlling information terminal |
US20130283206A1 (en) * | 2012-04-23 | 2013-10-24 | Samsung Electronics Co., Ltd. | Method of adjusting size of window and electronic device therefor |
US9081487B2 (en) * | 2012-11-22 | 2015-07-14 | Agfa Healthcare Inc. | System and method for manipulating an image |
US20140139446A1 (en) * | 2012-11-22 | 2014-05-22 | Giovan Giuseppe Boccanfuso | System and method for manipulating an image |
US10042544B2 (en) * | 2012-12-27 | 2018-08-07 | Keysight Technologies, Inc. | Method for controlling the magnification level on a display |
CN103902151A (en) * | 2012-12-27 | 2014-07-02 | 安捷伦科技有限公司 | Method for Controlling the Magnification Level on a Display |
US20140189605A1 (en) * | 2012-12-27 | 2014-07-03 | Agilent Technologies, Inc. | Method for Controlling the Magnification Level on a Display |
US10877659B2 (en) | 2012-12-27 | 2020-12-29 | Keysight Technologies, Inc. | Method for controlling the magnification level on a display |
AU2014201249B2 (en) * | 2013-03-15 | 2019-05-23 | Samsung Electronics Co., Ltd. | Method for controlling display function and an electronic device thereof |
US9489069B2 (en) | 2013-03-15 | 2016-11-08 | Samsung Electronics Co., Ltd. | Method for controlling display scrolling and zooming and an electronic device thereof |
EP2778880A3 (en) * | 2013-03-15 | 2015-03-11 | Samsung Electronics Co., Ltd. | Method for controlling display function and an electronic device thereof |
CN103412718A (en) * | 2013-08-21 | 2013-11-27 | 广州九游信息技术有限公司 | Card moving method and system based on double-finger control |
US20160026321A1 (en) * | 2014-07-22 | 2016-01-28 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9645668B2 (en) * | 2014-07-22 | 2017-05-09 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US10659750B2 (en) * | 2014-07-23 | 2020-05-19 | Apple Inc. | Method and system for presenting at least part of an image of a real object in a view of a real environment, and method and system for selecting a subset of a plurality of images |
US20170214899A1 (en) * | 2014-07-23 | 2017-07-27 | Metaio Gmbh | Method and system for presenting at least part of an image of a real object in a view of a real environment, and method and system for selecting a subset of a plurality of images |
Also Published As
Publication number | Publication date |
---|---|
US20120105351A1 (en) | 2012-05-03 |
US8350822B2 (en) | 2013-01-08 |
TW200933456A (en) | 2009-08-01 |
TWI460621B (en) | 2014-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090183930A1 (en) | Touch pad operable with multi-objects and method of operating same | |
US9024895B2 (en) | Touch pad operable with multi-objects and method of operating same | |
US8570283B2 (en) | Information processing apparatus, information processing method, and program | |
US11775076B2 (en) | Motion detecting system having multiple sensors | |
CN105992991B (en) | Low shape TrackPoint | |
US8370772B2 (en) | Touchpad controlling method and touch device using such method | |
US9575562B2 (en) | User interface systems and methods for managing multiple regions | |
US9092125B2 (en) | Multi-mode touchscreen user interface for a multi-state touchscreen device | |
EP2717120B1 (en) | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications | |
US20110060986A1 (en) | Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same | |
CN105992992B (en) | Low shape TrackPoint | |
US20110012927A1 (en) | Touch control method | |
WO2011002414A2 (en) | A user interface | |
US20100064262A1 (en) | Optical multi-touch method of window interface | |
US8462113B2 (en) | Method for executing mouse function of electronic device and electronic device thereof | |
CN102981743A (en) | Method for controlling operation object and electronic device | |
TWI284274B (en) | Method for controlling intelligent movement of touch pad | |
JP5384706B2 (en) | Multi-touch operation method and system | |
US9389780B2 (en) | Touch-control system | |
US20150009136A1 (en) | Operation input device and input operation processing method | |
JP2016133978A (en) | Information processor, information processing method and program | |
KR20110066545A (en) | Method and terminal for displaying of image using touchscreen | |
KR101165388B1 (en) | Method for controlling screen using different kind of input devices and terminal unit thereof | |
CN104111781A (en) | Image display control method and terminal | |
US20140092042A1 (en) | Electronic device and display change method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ELANTECH DEVICES CORPORATION, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YANG, WEI-WEN;LIU, CHUH-MIN;CHENG, CHIEN-WEI;REEL/FRAME:020745/0433 Effective date: 20080214 |
|
AS | Assignment |
Owner name: ELAN MICROELECTRONICS CORP., TAIWAN Free format text: MERGER;ASSIGNOR:ELANTECH DEVICES CORPORATION;REEL/FRAME:021867/0870 Effective date: 20081021 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |