US20140168141A1 - Method and system for discriminating stylus and touch interactions - Google Patents
Method and system for discriminating stylus and touch interactions Download PDFInfo
- Publication number
- US20140168141A1 US20140168141A1 US14/014,282 US201314014282A US2014168141A1 US 20140168141 A1 US20140168141 A1 US 20140168141A1 US 201314014282 A US201314014282 A US 201314014282A US 2014168141 A1 US2014168141 A1 US 2014168141A1
- Authority
- US
- United States
- Prior art keywords
- touch
- computing device
- stylus pen
- time
- data point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0442—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using active external devices, e.g. active pens, for transmitting changes in electrical potential to be received by the digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
Definitions
- the present invention relates to a method and a system that is able to discriminate between the interaction of an electronic stylus pen, finger(s) or user's appendage and a touch screen containing device.
- Touch-screen tablet computers allow a user the ability to interact directly with content displayed on the touch-screen of the tablet computer. These interactions can be conducted through various means, but typically is done through touch, by way of the user's fingers directly interacting with the screen, or through the use of a stylus pen or other type of input control device that contacts the screen based on movements made by the user.
- touch-screens distinguish touch inputs from stylus pen inputs by using various sensing technologies or input modes that the user has to select based on the operations the user wants to conduct on the touch-screen of the tablet computer.
- Other typical solutions require stylus pen inputs to originate from a stylus pen that is physically tethered to the tablet computer.
- Collecting touch information from these types of interface mechanisms also introduces a number of challenges. Moreover, the process of reliably collecting touch information becomes increasingly more complicated where the computing device allows a user to input information using both a touch input mechanism and a stylus pen input mechanism. In the course of interfacing with the touch sensitive surface of the computing device with a stylus pen device, the user may inadvertently rest his or her palm on the touch sensitive surface. The computing device may then incorrectly interpret this inadvertent contact as a legitimate input activity. A similar challenge may confront a user who is intentionally using a touch input mechanism to control or input data to the computing device.
- the user may attempt to apply a focused touch to the surface of the computing device, yet the user may accidentally brush or bump his or her hand against other parts of the display surface, causing accidental input events.
- These problems may understandably frustrate the user if they become a frequent occurrence, or even if uncommon, if they cause significant disruption in the task that the user is performing.
- the simplified data set includes the coordinates of a touch point and the time when the touch point was sensed by the touch sensing components.
- the simplified data set is generally a small fraction of the amount of the data that is commonly available from the touch sensitive hardware in a conventional touch sensitive display type computing devices today.
- Embodiments relate generally to control devices, such as human interface devices, configured for use with a touch screen tablet computer. More specifically, the present invention relates to methods and systems for discriminating between the interactions of a handheld device, touch of one or more of the user's finger(s) and interaction with appendages of the user on a touch-screen tablet computer.
- the methods described herein may include discriminating between the interaction of the handheld device, such as an electronic stylus pen, the user's finger(s) and a user's appendage so that the collected information can be used to control some aspect of the hardware or software running on the touch-screen tablet computer.
- the methods disclosed herein may also be used to separate the interaction of the user's appendage from the interactions of the handheld device and/or user's finger(s) with the touch-screen tablet computer.
- the information received from the appendage of the user is distinguished from the information received from the interaction of a stylus pen and the user's finger and the touch-screen tablet computer, and is purposely not used to control the hardware and/or software running on the touch-screen tablet computer.
- Embodiments provide a method of operating a host device, comprising receiving, at the host device, information related to a touch-down event, receiving, at the host device, information related to a touch event from a controlling engine, correlating the information related to the touch-down event with the information related to the touch event, and determining that the touch-down event is associated with a handheld device.
- Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information from the first touch event comprises a first touch data point, comparing the first touch event information with a first rule and a second rule, wherein the first rule and the second rule each form a vote as to the type user input that created the first touch event, and attributing the first touch data point to a type of user input by analyzing the votes received from the first and second rule.
- more than two rules may be used to determine the type of user input.
- Embodiments may further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, wherein the information related to the touch-down event comprises information relating to a first time when the touch-down event occurred, receiving, at the host device, information related to a first touch event and a second touch event from a touch sensing unit coupled to the host device, wherein the information provided for the first touch event comprises a first touch data point and information relating to a second time, and the information provided for the second touch event comprises a second touch data point and information relating to a third time, analyzing the information received by the host device, comprising comparing a predetermined threshold time and the information relating to the first time and the second time, and then assigning a first user input type vote to the first touch data point based on the comparison, and comparing a first position of the first touch data point on a user interface of the host device and a second position of the second touch data point on the
- Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, wherein the information comprises information relating to a first time when the touch-down event occurred, receiving, at the host device, information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information comprises information relating to a second time when the touch event occurred on a touch sensitive unit of the host device, correlating the information related to the touch-down event with the information related to the first touch event, wherein correlating the information comprises comparing the first time, the second time and a predetermined threshold, and determining that the touch-down event is associated with the handheld device when the difference in time between the first and second time is less than the predetermined threshold.
- Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, receiving, at the host device, information related to a plurality of touch events from a touch sensing unit coupled to the host device, defining a portion of the plurality of touch events as being part of a first cluster of touch events, correlating the information related to the touch-down event with the information related to the first cluster of touch events, determining that the first cluster of touch events is associated with a user's appendage, and determining that at least one touch event of the plurality of touch events is associated with a handheld device, wherein the at least one touch event is not within the first cluster.
- Embodiments further provide a computer readable medium configured to store instructions executable by a processor of a host device to characterize user input data received by the host device, the instructions when executed by the processor causing the processor to receive information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information from the first touch event comprises a first touch data point, compare the first touch event information with a first rule and a second rule, wherein the first rule and the second rule each form a vote as to the type user input that created the first touch event; and attribute the first touch data point to a type of user input by analyzing the votes received from the first and second rule.
- Embodiments further provide a method of operating a host device, comprising receiving, at the host device, information related to a touch-down event, receiving, at the host device, information related to a plurality of touch events from a controller, determining one or more clusters of touch events from the plurality of touch events, correlating the information related to the touch-down event with the information related to the one or more cluster of touch events, determining that one of the one or more the cluster of touch events is associated with a palm, and determining that the touch-down event is associated with a handheld device.
- the handheld device includes at least one of an accelerometer, a magnetometer, a gyroscope, or the like for detecting the orientation of the handheld device and detecting a triggering event, which both can be used to help control some aspect of the hardware or software running on the touch-screen tablet computer.
- FIG. 1 illustrates an exemplary touch-screen tablet computer and a capacitive stylus pen according to an embodiment of the invention.
- FIG. 2 is a simplified block diagram of the components of a host device and stylus pen according to an embodiment of the invention.
- FIG. 3A is a simplified block diagram of a user input discrimination processing architecture used to distinguishing between the different types of user inputs received by the touch-screen tablet computer according to an embodiment of the invention.
- FIG. 3B is a flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention.
- FIG. 3C is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention.
- FIG. 3D is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention.
- FIG. 4 is a simplified flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention.
- FIG. 5A illustrates a plurality of related touch points on a touch-screen tablet computer that have been analyzed by a controlling engine according to an embodiment of the invention.
- FIG. 5B illustrates a plurality of related touch points on a touch-screen tablet computer that have been analyzed by a controlling engine according to an embodiment of the invention.
- FIG. 5C is a simplified flowchart illustrating a method of discriminating interactions caused by the palm of a user on a touch-screen according to an embodiment of the invention.
- FIG. 6A is a simplified flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention.
- FIG. 6B is a table listing some examples of some voting results contained in the generated decision matrix data generated during the method of discriminating between various touch interactions illustrated in FIG. 6A , according to one or more of the embodiments described herein.
- FIG. 7 is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention.
- FIG. 8 is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, where stylus pen and touch interactions overlap, according to an embodiment of the invention.
- FIG. 9A is an isometric cross-sectional view of a portion of a mutual capacitance sensing type host device that is interacting with an active stylus pen, according to an embodiment of the invention.
- FIG. 9B is a schematic signal diagram illustrating aspects of the process of detecting a touch-sensing device output signal and synchronizing an active stylus pen thereto, according to an embodiment of the invention.
- FIG. 9C illustrates the components of an active stylus pen 206 capable of interacting with a host device 100 that is configured for mutual capacitance sensing, according to an embodiment of the invention.
- FIG. 10 illustrates simplified signature pulse diagrams that may be generated by two pens, according to an embodiment of the invention.
- Embodiments of the present invention generally provide a system and methods of distinguishing between the different types of user inputs provided from the interaction of a user's finger, a user's appendage and/or a handheld device with a touch sensitive device.
- the handheld device is an electronic stylus pen, or also referred to herein as simply a “stylus pen,” that a user uses to provide input to control some aspect of the touch sensitive device.
- Computing devices that provide software applications that allow a user to input information via a touch input mechanism and a stylus pen input mechanism are often complex due to the need to distinguish between the interaction of a user's finger, user's appendage and stylus pen with the touch sensitive device to properly control some aspect of the hardware or software applications running on the computing device.
- Embodiments of the invention described herein may also include a system and methods that employ a controlling engine running on a touch sensitive computing device, generally referred to herein as a host device, to discern between the user input received from a stylus pen, fingers or user's appendage.
- the data generated from the controlling engine's analysis of the user input data received from the various components that are coupled to or in communication with the touch sensitive computing device can then be used to control some aspects of the hardware or software running on the touch sensitive computing device.
- the controlling engine generally includes software instructions that include one or more input discrimination techniques that are used to analyze the various types of user input data received from one or more components in the touch sensitive device to determine the likely source of the user input.
- the one or more input discrimination techniques may include time based synchronization techniques, geometric shape discrimination techniques and inference based discrimination techniques that can be used separately or in combination to discern between different types of inputs received by the touch sensitive computing device.
- Touch sensitive computing devices may include a touch-screen tablet computer, which may use a resistive, capacitive, acoustic or other similar sensing technique to sense the input received from a user.
- a system and method are used to distinguish between different types of user inputs using a simplified data set that is created by the touch sensitive computing device from the interaction of a user's finger, user's appendage and/or a handheld device.
- the simplified data only includes the coordinates of the touch point and the time that the interaction occurred with the touch sensing components, which is generally a small fraction of the amount of the data that is typically collected by conventional handheld or touch sensitive computing devices.
- a system in FIG. 1 , includes a touch sensitive computing device, or host device 102 , that includes a user interface 104 .
- Host devices 102 that include a user interface 104 capable of user interaction through a touch-screen sensing component.
- the host device 102 may be, for example, general computing devices, phones, media players, e-reader, kiosks, notebooks, netbooks, tablet types of computers, or any other device having one or more touch-sensitive inputs.
- the user interface 104 can include components that are used to display applications being executed by the host device 102 .
- the host device 102 is an electronic device such as an iPad® device from Apple Inc.
- Exemplary embodiments of computing devices include, without limitation, the iPhone®, iPad® and iPod Touch® devices from Apple Inc., the Galaxy Note® 10.1 from Samsung, the SurfaceTM from Microsoft, other mobile devices, tablet computers, desktop computers, kiosks, and the like.
- FIG. 1 also depicts a user input device, or a handheld device, in the form of a stylus pen 106 that is capable of touch interactions with the user interface 104 of the host device 102 .
- stylus pen 106 is a typical embodiment of the control device described herein, embodiments of the control device are not limited to a stylus pen 106 , and may include control devices in other forms including stamps, and other devices that can be used to conduct touch interactions with the user interface 104 , such as other fixed or detachable devices.
- touch interactions between the stylus pen 106 and the user interface 104 do not require the physical interaction of a portion of the stylus pen 106 and the surface of the user interface 104 , and may also include interactions where the stylus pen 106 is moved over the surface of the user interface 104 without touching the surface (e.g., active stylus pen discussed below).
- FIG. 2 schematically illustrates a system diagram showing a simplified view of the control elements of a host device 102 , and a simplified system diagram of the control elements of a stylus pen 106 .
- the host device 102 typically has at least some minimum computational capability, touch sensing capability and/or visual display capability.
- the host device 102 includes processing units 201 that may include, but is not limited to one or more processing units 210 , a memory unit 211 , a touch sensing unit 212 , a display unit 213 and a communications unit 214 .
- the touch sensing unit 212 may utilize resistive, capacitive (e.g., absolute sensing or mutual capacitance sensing), acoustic or other similar sensing and signal processing components, which are known in the art, to sense the input received from a user at the user interface 104 .
- the touch sensing unit 212 may be disposed within and/or coupled to the user interface 104 in the host device 102 .
- the display unit 213 may include various components that are able to display and/or visually render information provided to it by the one or more processing units 210 and memory 211 .
- the display unit 213 may include any type of visual interface that includes light emitting diode (LED), organic LED (OLED), liquid crystal display (LCD), plasma, electroluminescence (EL), or other similar conventional display technology.
- the communications unit 214 will generally include one or more components that are configured to transmit and receive information via a communication link 205 between the host device 102 , the stylus pen 106 and other possible peripheral devices via a desirable communication method.
- a desirable communication method may include a wired or wireless communication method, such as a Bluetooth low energy (BTLE) communication method, Bluetooth classic, WiFi, WiFi direct, near-field communication (NFC) or other similar communication method.
- the memory unit 211 generally contains computer readable media that can be accessed by the host device 102 and may include both volatile and nonvolatile media for storage of information, such as computer-readable or computer-executable instructions, data, programs and/or other data.
- Memory 211 may include computer or machine readable media or storage devices such as DVD's, CD's, floppy disks, tape drives, hard drives, optical drives, solid state memory devices, RAM, ROM, flash memory or any other device which can be used to store the desired information.
- computer or machine readable media or storage devices such as DVD's, CD's, floppy disks, tape drives, hard drives, optical drives, solid state memory devices, RAM, ROM, flash memory or any other device which can be used to store the desired information.
- the device should have a sufficient computational capability and system memory to enable basic computational operations.
- the computational capability can be completed by one or more processing unit(s) 210 that are in communication with system memory 211 .
- the processing unit(s) 210 may include conventional central processing units (CPUs), which include graphical processing units (GPU) and other useful elements to control the various display, touch, communication and other units in the host device 102 .
- CPUs central processing units
- GPU graphical processing units
- the processing unit(s) 210 may also include or be in communication with a host clock 215 , which may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in the host device and/or data transferred between the host device 102 and other connected wired and wireless network components (e.g., stylus pen 106 ).
- a host clock 215 may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in the host device and/or data transferred between the host device 102 and other connected wired and wireless network components (e.g., stylus pen 106 ).
- the stylus pen 106 may have one or more active regions that are able to collect additional information about the user's interaction with the host device 102 .
- the one or more active regions may include an active tip of the stylus pen 106 that is positioned so that the user will cause this region of the stylus pen 106 to interact with the host device 102 .
- the active tip of the stylus pen 106 may contain sensors that are able to measure some aspect of the interaction of the active tip and the host device 102 . As schematically depicted in FIG.
- the stylus pen 106 may include a pen tip 106 a , a pressure sensing unit 106 b , a processor 106 c , a communications unit 106 d , a memory unit 106 e , a power source 106 f and a pen clock 106 g .
- the stylus pen 106 may further comprise one or more additional sensors (not shown in FIG. 2 ), such as one or both of a gyroscope and an accelerometer.
- the pen tip 106 a is configured to make contact with the user interface 104 of the host device 102 .
- the pressure exerted at the pen tip 106 a is dependent on the user's interaction with the stylus pen 106 .
- the pressure sensing unit 106 b is capable of detecting the amount of pressure applied to the pen tip 106 a of the stylus pen 106 by the user. Pressure data corresponding to the amount of pressure exerted by the user with the user interface 104 of the host device 102 is measured by the pressure sensing unit 106 b .
- the pressure data can include data from a binary switch, or other device that is able to discern between 8, 16, 32, 64, or any other desirable number of pressure levels so that the generated pressure data is useful for the control of the host device 102 .
- different pressure levels can be used for different host devices 102 , such that a stylus pen interaction will only be registered by the host device 102 when a threshold pressure level is detected.
- the pressure data sensed by the pressure sensing unit 106 b may also include an analog measurement of the pressure applied, and thus the generated pressure data supplied to the host device 102 may vary continuously across a desired range.
- the processor 106 c can be configured to control the operation of the stylus pen 106 .
- the stylus pen 106 may be comprised of one or more processors to control various aspects of the operation of the stylus pen 106 .
- the processor 106 c may also include or be in communication with a stylus pen clock 106 g , which may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in the stylus pen 106 and/or data transferred between the stylus pen 106 and other wired and wireless network components (e.g., host device 102 ).
- the stylus pen clock 106 g is set at a speed that is at least as fast as the speed that a clock (e.g., host clock 215 ) in the host device 102 is running at to facilitate the timing of the delivery of communication signals from the communications unit 214 .
- a clock e.g., host clock 215
- the stylus pen clock 106 g has a frequency error of less than about 50 parts per million (ppm), such as an accuracy of at least 30 to 50 ppm.
- the communications unit 106 d is capable of transmitting the pressure data from the stylus pen 106 to the communications unit 214 of the host device 102 when stylus pen interactions are made against the user interface 104 of the host device 102 .
- the communications unit 106 d transmits the interaction data via a desirable wireless communication method, such as a Bluetooth low energy (BTLE) communication method.
- BTLE Bluetooth low energy
- Other embodiments include other appropriate communications device components for transmitting interaction data between the stylus pen 106 and the host device 102 .
- Interaction data supplied by the stylus pen 106 can comprise the pressure data, timing data, and/or orientation data generated from gyroscopes and/or accelerometers or the like in the stylus pen 106 .
- the communications unit 106 d may only transmit the pressure data once a threshold pressure level has been detected by the pressure sensing unit 106 b . In other embodiments, the communications unit 106 d may transmit the pressure data from the stylus pen 106 once any pressure is detected, regardless of the pressure level detected by the pressure sensing unit 106 b.
- the memory unit 106 e is capable of storing data related to the stylus pen 106 and data related to the host device 102 , such as device settings and host clock 215 and stylus pen clock 106 g information.
- the memory unit 106 e may store data related to the linking association between the stylus pen 106 and the host device 102 .
- the power source 106 f is capable of providing power to the stylus pen 106 .
- the power source 106 f may be a built-in battery inside the stylus pen 106 .
- the power source 106 f can be electrically coupled to one or more of the components within the stylus pen 106 in order to supply electrical power to the stylus pen 106 .
- the stylus pen 106 may be comprised of one or both of a gyroscope, an accelerometer, or the like.
- a gyroscope is a device configured to measure the orientation of the stylus pen 106 and operates based on the principles of the conservation of angular momentum.
- one or more gyroscopes are micro-electromechanical (MEMS) devices configured to detect a certain rotation of the stylus pen 106 .
- MEMS micro-electromechanical
- the stylus pen 106 can be configured to send orientation data from a gyroscope contained within the stylus pen 106 . This orientation data can be used in conjunction with the timing and pressure data communicated from the stylus pen 106 to the host device 102 .
- the accelerometers are electromechanical devices (e.g., micro-electromechanical systems (MEMS) devices) configured to measure acceleration forces (e.g., static and dynamic forces).
- MEMS micro-electromechanical systems
- One or more accelerometers can be used to detect three-dimensional (3D) positioning.
- 3D tracking can utilize a three-axis accelerometer or two two-axis accelerometers.
- the stylus pen 106 may utilize a 3-axis accelerometer to detect the movement of the stylus pen 106 in relation to the user interface 104 of the host device 102 .
- FIG. 3A illustrates a simplified block diagram of a user input discrimination architecture 300 that comprise computer executable instructions and supporting hardware and software elements that are used to distinguish between the different types of user inputs received by the host device 102 .
- the system and methods described herein may provide a user input discrimination architecture 300 that includes a controlling engine 340 that receives various user input information and uses the received user input information to distinguish between the different types of user touch inputs received by the user interface 104 of the host device 102 .
- the controlling engine 340 comprises computer executable instructions that are stored in the memory 211 of the host device 102 , and are run in the background of the host device 102 by use of the processing units 201 of the host device 102 .
- the user inputs received by the controlling engine 340 may include a user touch related input 331 , a stylus pen input 335 and/or a host input 333 .
- the host input 333 which is delivered from the host signal processing unit 332 of the processing unit 210 to the controlling engine 340 , may include the user touch related input 331 received from the user's physical touch input 330 received by the user interface 104 , the stylus pen input 335 and other useful information relating to the control of the host device 102 collected by the processing unit 210 .
- the host signal processing unit 332 is not a separate component within the host device 102 , and may be formed within, and controlled by, the components used to provide the user's physical touch input 330 or even the controlling engine 340 .
- the controlling engine 340 may then deliver output data 350 that is used in the control of various software and hardware running on the host device 102 .
- the output data 350 may be used by one or more third party applications and/or components in the host device 102 to perform some useful display or data output functions.
- the output data 350 may be used by the host device 102 to control some aspect of a software program running on the host device 102 , to generate an image on a display in the host device 102 and/or process some data that is stored in the host device 102 .
- the user touch related input 331 includes the user's physical touch input 330 , which may include the interaction of a finger, an appendage and the physical interaction of the stylus pen 106 with the touch sensitive portion of the user interface 104 .
- the touch related input 331 is processed by the host signal processing unit 332 , such as capacitive sensing signal processing, before it is delivered to and then used by the controlling engine 340 .
- the user's input delivered to the host device 102 may also include configurational inputs 339 that are delivered from the user and/or stylus pen 106 to the controlling engine 340 .
- the configurational inputs 339 may include information about the user or stylus pen 106 that will help the user input discrimination architecture 300 distinguish between the different types of user touch input 331 information (e.g., information relating to touch input from a stylus pen, finger or appendage) received by the host device 102 .
- the configurational inputs 339 may include whether the user is right-handed, left-handed, information about the host device 102 , Bluetooth pairing information or other useful information about the user, stylus pen or controlling engine configuration.
- the stylus pen input 335 generally includes user input information received by components in the stylus pen 106 that can be transferred via wired or wireless communication methods to the host device 102 .
- the stylus pen input 335 may comprise the pressure data, timing data, and/or orientation data generated by the pressure sensing unit 106 b or other sensors found in the stylus pen 106 (e.g., gyroscopes, accelerometers, etc.), such as touch signal generating device 106 h which is discussed further below.
- the stylus pen input 335 may be transmitted via a wireless communication link to the communications unit 214 of the host device 102 using a desirable wired or wireless communication technique, such as a Bluetooth low energy (BTLE) communication protocol, and then is delivered to the controlling engine 340 .
- BTLE Bluetooth low energy
- the stylus pen input 335 is processed by the host signal processing unit 332 in the host device 102 using wired or wireless communication protocols (e.g., BTLE protocols) before it is delivered to the controlling engine 340 via the host signal processing unit 332 .
- wired or wireless communication protocols e.g., BTLE protocols
- the host input 333 generally includes various sets of synchronous and/or asynchronous data that are received by the host device 102 from the stylus pen 106 and/or created by the user's physical touch input 330 received from the user.
- the host input 333 which is provided to the controlling engine 340 , may include user touch input 331 generated by the touch sensing unit 212 and the stylus pen input 335 data provided by the stylus pen 106 to the communications unit 214 and host signal processing unit 332 .
- the touch related input 331 data is delivered to the controlling engine 340 separately (i.e., input 333 A) from the stylus pen input 335 data (e.g., input 333 B).
- the separate host inputs 333 A and 333 B may not be transferred on separate physical elements to the controlling engine 340 , but are shown herein separately to schematically illustrate the different types of data being delivered between the host device 102 and the controlling engine 340 .
- the communications unit 214 processes the transmitted stylus pen input 335 received from the stylus pen 106 via the communication link 205 before it is delivered to the controlling engine 340 .
- the controlling engine 340 generally includes one or more executable programs or program related tasks that are used to create the output data 350 which is used by the controlling engine 340 , software running on the host device 102 and/or one or more hardware components of the host device 102 to perform some useful function.
- the controlling engine 340 may comprise one or more input discrimination techniques 345 that are used separately or in combination generate useful and reliable output data 350 .
- the one or more input discrimination techniques 345 take in the various different types of inputs (e.g., inputs 331 , 333 A, 333 B, 335 , 339 ) received by the host device 102 and try to determine the different types of user inputs from one another, so that the number of errors in the proper selection of an inputting element, such as a finger, stylus pen and/or appendage will be eliminated or less likely to occur.
- the one or more input discrimination techniques 345 are thus used to determine the different types of user inputs from one another and provide a desired “input label” or “element label” for each type of user input so that they can be correctly used by the one or more third party applications and/or components used in host device 102 .
- the one or more input discrimination techniques 345 include a time based discrimination technique 341 , a geometric shape discrimination technique 342 and/or an inference based discrimination technique 343 that are used separately or in combination to generate useful and reliable output data 350 that can be used by the software and/or hardware running on the host device 102 .
- the one or more input discrimination techniques 345 include a plurality of time based discrimination techniques, geometric shape discrimination techniques and/or inference based discrimination techniques.
- FIG. 3B is a flowchart illustrating a method 390 of discriminating between finger and appendage touch interactions and the physical stylus pen interactions with the host device 102 using one or more input discrimination techniques 345 .
- the method 390 optionally starts with the delivery, storage in memory and/or recall of configurational inputs 339 by the controlling engine 340 , as shown a step 391 .
- the configurational inputs 339 may include information about the user and/or stylus pen 106 that is useful for the discrimination of a finger or appendage touch interaction from the physical stylus pen interaction.
- a stylus pen input 335 which is created when the stylus pen 106 is brought into contact with the user interface 104 , is transferred via a wired or wireless communication technique to the host device 102 and controlling engine 340 .
- the receipt of the stylus pen input 335 is also referred to herein as a “touch-down event.”
- a “touch-down event” may be created from a single interaction or each time a user reengages the stylus pen 106 with the user interface 104 during a writing, drawing or other similar stylus pen 106 user input interaction with the user interface 104 .
- the controlling engine 340 will ignore the received user touch related input 331 data until it has received touch-down event information.
- touch-down events do not require the physical contact of a portion of the handheld device and the surface of the user interface 104 , but may also include sensed interactions where the stylus pen is moved over the surface of the user interface 104 without touching the surface, for example, by use of an active pen tip, which is discussed below.
- a timing window of a desired length is created around the receipt of a stylus pen input 335 (e.g., touch-down event) in time, so that all of the user touch related inputs 331 can be collected for analysis by the controlling engine 340 to determine which of the touch inputs were received from the stylus pen, finger(s) or user's appendage.
- the timing window includes a time period of about 30 ms on either side of a received touch-down event.
- the timing window will include all user data received by and stored in memory 211 of the host device 102 in a first time period prior to the receipt of a stylus pen input 335 and second time period after the receipt of a stylus pen input 335 .
- the length of the timing window (e.g., first time period plus the second time period) may be adjusted so that any external noise received by the host device 102 does not adversely affect the discrimination process performed by the controlling engine 340 , while also assuring that all of the user touch related input 331 data that is associated with the stylus pen input 335 are captured.
- the length of the timing window will depend on the sampling frequency of the touch sensitive portion of the host device 102 , the communication speed between the stylus pen 106 and host device 102 and the processing speed of the controlling engine 340 .
- the sampling frequency of the stylus pen's generated data e.g., pressure data generated by the pressure sensing unit 106 b
- the communication speed between the stylus pen 106 and host device 102 is sampled at about a 30 ms rate.
- the controlling engine 340 may continue to track and provide user input discrimination results via the generation and delivery of the output data 350 .
- the accuracy of the stylus pen clock 106 g is at least as accurate as the host clock 215 to assure that the time stamps applied to the touch data information generated by the stylus pen 106 and host device 104 does not appreciably drift relative to one another over time. Clock speeds in the stylus pen 106 and host device 104 that appreciably vary from one another will affect the relative accuracy of the time stamp information that is compared by the controlling engine to determine whether a user input can be attributed to a stylus pen, finger or user appendage. As discussed herein, the time stamp information may be used in some embodiments described herein to help differentiate the type of user input based its timing relative to other touch events.
- the stylus pen clock 106 g has a frequency error of less than about 50 parts per million (ppm), such as an accuracy of at least 30 to 50 ppm. Therefore, the use of a stylus pen clock 106 g that has an accuracy that is at least as good as the host clock 215 can help reduce the error in the detection and analysis of the user input.
- ppm parts per million
- the controlling engine 340 creates a timing window of a desired length around the receipt of a first stylus pen input 335 (e.g., touch-down event) based on a first report received at a first time via the communication link 205 created between the stylus pen 106 and the host device 102 .
- the controlling engine 340 determines which touch data events fall within the first timing window and then notes that these touch data events are likely to be from a stylus pen 106 .
- the number of touch data events that fall within a timing window can be larger than the number of actual touch data event(s) that are associated with the stylus pen 106 .
- the controlling engine will compare the touch data events found in this timing window with the touch data events found in the first timing window to determine which touch data events also stopped (touch take off (e.g., pen removed from interface)) in this window.
- touch data events that do not fit within these requirements are likely not related to the stylus pen and touch data event(s) that are in both windows are more likely to have originated from the stylus pen 106 .
- the first report is generated when the stylus pen lands on the user interface 104 , few reports are then generated as long as pen is pressed on the host device, and the last report is generated when the stylus pen 106 is removed from the user interface 104 , and thus the controlling engine 340 is used to determine which of the touch events was associated with the stylus pen.
- the controlling engine 340 utilizes one or more of the input discrimination techniques 345 to discriminate between the touch interactions supplied by the stylus pen, a finger or user's appendage.
- One or more of the input discrimination techniques such as time based discrimination techniques, geometric shape discrimination techniques or inference based discrimination techniques, which are discussed further below, perform an analysis of the touch-down event information and touch event information received in steps 392 - 393 to help distinguish between the source of the different touch event interactions received by the user interface 104 .
- the controlling engine 340 may also utilize the configurational input 339 data received at step 391 during this step to help classify and further analyze the other received data.
- the analyses performed by the different input discrimination techniques 345 utilize various different rules that are found within the software instructions that form at least part of the controlling engine 340 .
- a discussion of some of the types of rules for each of the different types of input discrimination techniques 345 can be found below.
- each of the one or more input discrimination techniques 345 are used to create and apply a “user input type” label, or also referred to herein as an “element label,” to each of the touch data points for each touch event.
- the process of providing a “user input type” label generally includes the process of attributing each of the touch data points to a particular user's touch input, such as the physical input from the stylus pen, finger or appendage to the user interface 104 .
- the element labels may be further analyzed by the controlling engine 340 .
- an inference based discrimination technique 343 FIG. 3A
- each of the element labels for each of the touch data points are either further analyzed by the controlling engine 340 to reconcile differences between the element labels created by each of the input discrimination techniques or each of the different element labels are transferred within the output data 350 , so that they can be used by the software and/or hardware running on the host device 102 .
- the output data 350 may include the positional information (e.g., touch points) and timing information for only the relevant interacting components, such as a stylus pen 106 and a finger, and not the interaction of a user's appendage, by use of one or more input discrimination techniques 345 .
- steps 392 - 396 can then be repeated continually, while the stylus pen 106 is interacting with the user interface 104 or each time a touch-down event occurs to provide user input discrimination results via the generation and delivery of the output data 350 .
- FIGS. 3C-3D illustrate an example of the various user input information that may be received by the controlling engine 340 and the output data 350 results that may be generated by the controlling engine 340 using the steps provided in method 390 , according to an embodiment of the invention described herein.
- FIG. 3C illustrates an example of data 370 that is received by the controlling engine 340 , due to the interaction of a stylus pen 106 , finger or user's appendage (e.g., palm) with the host device 102 as a function of time.
- 3D graphically illustrates at least a portion of the output data 350 generated by the controlling engine 340 (e.g., data 380 ), due to the interaction of a stylus pen 106 , finger or user's appendage (e.g., palm) with the host device 102 as a function of time.
- the controlling engine 340 e.g., data 380
- the touch sensing component of the host device 102 receives interaction data 371 created by the interaction of an appendage (e.g., palm) with the host device 102 .
- the touch sensing component of the host device 102 also receives interaction data 372 created by the interaction of a stylus pen with the host device 102 (e.g., touch event).
- the host device 102 also receives stylus pen input 335 data, or interaction data 373 (e.g., touch-down event).
- the touch sensing component of the host device 102 also receives interaction data 374 created by the interaction of a finger with the host device 102 , and then at time T 4 the interaction of a finger with the host device 102 ends, thus causing the interaction data 374 to end.
- a timing window having a desired length is created so that the stored user input received between a time before T 0 and time T 2 and the user input received between times T 2 and a time after T 4 can be characterized and useful output data 350 can be created.
- the interaction data 371 - 374 received by the controlling engine 340 at any instant in time includes the coordinates of a touch data point and its timing information.
- the interaction data 371 - 374 includes the input data received over a period of time for each specific interacting element, and thus may contain many different touch data points that are in different coordinate positions on the user interface at different times.
- FIG. 3C illustrates an example of various different types of interacting elements (e.g., stylus pen, finger, appendage) and a specific example of the timing of the interaction of these interacting elements with the host device 102 , this example is not intended to be limiting, and is only added herein as a way to describe one or more aspects of the invention described herein.
- FIG. 3D illustrates at least a portion of the output data 350 created by the controlling engine 340 using the one or more input discrimination techniques 345 , based on the received interaction data 371 , 372 , 373 , and 374 illustrated in FIG. 3C .
- time T A which is at a time between time T 0 and time T 1 , the controlling engine 340 has received a small amount of the received interaction data 371 created by the user.
- at least one of the one or more input discrimination techniques 345 used in step 394 by the controlling engine 340 are used to create and apply a user input type label to the interaction data 371 , based on the input data received by the controlling engine 340 by time T A .
- the input data may include the user touch related input 331 , stylus pen input 335 , host input 333 and configurational inputs 339 .
- the controlling engine 340 does not have enough data to decide what type of user input is being applied to the host device 102 , it may be desirable to make an initial guess (e.g., finger, stylus pen and/or appendage) and then later correct the user input label as more data is acquired about the received user input.
- the user input type label for the interaction data 371 is defined to be an “appendage” versus a “finger” or “stylus pen.” Therefore, the output data 350 created at time T A includes the current positional information, current timing information and “appendage” element label for the interaction data 371 . In some embodiments, any interaction data that is not given a stylus pen or finger type of element label is excluded from the output data 350 provided from the controlling engine 340 and thus no output data 350 is transferred for the interaction data 371 at time T A , as illustrated in FIG. 3D as a dashed line.
- the controlling engine 340 has received data regarding a user input that is creating the interaction data 371 and a new user input that is creating the interaction data 372 .
- the one or more input discrimination techniques 345 of the controlling engine 340 are used to create and apply a user input type label to the interaction data 372 (e.g., step 394 ), based on the input data received by the controlling engine 340 by time T B .
- the user input type label for the interaction data 372 is initially defined as a “finger” based on the one or more input discrimination techniques 345 .
- the controlling engine 340 is continually collecting the interaction data 371 and 372 information and thus can continually reevaluate the user input type label for the received interaction data 371 and 372 .
- the controlling engine 340 has received data regarding the user inputs that are creating the interaction data 371 and 372 , and a new user input that is creating the interaction data 373 .
- the interaction data 373 comprises stylus pen input 335 data created by one or more sensors found in the stylus pen 106 .
- the interaction data 373 is generated due to a user initiated pen tip 106 a touch event that actually occurred at time T 1 .
- the delivery of the interaction data 373 to the controlling engine 340 has been delayed from the interaction data 372 received by the stylus pen's interaction with the touch sensing unit of the host device 102 by a signal delay time 375 ( FIG.
- the signal delay time may be created by communication processing timing delays, differences in the clocks of the stylus pen 106 and host device 102 and/or communication/timing errors created within the stylus pen 106 or the host device 102 .
- the data delivered in the transferred interaction data 373 may be generated by the pressure sensing unit 106 b and then transferred to the communications unit 214 of the host device 102 through the communications unit 106 d of the stylus pen 106 .
- the one or more input discrimination techniques 345 of the controlling engine 340 are used to create and apply a user input type label to the interaction data 373 , based on the input data received by the controlling engine 340 by time T C .
- the interaction data 372 and 373 are associated with each other and are given a “stylus pen” user input type label due to the information received and processed by the one or more input discrimination techniques 345 , which is an adjustment from the initial element label given to the interaction data 372 .
- the output data 350 provided to the hardware or other software running on the host device 102 at time T C will thus contain the stylus pen 106 's positional and timing data associated with the interaction data 372 and the stylus pen's pressure data, stylus pen related timing data, and/or stylus pen orientation data associated with the interaction data 373 , while the “appendage” related data found in the interaction data 371 is still being excluded. It should be noted that the controlling engine 340 will still collect the interaction data 371 , 372 and 373 information, and thus can continually reevaluate the user input type labels as needed.
- signal delay time 375 can be created by mechanical and electrical delays that are created during the collection and transmission of the information between the stylus pen 106 and the controlling engine 340 running in the host device 102 , and also created by the controlling engine, which may not be synchronized with the wired or wireless communication arrival (e.g., BTLE information). Delays may also be generated due to higher priority tasks being completed by the processing unit 210 and/or controlling engine 340 , which may cause a delay in the analysis of the received touch data.
- the mechanical delays may include delays created by inertia and/or friction in the pen tip 106 a and/or pressure sensing components in a pressure sensing unit 106 b of the stylus pen 106 .
- the electrical delays may results from the propagation delays created by one or more electrical components in the host device or stylus pen (e.g., low-pass filters (LPFs) and ADCs) and processing delays created due to the need to transmit and/or convert the data for transmission via a wireless transfer technique (e.g., BTLE) or use by one or more processing components in the stylus pen 106 or host device 102 .
- LPFs low-pass filters
- BTLE wireless transfer technique
- the sampling rate of the sensing components user interface 104 may be running at a speed of about 16 milliseconds (ms) and the sampling rate of the components in the stylus pen is less than about 16 ms. In one example, the sampling rate of the data sampling components in the stylus pen is less than about 10 ms, such as between about 1 ms and about 10 ms.
- the provided timestamp information is thus used to help better correlate the multiple sets of data that are received by the controlling engine 340 , so that a reliable characterization of the user inputs can be made.
- the host clock 215 and the stylus pen clock 106 g which are used to generate and/or facilitate the transfer of data between the stylus pen 106 and host device 102 , are generally not synchronized, and in some cases may be running at different speeds, errors in the characterization and processing of the received user input data are not reliably eliminated by use of a single timestamp. In one example, these errors may include errors in the proper selection of a user's input and can cause jitter in the display which will ultimately annoy the user or cause significant disruption in the tasks that the user is performing on the computing device.
- the controlling engine 340 and/or user input sensing program(s) being executed in the stylus pen 106 use both sets of timestamp information received in the transferred data to continually update the processes running in each device to account for any drift or difference in the timing found between the stylus pen clock 106 g and host clock 215 .
- the difference in the timing found between the stylus pen clock 106 g and host clock 215 will generally affect the analysis of the user input received by the controlling engine 340 .
- all communications provided between the host device 102 and the stylus pen 106 will include the latest time information received from the stylus pen clock 106 g and the time information received from the host clock 215 , so that the controlling engine 340 can receive and can continually correct for errors found between the stylus pen clock 106 g and the host clock 215 .
- the controlling engine 340 has received data regarding a user input that is creating the interaction data 371 , 372 and 373 , and the new user input that is creating the interaction data 374 .
- the one or more input discrimination techniques 345 of the controlling engine 340 are used to create and apply a user input type label to the interaction data 374 , based on the input data received by the controlling engine 340 by time T D .
- the user input type label for the interaction data 374 is initially defined as a “finger” based on the one or more input discrimination techniques 345 . It should be noted that the controlling engine 340 will still collect the interaction data 371 , 372 , 373 and 374 information, and thus can continually reevaluate the user input type labels as needed.
- time T E which is at a time between time T 4 and time T 5 , the controlling engine 340 has received interaction data 371 , 372 and 373 , while the interaction data 374 has not been received after the time T 4 was reached.
- the one or more input discrimination techniques 345 continually reevaluate the user input type labels for the interaction data 371 - 373 , based on the input data received by the controlling engine 340 , however, the tracking and characterization of the user input relating to the interaction data 374 will generally be halted due to the removal of this user input.
- all of the user's inputs will be continually tracked and characterized while they are interacting with the host device 102 , and be dropped from output data 350 when their interaction with the host device 102 ends.
- the controlling engine 340 may use the one or more input discrimination techniques 345 to track and provide user labels for user interactions that are suspended for times shorter than a specified period of time, such as when a stylus pen 106 is lifted from the touch sensitive surface of the host device 102 for only a short time to write, draw or input some different pieces of information on the host device 102 .
- Embodiments of the invention described herein may provide a system and method that analyzes the timing of the received user's input data to determine the source of the user input delivered in the user's physical touch input 330 (e.g., physical stylus pen, finger(s) and user appendage touch input) to the controlling engine 340 .
- the controlling engine 340 analyzes the timing of the user input data to determine the different types of received user's physical touch input 330 , which is often referred to herein as the time based user input discrimination technique.
- the time based user input discrimination techniques can be used to determine if the received user input was created by a stylus pen, finger(s) or user's appendage by comparing the relative timing of the different user's physical touch input 330 events and stylus pen input 335 .
- the time based discrimination techniques used by the controlling engine 340 will generally compare the various received user input data as a function of time to help the controlling engine 340 discriminate between the interaction of a stylus pen, fingers or an appendage.
- the time based user input discrimination techniques discussed herein may be used alone or in combination with one or more of the other user types of input discrimination techniques discussed herein.
- FIG. 4 is a simplified flowchart illustrating a time based user input discrimination technique for discriminating touch interactions from the physical stylus pen interactions on a touch-screen according to an embodiment of the invention.
- the method 400 can be performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computing system or a dedicated machine), firmware (embedded software), or any combination thereof that are contained in the host device 102 and/or the stylus pen 106 .
- the method 400 is performed by the controlling engine 340 that is running in the background of the host device 102 and/or the data collection and transmission processes running on components in the stylus pen 106 .
- the method may include step 402 , in which the controlling engine 340 receives user input (e.g., user touch input 331 or stylus pen input 335 ) information related to a touch-down event on the host device 102 .
- user input e.g., user touch input 331 or stylus pen input 335
- information related to a touch-down event is received from a handheld device.
- the handheld device may be an electronic stylus pen, such as a stylus pen 106 , comprising a pressure sensor (e.g., pressure sensing unit 106 b ) touch signal generating device 106 h that is configured to deliver stylus pen input 335 information to the host device 102 .
- the electronic pen may also be comprised of at least one of an accelerometer or a gyroscope.
- the information related to the touch-down event that is transferred to the controlling engine 340 may comprise timing information, pressure data, and other data (e.g., accelerometer and/or gyroscope data) that is sent from the stylus pen 106 via the communication link 335 to the host device 102 .
- the information related to the touch-down event may include a first timestamp for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event.
- the information related to the touch-down event may include a pen clock timestamp derived from a pen clock signal received from the pen clock 106 g and a host clock timestamp derived from a host clock signal received from the host clock 215 for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event.
- the method includes the controlling engine 340 receiving information related to a touch event sensed by the host device 102 .
- the touch event may be from the stylus pen 106 physically interacting with the user interface 104 of the host device 102 , or by a touch interaction from the direct contact with the user interface 104 by a finger and/or appendage of the user.
- the information related to the touch event may comprise a second timestamp for the touch event, which may include timing information related to when the touch event occurred.
- the second timestamp comprises a pen clock timestamp derived from a pen clock signal received from the pen clock 106 g and a host clock timestamp derived from a host clock signal received from the host clock 215 for the touch event.
- the information related to the touch event and the information related to the touch-down event may be received by the device simultaneously or at different times.
- the method also includes correlating the information related to the touch-down event with the information related to the touch event.
- the controlling engine 340 correlates the information related to the touch-down event with the information related to the touch event.
- first time information for the touch-down event is correlated with second time information for the touch event.
- the method also includes determining whether the time delay between the first timestamp for the touch-down event and the second timestamp for the touch event is less than an empirically predetermined threshold.
- the controlling engine 340 determines whether the time delay is within a predetermined threshold. For example, if the time delay between the touch-down event and the touch event are greater than the predetermined threshold, which may indicate that the touch event is separate from the touch-down event, as illustrated in step 410 . In that event, the controlling engine 340 would distinguish that touch event as not being associated with the stylus pen 106 . The controlling engine 340 may distinguish the touch event as being associated with the user's finger.
- the host device 102 would register that touch event as being associated with the stylus pen 106 and not the user's finger or user's appendage.
- the controlling engine 340 continues to monitor incoming touch-down event(s) and touch events, correlates the received data, and makes a determination as to whether the touch event is associated with the stylus pen 106 , the touch interactions by the user's fingers and/or appendage of the user.
- FIG. 4 provides a particular method of 400 according to an embodiment of the present invention. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated in FIG. 4 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications.
- One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
- Embodiments of the invention described herein may also provide a system and method that uses one or more geometric based user input discrimination techniques to distinguish between the different types of user's physical touch input 330 information received from a stylus pen, finger or user's appendage by a touch sensitive device.
- the one or more input discrimination techniques may include a geometric shape discrimination technique that uses information relating to the relative position of multiple touch points supplied by the user to help discriminate between the interaction of a stylus pen, fingers or an appendage.
- the geometric based user input discrimination techniques discussed herein may be used separately or in combination with one or more of the other user input discrimination techniques to distinguish between the various different types of user inputs received by the computing device.
- the use of one or more of the geometric based input discrimination techniques with one or more of the time based input discrimination techniques will help improve the accuracy of the user input discrimination created by either technique on their own.
- the host input 333 ( FIG. 3A ), which is provided to the controlling engine 340 via portions of the host device 102 , to only include a simplified data set that just includes the coordinates (e.g., X and Y-direction coordinates) of each of the touch data points 556 and the time that the interaction occurred with the user interface 104 .
- this simplified data set is a small fraction of the amount of the data that is commonly collected by conventional touch sensitive handheld devices or touch sensitive display type computing devices.
- the creation and use of the simplified data to discriminate between the interaction of a stylus pen, fingers or an appendage can reduce the required computing power of the host device and/or increase the speed of the computing device by reducing the computational power required to collect and transfer the touch interaction data.
- the controlling engine 340 does not have access to the actual user interaction data collected from the user interface 104 and is only fed a simplified data set from the host device 102 . In this case, the controlling engine 340 must discriminate between the interaction of a stylus pen, fingers or an appendage based on the limited nature of the data supplied to it by the host device 102 .
- FIG. 5A schematically illustrates a plurality of touch data points 556 that have been detected by the user interface 104 of the host device 102 and delivered to the controlling engine 340 for the discrimination of the various different types of user inputs.
- the geometric shape discrimination technique 342 used by the controlling engine 340 includes a method of sorting and grouping of the received touch data points 556 by their geometric location based on geometric rules coded into the controlling engine 340 .
- Each of the touch data points 556 will be separately analyzed by the controlling engine 340 to determine if they can be associated with the stylus pen, finger or part of the user's appendage.
- the geometric shape discrimination technique 342 uses geometric rules to provide element labels to one or more clusters of touch points, since it is often likely that these clusters of touch points are related to a specific type of user input, such as a user's palm or finger. For example, if the controlling engine 340 knows that the user is right handed, based on information received from a configurational input 339 or by prior analysis of received touch data points 556 , the controlling engine 340 can apply a rule that specifies that a stylus pen related touch point will be above and to the left of a group of touch points that are associated with a palm of the user.
- the controlling engine will generally use the current touch point data received by the user interface 104 and stored old touch data points that had been previously received by the controlling engine 340 (e.g., also referred to herein as “aging” touch data points).
- older touch data points which had each been previously analyzed and characterized by the controlling engine 340 , are used to help determine the type of user input that is associated with the current received touch data point 556 .
- Use of the older touch data points and its relationship to the new touch data points can improve the speed and accuracy by which the current touch data points can be associated with a type of user input.
- the older touch data points are retained, analyzed and/or used by the controlling engine 340 for only a short period of time before they are deemed not useful and are excluded from use.
- the controlling engine 340 is configured to group the touch data points 556 into at least one of a pen region 571 , an appendage region 561 or a finger region 581 . Therefore, based on the position of each touch data point 556 in relation to other touch data points 556 , the geometric shape discrimination technique 342 can determine that one or more touch points, at any instant in time, is likely to be associated with stylus pen, finger or part of the user's appendage.
- the various regions defined by the controlling engine 340 such as pen region 571 , an appendage region 561 or a finger region 581 , can be formed around clusters of touch data points that have been associated with the same type of user input. For example, the appendage region 561 illustrated in FIG. 5A contains a cluster of the touch data points 556 that have been associated with the user's appendage.
- the controlling engine 340 may also create and use a geometric boundary region 551 to help prioritize the analysis of the received touch data contained therein as being likely to contain useful user input data.
- the geometric boundary region 551 may include a region that includes a touch point that is associated with a stylus pen and one or more touch points that are associated with an appendage, since it is likely that a pen touch point will be near touch points that are associated with a palm of the user.
- the geometric boundary includes all of the touch points supplied to the user interface 104 that have been received at an instant in time. The controlling engine 340 may use the position of the touch points within the geometric boundary to help decide what type of user input has been received. In one example, touch data points that are near the boundaries of the geometric boundary may be more likely to be from a stylus pen.
- the controlling engine 340 has applied the various geometric based rules and determined that a group of touch points are associated with an appendage region 561 , a touch point is associated with a stylus pen (e.g., within the defined pen region 571 ) and a touch point is associated with a finger (e.g., within the defined finger region 581 ).
- the controlling engine 340 compares each of the currently received touch data points 556 with older touch data points 557 to determine the likely type of user input that has been received by the controlling engine 340 .
- the controlling engine 340 thus may determine that the user's appendage has shifted down and to the left based on the comparison of the touch data points 556 that are found in the appendage region 561 with the older touch data points 557 .
- This created geometric analysis data can then be used by other components in the host device 102 , or likely in this case be excluded from the output data 350 set.
- the geometric shape discrimination technique 342 creates and uses a predicted data point region 559 that it uses to help determine the element label for a newly received touch data point 556 .
- a first touch data point 557 1 and a second touch data point 557 2 are used to form a predicted direction 558 for the next touch data point and the predicted data point region 559 .
- the touch data point 556 in this example happens to fall within the predicted data point region 559 , and thus would have higher likelihood of being a continuation of this specific input received from the user, such as an touch data point input received by a stylus pen 106 .
- the controlling engine 340 therefore takes into account the higher likelihood that a touch data point is of a certain type when it is assigning an element label to that touch data point.
- the controlling engine may adjust the size and shape of the predicted data point region 559 due to the speed of the user input, which is determined based on the movement of the older touch data points 557 .
- the one or more geometric shape discrimination techniques 342 compare the movement of a touch data point, or cluster of touch data points, with the movement of a touch data point that is associated with a stylus pen, to determine if this cluster of points may be associated with an appendage (e.g., palm) following the stylus pen.
- an appendage e.g., palm
- a touch point or cluster of touch data points that move parallel to the direction of the movement of a touch data point that is associated with a stylus pen is labeled as being a palm.
- FIG. 5C is a simplified flowchart illustrating a method of discriminating interactions caused by an appendage, such as a palm of a user on a touch-screen according to an embodiment of the invention.
- the method 520 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computing system or a dedicated machine), firmware (embedded software), or any combination thereof that are contained in the host device 102 and/or the stylus pen 106 .
- the method 520 is performed by the controlling engine 340 that is running in the background of the host device 102 and/or the data collection and transmission processes running on components in the stylus pen 106 .
- the method includes receiving information related to a touch-down event on the host device 102 .
- the information related to the touch-down event that is transferred to the controlling engine 340 may comprise timing information, pressure data, and other data that is sent from the stylus pen 106 via the communication link 335 to the host device 102 .
- the information related to the touch-down event may comprise a first timestamp for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event.
- the information related to the touch-down event may comprise a pen clock timestamp derived from a pen clock signal received from the pen clock 106 g and a host clock timestamp derived from a host clock signal received from the host clock 215 for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event.
- the information related to the touch-down event is received from a handheld device, such as a stylus pen 106 .
- the method also includes receiving information related to a plurality of touch events on the host device 102 .
- the plurality of touch event may be from an appendage of the user, such as a palm of the user resting on the surface of the user interface 104 of the host device 102 .
- the plurality of touch event may also be from the stylus pen 106 interacting with the user interface 104 of the host device 102 , or by a touch interaction from the direct contact with the user interface 104 by a finger or appendage of the user.
- the information related to the touch event may comprise a second timestamp for the touch event, which may include timing information related to when the touch event occurred.
- the second timestamp comprises a pen clock timestamp derived from a pen clock signal received from the pen clock 106 g and a host clock timestamp derived from a host clock signal received from the host clock 215 for the touch event.
- the method also includes determining or defining one or more clusters of touch events from the plurality of touch events provided to the controlling engine 340 from the stylus pen 106 and host device 102 .
- the method also includes determining the movements of each one of the clusters of touch events.
- the controlling engine 340 in the host device 102 can determine the movements of each one of the clusters of touch events based on the interactions between each one of the clusters of touch events and the user interface 104 of the host device 102 .
- the controller in the host device 102 can determine the movement of each one of the clusters of touch events across the user interface 104 of the host device 102 , or can determine that one or more of the clusters of touch events are stationary.
- determining one or more clusters of touch events from the plurality of touch events may comprise detecting the location of each touch event in the plurality of touch events and associating each touch event into one or more clusters of touch events.
- associating each touch event into one or more clusters of touch events is based on relative distances between each touch event in the plurality of touch events. For example, touch events may be considered associated in the same cluster of touch events when they are within a predetermined distance from each other.
- the method may also include correlating the information related to the touch-down event with the information related to the plurality of touch events.
- the touch-down event and plurality of touch events may be correlated based on information received by the controlling engine 340 relating to the touch-down event and the plurality of touch events.
- the information includes, but is not limited to, timing information received from the stylus pen 106 or movement of the plurality of touch events detected by the controlling engine 340 of the host device 102 .
- the method may also include determining whether the movement is below a threshold distance.
- the controlling engine 340 may determine whether the movement of each cluster of touch events is less than a predetermined threshold distance.
- the predetermined threshold may be set to a numerical value specifying a particular distance of movement. If the controlling engine 340 determines that the movement of one of the clusters of touch events moves less than the predetermined threshold, the cluster of touch events may be determined to be associated with an appendage of a user, as illustrated in box 534 , such as a palm as it may be more likely to have a smaller movement or lack of movement, since the palm of the user is resting on the user interface 104 of the host device 102 .
- the cluster of touch events may be determined as being from a stylus pen or a finger, and thus are not associated with the appendage of a user, as illustrated in box 536 .
- timing information may be used to determine whether the cluster of touch events is from a stylus pen 106 or from a finger of the user interacting with the user interface 104 of the host device 102 .
- the one or more of the input discrimination techniques try to determine whether an interaction is from a stylus pen and then tries to decide whether the interaction is from a non-stylus pen, or vice versa
- the controlling engine 340 continues to monitor incoming touch-down event and touch events, correlates the received data, and makes a determination as to whether the touch event is associated with the stylus pen 106 , touch interactions by the user's fingers, or a touch interaction caused by the user's appendage resting on the user interface 104 of the host device 102 , and then generates the output data 350 that is provided to software and hardware components running on the host device 102 .
- FIG. 5C provides a particular method of 520 according to an embodiment of the present invention. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated in FIG. 5C may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications.
- One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
- the various geometric based rules applied by the controlling engine 340 may include increasing the likelihood that a touch data point 556 that is positioned a threshold distance from a previous touch data point that was characterized as a stylus pen is also a pen touch data point, a touch data point 556 that is positioned outside a defined appendage region 561 it is likely a stylus pen or a finger input, a touch data point 556 that is positioned in the same general direction that a “stylus pen” associated touch data point had moved is also likely a pen touch data point, and that a touch data point that has not moved over one or more touch sampling intervals is also likely an appendage touch point.
- These geometric rule examples, and types of user input used with these rule examples are not intended to be limiting as to scope of the invention described herein.
- Embodiments of the invention described herein may also include a system and method that utilizes one or more inference based user input discrimination techniques to determine whether it is likely that a user input was received from a stylus pen, finger or appendage by a touch sensitive device.
- the inference based discrimination techniques used by the controlling engine will generally compare the received user input data with predefined and/or relevance-weighted rules to help discriminate between the interaction of a stylus pen, fingers or an appendage.
- the inference based user input discrimination techniques discussed herein may be used by themselves or in combination with one or more of the other user types of input discrimination techniques discussed herein to discern between the various different types of inputs received by the computing device.
- FIG. 6A illustrates a simplified flowchart of a method 600 of discriminating between different types of user inputs using an inference based discrimination technique.
- the inference based discrimination technique 343 ( FIG. 3A ) of the controlling engine 340 takes in the various different types of inputs 601 received by the host device 102 at an instant in time, such as inputs 331 , 333 , 335 , 339 discussed above, and tries to determine what type of user input has been received by comparing the outcome of various analyses performed on the received user input data.
- the various analyses may be performed by use of two or more decision modules, such as decision modules 602 1 - 602 N , where N is whole number greater than or equal to two, that each provide an input 604 1 - 604 N to a decision matrix 605 .
- the decision matrix 605 then analyzes and compares the received inputs 604 1 - 604 N for each touch data point and other event data 603 to determine the most likely element label for each touch data point.
- the inputs 604 1 - 604 N may comprise a “vote” that includes an alphanumeric, numerical or other distinct label that signifies one type of user input from another.
- the decision matrix 605 compares the received inputs 604 1 - 604 N for each touch data point by tabulating or summing the different votes for the user input type created by each of the decision modules 602 1 - 602 N .
- the decision matrix 605 of the controlling engine 340 then generates the output data 350 that is then used by the components and software found in the host device 102 .
- the decision matrix 605 uses threshold values, which are stored in memory, to assign the user input a desired element label. In one example, a touch data point is not assigned a stylus pen input element label unless it receives a certain minimum number of votes.
- the decision matrix 605 determines an element label for a touch data point by first applying a weighting factor to each of the received inputs 604 1 - 604 N , and then compares the adjusted, or weighted, inputs to determine the element label for a given touch data point.
- the weighting factor may mean that a vote provided by a given decision module (e.g., decision module 602 1 ) may carry more weight than another vote provided by another given decision module (e.g., decision module 602 2 ) based on its ability to correctly characterize the type of user input.
- the event type 603 includes other known information relating to the received touch point data, such as whether the data is being delivered from the user interface 104 or stylus pen 106 .
- each decision module 602 1 , 602 2 , . . . 602 N includes a time based user input discrimination technique, a geometric user input discrimination technique or other applicable user input discriminating rules that is available to the controlling engine 340 .
- each of the decision modules 602 1 - 602 N comprise one or more coded software instructions that apply one or more defined rules that are used to characterize the type user input that a touch data point was derived from.
- a decision module 602 1 characterizes a touch data point based on its relationship to a cluster of other touch data points as discussed in relation to FIG. 5A
- a decision module 602 2 characterizes the touch data point based on its predicted position as discussed in relation to FIG.
- a decision module 602 3 characterizes the touch data point based on its movement being within a threshold value as discussed in relation to FIG. 5C
- a decision module 602 4 characterizes a touch data point based on the knowledge of an attribute of the user (e.g., right-handed). Then, each decision module 602 1 - 602 4 delivers its input 604 1 - 604 4 to the decision matrix 605 , or also referred to herein as its “vote” as to what type of user input the touch data point was created from.
- the inputs 604 1 - 604 4 each include whether the decision module believes that it the touch data point is associated with a stylus pen, finger or user's appendage.
- the decision matrix 605 of the controlling engine 340 compares the inputs 604 1 - 604 4 and generates the output data 350 for that touch data point, which may include its position, timestamp information and whether the touch data point is associated with a stylus pen, finger or user's appendage.
- FIG. 6B illustrates a table that contains some examples of some voting results contained in the generated decision matrix data based on the received inputs 604 1 - 604 N , where in this example N is equal to 10. Therefore, the inputs 604 1 - 604 10 (not shown) have been created by use of inputs received from the decision modules 602 1 - 602 10 (not shown), and have been tabulated by the decision matrix 605 to form the illustrated results.
- a first touch data point has received eight votes that it is related to a stylus pen, one vote that it is related to a finger and one vote that it is related to a user's appendage, while a second touch data point has received two votes that it is related to a stylus pen, seven votes that it is related to a finger and one vote that it is related to a user's appendage. Therefore, based on the tabulated data the controlling engine 340 would attribute the first touch data point to a stylus pen and the second touch data point to a finger, which would then be delivered in the output data 350 .
- the controlling engine 340 may then deliver output data 350 to one or more software and hardware components running on the host device 102 .
- the output data 350 may be used by one or more third party applications and/or components in the host device 102 to perform some useful display or data output function.
- the output data 350 may be used by the host device 102 to generate an image or line on a display in the host device 102 , due to the determination that the received touch data is related to a stylus pen 106 .
- the decision matrix 605 is unable to determine what type of input a certain touch data point is, which are referred to herein as an “unknown” touch data point. Therefore, to resolve this issue, the software running on the host device may take a few different paths to decide what to do with these unknown touch data points.
- the software may decide not to use the “unknown” touch data points in any of the tasks that is currently performing. For example, in the case of a drawing program, the controlling software may decide not to render the touch point on the screen of the user interface 104 . In this case, the controlling software has decided that each data point must have element label to be used.
- the software running on the host device may decide to use the “unknown” touch data points in the tasks that it is currently performing. For example, in the case of a drawing program, the controlling software may decide to render the touch point on the screen of the user interface 104 , and at some later time undo and/or remove the rendered data when it is clear that the input data was not received from a desired component, such as stylus pen or finger. In this case, the controlling software may decide to give each input an initial element label and then correct the label when it has more data.
- FIG. 7 is a simplified signal diagram 700 illustrating aspects of a method for discriminating stylus pen interactions from touch interactions on a touch-screen by an inference technique, according to an embodiment of the invention.
- the diagram includes an host device signal 710 , a signal representing touch events detected by a controlling engine 720 , a signal representing a touch-down event by a stylus pen 106 on the user interface of the host device 730 , and a signal representing a touch interaction (e.g. from a finger of a user) on the user interface of the host device 740 .
- the host device signal 710 includes an active period 711 that indicates a period where the host device is active and able to receive inputs.
- the signal representing touch events detected by a controlling engine 720 includes a first period indicating a touch event 721 , a first period indicating no touch event 722 , a second period indicating a touch event 723 , a second period indicating no touch event 724 , a third period indicating a touch event 725 , a third period indicating no touch event 726 , a fourth period indicating a touch event 727 , and a fourth period indicating no touch event 728 .
- the signal representing touch-down event by a stylus pen 106 on the user interface of the host device 730 includes a first period indicating a touch-down event 731 , a first period indicating no touch-down interaction 732 , a second period indicating a touch-down event 733 , and a second period indicating no touch-down interaction 734 .
- the signal representing a touch interaction on the user interface of the host device 740 includes a first period indicating a touch interaction 741 , a first period indicating no touch interaction 742 , a second period indicating a touch interaction 743 , and a second period indicating no touch interaction 744 .
- the touch-down events attributable to the stylus pen 106 can be parsed out from amongst all the signal periods where touch events were detected.
- touch events 723 and 727 can be distinguished from touch events 721 and 725 as being touch events conducted using the stylus pen 106 against the user interface 104 of the host device 102 , rather being touch events conducted by the user making contact with their finger with the user interface 104 of the host device 102 .
- FIG. 8 is a simplified signal diagram 800 illustrating aspects of a method for discriminating stylus pen interactions from touch interactions on a touch-screen, where stylus pen and touch interactions overlap, according to an embodiment of the invention.
- a user may conduct stylus pen and touch interactions simultaneously with the user interface 104 of the host device 102 .
- the system can parse out those signals from the stylus pen 106 from the touch interactions conducting using the user's fingers.
- the diagram includes an host device signal 810 , a signal representing touch events detected by a controlling engine 820 , a signal representing a touch-down event by a stylus pen 106 on the user interface of the host device 830 , and a signal representing a touch interaction (e.g. from a finger of a user) on the user interface of the host device 840 .
- the host device signal 810 includes an active period 811 that indicates a period where the host device 102 is active and able to receive inputs.
- the signal representing touch events detected by a controlling engine 820 includes a first period indicating a touch event 821 , a first period indicating no touch event 822 , a second period indicating a touch event 823 , a second period indicating no touch event 824 , a third period indicating a touch event 825 , and a third period indicating no touch event 826 .
- the signal representing touch-down events by a stylus pen 106 on the user interface of the host device 830 includes a first period indicating a touch-down event 831 , a first period indicating no touch-down interaction 832 , a second period indicating a touch-down event 833 , and a second period indicating no touch-down interaction 834 .
- the signal representing a touch interactions on the user interface of the host device 840 includes a first period indicating a touch interaction 841 , a first period indicating no touch interaction 842 , a second period indicating a touch interaction 843 , and a second period indicating no touch interaction 844 .
- the signal representing a touch interaction on the user interface of the host device 840 and the signal representing touch-down events by a capacitive stylus pen on the user interface of the host device 830 partially overlap during an overlap period 850 .
- the overlap period 850 coincides with first period indicating a touch-down event 831 and the second period indicating a touch interaction 843 .
- the signal strength from the touch-down events by the stylus pen 106 and the signal strength from the touch interactions on the user interface of the host device 840 are roughly equal.
- the relative signal strengths of the touch-down events by the stylus pen 106 and the touch interactions on the user interface of the host device 840 may vary in strength.
- the relative change in signal strengths may be an additional factor in further discriminating between stylus pen and touch interactions.
- the stylus pen 106 may include a touch signal generating device (TSGD) 106 h that is used to cause the stylus pen 106 to be selectively sensed by the capacitive sensing elements found within the touch sensing unit 212 of the user interface 104 of the host device 102 .
- touch signal generating device 106 h includes one or more components that are able to selectively form a virtual capacitance between a portion of the pen tip 106 a and the capacitive sensing elements found in the user interface 104 when a TSGD switch, such as a mechanical sensor/switch 221 is activated by the user.
- the TSGD switch is part of the pen tip 106 a or pressure sensing unit 106 b .
- the formed virtual capacitance between the pen tip 106 a and the host device 102 creates a touch event that is sensed by the user interface 104 with or without the physical act of touching the pen tip 106 a to the user interface.
- FIG. 9A is an electrical schematic that illustrates the operation of an active stylus pen 106 with the host device 102 that is configured for mutual capacitance sensing, according to an embodiment of the invention.
- the active stylus pen 106 is configured with an active stylus control element 907 , which may receive signals from host device 102 as well as generate signals to be transmitted to the host device 102 .
- the active stylus pen 106 may be held by a user's fingers 915 and is coupled to the user interface 104 through pen tip 106 a .
- the active stylus pen 106 may be physically coupled to the user interface 104 , or the active stylus pen 106 may be located in proximity to the user interface 104 such that signals generated within the active stylus control element 907 and transmitted to the pen tip 106 a are able to change the sensed capacitance at sensing assembly 117 within the host device 102 to a desired level at a desired time.
- the host device 102 generally includes a user interface 104 , a driver assembly 113 and a sensing assembly 117 .
- the host device 102 may include, for example, drive regions and sense regions, such as drive electrodes 114 and sense electrodes 116 . Further, the drive electrodes 114 a - 114 c (x-direction) may be formed in columns while sense electrodes 116 a - 116 b (y-direction) may be formed in rows. Touch sensing areas, or touch pixels, may be formed at the overlapping regions of the drive electrodes and sense electrodes.
- column driver 113 may transmit a capacitive sensing waveform on one or more drive electrodes 114 at a time, thereby creating a mutual capacitance C M between the row of sense electrodes 116 and the driven drive electrode(s) 114 (i.e., column(s)) at each touch pixel.
- Active stylus pen 106 when coupled to the user interface 104 , may be configured to detect the transmitted capacitive sensing waveform.
- active stylus pen 106 is coupled to the user interface 104 , some of the charge coupled between the drive electrodes 114 and sense electrodes 116 corresponding to one or more touch pixels may instead be coupled onto the active stylus pen 106 , thus forming a pen capacitance C P corresponding to each of the coupled touch pixels.
- More charge may generally be coupled from a particular touch pixel to the active stylus pen 106 where the active stylus pen 106 is a shorter distance from that touch pixel; therefore, detecting that more charge has been coupled away from a particular touch pixel may indicate a shorter distance to active stylus pen 106 .
- This reduction in charge coupling across the touch pixels can result in a net decrease in the measured mutual capacitance CM between the drive electrode 114 and the sense electrode 116 , and a reduction in the capacitive sensing waveform being coupled across the touch pixel.
- This reduction in the charge-coupled sensing waveform can be detected and measured by analyzing the change in the sensed capacitance C s in the sensing assembly 117 to determine the positions of multiple objects when they touch the user interface 104 .
- the active stylus pen 106 may send a controlling signal to the user interface 104 by injecting a charge at the appropriate time to the pen tip 106 a , which alters the mutual capacitance C M and thus the value of sensed capacitance C s detected by the sensing assembly 117 . Therefore, by controlling the amount of charge to a desired level, or voltage formed between the pen tip 106 a and a sensing electrode 116 to a desired level, the pen tip 106 a of the active stylus pen 106 can be detected by the capacitive sensing element in the touch-screen containing device as being a touch event.
- the active stylus pen 106 may detect a signal produced at one or more drive electrodes 114 of the touch-screen containing device by the column driver 113 . Based on the detected signal, the active stylus pen 106 may alter the sensed capacitance C s to a level at a desired time, so as to cause the touch-screen containing device to correctly determine the location of input provided by the active stylus pen 106 .
- the active stylus pen 106 since the size of the pen tip 106 a is generally too small to be sensed by the user interface 104 , the active stylus pen 106 may therefore be used to selectively provide a touch sensing input to the user interface 104 .
- the software running on the touch-screen containing device can analyze and use the provided input to control some aspect of a software program running on the touch-screen containing device and/or display some aspect of the input received on the display portion of the touch-screen device.
- the active stylus pen 106 is adapted to deliver input from the active stylus pen 106 to any type of touch-screen containing device, despite differences in the particular configurations and sensing methods preformed by the touch-screen containing devices.
- FIG. 9B generally illustrates a driven touch-sensing detected signal 951 provided by the touch sensing components in the host device 102 and a controlling signal 915 that is generated and provided to the pen tip 106 a by the active stylus controlling element 907 , according to an embodiment described herein.
- the active stylus control element 907 may generally operate in a synchronization mode 913 or in a transmit mode 914 .
- active stylus pen 106 is coupled to a particular touch-screen containing host device 102 .
- the location of the pen tip 106 a on the touch screen may be directly at a drive pixel that contains a portion of the drive electrode 114 (i.e., a column) and the sense electrode 116 (i.e., a row), but may also be located on the touch screen between drive pixels.
- Detected signal 951 represents the voltage measured by the pen tip 106 a over time. Detected signal 951 reflects a signal that is generated by the column driver 113 and then sequentially applied to each column as the user interface 104 is sequentially scanned.
- the active stylus controlling element 907 may operate by default in synchronization mode 913 , essentially listening for signal activity in this mode, then may transition to transmit mode 914 based on signal activity received and processed by the processor 106 c.
- detected signal 901 has a signal magnitude 901 a , which indicates that the column driver 113 signal is being applied to a column that is a distance away from the pen tip 106 a , such as a neighboring column, and thus has not yet reached the column nearest to the pen tip 106 a .
- the active stylus control element 907 may remain in a synchronization mode 913 for a period of time or until the signal magnitude changes.
- detected signal 901 has an amplitude of 901 b , indicating that the column driver 113 is currently applying a portion of the detected signal 901 to a column (e.g., drive electrode 114 ) that is closer to the pen tip 106 a than the column that delivered the signal during the time period 902 .
- a column e.g., drive electrode 114
- synchronization of the active stylus control element 907 with the touch-screen containing host device 102 is important to ensuring accurate input is detected by the host device 102 .
- the active stylus control element 907 transmits a signal to pen tip 106 a when column driver 113 is driving a column at which the pen tip 106 a is not located.
- the signal transmitted to pen tip 106 a will change the sensed capacitance most strongly at a sensing assembly 117 closest to the location of pen tip 106 a , but may also affect nearby sensing assemblies 117 to a lesser degree.
- the host device 102 may measure the values of sensed capacitance across all rows simultaneously, but the columns are driven in particular sequence, the host device 102 will detect the changes in sensed capacitance but may misinterpret the location of the input.
- the effect of the misinterpretation may be erratic or erroneous input into host device 102 , which may cause the input position on the screen to jump around and/or lead to other undesirable effects in programs being executed on host device 102 , and may further significantly degrade the user's experience.
- the frequency of detected signal 901 received from the host device 102 may be changed, and may be a higher or lower frequency than the portion of the detected signal 901 in time period 903 .
- the change in frequency may be caused by the particular scanning process of host device 102 .
- the frequency of detected signal 901 increases at time period 304 while the amplitude of detected signal 901 remains at a signal magnitude 901 b , indicating that the detected signal 901 is still being applied to the same or similarly positioned column to pen tip 106 a .
- the active stylus control element 907 may adapt to such a change in frequency and adjust the output signal delivered from the pen tip 106 a .
- the active stylus control element 907 may stop transmitting and transition from transmit mode 914 to synchronization mode 913 .
- the active stylus control element 907 may then return to transmit mode 914 and resume transmitting an output signal 912 to the pen tip 106 a.
- the magnitude of detected signal 901 decreases from 901 b to 901 c , indicating that the column driver 113 is applying the detected signal 901 to a column (i.e., the column driver 113 is transmitting on the next column) that is a further distance away from the column(s) that delivered the signal during the time periods 903 and 904 .
- the indication that the nearest column is no longer delivering the detected signal 901 from column driver 113 which then causes the active stylus control element 907 to transition into synchronization mode 913 , irrespective of the frequency or phase of detected signal 901 that is detected by the active stylus pen 106 .
- signal 901 is depicted as having the same frequency and phase during time period 305 as during time period 904 , the example is meant to demonstrate that the signal magnitude falling below a particular threshold may trigger a transition into synchronization mode 913 , regardless of signal frequency or phase. Further, the examples disclosed herein are not meant to be limiting the claimed subject matter to only those embodiments interacting with host devices 102 that generate such signal patterns, frequencies, phases, or changes in frequencies and/or phases.
- the maximum signal magnitude value that corresponds to column driver 113 driving the nearest column may be learned during one scan cycle.
- the maximum signal magnitude value may then be used to determine a threshold value that can effectively distinguish the maximum magnitude value from the remainder of detected signal magnitude values (i.e., distinguish magnitude 901 b from magnitudes 901 a and 901 c ).
- the threshold value may be compared with the detected signal magnitude to indicate whether column driver 113 is currently driving the nearest column to pen tip 106 a.
- the sensing component e.g., communications unit 906 d , the processor 906 c and the memory 906 e
- the sensing component may analyze the detected signal 901 and generate an output signal based on the detected signal 901 .
- the active stylus control element 907 may remain in synchronization mode 913 for a time period 918 , when analysis of the detected signal 901 is complete and the active stylus control element 907 has synchronized to the detected signal 901 .
- the active stylus control element 907 may then transition into transmit mode 914 and begin transmitting an output signal, such as the output signal found in transmit modes 914 of the controlling signal 915 to the pen tip 106 a . Transmission may continue until synchronization with the detected signal 901 is lost (e.g., if the frequency or phase of detected signal 901 changes).
- active stylus control element 907 may be capable of on-the-fly adaption to a frequency change in a detected signal 901 , this adaptive capability may have a significant computational expense. This expense may have secondary effects of increasing the power consumption of active stylus pen 106 as the active stylus control element 907 more frequently processes the detected signal 901 and attempts to synchronize, as well as decreasing the percentage of time during scan cycles that the active stylus control element 907 is able to transmit to host device 102 .
- active stylus control element 907 is depicted as being in synchronization mode 913 for a longer period 918 than the period 919 , during which active stylus control element 907 is in transmit mode 914 . Such a decreased percentage may result in a less responsive input to the host device 102 , which may ultimately cause computing errors in host device 102 .
- the active stylus pen 106 may accommodate longer transmit mode periods 919 by storing host device identification information that relates to one or more host devices 102 .
- the information may include data relating to physical characteristics or capacitive sensing techniques of each of the different types of host devices, and the information may be stored in memory 106 e .
- the host device identification information may further include frequency, timing and phase information of detected signal 901 , number of rows and/or columns in the user interface 104 and other useful information.
- the host device identification information may be pre-programmed and/or stored in memory based on vendor specifications or may be learned (through use of the active stylus pen 106 with particular host devices 102 ) and then stored in memory by the sensing component of active stylus pen 106 .
- the active stylus pen 106 may advantageously bypass synchronization mode 913 when column driver 113 is driving detected signal 901 on the nearest column. In other words, active stylus pen 106 may transmit an output signal to the pen tip 106 a during the entirety of the time period. Further, frequency and phase changes to detected signal 901 may not disrupt the transmission by the active stylus pen 106 if the target frequency and phase values are also included in the host device identification information.
- the stylus pen 106 is able to use the knowledge of the physical characteristics of the host device 102 to determine one of the coordinates of a touch event created by a stylus pen 106 's interaction with the user interface 104 .
- the stylus pen since the stylus pen is able to sense the transmitted signals provided by the driven columns in the host device 102 , and is able to determine that it is nearer to one column versus another, using of the knowledge of the physical layout of the columns (i.e., driven electrodes) in the host device 102 , the stylus pen 106 can ascertain its x-direction coordinates.
- the stylus pen 106 can determine which column number is being driven at a certain time, either by knowledge of the scanning technique used by the host device and/or by analysis of the touch sensing scanning process. For example, it is common for touch sensing devices to drive all of the columns at the end of a touch sensing cycle to reduce any charge built up in different areas of the user interface. The stylus pen 106 is then able to detect and use this information to know when the first column in a new touch sensing scan is about to start. The stylus pen can then analyze the number of sensing signals of different amplitude created by the column driver 113 that are sent before the column nearest the pen tip 106 a is reached.
- the stylus pen 106 can then determine which column number that it is nearest to in the user interface, and thus its relative x-coordinate position.
- the x-coordinate position can then be transmitted to the host device via the communication link 205 , so that this information can be used by and/or compared with the touch sensing coordinate information received from the host device to help more easily determine which touch data points are related to the stylus pen 106 .
- Knowledge of at least one of the coordinates of a stylus pen 106 interaction with the user interface 104 can help reduce misidentification error rate and help with palm and finger detection using the techniques described above.
- FIG. 9C illustrates the components of an active stylus pen 106 capable of interacting with a host device 102 that is configured for mutual capacitance sensing, according to an embodiment of the invention.
- the active stylus pen 106 may couple to the host device 102 through pen tip 106 a , as discussed above.
- the active stylus pen 106 is further configured with an active stylus control element 910 , which comprises a low-noise amplifier (LNA) 931 , a phase discriminator 932 , a peak detector 933 , a timing state machine (TSM) 934 , a waveform generator (WG) 935 , a power amplifier (PA) 936 , and a clock source 937 .
- LNA low-noise amplifier
- TSM timing state machine
- WG waveform generator
- PA power amplifier
- the LNA 931 generally provides linear signal amplification, and in one or more configurations, LNA 931 may operate across the 10 kilohertz (kHz) to 1 megahertz (MHz) frequency range and may have an input impedance is greater than 1 megaohm (M ⁇ ).
- the phase discriminator 932 is generally a zero-crossing detector, which generates a pulse having a width of one cycle of clock source 937 upon detecting a transition of potential at pen tip 106 a .
- the peak detector 933 is generally comprised of rectifier, integrator, and high pass filter components.
- the TSM 934 is comprised of a state machine that controls mode selection, a phase and frequency estimator, a calibration state machine, and a timing sequencer through use of the processor 106 c , clock 106 g and memory unit 106 e .
- Output generated by TSM 934 provides control to the WG 935 , which may generate an appropriate sequence of square pulses having a particular frequency, amplitude and duty cycle that are specified by TSM 934 .
- the PA 936 drives the pen tip 106 a so that a desired signal can be detected by the host device 102 , and is capable of tri-state operation based on control signals received from TSM 934 and WG 935 .
- the tri-state operation which may be controlled by the TSM 934 , may include the delivery of a high voltage signal (V H ) (e.g., positive voltage signal) and a low voltage signal (V L ) (e.g., negative voltage signal) to provide a desired signal from the pen tip 106 a that can be sensed (e.g., V H or V L ) at desired times by any type of host device 102 using any type sensing technique.
- the PA 936 may also deliver no signal at all to pen tip 106 a such as during idle periods or while the PA 936 is in a high-impedance mode (e.g., when active stylus control element 910 is synchronizing to a detected signal 901 ).
- the clock source 937 may be a crystal oscillator or a comparably precise source, and is typically the same clock as clock 206 g discussed above.
- the clock source 937 is generally required to be as precise as the clock source that drives the user interface 104 .
- the host device 102 generally includes a user interface 104 , a driver assembly 113 and a sensing assembly 117 . Touch sensing areas, or touch pixels, may be formed at the overlapping regions of the one or more drive electrodes 114 and one or more sense electrodes 116 . As shown, pen tip 106 a is located within an electric field E of the mutual capacitance created by the drive electrode 114 and sense electrode 116 .
- the pen tip 106 a is coupled to the user interface 104 , and the signals generated within the active stylus control element 910 and transmitted to the pen tip 106 a may alter the electric field E, which in turn may change the sensed capacitance at sensing assembly 117 to a desired level at a desired time.
- active stylus control element 910 may generally operate in a synchronization mode and/or in a transmit mode.
- the active stylus control element 910 may operate by default in synchronization mode, essentially listening for signal activity of the touch sensing component in the host device 102 in this mode, then may transition to transmit mode based on received signal activity.
- the TSM 934 may transmit an output to the enable (ENB) input of PA 936 , which causes the PA 936 to operate in a high impedance mode and deliver the signal to the pen tip 106 a at a desired time to coincide with the capacitive sensing signal delivered by the host device 102 .
- ENB enable
- the high impedance at PA 936 relative to LNA 931 causes most of the detected signal at pen tip 106 a to be transmitted to the LNA 931 .
- the TSM 934 also may transmit an output to the WG 935 to disable the WG 935 , which may be advantageously used to conserve power in the active stylus pen 106 .
- the pen tip 106 a when coupled to a host device 102 may detect a signal from the host device 102 , by monitoring the signal received by the LNA 931 as PA 936 is operating in high impedance mode. After being amplified at LNA 931 , the detected signal is provided to both the phase discriminator 932 and the peak detector 933 . The respective outputs from the phase discriminator 932 and peak detector 933 are then transmitted to TSM 934 , which uses the estimated phase and frequency to control the output of the WG 935 .
- the TSM 934 may cause the active stylus control element 910 to operate in transmit mode by enabling the PA 936 and causing the WG 935 to begin generating an output signal according to the phase, amplitude and frequency information provided by the TSM 934 .
- the output signal generated by the WG 935 may next be amplified by the PA 936 .
- LNA 931 may have a relatively large input impedance compared to the pen tip 106 a , so that the amplified signal will be transmitted to the pen tip 106 a , in order to affect the sensed capacitance due to the capacitive coupling of the pen tip 106 a to the touch sensing components in the user interface 104 .
- the touch signal generating device 106 h includes signal control electronics 106 i , a conductive coating 222 formed on a surface of the stylus pen 106 , which the user is in contact with when they are holding the stylus pen 106 , and the mechanical sensor/switch 221 (e.g., simple mechanical switch).
- the signal control electronics 106 i generally includes a signal generating device and other supporting components that are able to inject a current through the pen tip 106 a to the capacitive sensing elements in the user interface 104 at an interval that is synchronized with the capacitive sensing signals delivered between the capacitive sensing elements in the user interface 104 .
- the signal control electronics 106 i is also adapted to detect the capacitive sensing signal(s) delivered between the transmitter and receiver electrodes in the touch sensing unit 212 at any instant in time, and a phase shifting device (not shown) that is able to synchronize the timing of the injection of current through the pen tip 106 a with the delivery of the capacitive sensing signal(s) delivered between the transmitter and receiver electrodes.
- the mechanical sensor/switch 221 when activated electrically couples the conductive coating 222 , signal control electronics 106 i and other useful electrical components in the stylus pen 106 to the pen tip 106 a to create a virtual capacitance signal that is delivered between then pen tip 106 a and the capacitive sensing elements in the user interface 104 .
- the virtual capacitance created by the activation of the mechanical sensor/switch 221 can at least be intermittently formed between the pen tip 106 a and a portion of the user interface 104 , so that a desirable touch signal is received by the user interface 104 with or without the physical act of touching the pen tip 106 a to the user interface.
- the initial activation of the mechanical sensor/switch 221 causes a specific set of sensing signal pulses, or signature pulses, that will allow the one or more input discrimination techniques 345 used by the controlling engine 340 to more easily determine that the created touch data input created by the activation of the touch signal generating device 106 h will be more easily characterized as an input from the stylus pen 106 .
- the capacitive sensing elements in the user interface 104 of the host device 102 are sampled a set frequency (e.g., sampled every 16 ms).
- the set of sensing signal pulses created by portions of the stylus pen 106 may require two or more sensing signal pulses that each have a distinguishing preset length and/or fixed time between them that is equal to greater than the sampling rate of the device, so that the signature of the activation of the touch signal generating device 106 h can be more easily determined by the user input discriminating techniques performed by the controlling engine 340 that is running on the host device 102 .
- the touch signal generating device 106 h is useful, since it allows the user to initiate the interaction of the stylus pen 106 with the user interface 104 , rather than wait for the sensed contact of the pen tip 106 a and the user interface 104 to be characterized by the controlling engine 340 as an input is received from a stylus pen.
- FIG. 10 illustrates two sets of signature pulses 1001 and 1002 that each may be delivered from two different pens 106 , so that the controlling engine 340 can more easily determine that the user input created by each stylus pen 106 will be more easily associated to that particular stylus pen.
- the signature pulses 1001 and 1002 may be generated at the start of the interaction of the stylus pen with the user interface 104 to let the controlling engine know that the subsequent touch interactions that are associated with that initiating touch event will be made by a particular stylus pen.
- the signature pulse 1001 may comprise two pulses 1005 and 1006 that each have a desired duration 1021 and 1023 , respectively, and an off-period 1007 that has a duration 1022 .
- the signature pulse 1002 may comprises two pulses 1010 and 1011 that each have a desired duration 1041 and 1043 , respectively, and an off-period 1012 that has a duration 1042 .
- signature pulses 1001 and 1002 due to at least one difference between signature pulses 1001 and 1002 , such as the number of pulses (e.g., 2, 4 or 8 pulses), pulse shape (e.g., square-wave shape, sinusoidal wave shape), pulse duration or off-period between pulses the controlling engine 340 will be able to more easily determine that a particular input is received by one stylus pen versus another.
- a signature pulse 1001 or 1002 can also be used to determine that an interaction sensed by the user interface 104 is related to a stylus pen and not a finger or user's appendage.
- the present invention can be implemented in the form of control logic in software or hardware or a combination of both.
- the control logic may be stored in an information storage medium as a plurality of instructions adapted to direct an information-processing device to perform a set of steps disclosed in embodiments of the present invention. Based on the disclosure and teaching provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the present invention.
- any of the entities described herein may be embodied by a computer that performs any or all of the functions and steps disclosed.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Embodiments of the invention are directed to control devices, such as human interface devices, configured for use with a tablet computer. More specifically, the present invention relates to methods and system for discriminating between the interactions of a handheld device, touch of one or more of the user's finger(s) and interaction with appendages of the user on a touch-screen tablet computer. The methods described herein may include discriminating between the interaction of the handheld device, the user's finger(s) and an appendage of the user so that the collected information can be used to control some aspect of the hardware or software running on the touch-screen tablet computer.
Description
- This application claims benefit of U.S. Provisional Patent Application Ser. No. 61/755,881, filed Jan. 23, 2013, entitled “Method and System For Discriminating Pen and Touch Interactions”, U.S. Provisional Patent Application Ser. No. 61/791,577, filed Mar. 15, 2013, entitled “Method and System for Discriminating Stylus and Touch Interactions” (Atty Dkt No. LOGI/0005L), U.S. Provisional Patent Application Ser. No. 61/738,797, filed Dec. 18, 2012 entitled “Electronically Augmented Pen Tip For A Touch Pad Digitizer” (Atty Dkt No. LOGI/0003L), U.S. Provisional Patent Application Ser. No. 61/762,222, filed Feb. 7, 2013, entitled “Electronically Augmented Pen Tip For A Touch Pad Digitizer” (Atty Dkt No. LOGI/0003L02) and U.S. Provisional Patent Application Ser. No. 61/790,310, filed Mar. 15, 2013, entitled “Active Stylus For Touch Sensing Applications” (Atty Dkt No. LOGI/0003L03), which are all herein incorporated by reference.
- 1. Field of the Invention
- The present invention relates to a method and a system that is able to discriminate between the interaction of an electronic stylus pen, finger(s) or user's appendage and a touch screen containing device.
- 2. Description of the Related Art
- Touch-screen tablet computers allow a user the ability to interact directly with content displayed on the touch-screen of the tablet computer. These interactions can be conducted through various means, but typically is done through touch, by way of the user's fingers directly interacting with the screen, or through the use of a stylus pen or other type of input control device that contacts the screen based on movements made by the user. Typically, touch-screens distinguish touch inputs from stylus pen inputs by using various sensing technologies or input modes that the user has to select based on the operations the user wants to conduct on the touch-screen of the tablet computer. Other typical solutions require stylus pen inputs to originate from a stylus pen that is physically tethered to the tablet computer.
- Collecting touch information from these types of interface mechanisms also introduces a number of challenges. Moreover, the process of reliably collecting touch information becomes increasingly more complicated where the computing device allows a user to input information using both a touch input mechanism and a stylus pen input mechanism. In the course of interfacing with the touch sensitive surface of the computing device with a stylus pen device, the user may inadvertently rest his or her palm on the touch sensitive surface. The computing device may then incorrectly interpret this inadvertent contact as a legitimate input activity. A similar challenge may confront a user who is intentionally using a touch input mechanism to control or input data to the computing device. In some cases, the user may attempt to apply a focused touch to the surface of the computing device, yet the user may accidentally brush or bump his or her hand against other parts of the display surface, causing accidental input events. These problems may understandably frustrate the user if they become a frequent occurrence, or even if uncommon, if they cause significant disruption in the task that the user is performing.
- Moreover, due to limitation in the computing power of the computing device, a wish to increase the speed of the computing device by reducing the computational power required to collect and transfer the touch interaction data and/or the often limited nature of the data received from the touch sensing components of a third party's computing device on which a hardware and software application (e.g., “app”) maker's software is running, there is a need for a method that can distinguish between the different user inputs by use of a simplified data set that is created by the computing device from the interaction of the user's fingers, appendage and/or stylus pen. In some cases, the simplified data set includes the coordinates of a touch point and the time when the touch point was sensed by the touch sensing components. The simplified data set is generally a small fraction of the amount of the data that is commonly available from the touch sensitive hardware in a conventional touch sensitive display type computing devices today.
- Despite the progress made with respect to operating touch screen tablet computers, there is a need in the art for improved methods and systems related to distinguishing different inputs provided to tablet computers in spite of the problems discussed above.
- Embodiments relate generally to control devices, such as human interface devices, configured for use with a touch screen tablet computer. More specifically, the present invention relates to methods and systems for discriminating between the interactions of a handheld device, touch of one or more of the user's finger(s) and interaction with appendages of the user on a touch-screen tablet computer. The methods described herein may include discriminating between the interaction of the handheld device, such as an electronic stylus pen, the user's finger(s) and a user's appendage so that the collected information can be used to control some aspect of the hardware or software running on the touch-screen tablet computer. The methods disclosed herein may also be used to separate the interaction of the user's appendage from the interactions of the handheld device and/or user's finger(s) with the touch-screen tablet computer. In one example, the information received from the appendage of the user is distinguished from the information received from the interaction of a stylus pen and the user's finger and the touch-screen tablet computer, and is purposely not used to control the hardware and/or software running on the touch-screen tablet computer.
- Embodiments provide a method of operating a host device, comprising receiving, at the host device, information related to a touch-down event, receiving, at the host device, information related to a touch event from a controlling engine, correlating the information related to the touch-down event with the information related to the touch event, and determining that the touch-down event is associated with a handheld device.
- Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information from the first touch event comprises a first touch data point, comparing the first touch event information with a first rule and a second rule, wherein the first rule and the second rule each form a vote as to the type user input that created the first touch event, and attributing the first touch data point to a type of user input by analyzing the votes received from the first and second rule. However, in some embodiments, more than two rules may be used to determine the type of user input.
- Embodiments may further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, wherein the information related to the touch-down event comprises information relating to a first time when the touch-down event occurred, receiving, at the host device, information related to a first touch event and a second touch event from a touch sensing unit coupled to the host device, wherein the information provided for the first touch event comprises a first touch data point and information relating to a second time, and the information provided for the second touch event comprises a second touch data point and information relating to a third time, analyzing the information received by the host device, comprising comparing a predetermined threshold time and the information relating to the first time and the second time, and then assigning a first user input type vote to the first touch data point based on the comparison, and comparing a first position of the first touch data point on a user interface of the host device and a second position of the second touch data point on the user interface of the host device, and then assigning a second user input type vote to the first touch data point based on the comparison of the first position relative to the second position, and attributing a type of user input to the first touch data point using the first user input type vote and second user input type vote.
- Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, wherein the information comprises information relating to a first time when the touch-down event occurred, receiving, at the host device, information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information comprises information relating to a second time when the touch event occurred on a touch sensitive unit of the host device, correlating the information related to the touch-down event with the information related to the first touch event, wherein correlating the information comprises comparing the first time, the second time and a predetermined threshold, and determining that the touch-down event is associated with the handheld device when the difference in time between the first and second time is less than the predetermined threshold.
- Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, receiving, at the host device, information related to a plurality of touch events from a touch sensing unit coupled to the host device, defining a portion of the plurality of touch events as being part of a first cluster of touch events, correlating the information related to the touch-down event with the information related to the first cluster of touch events, determining that the first cluster of touch events is associated with a user's appendage, and determining that at least one touch event of the plurality of touch events is associated with a handheld device, wherein the at least one touch event is not within the first cluster.
- Embodiments further provide a computer readable medium configured to store instructions executable by a processor of a host device to characterize user input data received by the host device, the instructions when executed by the processor causing the processor to receive information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information from the first touch event comprises a first touch data point, compare the first touch event information with a first rule and a second rule, wherein the first rule and the second rule each form a vote as to the type user input that created the first touch event; and attribute the first touch data point to a type of user input by analyzing the votes received from the first and second rule.
- Embodiments further provide a method of operating a host device, comprising receiving, at the host device, information related to a touch-down event, receiving, at the host device, information related to a plurality of touch events from a controller, determining one or more clusters of touch events from the plurality of touch events, correlating the information related to the touch-down event with the information related to the one or more cluster of touch events, determining that one of the one or more the cluster of touch events is associated with a palm, and determining that the touch-down event is associated with a handheld device.
- In another embodiment, the handheld device includes at least one of an accelerometer, a magnetometer, a gyroscope, or the like for detecting the orientation of the handheld device and detecting a triggering event, which both can be used to help control some aspect of the hardware or software running on the touch-screen tablet computer.
- So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
-
FIG. 1 illustrates an exemplary touch-screen tablet computer and a capacitive stylus pen according to an embodiment of the invention. -
FIG. 2 is a simplified block diagram of the components of a host device and stylus pen according to an embodiment of the invention. -
FIG. 3A is a simplified block diagram of a user input discrimination processing architecture used to distinguishing between the different types of user inputs received by the touch-screen tablet computer according to an embodiment of the invention. -
FIG. 3B is a flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention. -
FIG. 3C is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention. -
FIG. 3D is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention. -
FIG. 4 is a simplified flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention. -
FIG. 5A illustrates a plurality of related touch points on a touch-screen tablet computer that have been analyzed by a controlling engine according to an embodiment of the invention. -
FIG. 5B illustrates a plurality of related touch points on a touch-screen tablet computer that have been analyzed by a controlling engine according to an embodiment of the invention. -
FIG. 5C is a simplified flowchart illustrating a method of discriminating interactions caused by the palm of a user on a touch-screen according to an embodiment of the invention. -
FIG. 6A is a simplified flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention. -
FIG. 6B is a table listing some examples of some voting results contained in the generated decision matrix data generated during the method of discriminating between various touch interactions illustrated inFIG. 6A , according to one or more of the embodiments described herein. -
FIG. 7 is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention. -
FIG. 8 is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, where stylus pen and touch interactions overlap, according to an embodiment of the invention. -
FIG. 9A is an isometric cross-sectional view of a portion of a mutual capacitance sensing type host device that is interacting with an active stylus pen, according to an embodiment of the invention. -
FIG. 9B is a schematic signal diagram illustrating aspects of the process of detecting a touch-sensing device output signal and synchronizing an active stylus pen thereto, according to an embodiment of the invention. -
FIG. 9C illustrates the components of an active stylus pen 206 capable of interacting with ahost device 100 that is configured for mutual capacitance sensing, according to an embodiment of the invention. -
FIG. 10 illustrates simplified signature pulse diagrams that may be generated by two pens, according to an embodiment of the invention. - To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized on other embodiments without specific recitation. The drawings referred to here should not be understood as being drawn to scale unless specifically noted. Also, the drawings are often simplified and details or components omitted for clarity of presentation and explanation. The drawings and discussion serve to explain principles discussed below, where like designations denote like elements.
- Embodiments of the present invention generally provide a system and methods of distinguishing between the different types of user inputs provided from the interaction of a user's finger, a user's appendage and/or a handheld device with a touch sensitive device. In some configurations the handheld device is an electronic stylus pen, or also referred to herein as simply a “stylus pen,” that a user uses to provide input to control some aspect of the touch sensitive device. Computing devices that provide software applications that allow a user to input information via a touch input mechanism and a stylus pen input mechanism are often complex due to the need to distinguish between the interaction of a user's finger, user's appendage and stylus pen with the touch sensitive device to properly control some aspect of the hardware or software applications running on the computing device. It is common for the software applications running on the computing device to assign different tasks or cause different computing device controlling events to happen based on the input received from either a stylus pen, a finger or an appendage. It is often desirable to single out the unwanted interactions with the touch sensitive device, such as interactions created by an appendage of a user (e.g., palm, shirt cuff, or other similar element), so that they can be purposely excluded from the input provided to and/or analyzed by one or more software applications running on the computing device. Errors in the proper selection of an inputting element will create errors in the output generated by the software running on the host device, which will understandably frustrate the user even if they are an uncommon occurrence. Moreover, improper selection errors can also cause significant disruption to the task that the user is performing on the computing device.
- Embodiments of the invention described herein may also include a system and methods that employ a controlling engine running on a touch sensitive computing device, generally referred to herein as a host device, to discern between the user input received from a stylus pen, fingers or user's appendage. The data generated from the controlling engine's analysis of the user input data received from the various components that are coupled to or in communication with the touch sensitive computing device can then be used to control some aspects of the hardware or software running on the touch sensitive computing device. The controlling engine generally includes software instructions that include one or more input discrimination techniques that are used to analyze the various types of user input data received from one or more components in the touch sensitive device to determine the likely source of the user input. The one or more input discrimination techniques may include time based synchronization techniques, geometric shape discrimination techniques and inference based discrimination techniques that can be used separately or in combination to discern between different types of inputs received by the touch sensitive computing device. Touch sensitive computing devices may include a touch-screen tablet computer, which may use a resistive, capacitive, acoustic or other similar sensing technique to sense the input received from a user.
- In some embodiments, a system and method are used to distinguish between different types of user inputs using a simplified data set that is created by the touch sensitive computing device from the interaction of a user's finger, user's appendage and/or a handheld device. In some cases, the simplified data only includes the coordinates of the touch point and the time that the interaction occurred with the touch sensing components, which is generally a small fraction of the amount of the data that is typically collected by conventional handheld or touch sensitive computing devices.
- In
FIG. 1 , a system is depicted that includes a touch sensitive computing device, orhost device 102, that includes auser interface 104.Host devices 102 that include auser interface 104 capable of user interaction through a touch-screen sensing component. Thehost device 102 may be, for example, general computing devices, phones, media players, e-reader, kiosks, notebooks, netbooks, tablet types of computers, or any other device having one or more touch-sensitive inputs. In some devices, theuser interface 104 can include components that are used to display applications being executed by thehost device 102. In the example shown inFIG. 1 , thehost device 102 is an electronic device such as an iPad® device from Apple Inc. Exemplary embodiments of computing devices include, without limitation, the iPhone®, iPad® and iPod Touch® devices from Apple Inc., the Galaxy Note® 10.1 from Samsung, the Surface™ from Microsoft, other mobile devices, tablet computers, desktop computers, kiosks, and the like. -
FIG. 1 also depicts a user input device, or a handheld device, in the form of astylus pen 106 that is capable of touch interactions with theuser interface 104 of thehost device 102. Whilestylus pen 106 is a typical embodiment of the control device described herein, embodiments of the control device are not limited to astylus pen 106, and may include control devices in other forms including stamps, and other devices that can be used to conduct touch interactions with theuser interface 104, such as other fixed or detachable devices. One skilled in the art will appreciate that the touch interactions between thestylus pen 106 and theuser interface 104 do not require the physical interaction of a portion of thestylus pen 106 and the surface of theuser interface 104, and may also include interactions where thestylus pen 106 is moved over the surface of theuser interface 104 without touching the surface (e.g., active stylus pen discussed below). -
FIG. 2 schematically illustrates a system diagram showing a simplified view of the control elements of ahost device 102, and a simplified system diagram of the control elements of astylus pen 106. Thehost device 102 typically has at least some minimum computational capability, touch sensing capability and/or visual display capability. Thehost device 102 includes processingunits 201 that may include, but is not limited to one ormore processing units 210, amemory unit 211, atouch sensing unit 212, adisplay unit 213 and acommunications unit 214. Thetouch sensing unit 212 may utilize resistive, capacitive (e.g., absolute sensing or mutual capacitance sensing), acoustic or other similar sensing and signal processing components, which are known in the art, to sense the input received from a user at theuser interface 104. Thetouch sensing unit 212 may be disposed within and/or coupled to theuser interface 104 in thehost device 102. Thedisplay unit 213 may include various components that are able to display and/or visually render information provided to it by the one ormore processing units 210 andmemory 211. Thedisplay unit 213 may include any type of visual interface that includes light emitting diode (LED), organic LED (OLED), liquid crystal display (LCD), plasma, electroluminescence (EL), or other similar conventional display technology. Thecommunications unit 214 will generally include one or more components that are configured to transmit and receive information via acommunication link 205 between thehost device 102, thestylus pen 106 and other possible peripheral devices via a desirable communication method. A desirable communication method may include a wired or wireless communication method, such as a Bluetooth low energy (BTLE) communication method, Bluetooth classic, WiFi, WiFi direct, near-field communication (NFC) or other similar communication method. Thememory unit 211 generally contains computer readable media that can be accessed by thehost device 102 and may include both volatile and nonvolatile media for storage of information, such as computer-readable or computer-executable instructions, data, programs and/or other data.Memory 211 may include computer or machine readable media or storage devices such as DVD's, CD's, floppy disks, tape drives, hard drives, optical drives, solid state memory devices, RAM, ROM, flash memory or any other device which can be used to store the desired information. - To allow the
host device 102 to discriminate between the various inputs received from the user, the device should have a sufficient computational capability and system memory to enable basic computational operations. As illustrated byFIG. 2 , the computational capability can be completed by one or more processing unit(s) 210 that are in communication withsystem memory 211. The processing unit(s) 210 may include conventional central processing units (CPUs), which include graphical processing units (GPU) and other useful elements to control the various display, touch, communication and other units in thehost device 102. The processing unit(s) 210 may also include or be in communication with ahost clock 215, which may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in the host device and/or data transferred between thehost device 102 and other connected wired and wireless network components (e.g., stylus pen 106). - In some embodiments, the
stylus pen 106 may have one or more active regions that are able to collect additional information about the user's interaction with thehost device 102. In one example, the one or more active regions may include an active tip of thestylus pen 106 that is positioned so that the user will cause this region of thestylus pen 106 to interact with thehost device 102. The active tip of thestylus pen 106 may contain sensors that are able to measure some aspect of the interaction of the active tip and thehost device 102. As schematically depicted inFIG. 2 , thestylus pen 106 may include apen tip 106 a, apressure sensing unit 106 b, aprocessor 106 c, acommunications unit 106 d, amemory unit 106 e, apower source 106 f and apen clock 106 g. In some embodiments, thestylus pen 106 may further comprise one or more additional sensors (not shown inFIG. 2 ), such as one or both of a gyroscope and an accelerometer. - Referring back to
FIG. 2 , thepen tip 106 a is configured to make contact with theuser interface 104 of thehost device 102. The pressure exerted at thepen tip 106 a is dependent on the user's interaction with thestylus pen 106. - The
pressure sensing unit 106 b is capable of detecting the amount of pressure applied to thepen tip 106 a of thestylus pen 106 by the user. Pressure data corresponding to the amount of pressure exerted by the user with theuser interface 104 of thehost device 102 is measured by thepressure sensing unit 106 b. The pressure data can include data from a binary switch, or other device that is able to discern between 8, 16, 32, 64, or any other desirable number of pressure levels so that the generated pressure data is useful for the control of thehost device 102. In embodiments of the invention, different pressure levels can be used fordifferent host devices 102, such that a stylus pen interaction will only be registered by thehost device 102 when a threshold pressure level is detected. In some embodiments, the pressure data sensed by thepressure sensing unit 106 b may also include an analog measurement of the pressure applied, and thus the generated pressure data supplied to thehost device 102 may vary continuously across a desired range. - The
processor 106 c can be configured to control the operation of thestylus pen 106. Thestylus pen 106 may be comprised of one or more processors to control various aspects of the operation of thestylus pen 106. Theprocessor 106 c may also include or be in communication with astylus pen clock 106 g, which may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in thestylus pen 106 and/or data transferred between thestylus pen 106 and other wired and wireless network components (e.g., host device 102). In one embodiment, thestylus pen clock 106 g is set at a speed that is at least as fast as the speed that a clock (e.g., host clock 215) in thehost device 102 is running at to facilitate the timing of the delivery of communication signals from thecommunications unit 214. In general, it is desirable for the accuracy of thestylus pen clock 106 g to be at least as accurate as thehost clock 215 to assure that the time stamps applied to the touch data information generated by thestylus pen 106 andhost device 104 does not appreciably drift relative to one another. Clocks that have appreciably different accuracies (e.g., frequency error rates) from one another will affect the accuracy and usefulness of the time stamp information that is transferred between thestylus pen 106 andhost device 102. As discussed herein, the time stamp information provided by both thestylus pen 106 and thehost device 102 can be used together to help differentiate the type of user input based on its timing relative to other touch events. In one example, thestylus pen clock 106 g has a frequency error of less than about 50 parts per million (ppm), such as an accuracy of at least 30 to 50 ppm. - The
communications unit 106 d is capable of transmitting the pressure data from thestylus pen 106 to thecommunications unit 214 of thehost device 102 when stylus pen interactions are made against theuser interface 104 of thehost device 102. In some embodiments of the invention, thecommunications unit 106 d transmits the interaction data via a desirable wireless communication method, such as a Bluetooth low energy (BTLE) communication method. Other embodiments include other appropriate communications device components for transmitting interaction data between thestylus pen 106 and thehost device 102. Interaction data supplied by thestylus pen 106 can comprise the pressure data, timing data, and/or orientation data generated from gyroscopes and/or accelerometers or the like in thestylus pen 106. In some embodiments, thecommunications unit 106 d may only transmit the pressure data once a threshold pressure level has been detected by thepressure sensing unit 106 b. In other embodiments, thecommunications unit 106 d may transmit the pressure data from thestylus pen 106 once any pressure is detected, regardless of the pressure level detected by thepressure sensing unit 106 b. - The
memory unit 106 e is capable of storing data related to thestylus pen 106 and data related to thehost device 102, such as device settings andhost clock 215 andstylus pen clock 106 g information. For example, thememory unit 106 e may store data related to the linking association between thestylus pen 106 and thehost device 102. - The
power source 106 f is capable of providing power to thestylus pen 106. Thepower source 106 f may be a built-in battery inside thestylus pen 106. Thepower source 106 f can be electrically coupled to one or more of the components within thestylus pen 106 in order to supply electrical power to thestylus pen 106. - As noted above, some embodiments of the
stylus pen 106 may be comprised of one or both of a gyroscope, an accelerometer, or the like. A gyroscope is a device configured to measure the orientation of thestylus pen 106 and operates based on the principles of the conservation of angular momentum. In certain embodiments, one or more gyroscopes are micro-electromechanical (MEMS) devices configured to detect a certain rotation of thestylus pen 106. To illustrate, thestylus pen 106 can be configured to send orientation data from a gyroscope contained within thestylus pen 106. This orientation data can be used in conjunction with the timing and pressure data communicated from thestylus pen 106 to thehost device 102. In certain embodiments, the accelerometers are electromechanical devices (e.g., micro-electromechanical systems (MEMS) devices) configured to measure acceleration forces (e.g., static and dynamic forces). One or more accelerometers can be used to detect three-dimensional (3D) positioning. For example, 3D tracking can utilize a three-axis accelerometer or two two-axis accelerometers. According to some embodiments, thestylus pen 106 may utilize a 3-axis accelerometer to detect the movement of thestylus pen 106 in relation to theuser interface 104 of thehost device 102. -
FIG. 3A illustrates a simplified block diagram of a userinput discrimination architecture 300 that comprise computer executable instructions and supporting hardware and software elements that are used to distinguish between the different types of user inputs received by thehost device 102. In some embodiments, the system and methods described herein may provide a userinput discrimination architecture 300 that includes a controllingengine 340 that receives various user input information and uses the received user input information to distinguish between the different types of user touch inputs received by theuser interface 104 of thehost device 102. In some configurations, the controllingengine 340 comprises computer executable instructions that are stored in thememory 211 of thehost device 102, and are run in the background of thehost device 102 by use of theprocessing units 201 of thehost device 102. - The user inputs received by the controlling
engine 340 may include a user touchrelated input 331, astylus pen input 335 and/or ahost input 333. While not intending to be limiting as to the scope of the invention described herein, thehost input 333, which is delivered from the hostsignal processing unit 332 of theprocessing unit 210 to the controllingengine 340, may include the user touchrelated input 331 received from the user'sphysical touch input 330 received by theuser interface 104, thestylus pen input 335 and other useful information relating to the control of thehost device 102 collected by theprocessing unit 210. However, in some configurations of thehost device 102, the hostsignal processing unit 332 is not a separate component within thehost device 102, and may be formed within, and controlled by, the components used to provide the user'sphysical touch input 330 or even the controllingengine 340. - After receiving and analyzing the received user information, the controlling
engine 340 may then deliveroutput data 350 that is used in the control of various software and hardware running on thehost device 102. In one example, theoutput data 350 may be used by one or more third party applications and/or components in thehost device 102 to perform some useful display or data output functions. In another example, theoutput data 350 may be used by thehost device 102 to control some aspect of a software program running on thehost device 102, to generate an image on a display in thehost device 102 and/or process some data that is stored in thehost device 102. - In general, the user touch
related input 331 includes the user'sphysical touch input 330, which may include the interaction of a finger, an appendage and the physical interaction of thestylus pen 106 with the touch sensitive portion of theuser interface 104. Typically, the touchrelated input 331 is processed by the hostsignal processing unit 332, such as capacitive sensing signal processing, before it is delivered to and then used by the controllingengine 340. - The user's input delivered to the
host device 102, as illustrated inFIG. 2 , may also includeconfigurational inputs 339 that are delivered from the user and/orstylus pen 106 to the controllingengine 340. Theconfigurational inputs 339 may include information about the user orstylus pen 106 that will help the userinput discrimination architecture 300 distinguish between the different types ofuser touch input 331 information (e.g., information relating to touch input from a stylus pen, finger or appendage) received by thehost device 102. Theconfigurational inputs 339 may include whether the user is right-handed, left-handed, information about thehost device 102, Bluetooth pairing information or other useful information about the user, stylus pen or controlling engine configuration. - The
stylus pen input 335 generally includes user input information received by components in thestylus pen 106 that can be transferred via wired or wireless communication methods to thehost device 102. Thestylus pen input 335 may comprise the pressure data, timing data, and/or orientation data generated by thepressure sensing unit 106 b or other sensors found in the stylus pen 106 (e.g., gyroscopes, accelerometers, etc.), such as touchsignal generating device 106 h which is discussed further below. In one embodiment, thestylus pen input 335 may be transmitted via a wireless communication link to thecommunications unit 214 of thehost device 102 using a desirable wired or wireless communication technique, such as a Bluetooth low energy (BTLE) communication protocol, and then is delivered to the controllingengine 340. Typically, thestylus pen input 335 is processed by the hostsignal processing unit 332 in thehost device 102 using wired or wireless communication protocols (e.g., BTLE protocols) before it is delivered to the controllingengine 340 via the hostsignal processing unit 332. - The
host input 333 generally includes various sets of synchronous and/or asynchronous data that are received by thehost device 102 from thestylus pen 106 and/or created by the user'sphysical touch input 330 received from the user. Thehost input 333, which is provided to the controllingengine 340, may includeuser touch input 331 generated by thetouch sensing unit 212 and thestylus pen input 335 data provided by thestylus pen 106 to thecommunications unit 214 and hostsignal processing unit 332. In one example, the touchrelated input 331 data is delivered to the controllingengine 340 separately (i.e., input 333A) from thestylus pen input 335 data (e.g., input 333B). The separate host inputs 333A and 333B may not be transferred on separate physical elements to the controllingengine 340, but are shown herein separately to schematically illustrate the different types of data being delivered between thehost device 102 and the controllingengine 340. In some embodiments, thecommunications unit 214 processes the transmittedstylus pen input 335 received from thestylus pen 106 via thecommunication link 205 before it is delivered to the controllingengine 340. - As briefly discussed above, the controlling
engine 340 generally includes one or more executable programs or program related tasks that are used to create theoutput data 350 which is used by the controllingengine 340, software running on thehost device 102 and/or one or more hardware components of thehost device 102 to perform some useful function. The controllingengine 340 may comprise one or moreinput discrimination techniques 345 that are used separately or in combination generate useful andreliable output data 350. The one or moreinput discrimination techniques 345 take in the various different types of inputs (e.g.,inputs host device 102 and try to determine the different types of user inputs from one another, so that the number of errors in the proper selection of an inputting element, such as a finger, stylus pen and/or appendage will be eliminated or less likely to occur. The one or moreinput discrimination techniques 345 are thus used to determine the different types of user inputs from one another and provide a desired “input label” or “element label” for each type of user input so that they can be correctly used by the one or more third party applications and/or components used inhost device 102. In one embodiment, the one or moreinput discrimination techniques 345 include a time baseddiscrimination technique 341, a geometricshape discrimination technique 342 and/or an inference baseddiscrimination technique 343 that are used separately or in combination to generate useful andreliable output data 350 that can be used by the software and/or hardware running on thehost device 102. In some configurations, the one or moreinput discrimination techniques 345 include a plurality of time based discrimination techniques, geometric shape discrimination techniques and/or inference based discrimination techniques. -
FIG. 3B is a flowchart illustrating amethod 390 of discriminating between finger and appendage touch interactions and the physical stylus pen interactions with thehost device 102 using one or moreinput discrimination techniques 345. Themethod 390 optionally starts with the delivery, storage in memory and/or recall ofconfigurational inputs 339 by the controllingengine 340, as shown astep 391. As noted above, theconfigurational inputs 339 may include information about the user and/orstylus pen 106 that is useful for the discrimination of a finger or appendage touch interaction from the physical stylus pen interaction. - Next, at
step 392, astylus pen input 335, which is created when thestylus pen 106 is brought into contact with theuser interface 104, is transferred via a wired or wireless communication technique to thehost device 102 and controllingengine 340. The receipt of thestylus pen input 335 is also referred to herein as a “touch-down event.” A “touch-down event” may be created from a single interaction or each time a user reengages thestylus pen 106 with theuser interface 104 during a writing, drawing or othersimilar stylus pen 106 user input interaction with theuser interface 104. In some embodiments, the controllingengine 340 will ignore the received user touchrelated input 331 data until it has received touch-down event information. In some embodiments, touch-down events do not require the physical contact of a portion of the handheld device and the surface of theuser interface 104, but may also include sensed interactions where the stylus pen is moved over the surface of theuser interface 104 without touching the surface, for example, by use of an active pen tip, which is discussed below. - Next, at
step 393, once thestylus pen input 335 is received, a timing window of a desired length is created around the receipt of a stylus pen input 335 (e.g., touch-down event) in time, so that all of the user touch relatedinputs 331 can be collected for analysis by the controllingengine 340 to determine which of the touch inputs were received from the stylus pen, finger(s) or user's appendage. In one example, the timing window includes a time period of about 30 ms on either side of a received touch-down event. The timing window will include all user data received by and stored inmemory 211 of thehost device 102 in a first time period prior to the receipt of astylus pen input 335 and second time period after the receipt of astylus pen input 335. The length of the timing window (e.g., first time period plus the second time period) may be adjusted so that any external noise received by thehost device 102 does not adversely affect the discrimination process performed by the controllingengine 340, while also assuring that all of the user touchrelated input 331 data that is associated with thestylus pen input 335 are captured. The length of the timing window will depend on the sampling frequency of the touch sensitive portion of thehost device 102, the communication speed between thestylus pen 106 andhost device 102 and the processing speed of the controllingengine 340. In one example, the sampling frequency of the stylus pen's generated data (e.g., pressure data generated by thepressure sensing unit 106 b) is sampled at about a 1 millisecond (ms) rate and the communication speed between thestylus pen 106 andhost device 102 is sampled at about a 30 ms rate. Once a touch-down event has occurred and at least one of the touch data points received by theuser interface 104 has been associated by the one or more input discrimination techniques as being a physical stylus pen touch point, the controllingengine 340 may continue to track and provide user input discrimination results via the generation and delivery of theoutput data 350. - In general, it is desirable for the accuracy of the
stylus pen clock 106 g to be at least as accurate as thehost clock 215 to assure that the time stamps applied to the touch data information generated by thestylus pen 106 andhost device 104 does not appreciably drift relative to one another over time. Clock speeds in thestylus pen 106 andhost device 104 that appreciably vary from one another will affect the relative accuracy of the time stamp information that is compared by the controlling engine to determine whether a user input can be attributed to a stylus pen, finger or user appendage. As discussed herein, the time stamp information may be used in some embodiments described herein to help differentiate the type of user input based its timing relative to other touch events. In one example, thestylus pen clock 106 g has a frequency error of less than about 50 parts per million (ppm), such as an accuracy of at least 30 to 50 ppm. Therefore, the use of astylus pen clock 106 g that has an accuracy that is at least as good as thehost clock 215 can help reduce the error in the detection and analysis of the user input. While the data transfer rate between thestylus pen 106 and thehost device 102 is much greater than the touch data collection rate used by the components in thestylus pen 106 andhost device 102, this will not affect the ability of the controllingengine 340 to determine the type of user input, since the use of accurate time stamp information in the data transferred between devices will prevent the slow data transfer rate from affecting the usefulness of the created touch data analyzed by the controlling engine. - In one embodiment of
step 393, the controllingengine 340 creates a timing window of a desired length around the receipt of a first stylus pen input 335 (e.g., touch-down event) based on a first report received at a first time via thecommunication link 205 created between thestylus pen 106 and thehost device 102. The controllingengine 340 then determines which touch data events fall within the first timing window and then notes that these touch data events are likely to be from astylus pen 106. However, the number of touch data events that fall within a timing window can be larger than the number of actual touch data event(s) that are associated with thestylus pen 106. Therefore, to confirm or refute that touch data events that are likely not associated with thestylus pen 106, when the last report of this sequence sent by the pen is received by the controllingengine 340, the controlling engine will compare the touch data events found in this timing window with the touch data events found in the first timing window to determine which touch data events also stopped (touch take off (e.g., pen removed from interface)) in this window. Thus, touch data events that do not fit within these requirements are likely not related to the stylus pen and touch data event(s) that are in both windows are more likely to have originated from thestylus pen 106. In one example, the first report is generated when the stylus pen lands on theuser interface 104, few reports are then generated as long as pen is pressed on the host device, and the last report is generated when thestylus pen 106 is removed from theuser interface 104, and thus the controllingengine 340 is used to determine which of the touch events was associated with the stylus pen. - At
step 394, the controllingengine 340 utilizes one or more of theinput discrimination techniques 345 to discriminate between the touch interactions supplied by the stylus pen, a finger or user's appendage. One or more of the input discrimination techniques, such as time based discrimination techniques, geometric shape discrimination techniques or inference based discrimination techniques, which are discussed further below, perform an analysis of the touch-down event information and touch event information received in steps 392-393 to help distinguish between the source of the different touch event interactions received by theuser interface 104. The controllingengine 340 may also utilize theconfigurational input 339 data received atstep 391 during this step to help classify and further analyze the other received data. The analyses performed by the differentinput discrimination techniques 345, such as the analysis steps 394A-394C, utilize various different rules that are found within the software instructions that form at least part of the controllingengine 340. A discussion of some of the types of rules for each of the different types ofinput discrimination techniques 345 can be found below. - Next, at step 395, after performing the various analyses of the received and collected data, each of the one or more
input discrimination techniques 345 are used to create and apply a “user input type” label, or also referred to herein as an “element label,” to each of the touch data points for each touch event. The process of providing a “user input type” label generally includes the process of attributing each of the touch data points to a particular user's touch input, such as the physical input from the stylus pen, finger or appendage to theuser interface 104. To reconcile any differences in the element labels given to each of the touch data points by the different input discrimination techniques, the element labels may be further analyzed by the controllingengine 340. In one embodiment ofstep 394 or 395, an inference based discrimination technique 343 (FIG. 3A ) may be used to reconcile the differences between the element labels created by each of the input discrimination techniques used instep 394, which is further described below. - At
step 396, each of the element labels for each of the touch data points are either further analyzed by the controllingengine 340 to reconcile differences between the element labels created by each of the input discrimination techniques or each of the different element labels are transferred within theoutput data 350, so that they can be used by the software and/or hardware running on thehost device 102. Theoutput data 350 may include the positional information (e.g., touch points) and timing information for only the relevant interacting components, such as astylus pen 106 and a finger, and not the interaction of a user's appendage, by use of one or moreinput discrimination techniques 345. - After
step 396 has been performed, steps 392-396 can then be repeated continually, while thestylus pen 106 is interacting with theuser interface 104 or each time a touch-down event occurs to provide user input discrimination results via the generation and delivery of theoutput data 350. -
FIGS. 3C-3D illustrate an example of the various user input information that may be received by the controllingengine 340 and theoutput data 350 results that may be generated by the controllingengine 340 using the steps provided inmethod 390, according to an embodiment of the invention described herein.FIG. 3C illustrates an example ofdata 370 that is received by the controllingengine 340, due to the interaction of astylus pen 106, finger or user's appendage (e.g., palm) with thehost device 102 as a function of time.FIG. 3D graphically illustrates at least a portion of theoutput data 350 generated by the controlling engine 340 (e.g., data 380), due to the interaction of astylus pen 106, finger or user's appendage (e.g., palm) with thehost device 102 as a function of time. - Referring to the example of
FIG. 3C , at time T0 the touch sensing component of the host device 102 (FIG. 2 ) receivesinteraction data 371 created by the interaction of an appendage (e.g., palm) with thehost device 102. Next, at time T1 the touch sensing component of thehost device 102 also receivesinteraction data 372 created by the interaction of a stylus pen with the host device 102 (e.g., touch event). Next, at time T2 thehost device 102 also receivesstylus pen input 335 data, or interaction data 373 (e.g., touch-down event). At time T3 the touch sensing component of thehost device 102 also receivesinteraction data 374 created by the interaction of a finger with thehost device 102, and then at time T4 the interaction of a finger with thehost device 102 ends, thus causing theinteraction data 374 to end. In this example, once thestylus pen input 335 is received by the controllingengine 340 at time T2, a timing window having a desired length is created so that the stored user input received between a time before T0 and time T2 and the user input received between times T2 and a time after T4 can be characterized anduseful output data 350 can be created. In some embodiments, the interaction data 371-374 received by the controllingengine 340 at any instant in time includes the coordinates of a touch data point and its timing information. One will note that the interaction data 371-374 includes the input data received over a period of time for each specific interacting element, and thus may contain many different touch data points that are in different coordinate positions on the user interface at different times. WhileFIG. 3C illustrates an example of various different types of interacting elements (e.g., stylus pen, finger, appendage) and a specific example of the timing of the interaction of these interacting elements with thehost device 102, this example is not intended to be limiting, and is only added herein as a way to describe one or more aspects of the invention described herein. -
FIG. 3D illustrates at least a portion of theoutput data 350 created by the controllingengine 340 using the one or moreinput discrimination techniques 345, based on the receivedinteraction data FIG. 3C . At time TA, which is at a time between time T0 and time T1, the controllingengine 340 has received a small amount of the receivedinteraction data 371 created by the user. At time TA, at least one of the one or moreinput discrimination techniques 345 used instep 394 by the controllingengine 340 are used to create and apply a user input type label to theinteraction data 371, based on the input data received by the controllingengine 340 by time TA. In general, as noted above, the input data may include the user touchrelated input 331,stylus pen input 335,host input 333 andconfigurational inputs 339. In some configurations, where the controllingengine 340 does not have enough data to decide what type of user input is being applied to thehost device 102, it may be desirable to make an initial guess (e.g., finger, stylus pen and/or appendage) and then later correct the user input label as more data is acquired about the received user input. Therefore, in one example, the user input type label for theinteraction data 371 is defined to be an “appendage” versus a “finger” or “stylus pen.” Therefore, theoutput data 350 created at time TA includes the current positional information, current timing information and “appendage” element label for theinteraction data 371. In some embodiments, any interaction data that is not given a stylus pen or finger type of element label is excluded from theoutput data 350 provided from the controllingengine 340 and thus nooutput data 350 is transferred for theinteraction data 371 at time TA, as illustrated inFIG. 3D as a dashed line. - Next, at time TB, which is at a time between time T1 and time T2, the controlling
engine 340 has received data regarding a user input that is creating theinteraction data 371 and a new user input that is creating theinteraction data 372. At time TB, the one or moreinput discrimination techniques 345 of the controllingengine 340 are used to create and apply a user input type label to the interaction data 372 (e.g., step 394), based on the input data received by the controllingengine 340 by time TB. In this example, the user input type label for theinteraction data 372 is initially defined as a “finger” based on the one or moreinput discrimination techniques 345. Typically, the controllingengine 340 is continually collecting theinteraction data interaction data - Next, at time TC, which is at a time between time T2 and time T3, the controlling
engine 340 has received data regarding the user inputs that are creating theinteraction data interaction data 373. As noted above, theinteraction data 373 comprisesstylus pen input 335 data created by one or more sensors found in thestylus pen 106. In this example, theinteraction data 373 is generated due to a user initiatedpen tip 106 a touch event that actually occurred at time T1. However, the delivery of theinteraction data 373 to the controllingengine 340 has been delayed from theinteraction data 372 received by the stylus pen's interaction with the touch sensing unit of thehost device 102 by a signal delay time 375 (FIG. 3C ). The signal delay time may be created by communication processing timing delays, differences in the clocks of thestylus pen 106 andhost device 102 and/or communication/timing errors created within thestylus pen 106 or thehost device 102. The data delivered in the transferredinteraction data 373 may be generated by thepressure sensing unit 106 b and then transferred to thecommunications unit 214 of thehost device 102 through thecommunications unit 106 d of thestylus pen 106. At time TC, the one or moreinput discrimination techniques 345 of the controllingengine 340 are used to create and apply a user input type label to theinteraction data 373, based on the input data received by the controllingengine 340 by time TC. In this example, theinteraction data input discrimination techniques 345, which is an adjustment from the initial element label given to theinteraction data 372. Theoutput data 350 provided to the hardware or other software running on thehost device 102 at time TC will thus contain thestylus pen 106's positional and timing data associated with theinteraction data 372 and the stylus pen's pressure data, stylus pen related timing data, and/or stylus pen orientation data associated with theinteraction data 373, while the “appendage” related data found in theinteraction data 371 is still being excluded. It should be noted that the controllingengine 340 will still collect theinteraction data - In general,
signal delay time 375 can be created by mechanical and electrical delays that are created during the collection and transmission of the information between thestylus pen 106 and the controllingengine 340 running in thehost device 102, and also created by the controlling engine, which may not be synchronized with the wired or wireless communication arrival (e.g., BTLE information). Delays may also be generated due to higher priority tasks being completed by theprocessing unit 210 and/or controllingengine 340, which may cause a delay in the analysis of the received touch data. In some examples, the mechanical delays may include delays created by inertia and/or friction in thepen tip 106 a and/or pressure sensing components in apressure sensing unit 106 b of thestylus pen 106. In some examples, the electrical delays may results from the propagation delays created by one or more electrical components in the host device or stylus pen (e.g., low-pass filters (LPFs) and ADCs) and processing delays created due to the need to transmit and/or convert the data for transmission via a wireless transfer technique (e.g., BTLE) or use by one or more processing components in thestylus pen 106 orhost device 102. In some embodiments, to prevent the signal delay time 375 (FIG. 3C ) from causing mischaracterization of the user input data, it is desirable to encode any transferred data with a timestamp that is at least based on the clock of the device transmitting the desired information. In one example, the sampling rate of the sensingcomponents user interface 104 may be running at a speed of about 16 milliseconds (ms) and the sampling rate of the components in the stylus pen is less than about 16 ms. In one example, the sampling rate of the data sampling components in the stylus pen is less than about 10 ms, such as between about 1 ms and about 10 ms. The provided timestamp information is thus used to help better correlate the multiple sets of data that are received by the controllingengine 340, so that a reliable characterization of the user inputs can be made. - However, since the
host clock 215 and thestylus pen clock 106 g, which are used to generate and/or facilitate the transfer of data between thestylus pen 106 andhost device 102, are generally not synchronized, and in some cases may be running at different speeds, errors in the characterization and processing of the received user input data are not reliably eliminated by use of a single timestamp. In one example, these errors may include errors in the proper selection of a user's input and can cause jitter in the display which will ultimately annoy the user or cause significant disruption in the tasks that the user is performing on the computing device. It has been found that providing the timing data from both thestylus pen clock 106 g and thehost clock 215 in the data transferred between the devices in either direction helps significantly reduce any error in the mischaracterization of the user input. In general, the controllingengine 340 and/or user input sensing program(s) being executed in thestylus pen 106 use both sets of timestamp information received in the transferred data to continually update the processes running in each device to account for any drift or difference in the timing found between thestylus pen clock 106 g andhost clock 215. As noted above, the difference in the timing found between thestylus pen clock 106 g andhost clock 215 will generally affect the analysis of the user input received by the controllingengine 340. Therefore, in one embodiment, all communications provided between thehost device 102 and thestylus pen 106 will include the latest time information received from thestylus pen clock 106 g and the time information received from thehost clock 215, so that the controllingengine 340 can receive and can continually correct for errors found between thestylus pen clock 106 g and thehost clock 215. - Next, at time TD, which is at a time between time T3 and time T4, the controlling
engine 340 has received data regarding a user input that is creating theinteraction data interaction data 374. At time TD, the one or moreinput discrimination techniques 345 of the controllingengine 340 are used to create and apply a user input type label to theinteraction data 374, based on the input data received by the controllingengine 340 by time TD. In this example, the user input type label for theinteraction data 374 is initially defined as a “finger” based on the one or moreinput discrimination techniques 345. It should be noted that the controllingengine 340 will still collect theinteraction data - Next, at time TE, which is at a time between time T4 and time T5, the controlling
engine 340 has receivedinteraction data interaction data 374 has not been received after the time T4 was reached. At time TE, the one or moreinput discrimination techniques 345 continually reevaluate the user input type labels for the interaction data 371-373, based on the input data received by the controllingengine 340, however, the tracking and characterization of the user input relating to theinteraction data 374 will generally be halted due to the removal of this user input. In some embodiments, all of the user's inputs will be continually tracked and characterized while they are interacting with thehost device 102, and be dropped fromoutput data 350 when their interaction with thehost device 102 ends. In some cases, the controllingengine 340 may use the one or moreinput discrimination techniques 345 to track and provide user labels for user interactions that are suspended for times shorter than a specified period of time, such as when astylus pen 106 is lifted from the touch sensitive surface of thehost device 102 for only a short time to write, draw or input some different pieces of information on thehost device 102. - Embodiments of the invention described herein may provide a system and method that analyzes the timing of the received user's input data to determine the source of the user input delivered in the user's physical touch input 330 (e.g., physical stylus pen, finger(s) and user appendage touch input) to the controlling
engine 340. During operation the controllingengine 340 analyzes the timing of the user input data to determine the different types of received user'sphysical touch input 330, which is often referred to herein as the time based user input discrimination technique. In one example, the time based user input discrimination techniques can be used to determine if the received user input was created by a stylus pen, finger(s) or user's appendage by comparing the relative timing of the different user'sphysical touch input 330 events andstylus pen input 335. The time based discrimination techniques used by the controllingengine 340 will generally compare the various received user input data as a function of time to help the controllingengine 340 discriminate between the interaction of a stylus pen, fingers or an appendage. The time based user input discrimination techniques discussed herein may be used alone or in combination with one or more of the other user types of input discrimination techniques discussed herein. -
FIG. 4 is a simplified flowchart illustrating a time based user input discrimination technique for discriminating touch interactions from the physical stylus pen interactions on a touch-screen according to an embodiment of the invention. Themethod 400 can be performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computing system or a dedicated machine), firmware (embedded software), or any combination thereof that are contained in thehost device 102 and/or thestylus pen 106. In one embodiment, themethod 400 is performed by the controllingengine 340 that is running in the background of thehost device 102 and/or the data collection and transmission processes running on components in thestylus pen 106. - The method may include
step 402, in which the controllingengine 340 receives user input (e.g.,user touch input 331 or stylus pen input 335) information related to a touch-down event on thehost device 102. According to embodiments of the present invention, information related to a touch-down event is received from a handheld device. The handheld device may be an electronic stylus pen, such as astylus pen 106, comprising a pressure sensor (e.g.,pressure sensing unit 106 b) touchsignal generating device 106 h that is configured to deliverstylus pen input 335 information to thehost device 102. In some embodiments of the present invention, the electronic pen may also be comprised of at least one of an accelerometer or a gyroscope. - The information related to the touch-down event that is transferred to the controlling
engine 340 may comprise timing information, pressure data, and other data (e.g., accelerometer and/or gyroscope data) that is sent from thestylus pen 106 via thecommunication link 335 to thehost device 102. For example, the information related to the touch-down event may include a first timestamp for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. In another example, the information related to the touch-down event may include a pen clock timestamp derived from a pen clock signal received from thepen clock 106 g and a host clock timestamp derived from a host clock signal received from thehost clock 215 for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. - Next, at box 404, the method includes the controlling
engine 340 receiving information related to a touch event sensed by thehost device 102. The touch event may be from thestylus pen 106 physically interacting with theuser interface 104 of thehost device 102, or by a touch interaction from the direct contact with theuser interface 104 by a finger and/or appendage of the user. According to embodiments of the present invention, the information related to the touch event may comprise a second timestamp for the touch event, which may include timing information related to when the touch event occurred. In one embodiment, the second timestamp comprises a pen clock timestamp derived from a pen clock signal received from thepen clock 106 g and a host clock timestamp derived from a host clock signal received from thehost clock 215 for the touch event. - According to embodiments of the present invention, the information related to the touch event and the information related to the touch-down event may be received by the device simultaneously or at different times.
- Next, at
box 406, the method also includes correlating the information related to the touch-down event with the information related to the touch event. In one example, the controllingengine 340 correlates the information related to the touch-down event with the information related to the touch event. According to embodiments of the present invention, first time information for the touch-down event is correlated with second time information for the touch event. - Next, at
box 408, the method also includes determining whether the time delay between the first timestamp for the touch-down event and the second timestamp for the touch event is less than an empirically predetermined threshold. According to embodiments of the present invention, the controllingengine 340 determines whether the time delay is within a predetermined threshold. For example, if the time delay between the touch-down event and the touch event are greater than the predetermined threshold, which may indicate that the touch event is separate from the touch-down event, as illustrated in step 410. In that event, the controllingengine 340 would distinguish that touch event as not being associated with thestylus pen 106. The controllingengine 340 may distinguish the touch event as being associated with the user's finger. - Alternatively, if the time delay between the touch-down event and the touch event are equal to or less than the predetermined threshold, that may indicate that the touch event is associated with the touch-down event, as illustrated in
box 412. In that event, thehost device 102 would register that touch event as being associated with thestylus pen 106 and not the user's finger or user's appendage. - Once the determination has been made, the controlling
engine 340 continues to monitor incoming touch-down event(s) and touch events, correlates the received data, and makes a determination as to whether the touch event is associated with thestylus pen 106, the touch interactions by the user's fingers and/or appendage of the user. - It should be appreciated that the specific steps illustrated in
FIG. 4 provide a particular method of 400 according to an embodiment of the present invention. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated inFIG. 4 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives. - Embodiments of the invention described herein may also provide a system and method that uses one or more geometric based user input discrimination techniques to distinguish between the different types of user's
physical touch input 330 information received from a stylus pen, finger or user's appendage by a touch sensitive device. The one or more input discrimination techniques may include a geometric shape discrimination technique that uses information relating to the relative position of multiple touch points supplied by the user to help discriminate between the interaction of a stylus pen, fingers or an appendage. The geometric based user input discrimination techniques discussed herein may be used separately or in combination with one or more of the other user input discrimination techniques to distinguish between the various different types of user inputs received by the computing device. In some embodiments, the use of one or more of the geometric based input discrimination techniques with one or more of the time based input discrimination techniques will help improve the accuracy of the user input discrimination created by either technique on their own. - In some embodiments, as noted above, it is desirable for the host input 333 (
FIG. 3A ), which is provided to the controllingengine 340 via portions of thehost device 102, to only include a simplified data set that just includes the coordinates (e.g., X and Y-direction coordinates) of each of thetouch data points 556 and the time that the interaction occurred with theuser interface 104. Typically, this simplified data set is a small fraction of the amount of the data that is commonly collected by conventional touch sensitive handheld devices or touch sensitive display type computing devices. The creation and use of the simplified data to discriminate between the interaction of a stylus pen, fingers or an appendage can reduce the required computing power of the host device and/or increase the speed of the computing device by reducing the computational power required to collect and transfer the touch interaction data. Alternately, in some configurations, the controllingengine 340 does not have access to the actual user interaction data collected from theuser interface 104 and is only fed a simplified data set from thehost device 102. In this case, the controllingengine 340 must discriminate between the interaction of a stylus pen, fingers or an appendage based on the limited nature of the data supplied to it by thehost device 102. -
FIG. 5A schematically illustrates a plurality oftouch data points 556 that have been detected by theuser interface 104 of thehost device 102 and delivered to the controllingengine 340 for the discrimination of the various different types of user inputs. In general, the geometricshape discrimination technique 342 used by the controllingengine 340 includes a method of sorting and grouping of the receivedtouch data points 556 by their geometric location based on geometric rules coded into the controllingengine 340. Each of thetouch data points 556 will be separately analyzed by the controllingengine 340 to determine if they can be associated with the stylus pen, finger or part of the user's appendage. The geometricshape discrimination technique 342 uses geometric rules to provide element labels to one or more clusters of touch points, since it is often likely that these clusters of touch points are related to a specific type of user input, such as a user's palm or finger. For example, if the controllingengine 340 knows that the user is right handed, based on information received from aconfigurational input 339 or by prior analysis of receivedtouch data points 556, the controllingengine 340 can apply a rule that specifies that a stylus pen related touch point will be above and to the left of a group of touch points that are associated with a palm of the user. - To determine the likely type of user input a
touch data point 556 may be associated with, the controlling engine will generally use the current touch point data received by theuser interface 104 and stored old touch data points that had been previously received by the controlling engine 340 (e.g., also referred to herein as “aging” touch data points). In one example, older touch data points, which had each been previously analyzed and characterized by the controllingengine 340, are used to help determine the type of user input that is associated with the current receivedtouch data point 556. Use of the older touch data points and its relationship to the new touch data points can improve the speed and accuracy by which the current touch data points can be associated with a type of user input. In some configurations, the older touch data points are retained, analyzed and/or used by the controllingengine 340 for only a short period of time before they are deemed not useful and are excluded from use. - In one embodiment, the controlling
engine 340 is configured to group thetouch data points 556 into at least one of apen region 571, anappendage region 561 or afinger region 581. Therefore, based on the position of eachtouch data point 556 in relation to othertouch data points 556, the geometricshape discrimination technique 342 can determine that one or more touch points, at any instant in time, is likely to be associated with stylus pen, finger or part of the user's appendage. The various regions defined by the controllingengine 340, such aspen region 571, anappendage region 561 or afinger region 581, can be formed around clusters of touch data points that have been associated with the same type of user input. For example, theappendage region 561 illustrated inFIG. 5A contains a cluster of thetouch data points 556 that have been associated with the user's appendage. - The controlling
engine 340 may also create and use ageometric boundary region 551 to help prioritize the analysis of the received touch data contained therein as being likely to contain useful user input data. In one example, thegeometric boundary region 551 may include a region that includes a touch point that is associated with a stylus pen and one or more touch points that are associated with an appendage, since it is likely that a pen touch point will be near touch points that are associated with a palm of the user. In one embodiment, the geometric boundary includes all of the touch points supplied to theuser interface 104 that have been received at an instant in time. The controllingengine 340 may use the position of the touch points within the geometric boundary to help decide what type of user input has been received. In one example, touch data points that are near the boundaries of the geometric boundary may be more likely to be from a stylus pen. - Referring to
FIG. 5A , in one example, the controllingengine 340 has applied the various geometric based rules and determined that a group of touch points are associated with anappendage region 561, a touch point is associated with a stylus pen (e.g., within the defined pen region 571) and a touch point is associated with a finger (e.g., within the defined finger region 581). In this example, the controllingengine 340 compares each of the currently receivedtouch data points 556 with oldertouch data points 557 to determine the likely type of user input that has been received by the controllingengine 340. In this example, the controllingengine 340 thus may determine that the user's appendage has shifted down and to the left based on the comparison of thetouch data points 556 that are found in theappendage region 561 with the older touch data points 557. This created geometric analysis data can then be used by other components in thehost device 102, or likely in this case be excluded from theoutput data 350 set. - In an effort to provide more accurate determination of the type of user input it is often desirable to use the change in position of two or more older
touch data points 557 to predict the likely position of the next touch point (e.g., touch data point 556). Use of this predictive technique to help characterize the type of user input can reduce the errors created by incorrectly assigning an element label to a touch data point. In one embodiment, as illustrated inFIG. 5B , the geometricshape discrimination technique 342 creates and uses a predicteddata point region 559 that it uses to help determine the element label for a newly receivedtouch data point 556. In this example, a firsttouch data point 557 1 and a secondtouch data point 557 2 are used to form a predicteddirection 558 for the next touch data point and the predicteddata point region 559. Thetouch data point 556 in this example happens to fall within the predicteddata point region 559, and thus would have higher likelihood of being a continuation of this specific input received from the user, such as an touch data point input received by astylus pen 106. The controllingengine 340 therefore takes into account the higher likelihood that a touch data point is of a certain type when it is assigning an element label to that touch data point. The controlling engine may adjust the size and shape of the predicteddata point region 559 due to the speed of the user input, which is determined based on the movement of the older touch data points 557. - In some configurations, the one or more geometric
shape discrimination techniques 342 compare the movement of a touch data point, or cluster of touch data points, with the movement of a touch data point that is associated with a stylus pen, to determine if this cluster of points may be associated with an appendage (e.g., palm) following the stylus pen. In one example, a touch point or cluster of touch data points that move parallel to the direction of the movement of a touch data point that is associated with a stylus pen is labeled as being a palm. -
FIG. 5C is a simplified flowchart illustrating a method of discriminating interactions caused by an appendage, such as a palm of a user on a touch-screen according to an embodiment of the invention. Themethod 520 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computing system or a dedicated machine), firmware (embedded software), or any combination thereof that are contained in thehost device 102 and/or thestylus pen 106. In one embodiment, themethod 520 is performed by the controllingengine 340 that is running in the background of thehost device 102 and/or the data collection and transmission processes running on components in thestylus pen 106. - At
step 522, the method includes receiving information related to a touch-down event on thehost device 102. The information related to the touch-down event that is transferred to the controllingengine 340 may comprise timing information, pressure data, and other data that is sent from thestylus pen 106 via thecommunication link 335 to thehost device 102. For example, the information related to the touch-down event may comprise a first timestamp for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. In another example, the information related to the touch-down event may comprise a pen clock timestamp derived from a pen clock signal received from thepen clock 106 g and a host clock timestamp derived from a host clock signal received from thehost clock 215 for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. According to embodiments of the present invention, the information related to the touch-down event is received from a handheld device, such as astylus pen 106. - Next, at
box 524, the method also includes receiving information related to a plurality of touch events on thehost device 102. The plurality of touch event may be from an appendage of the user, such as a palm of the user resting on the surface of theuser interface 104 of thehost device 102. The plurality of touch event may also be from thestylus pen 106 interacting with theuser interface 104 of thehost device 102, or by a touch interaction from the direct contact with theuser interface 104 by a finger or appendage of the user. According to embodiments of the present invention, the information related to the touch event may comprise a second timestamp for the touch event, which may include timing information related to when the touch event occurred. In one embodiment, the second timestamp comprises a pen clock timestamp derived from a pen clock signal received from thepen clock 106 g and a host clock timestamp derived from a host clock signal received from thehost clock 215 for the touch event. - Next, at
box 526, the method also includes determining or defining one or more clusters of touch events from the plurality of touch events provided to the controllingengine 340 from thestylus pen 106 andhost device 102. According to embodiments of the present invention, there may be touch events occurring from the finger of the user, from thestylus pen 106 interacting with theuser interface 104 of thehost device 102, or from an appendage of the user, such as a palm resting on theuser interface 104 of thehost device 102. - Next, at
box 528, the method also includes determining the movements of each one of the clusters of touch events. According to embodiments of the present invention, the controllingengine 340 in thehost device 102 can determine the movements of each one of the clusters of touch events based on the interactions between each one of the clusters of touch events and theuser interface 104 of thehost device 102. For example, the controller in thehost device 102 can determine the movement of each one of the clusters of touch events across theuser interface 104 of thehost device 102, or can determine that one or more of the clusters of touch events are stationary. According to embodiments of the present invention, determining one or more clusters of touch events from the plurality of touch events may comprise detecting the location of each touch event in the plurality of touch events and associating each touch event into one or more clusters of touch events. In some embodiments, associating each touch event into one or more clusters of touch events is based on relative distances between each touch event in the plurality of touch events. For example, touch events may be considered associated in the same cluster of touch events when they are within a predetermined distance from each other. - Next, at
box 530, the method may also include correlating the information related to the touch-down event with the information related to the plurality of touch events. According to embodiments of the present invention, the touch-down event and plurality of touch events may be correlated based on information received by the controllingengine 340 relating to the touch-down event and the plurality of touch events. The information includes, but is not limited to, timing information received from thestylus pen 106 or movement of the plurality of touch events detected by the controllingengine 340 of thehost device 102. - Next, at
box 532, the method may also include determining whether the movement is below a threshold distance. In this step, the controllingengine 340 may determine whether the movement of each cluster of touch events is less than a predetermined threshold distance. For example, the predetermined threshold may be set to a numerical value specifying a particular distance of movement. If the controllingengine 340 determines that the movement of one of the clusters of touch events moves less than the predetermined threshold, the cluster of touch events may be determined to be associated with an appendage of a user, as illustrated inbox 534, such as a palm as it may be more likely to have a smaller movement or lack of movement, since the palm of the user is resting on theuser interface 104 of thehost device 102. On the other hand, if the controllingengine 340 determines that the movement of one of the cluster of touch events moves greater than the predetermined threshold, the cluster of touch events may be determined as being from a stylus pen or a finger, and thus are not associated with the appendage of a user, as illustrated inbox 536. In such cases, timing information may be used to determine whether the cluster of touch events is from astylus pen 106 or from a finger of the user interacting with theuser interface 104 of thehost device 102. In one example, the one or more of the input discrimination techniques try to determine whether an interaction is from a stylus pen and then tries to decide whether the interaction is from a non-stylus pen, or vice versa - Once the determination has been made, the controlling
engine 340 continues to monitor incoming touch-down event and touch events, correlates the received data, and makes a determination as to whether the touch event is associated with thestylus pen 106, touch interactions by the user's fingers, or a touch interaction caused by the user's appendage resting on theuser interface 104 of thehost device 102, and then generates theoutput data 350 that is provided to software and hardware components running on thehost device 102. - It should be appreciated that the specific steps illustrated in
FIG. 5C provide a particular method of 520 according to an embodiment of the present invention. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated inFIG. 5C may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives. - Therefore, in some embodiments, the various geometric based rules applied by the controlling
engine 340 may include increasing the likelihood that atouch data point 556 that is positioned a threshold distance from a previous touch data point that was characterized as a stylus pen is also a pen touch data point, atouch data point 556 that is positioned outside a definedappendage region 561 it is likely a stylus pen or a finger input, atouch data point 556 that is positioned in the same general direction that a “stylus pen” associated touch data point had moved is also likely a pen touch data point, and that a touch data point that has not moved over one or more touch sampling intervals is also likely an appendage touch point. These geometric rule examples, and types of user input used with these rule examples, are not intended to be limiting as to scope of the invention described herein. - Embodiments of the invention described herein may also include a system and method that utilizes one or more inference based user input discrimination techniques to determine whether it is likely that a user input was received from a stylus pen, finger or appendage by a touch sensitive device. The inference based discrimination techniques used by the controlling engine will generally compare the received user input data with predefined and/or relevance-weighted rules to help discriminate between the interaction of a stylus pen, fingers or an appendage. The inference based user input discrimination techniques discussed herein may be used by themselves or in combination with one or more of the other user types of input discrimination techniques discussed herein to discern between the various different types of inputs received by the computing device.
-
FIG. 6A illustrates a simplified flowchart of amethod 600 of discriminating between different types of user inputs using an inference based discrimination technique. The inference based discrimination technique 343 (FIG. 3A ) of the controllingengine 340 takes in the various different types ofinputs 601 received by thehost device 102 at an instant in time, such asinputs decision matrix 605. Thedecision matrix 605 then analyzes and compares the received inputs 604 1-604 N for each touch data point andother event data 603 to determine the most likely element label for each touch data point. The inputs 604 1-604 N may comprise a “vote” that includes an alphanumeric, numerical or other distinct label that signifies one type of user input from another. In one embodiment, thedecision matrix 605 compares the received inputs 604 1-604 N for each touch data point by tabulating or summing the different votes for the user input type created by each of the decision modules 602 1-602 N. Thedecision matrix 605 of the controllingengine 340 then generates theoutput data 350 that is then used by the components and software found in thehost device 102. In some embodiments, thedecision matrix 605 uses threshold values, which are stored in memory, to assign the user input a desired element label. In one example, a touch data point is not assigned a stylus pen input element label unless it receives a certain minimum number of votes. - In some embodiments, the
decision matrix 605 determines an element label for a touch data point by first applying a weighting factor to each of the received inputs 604 1-604 N, and then compares the adjusted, or weighted, inputs to determine the element label for a given touch data point. The weighting factor may mean that a vote provided by a given decision module (e.g., decision module 602 1) may carry more weight than another vote provided by another given decision module (e.g., decision module 602 2) based on its ability to correctly characterize the type of user input. In one embodiment, theevent type 603 includes other known information relating to the received touch point data, such as whether the data is being delivered from theuser interface 104 orstylus pen 106. - In one embodiment, each
decision module engine 340. In one embodiment, each of the decision modules 602 1-602 N comprise one or more coded software instructions that apply one or more defined rules that are used to characterize the type user input that a touch data point was derived from. In one example, adecision module 602 1 characterizes a touch data point based on its relationship to a cluster of other touch data points as discussed in relation toFIG. 5A , adecision module 602 2 characterizes the touch data point based on its predicted position as discussed in relation toFIG. 5B , adecision module 602 3 characterizes the touch data point based on its movement being within a threshold value as discussed in relation toFIG. 5C , and adecision module 602 4 characterizes a touch data point based on the knowledge of an attribute of the user (e.g., right-handed). Then, each decision module 602 1-602 4 delivers its input 604 1-604 4 to thedecision matrix 605, or also referred to herein as its “vote” as to what type of user input the touch data point was created from. In one example, the inputs 604 1-604 4 each include whether the decision module believes that it the touch data point is associated with a stylus pen, finger or user's appendage. Thedecision matrix 605 of the controllingengine 340 then compares the inputs 604 1-604 4 and generates theoutput data 350 for that touch data point, which may include its position, timestamp information and whether the touch data point is associated with a stylus pen, finger or user's appendage. -
FIG. 6B illustrates a table that contains some examples of some voting results contained in the generated decision matrix data based on the received inputs 604 1-604 N, where in this example N is equal to 10. Therefore, the inputs 604 1-604 10 (not shown) have been created by use of inputs received from the decision modules 602 1-602 10 (not shown), and have been tabulated by thedecision matrix 605 to form the illustrated results. In this example, a first touch data point has received eight votes that it is related to a stylus pen, one vote that it is related to a finger and one vote that it is related to a user's appendage, while a second touch data point has received two votes that it is related to a stylus pen, seven votes that it is related to a finger and one vote that it is related to a user's appendage. Therefore, based on the tabulated data the controllingengine 340 would attribute the first touch data point to a stylus pen and the second touch data point to a finger, which would then be delivered in theoutput data 350. - As noted above, after receiving and analyzing the received user information, the controlling
engine 340 may then deliveroutput data 350 to one or more software and hardware components running on thehost device 102. In one example, theoutput data 350 may be used by one or more third party applications and/or components in thehost device 102 to perform some useful display or data output function. In another example, theoutput data 350 may be used by thehost device 102 to generate an image or line on a display in thehost device 102, due to the determination that the received touch data is related to astylus pen 106. - In some cases, after receiving all of the inputs, the
decision matrix 605 is unable to determine what type of input a certain touch data point is, which are referred to herein as an “unknown” touch data point. Therefore, to resolve this issue, the software running on the host device may take a few different paths to decide what to do with these unknown touch data points. First, the software may decide not to use the “unknown” touch data points in any of the tasks that is currently performing. For example, in the case of a drawing program, the controlling software may decide not to render the touch point on the screen of theuser interface 104. In this case, the controlling software has decided that each data point must have element label to be used. In a second approach, the software running on the host device may decide to use the “unknown” touch data points in the tasks that it is currently performing. For example, in the case of a drawing program, the controlling software may decide to render the touch point on the screen of theuser interface 104, and at some later time undo and/or remove the rendered data when it is clear that the input data was not received from a desired component, such as stylus pen or finger. In this case, the controlling software may decide to give each input an initial element label and then correct the label when it has more data. -
FIG. 7 is a simplified signal diagram 700 illustrating aspects of a method for discriminating stylus pen interactions from touch interactions on a touch-screen by an inference technique, according to an embodiment of the invention. The diagram includes anhost device signal 710, a signal representing touch events detected by a controllingengine 720, a signal representing a touch-down event by astylus pen 106 on the user interface of thehost device 730, and a signal representing a touch interaction (e.g. from a finger of a user) on the user interface of thehost device 740. - The
host device signal 710 includes anactive period 711 that indicates a period where the host device is active and able to receive inputs. The signal representing touch events detected by a controllingengine 720 includes a first period indicating atouch event 721, a first period indicating notouch event 722, a second period indicating atouch event 723, a second period indicating notouch event 724, a third period indicating atouch event 725, a third period indicating notouch event 726, a fourth period indicating atouch event 727, and a fourth period indicating notouch event 728. - The signal representing touch-down event by a
stylus pen 106 on the user interface of thehost device 730 includes a first period indicating a touch-down event 731, a first period indicating no touch-down interaction 732, a second period indicating a touch-down event 733, and a second period indicating no touch-down interaction 734. - The signal representing a touch interaction on the user interface of the
host device 740 includes a first period indicating atouch interaction 741, a first period indicating notouch interaction 742, a second period indicating atouch interaction 743, and a second period indicating notouch interaction 744. - In embodiments of the invention, by correlating the times of the signals from the
stylus pen 106 with the times that touch events were detected (e.g. comparing signals FIG. 7 ), the touch-down events attributable to thestylus pen 106 can be parsed out from amongst all the signal periods where touch events were detected. For example,touch events touch events stylus pen 106 against theuser interface 104 of thehost device 102, rather being touch events conducted by the user making contact with their finger with theuser interface 104 of thehost device 102. -
FIG. 8 is a simplified signal diagram 800 illustrating aspects of a method for discriminating stylus pen interactions from touch interactions on a touch-screen, where stylus pen and touch interactions overlap, according to an embodiment of the invention. For example, a user may conduct stylus pen and touch interactions simultaneously with theuser interface 104 of thehost device 102. As there may be periods where the signals overlap, the system can parse out those signals from thestylus pen 106 from the touch interactions conducting using the user's fingers. The diagram includes anhost device signal 810, a signal representing touch events detected by a controllingengine 820, a signal representing a touch-down event by astylus pen 106 on the user interface of thehost device 830, and a signal representing a touch interaction (e.g. from a finger of a user) on the user interface of thehost device 840. - The
host device signal 810 includes anactive period 811 that indicates a period where thehost device 102 is active and able to receive inputs. The signal representing touch events detected by a controllingengine 820 includes a first period indicating atouch event 821, a first period indicating notouch event 822, a second period indicating atouch event 823, a second period indicating notouch event 824, a third period indicating atouch event 825, and a third period indicating notouch event 826. - The signal representing touch-down events by a
stylus pen 106 on the user interface of thehost device 830 includes a first period indicating a touch-down event 831, a first period indicating no touch-down interaction 832, a second period indicating a touch-down event 833, and a second period indicating no touch-down interaction 834. - The signal representing a touch interactions on the user interface of the
host device 840 includes a first period indicating atouch interaction 841, a first period indicating notouch interaction 842, a second period indicating atouch interaction 843, and a second period indicating notouch interaction 844. - As shown in
FIG. 8 , the signal representing a touch interaction on the user interface of thehost device 840 and the signal representing touch-down events by a capacitive stylus pen on the user interface of thehost device 830 partially overlap during anoverlap period 850. Theoverlap period 850 coincides with first period indicating a touch-down event 831 and the second period indicating atouch interaction 843. - As depicted in
FIGS. 7 and 8 , the signal strength from the touch-down events by thestylus pen 106 and the signal strength from the touch interactions on the user interface of thehost device 840 are roughly equal. However, in other embodiments, the relative signal strengths of the touch-down events by thestylus pen 106 and the touch interactions on the user interface of thehost device 840 may vary in strength. In these other embodiments, the relative change in signal strengths may be an additional factor in further discriminating between stylus pen and touch interactions. - Referring back to
FIGS. 2 and 3 , in one embodiment, thestylus pen 106 may include a touch signal generating device (TSGD) 106 h that is used to cause thestylus pen 106 to be selectively sensed by the capacitive sensing elements found within thetouch sensing unit 212 of theuser interface 104 of thehost device 102. In this configuration, touchsignal generating device 106 h includes one or more components that are able to selectively form a virtual capacitance between a portion of thepen tip 106 a and the capacitive sensing elements found in theuser interface 104 when a TSGD switch, such as a mechanical sensor/switch 221 is activated by the user. In one example, the TSGD switch is part of thepen tip 106 a orpressure sensing unit 106 b. The formed virtual capacitance between thepen tip 106 a and thehost device 102 creates a touch event that is sensed by theuser interface 104 with or without the physical act of touching thepen tip 106 a to the user interface. -
FIG. 9A is an electrical schematic that illustrates the operation of anactive stylus pen 106 with thehost device 102 that is configured for mutual capacitance sensing, according to an embodiment of the invention. Theactive stylus pen 106 is configured with an activestylus control element 907, which may receive signals fromhost device 102 as well as generate signals to be transmitted to thehost device 102. As shown, theactive stylus pen 106 may be held by a user'sfingers 915 and is coupled to theuser interface 104 throughpen tip 106 a. Theactive stylus pen 106 may be physically coupled to theuser interface 104, or theactive stylus pen 106 may be located in proximity to theuser interface 104 such that signals generated within the activestylus control element 907 and transmitted to thepen tip 106 a are able to change the sensed capacitance atsensing assembly 117 within thehost device 102 to a desired level at a desired time. - The
host device 102, of which a portion is depicted inFIG. 9 , generally includes auser interface 104, adriver assembly 113 and asensing assembly 117. Thehost device 102 may include, for example, drive regions and sense regions, such asdrive electrodes 114 andsense electrodes 116. Further, thedrive electrodes 114 a-114 c (x-direction) may be formed in columns whilesense electrodes 116 a-116 b (y-direction) may be formed in rows. Touch sensing areas, or touch pixels, may be formed at the overlapping regions of the drive electrodes and sense electrodes. - During operation,
column driver 113 may transmit a capacitive sensing waveform on one ormore drive electrodes 114 at a time, thereby creating a mutual capacitance CM between the row ofsense electrodes 116 and the driven drive electrode(s) 114 (i.e., column(s)) at each touch pixel.Active stylus pen 106, when coupled to theuser interface 104, may be configured to detect the transmitted capacitive sensing waveform. Whenactive stylus pen 106 is coupled to theuser interface 104, some of the charge coupled between thedrive electrodes 114 andsense electrodes 116 corresponding to one or more touch pixels may instead be coupled onto theactive stylus pen 106, thus forming a pen capacitance CP corresponding to each of the coupled touch pixels. More charge may generally be coupled from a particular touch pixel to theactive stylus pen 106 where theactive stylus pen 106 is a shorter distance from that touch pixel; therefore, detecting that more charge has been coupled away from a particular touch pixel may indicate a shorter distance toactive stylus pen 106. This reduction in charge coupling across the touch pixels can result in a net decrease in the measured mutual capacitance CM between thedrive electrode 114 and thesense electrode 116, and a reduction in the capacitive sensing waveform being coupled across the touch pixel. This reduction in the charge-coupled sensing waveform can be detected and measured by analyzing the change in the sensed capacitance Cs in thesensing assembly 117 to determine the positions of multiple objects when they touch theuser interface 104. - In some embodiments, the
active stylus pen 106 may send a controlling signal to theuser interface 104 by injecting a charge at the appropriate time to thepen tip 106 a, which alters the mutual capacitance CM and thus the value of sensed capacitance Cs detected by thesensing assembly 117. Therefore, by controlling the amount of charge to a desired level, or voltage formed between thepen tip 106 a and asensing electrode 116 to a desired level, thepen tip 106 a of theactive stylus pen 106 can be detected by the capacitive sensing element in the touch-screen containing device as being a touch event. - Further, in some embodiments the
active stylus pen 106 may detect a signal produced at one ormore drive electrodes 114 of the touch-screen containing device by thecolumn driver 113. Based on the detected signal, theactive stylus pen 106 may alter the sensed capacitance Cs to a level at a desired time, so as to cause the touch-screen containing device to correctly determine the location of input provided by theactive stylus pen 106. Advantageously, since the size of thepen tip 106 a is generally too small to be sensed by theuser interface 104, theactive stylus pen 106 may therefore be used to selectively provide a touch sensing input to theuser interface 104. Therefore, by timing when a user input is provided by theactive stylus pen 106 to theuser interface 104, the software running on the touch-screen containing device can analyze and use the provided input to control some aspect of a software program running on the touch-screen containing device and/or display some aspect of the input received on the display portion of the touch-screen device. In some embodiments, theactive stylus pen 106 is adapted to deliver input from theactive stylus pen 106 to any type of touch-screen containing device, despite differences in the particular configurations and sensing methods preformed by the touch-screen containing devices. -
FIG. 9B generally illustrates a driven touch-sensing detected signal 951 provided by the touch sensing components in thehost device 102 and acontrolling signal 915 that is generated and provided to thepen tip 106 a by the active styluscontrolling element 907, according to an embodiment described herein. To provide desirable user input the activestylus control element 907 may generally operate in asynchronization mode 913 or in a transmitmode 914. - For example, assume that
active stylus pen 106 is coupled to a particular touch-screen containinghost device 102. The location of thepen tip 106 a on the touch screen may be directly at a drive pixel that contains a portion of the drive electrode 114 (i.e., a column) and the sense electrode 116 (i.e., a row), but may also be located on the touch screen between drive pixels. Detected signal 951 represents the voltage measured by thepen tip 106 a over time. Detected signal 951 reflects a signal that is generated by thecolumn driver 113 and then sequentially applied to each column as theuser interface 104 is sequentially scanned. The active styluscontrolling element 907 may operate by default insynchronization mode 913, essentially listening for signal activity in this mode, then may transition to transmitmode 914 based on signal activity received and processed by theprocessor 106 c. - During
time period 902, detectedsignal 901 has asignal magnitude 901 a, which indicates that thecolumn driver 113 signal is being applied to a column that is a distance away from thepen tip 106 a, such as a neighboring column, and thus has not yet reached the column nearest to thepen tip 106 a. The activestylus control element 907 may remain in asynchronization mode 913 for a period of time or until the signal magnitude changes. During thenext time periods signal 901 has an amplitude of 901 b, indicating that thecolumn driver 113 is currently applying a portion of the detectedsignal 901 to a column (e.g., drive electrode 114) that is closer to thepen tip 106 a than the column that delivered the signal during thetime period 902. - Generally, synchronization of the active
stylus control element 907 with the touch-screen containinghost device 102 is important to ensuring accurate input is detected by thehost device 102. For example, suppose the activestylus control element 907 transmits a signal to pentip 106 a whencolumn driver 113 is driving a column at which thepen tip 106 a is not located. The signal transmitted topen tip 106 a will change the sensed capacitance most strongly at asensing assembly 117 closest to the location ofpen tip 106 a, but may also affectnearby sensing assemblies 117 to a lesser degree. Because thehost device 102 may measure the values of sensed capacitance across all rows simultaneously, but the columns are driven in particular sequence, thehost device 102 will detect the changes in sensed capacitance but may misinterpret the location of the input. The effect of the misinterpretation may be erratic or erroneous input intohost device 102, which may cause the input position on the screen to jump around and/or lead to other undesirable effects in programs being executed onhost device 102, and may further significantly degrade the user's experience. - At the
next time period 904, the frequency of detectedsignal 901 received from thehost device 102 may be changed, and may be a higher or lower frequency than the portion of the detectedsignal 901 intime period 903. The change in frequency may be caused by the particular scanning process ofhost device 102. In this example, the frequency of detectedsignal 901 increases at time period 304 while the amplitude of detectedsignal 901 remains at asignal magnitude 901 b, indicating that the detectedsignal 901 is still being applied to the same or similarly positioned column to pentip 106 a. In one or more configurations, the activestylus control element 907 may adapt to such a change in frequency and adjust the output signal delivered from thepen tip 106 a. To accomplish this, the activestylus control element 907 may stop transmitting and transition from transmitmode 914 tosynchronization mode 913. When the activestylus control element 907 regains synchronization with detectedsignal 901, the activestylus control element 907 may then return to transmitmode 914 and resume transmitting anoutput signal 912 to thepen tip 106 a. - At
subsequent time period 905, the magnitude of detectedsignal 901 decreases from 901 b to 901 c, indicating that thecolumn driver 113 is applying the detectedsignal 901 to a column (i.e., thecolumn driver 113 is transmitting on the next column) that is a further distance away from the column(s) that delivered the signal during thetime periods signal 901 fromcolumn driver 113, which then causes the activestylus control element 907 to transition intosynchronization mode 913, irrespective of the frequency or phase of detectedsignal 901 that is detected by theactive stylus pen 106. Althoughsignal 901 is depicted as having the same frequency and phase during time period 305 as duringtime period 904, the example is meant to demonstrate that the signal magnitude falling below a particular threshold may trigger a transition intosynchronization mode 913, regardless of signal frequency or phase. Further, the examples disclosed herein are not meant to be limiting the claimed subject matter to only those embodiments interacting withhost devices 102 that generate such signal patterns, frequencies, phases, or changes in frequencies and/or phases. - In one or more embodiments, the maximum signal magnitude value that corresponds to
column driver 113 driving the nearest column (i.e.,magnitude 901 b) may be learned during one scan cycle. The maximum signal magnitude value may then be used to determine a threshold value that can effectively distinguish the maximum magnitude value from the remainder of detected signal magnitude values (i.e., distinguishmagnitude 901 b frommagnitudes column driver 113 is currently driving the nearest column to pentip 106 a. - In one embodiment, when the sensing component (e.g., communications unit 906 d, the processor 906 c and the memory 906 e) of the
active stylus pen 106 determines that the nearest column(s) are delivering thecolumn driver 113 signal, the sensing component may analyze the detectedsignal 901 and generate an output signal based on the detectedsignal 901. The activestylus control element 907 may remain insynchronization mode 913 for atime period 918, when analysis of the detectedsignal 901 is complete and the activestylus control element 907 has synchronized to the detectedsignal 901. The activestylus control element 907 may then transition into transmitmode 914 and begin transmitting an output signal, such as the output signal found in transmitmodes 914 of thecontrolling signal 915 to thepen tip 106 a. Transmission may continue until synchronization with the detectedsignal 901 is lost (e.g., if the frequency or phase of detectedsignal 901 changes). - Though active
stylus control element 907 may be capable of on-the-fly adaption to a frequency change in a detectedsignal 901, this adaptive capability may have a significant computational expense. This expense may have secondary effects of increasing the power consumption ofactive stylus pen 106 as the activestylus control element 907 more frequently processes the detectedsignal 901 and attempts to synchronize, as well as decreasing the percentage of time during scan cycles that the activestylus control element 907 is able to transmit tohost device 102. For example, activestylus control element 907 is depicted as being insynchronization mode 913 for alonger period 918 than theperiod 919, during which activestylus control element 907 is in transmitmode 914. Such a decreased percentage may result in a less responsive input to thehost device 102, which may ultimately cause computing errors inhost device 102. - In another embodiment, however, the
active stylus pen 106 may accommodate longer transmitmode periods 919 by storing host device identification information that relates to one ormore host devices 102. The information may include data relating to physical characteristics or capacitive sensing techniques of each of the different types of host devices, and the information may be stored inmemory 106 e. The host device identification information may further include frequency, timing and phase information of detectedsignal 901, number of rows and/or columns in theuser interface 104 and other useful information. The host device identification information may be pre-programmed and/or stored in memory based on vendor specifications or may be learned (through use of theactive stylus pen 106 with particular host devices 102) and then stored in memory by the sensing component ofactive stylus pen 106. If theactive stylus pen 106 already contains host device identification information corresponding to theparticular host device 102, theactive stylus pen 106 may advantageously bypasssynchronization mode 913 whencolumn driver 113 is driving detectedsignal 901 on the nearest column. In other words,active stylus pen 106 may transmit an output signal to thepen tip 106 a during the entirety of the time period. Further, frequency and phase changes to detectedsignal 901 may not disrupt the transmission by theactive stylus pen 106 if the target frequency and phase values are also included in the host device identification information. - In one embodiment, the
stylus pen 106 is able to use the knowledge of the physical characteristics of thehost device 102 to determine one of the coordinates of a touch event created by astylus pen 106's interaction with theuser interface 104. In one example, since the stylus pen is able to sense the transmitted signals provided by the driven columns in thehost device 102, and is able to determine that it is nearer to one column versus another, using of the knowledge of the physical layout of the columns (i.e., driven electrodes) in thehost device 102, thestylus pen 106 can ascertain its x-direction coordinates. By monitoring the full touch sensing scan cycle, or monitoring characteristics of the touch sensing scanning process performed by thehost device 102, thestylus pen 106 can determine which column number is being driven at a certain time, either by knowledge of the scanning technique used by the host device and/or by analysis of the touch sensing scanning process. For example, it is common for touch sensing devices to drive all of the columns at the end of a touch sensing cycle to reduce any charge built up in different areas of the user interface. Thestylus pen 106 is then able to detect and use this information to know when the first column in a new touch sensing scan is about to start. The stylus pen can then analyze the number of sensing signals of different amplitude created by thecolumn driver 113 that are sent before the column nearest thepen tip 106 a is reached. Thestylus pen 106 can then determine which column number that it is nearest to in the user interface, and thus its relative x-coordinate position. The x-coordinate position can then be transmitted to the host device via thecommunication link 205, so that this information can be used by and/or compared with the touch sensing coordinate information received from the host device to help more easily determine which touch data points are related to thestylus pen 106. Knowledge of at least one of the coordinates of astylus pen 106 interaction with theuser interface 104 can help reduce misidentification error rate and help with palm and finger detection using the techniques described above. -
FIG. 9C illustrates the components of anactive stylus pen 106 capable of interacting with ahost device 102 that is configured for mutual capacitance sensing, according to an embodiment of the invention. Theactive stylus pen 106 may couple to thehost device 102 throughpen tip 106 a, as discussed above. Theactive stylus pen 106 is further configured with an activestylus control element 910, which comprises a low-noise amplifier (LNA) 931, aphase discriminator 932, apeak detector 933, a timing state machine (TSM) 934, a waveform generator (WG) 935, a power amplifier (PA) 936, and aclock source 937. TheLNA 931 generally provides linear signal amplification, and in one or more configurations,LNA 931 may operate across the 10 kilohertz (kHz) to 1 megahertz (MHz) frequency range and may have an input impedance is greater than 1 megaohm (MΩ). Thephase discriminator 932 is generally a zero-crossing detector, which generates a pulse having a width of one cycle ofclock source 937 upon detecting a transition of potential atpen tip 106 a. Thepeak detector 933 is generally comprised of rectifier, integrator, and high pass filter components. TheTSM 934 is comprised of a state machine that controls mode selection, a phase and frequency estimator, a calibration state machine, and a timing sequencer through use of theprocessor 106 c,clock 106 g andmemory unit 106 e. Output generated byTSM 934 provides control to theWG 935, which may generate an appropriate sequence of square pulses having a particular frequency, amplitude and duty cycle that are specified byTSM 934. ThePA 936 drives thepen tip 106 a so that a desired signal can be detected by thehost device 102, and is capable of tri-state operation based on control signals received fromTSM 934 andWG 935. In one example, the tri-state operation, which may be controlled by theTSM 934, may include the delivery of a high voltage signal (VH) (e.g., positive voltage signal) and a low voltage signal (VL) (e.g., negative voltage signal) to provide a desired signal from thepen tip 106 a that can be sensed (e.g., VH or VL) at desired times by any type ofhost device 102 using any type sensing technique. ThePA 936 may also deliver no signal at all to pentip 106 a such as during idle periods or while thePA 936 is in a high-impedance mode (e.g., when activestylus control element 910 is synchronizing to a detected signal 901). Theclock source 937 may be a crystal oscillator or a comparably precise source, and is typically the same clock as clock 206 g discussed above. Theclock source 937 is generally required to be as precise as the clock source that drives theuser interface 104. Thehost device 102 generally includes auser interface 104, adriver assembly 113 and asensing assembly 117. Touch sensing areas, or touch pixels, may be formed at the overlapping regions of the one ormore drive electrodes 114 and one ormore sense electrodes 116. As shown,pen tip 106 a is located within an electric field E of the mutual capacitance created by thedrive electrode 114 andsense electrode 116. In this configuration, thepen tip 106 a is coupled to theuser interface 104, and the signals generated within the activestylus control element 910 and transmitted to thepen tip 106 a may alter the electric field E, which in turn may change the sensed capacitance atsensing assembly 117 to a desired level at a desired time. - According to an embodiment of the invention, active
stylus control element 910 may generally operate in a synchronization mode and/or in a transmit mode. The activestylus control element 910 may operate by default in synchronization mode, essentially listening for signal activity of the touch sensing component in thehost device 102 in this mode, then may transition to transmit mode based on received signal activity. To operate in synchronization mode, theTSM 934 may transmit an output to the enable (ENB) input ofPA 936, which causes thePA 936 to operate in a high impedance mode and deliver the signal to thepen tip 106 a at a desired time to coincide with the capacitive sensing signal delivered by thehost device 102. The high impedance atPA 936 relative toLNA 931 causes most of the detected signal atpen tip 106 a to be transmitted to theLNA 931. TheTSM 934 also may transmit an output to theWG 935 to disable theWG 935, which may be advantageously used to conserve power in theactive stylus pen 106. In some configurations, thepen tip 106 a when coupled to ahost device 102 may detect a signal from thehost device 102, by monitoring the signal received by theLNA 931 asPA 936 is operating in high impedance mode. After being amplified atLNA 931, the detected signal is provided to both thephase discriminator 932 and thepeak detector 933. The respective outputs from thephase discriminator 932 andpeak detector 933 are then transmitted toTSM 934, which uses the estimated phase and frequency to control the output of theWG 935. - Upon determining the estimated phase and frequency of the signal received from the
host device 102, theTSM 934 may cause the activestylus control element 910 to operate in transmit mode by enabling thePA 936 and causing theWG 935 to begin generating an output signal according to the phase, amplitude and frequency information provided by theTSM 934. The output signal generated by theWG 935 may next be amplified by thePA 936. In one or more embodiments,LNA 931 may have a relatively large input impedance compared to thepen tip 106 a, so that the amplified signal will be transmitted to thepen tip 106 a, in order to affect the sensed capacitance due to the capacitive coupling of thepen tip 106 a to the touch sensing components in theuser interface 104. - In one embodiment, the touch
signal generating device 106 h includessignal control electronics 106 i, aconductive coating 222 formed on a surface of thestylus pen 106, which the user is in contact with when they are holding thestylus pen 106, and the mechanical sensor/switch 221 (e.g., simple mechanical switch). In one embodiment, thesignal control electronics 106 i generally includes a signal generating device and other supporting components that are able to inject a current through thepen tip 106 a to the capacitive sensing elements in theuser interface 104 at an interval that is synchronized with the capacitive sensing signals delivered between the capacitive sensing elements in theuser interface 104. Thesignal control electronics 106 i is also adapted to detect the capacitive sensing signal(s) delivered between the transmitter and receiver electrodes in thetouch sensing unit 212 at any instant in time, and a phase shifting device (not shown) that is able to synchronize the timing of the injection of current through thepen tip 106 a with the delivery of the capacitive sensing signal(s) delivered between the transmitter and receiver electrodes. The mechanical sensor/switch 221 when activated electrically couples theconductive coating 222,signal control electronics 106 i and other useful electrical components in thestylus pen 106 to thepen tip 106 a to create a virtual capacitance signal that is delivered between thenpen tip 106 a and the capacitive sensing elements in theuser interface 104. The virtual capacitance created by the activation of the mechanical sensor/switch 221 can at least be intermittently formed between thepen tip 106 a and a portion of theuser interface 104, so that a desirable touch signal is received by theuser interface 104 with or without the physical act of touching thepen tip 106 a to the user interface. - In one embodiment, the initial activation of the mechanical sensor/
switch 221 causes a specific set of sensing signal pulses, or signature pulses, that will allow the one or moreinput discrimination techniques 345 used by the controllingengine 340 to more easily determine that the created touch data input created by the activation of the touchsignal generating device 106 h will be more easily characterized as an input from thestylus pen 106. One will note that the capacitive sensing elements in theuser interface 104 of thehost device 102 are sampled a set frequency (e.g., sampled every 16 ms). Therefore, the set of sensing signal pulses created by portions of the stylus pen 106 (e.g., touchsignal generating device 106 h,processor 106 c andmemory 106 e) when the touchsignal generating device 106 h is activated may require two or more sensing signal pulses that each have a distinguishing preset length and/or fixed time between them that is equal to greater than the sampling rate of the device, so that the signature of the activation of the touchsignal generating device 106 h can be more easily determined by the user input discriminating techniques performed by the controllingengine 340 that is running on thehost device 102. Thus, the touchsignal generating device 106 h is useful, since it allows the user to initiate the interaction of thestylus pen 106 with theuser interface 104, rather than wait for the sensed contact of thepen tip 106 a and theuser interface 104 to be characterized by the controllingengine 340 as an input is received from a stylus pen. - Use of the touch
signal generating device 106 h can also allow two ormore pens 106 to be used with ahost device 102, since each stylus pen can provide a different initial signature pulse configurations that allow the controllingengine 340 to determine which of thepens 106 is being used at any instant in time.FIG. 10 illustrates two sets ofsignature pulses different pens 106, so that the controllingengine 340 can more easily determine that the user input created by eachstylus pen 106 will be more easily associated to that particular stylus pen. Thesignature pulses user interface 104 to let the controlling engine know that the subsequent touch interactions that are associated with that initiating touch event will be made by a particular stylus pen. As illustrated inFIG. 10 , thesignature pulse 1001 may comprise two pulses 1005 and 1006 that each have a desiredduration duration 1022. Also, thesignature pulse 1002 may comprises two pulses 1010 and 1011 that each have a desiredduration duration 1042. Therefore, due to at least one difference betweensignature pulses engine 340 will be able to more easily determine that a particular input is received by one stylus pen versus another. Moreover, asignature pulse user interface 104 is related to a stylus pen and not a finger or user's appendage. - While the techniques disclosed herein primarily discuss a process of determining the type of user input to create output data that is used within a host device on which the controlling engine is running, this configuration is not intended to limiting as to the scope of the invention described herein, since the output data can also be delivered to or shared with other peripheral devices without deviating from the basic scope of the invention described herein.
- The present invention can be implemented in the form of control logic in software or hardware or a combination of both. The control logic may be stored in an information storage medium as a plurality of instructions adapted to direct an information-processing device to perform a set of steps disclosed in embodiments of the present invention. Based on the disclosure and teaching provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the present invention.
- In embodiments, any of the entities described herein may be embodied by a computer that performs any or all of the functions and steps disclosed.
- It should be noted that any recitation of “a”, “an” or “the” is intended to mean “one or more” unless specifically indicated to the contrary.
- It is also understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application and scope of the appended claims. Therefore, the above description should not be understood as limiting the scope of the invention as defined by the claims.
Claims (22)
1. A method of using a touch sensitive computing device, comprising:
receiving, at the computing device, information related to a touch-down event from a handheld device, wherein the information related to the touch-down event comprises information relating to a first time when the touch-down event occurred;
defining a timing window having a timing window length, wherein the timing window is positioned in time relative to the first time;
receiving, at the computing device, information related to a first touch event from a touch sensing unit coupled to the computing device, wherein the information provided for the first touch event comprises information relating to a first touch data point and a second time; and
generating output data that comprises information relating to a type of user input that is attributed to the first touch data point when the second time falls within the timing window.
2. The method of claim 1 , further comprising:
altering information rendered on a display in the computing device at a position relating to the first touch data point based on an analysis of the output data performed by a processor in the computing device.
3. The method of claim 2 , wherein altering information rendered on the display in the computing device further comprises:
generating an image on a display in the computing device relating to the first touch data point when the attributed type of user input to the first touch data point is a stylus pen.
4. The method of claim 1 , wherein the information related to the touch-down event comprises pressure data generated from pressure sensor disposed in a tip of the handheld device when the tip is brought into contact with a surface of the computing device.
5. The method of claim 1 , wherein the information related to the touch-down event includes information related to a position of a tip of an active stylus pen relative to an electrode disposed in a touch sensing unit coupled to the computing device.
6. The method of claim 5 , wherein the position of the active stylus pen is determined at least partially from a capacitive sensing signal delivered from the electrode.
7. The method of claim 1 , wherein the attributed type of user input includes an interaction of a stylus pen with the touch sensing unit.
8. The method of claim 1 , wherein the information provided for the first touch event further comprises information relating to coordinates of the first touch data point, and the method further comprises:
receiving, at the computing device, information related to a second touch event from a touch sensing unit coupled to the computing device, wherein the information provided for the second touch event comprises a second touch data point;
defining a region of the touch sensing unit of the computing device that includes the first touch data point and the second touch data point;
determining whether a position of a third first touch data point on the user interface of the computing device is within the defined region; and
attributing the third touch data point to a type of user input when the third touch data point is disposed within the region of the touch sensing unit,
wherein the output data further comprises information relating to the attributed type of user input of the third data point.
9. The method of claim 1 , further comprises:
associating the received information related to a touch-down event with a second timestamp that is derived from a second clock disposed within the computing device, and wherein:
the information relating to the first time comprises a first timestamp that is derived from a first clock disposed within the handheld device,
the information relating to the second time comprises a third timestamp that is derived from the second clock, and
wherein the generating output data further comprises using the first, second and third timestamps to determine when the second time falls within the timing window.
10. The method of claim 1 , wherein the handheld device comprises an electronic stylus pen.
11. The method of claim 1 , wherein
the information relating to the first time comprises a first timestamp that is derived from a first clock disposed within the handheld device,
the information relating to the second time comprises a second timestamp that is derived from a second clock disposed within the computing device, and
wherein the accuracy of the first clock is equal to or better than the second clock.
12. A method using a touch sensitive computing device, comprising:
receiving, at the computing device, information related to a touch-down event from a handheld device, wherein the information related to the touch-down event comprises information relating to a first time when the touch-down event occurred;
receiving, at the computing device, information related to a first touch event from a touch sensing unit coupled to the computing device, wherein the information provided for the first touch event comprises information relating to a second time and coordinates of a first touch data point;
generating a first user input type vote for the first touch data point based on a comparison of a predetermined threshold time with the information relating to the first time and the second time;
generating a second user input type vote for the first touch data point based on a comparison of a first position of the first touch data point on a user interface of the computing device and a second position of a second touch data point on the user interface of the computing device; and
generating output data that comprises information relating to a type of user input assigned to the first touch data point, wherein assigning the type of user input to the first touch data point includes analyzing the first user input type vote and the second user input type vote.
13. The method of claim 12 , further comprising:
altering information rendered on a display in the computing device at a position relating to the first touch data point based on an analysis of the output data performed by a processor in the computing device.
14. The method of claim 13 , wherein altering information rendered on the display in the computing device further comprises:
generating an image on a display in the computing device relating to the first touch data point when the attributed type of user input to the first touch data point is a stylus pen.
15. The method of claim 12 , further comprises:
associating the received information related to a touch-down event with a second timestamp that is derived from a second clock disposed within the computing device, and wherein:
the information relating to the first time comprises a first timestamp that is derived from a first clock disposed within the handheld device,
the information relating to the second time comprises a third timestamp that is derived from the second clock, and
wherein the generating output data further comprises comparing the first, second and third timestamps.
16. The method of claim 12 , wherein the handheld device comprises an active stylus pen.
17. The method of claim 12 , wherein the comparison of the first position of the first touch data point and the second position of a second touch data point further comprises determining that second touch data point is part of a first cluster of touch data points and that the first touch data point is not part of the first cluster of touch data points.
18. A method using a touch sensitive computing device, comprising:
receiving, at the computing device, information related to a touch-down event from a handheld device, wherein the information comprises information relating to a first time when the touch-down event occurred;
receiving, at the computing device, information related to a first touch event from a touch sensing unit coupled to the computing device, wherein the information comprises information relating to a second time when the touch event occurred on a touch sensing unit of the computing device;
correlating the information related to the touch-down event with the information related to the first touch event, wherein correlating the information comprises comparing the first time, the second time and a predetermined threshold; and
determining that the touch-down event is associated with the handheld device when the difference in time between the first and second time is less than the predetermined threshold.
19. The method of claim 18 , wherein
the information related to the touch-down event comprises a first timestamp derived from a clock in the handheld device; and
the information related to the touch event comprises a second timestamp derived from a clock in the computing device.
20. The method of claim 18 , wherein the handheld device comprises an active stylus pen.
21. A computer readable medium disposed in a computing device, containing a set of instructions that causes a processor to perform a process comprising:
receive, at the computing device, information related to a touch-down event from a handheld device, wherein the information related to the touch-down event comprises information relating to a first time when the touch-down event occurred;
receive, at the computing device, information related to a first touch event and a second touch event from a touch sensing unit coupled to the computing device, wherein the information provided for the first touch event comprises a first touch data point and information relating to a second time, and the information provided for the second touch event comprises a second touch data point and information relating to a third time;
analyze the information received by the computing device, comprising:
comparing a predetermined threshold time and the information relating to the first time and the second time, and then assigning a first user input type vote to the first touch data point based on the comparison; and
comparing a first position of the first touch data point on a user interface of the computing device and a second position of the second touch data point on the user interface of the computing device, and then assigning a second user input type vote to the first touch data point based on the comparison of the first position relative to the second position; and
generating output data that comprises information relating to a type of user input assigned to the first touch event, wherein assigning the type of user input to the first touch event includes analyzing the first user input type vote and the second user input type vote.
22. The computer-readable medium as recited in claim 21 , wherein the process further comprises:
altering information rendered on a display in the computing device at a position relating to the first touch data point based on the assigned first user input type.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/014,282 US20140168141A1 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261738797P | 2012-12-18 | 2012-12-18 | |
US201361755881P | 2013-01-23 | 2013-01-23 | |
US201361762222P | 2013-02-07 | 2013-02-07 | |
US201361790310P | 2013-03-15 | 2013-03-15 | |
US201361791577P | 2013-03-15 | 2013-03-15 | |
US14/014,282 US20140168141A1 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140168141A1 true US20140168141A1 (en) | 2014-06-19 |
Family
ID=50930298
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/014,282 Abandoned US20140168141A1 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
US14/014,274 Active 2034-03-17 US9367185B2 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
US14/014,283 Active 2034-03-24 US9367186B2 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
US14/014,277 Abandoned US20140168140A1 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/014,274 Active 2034-03-17 US9367185B2 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
US14/014,283 Active 2034-03-24 US9367186B2 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
US14/014,277 Abandoned US20140168140A1 (en) | 2012-12-18 | 2013-08-29 | Method and system for discriminating stylus and touch interactions |
Country Status (1)
Country | Link |
---|---|
US (4) | US20140168141A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170097720A1 (en) * | 2015-10-05 | 2017-04-06 | Samsung Electronics Co., Ltd | Electronic device and method for identifying input made by external device of electronic device |
CN114237414A (en) * | 2020-09-09 | 2022-03-25 | 川奇光电科技(扬州)有限公司 | Touch display device and sensing method thereof |
US11340780B2 (en) * | 2018-07-11 | 2022-05-24 | Samsung Electronics Co., Ltd. | Electronic device and method for performing function of electronic device |
EP4068058A4 (en) * | 2019-12-25 | 2024-01-03 | Shenzhen Hitevision Technology Co., Ltd. | Handwriting generation method and apparatus, storage medium, electronic device, and system |
US12050750B2 (en) | 2021-12-28 | 2024-07-30 | Lx Semicon Co., Ltd. | Touch sensing device |
Families Citing this family (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9310923B2 (en) | 2010-12-03 | 2016-04-12 | Apple Inc. | Input device for touch sensitive devices |
US9329703B2 (en) | 2011-06-22 | 2016-05-03 | Apple Inc. | Intelligent stylus |
US8928635B2 (en) | 2011-06-22 | 2015-01-06 | Apple Inc. | Active stylus |
CN104160364A (en) | 2011-10-18 | 2014-11-19 | 卡内基梅隆大学 | Method and apparatus for classifying touch events on a touch sensitive surface |
US9201521B2 (en) * | 2012-06-08 | 2015-12-01 | Qualcomm Incorporated | Storing trace information |
US9652090B2 (en) | 2012-07-27 | 2017-05-16 | Apple Inc. | Device for digital communication through capacitive coupling |
US9557845B2 (en) | 2012-07-27 | 2017-01-31 | Apple Inc. | Input device for and method of communication with capacitive devices through frequency variation |
RO128874B1 (en) * | 2012-12-19 | 2017-08-30 | Softwin S.R.L. | System, electronic pen and method for acquisition of dynamic holograph signature by using mobile devices with capacitive screens |
US10048775B2 (en) | 2013-03-14 | 2018-08-14 | Apple Inc. | Stylus detection and demodulation |
KR20140114766A (en) | 2013-03-19 | 2014-09-29 | 퀵소 코 | Method and device for sensing touch inputs |
US9612689B2 (en) | 2015-02-02 | 2017-04-04 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer |
US9013452B2 (en) | 2013-03-25 | 2015-04-21 | Qeexo, Co. | Method and system for activating different interactive functions using different types of finger contacts |
US20170285855A1 (en) * | 2013-04-10 | 2017-10-05 | Nvidia Corporation | Method and system hybrid stylus |
US9823758B2 (en) * | 2013-04-10 | 2017-11-21 | Nvidia Corporation | Automatic performance of touch screen related functionality in response to detected stylus position |
DE102013214020A1 (en) * | 2013-07-17 | 2015-02-19 | Stabilo International Gmbh | Digital pen |
US10067580B2 (en) * | 2013-07-31 | 2018-09-04 | Apple Inc. | Active stylus for use with touch controller architecture |
US9529525B2 (en) * | 2013-08-30 | 2016-12-27 | Nvidia Corporation | Methods and apparatus for reducing perceived pen-to-ink latency on touchpad devices |
US9477330B2 (en) * | 2013-11-05 | 2016-10-25 | Microsoft Technology Licensing, Llc | Stylus tilt tracking with a digitizer |
CN104636010B (en) * | 2013-11-08 | 2018-07-10 | 禾瑞亚科技股份有限公司 | transmitter and transmitting method thereof |
US9152254B2 (en) * | 2013-11-21 | 2015-10-06 | Atmel Corporation | Electrical connection for active-stylus electrode |
WO2015111159A1 (en) * | 2014-01-22 | 2015-07-30 | 株式会社ワコム | Position indicator, position detection device, position detection circuit, and position detection method |
US9671877B2 (en) | 2014-01-27 | 2017-06-06 | Nvidia Corporation | Stylus tool with deformable tip |
US9632619B2 (en) * | 2014-06-25 | 2017-04-25 | Egalax_Empia Technology Inc. | Recording method, apparatus, system, and computer-readable media of touch information timing |
TWI556154B (en) * | 2014-06-25 | 2016-11-01 | 禾瑞亞科技股份有限公司 | Recording method, apparatus, system, and computer readable media of touch information timing |
US10108301B2 (en) * | 2014-09-02 | 2018-10-23 | Rapt Ip Limited | Instrument detection with an optical touch sensitive device, with associating contacts with active instruments |
US9329715B2 (en) | 2014-09-11 | 2016-05-03 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
KR20160033951A (en) * | 2014-09-19 | 2016-03-29 | 삼성전자주식회사 | Display apparatus and Method for controlling display apparatus thereof |
US10606417B2 (en) | 2014-09-24 | 2020-03-31 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
US10712858B2 (en) * | 2014-09-25 | 2020-07-14 | Qeexo, Co. | Method and apparatus for classifying contacts with a touch sensitive device |
US10067618B2 (en) | 2014-12-04 | 2018-09-04 | Apple Inc. | Coarse scan and targeted active mode scan for touch |
US9830000B2 (en) * | 2014-12-12 | 2017-11-28 | Microsoft Technology Licensing, Llc | Active stylus synchronization |
KR102260600B1 (en) * | 2014-12-31 | 2021-06-04 | 엘지디스플레이 주식회사 | Touch screen device |
WO2016108216A1 (en) * | 2015-01-04 | 2016-07-07 | Microsoft Technology Licensing, Llc | Active stylus communication with a digitizer |
US9772697B2 (en) | 2015-01-04 | 2017-09-26 | Microsoft Technology Licensing, Llc | Touch down detection with a stylus |
US10706818B2 (en) * | 2015-02-16 | 2020-07-07 | Invensense, Inc. | System and method for aligning sensor data to screen refresh rate |
US10095361B2 (en) | 2015-03-18 | 2018-10-09 | Microsoft Technology Licensing, Llc | Stylus detection with capacitive based digitizer sensor |
KR102512318B1 (en) * | 2015-06-29 | 2023-03-22 | 가부시키가이샤 와코무 | Position detection device and pointing device |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
US10423268B2 (en) | 2015-12-22 | 2019-09-24 | Microsoft Technology Licensing, Llc | System and method for detecting grounding state of a touch enabled computing device |
US10296146B2 (en) | 2015-12-22 | 2019-05-21 | Microsoft Technology Licensing, Llc | System and method for detecting grip of a touch enabled device |
KR102513147B1 (en) * | 2016-01-19 | 2023-03-23 | 삼성전자 주식회사 | Electronic device and method for recognizing touch input using the same |
US9823774B2 (en) | 2016-02-23 | 2017-11-21 | Microsoft Technology Licensing, Llc | Noise reduction in a digitizer system |
KR102523154B1 (en) * | 2016-04-22 | 2023-04-21 | 삼성전자주식회사 | Display apparatus, input device and control method thereof |
JP5971608B1 (en) * | 2016-04-25 | 2016-08-17 | パナソニックIpマネジメント株式会社 | Electronic device and coordinate detection method |
CN105929985B (en) * | 2016-04-28 | 2020-04-14 | 深圳市华鼎星科技有限公司 | True handwriting touch control pen with radio frequency transceiving transmission function and touch control device |
US10474277B2 (en) | 2016-05-31 | 2019-11-12 | Apple Inc. | Position-based stylus communication |
US10671186B2 (en) * | 2016-06-15 | 2020-06-02 | Microsoft Technology Licensing, Llc | Autonomous haptic stylus |
TWI606376B (en) * | 2016-08-08 | 2017-11-21 | 意象無限股份有限公司 | Touch Sensor Device And Touch-Sensing Method With Error-Touch Rejection |
CN114721531A (en) * | 2016-08-12 | 2022-07-08 | 株式会社和冠 | Stylus, sensor controller, system and method |
US10852936B2 (en) * | 2016-09-23 | 2020-12-01 | Apple Inc. | Devices, methods, and graphical user interfaces for a unified annotation layer for annotating content displayed on a device |
WO2018225204A1 (en) | 2017-06-08 | 2018-12-13 | 株式会社ワコム | Pointer position detection method |
US11169641B2 (en) | 2018-01-23 | 2021-11-09 | Beechrock Limited | Compliant stylus interaction with touch sensitive surface |
CN112041799A (en) | 2018-02-19 | 2020-12-04 | 拉普特知识产权公司 | Unwanted touch management in touch sensitive devices |
US10678348B2 (en) | 2018-03-12 | 2020-06-09 | Microsoft Technology Licensing, Llc | Touch detection on an ungrounded pen enabled device |
US11036338B2 (en) | 2018-04-20 | 2021-06-15 | Beechrock Limited | Touch object discrimination by characterizing and classifying touch events |
US10616349B2 (en) | 2018-05-01 | 2020-04-07 | Microsoft Technology Licensing, Llc | Hybrid sensor centric recommendation engine |
US10983611B2 (en) | 2018-06-06 | 2021-04-20 | Beechrock Limited | Stylus with a control |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
US11054935B2 (en) | 2018-11-19 | 2021-07-06 | Beechrock Limited | Stylus with contact sensor |
US10831290B2 (en) * | 2019-02-22 | 2020-11-10 | Qualcomm Incorporated | Stylus-tracking piezoelectric sensor |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
KR102149105B1 (en) * | 2019-09-18 | 2020-08-27 | 세종대학교산학협력단 | Mixed reality based 3D sketching device and method |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
US11209916B1 (en) * | 2020-07-30 | 2021-12-28 | Logitech Europe S.A. | Dominant hand usage for an augmented/virtual reality device |
US11281298B1 (en) * | 2020-12-22 | 2022-03-22 | Dell Products L.P. | System and method for dynamically disabling haptic feedback with recognition of pen on touch sensitive area of an information handling system |
US11775100B2 (en) * | 2021-02-04 | 2023-10-03 | Ontario Inc. | Touch sensor system configuration |
US11829556B2 (en) | 2021-03-12 | 2023-11-28 | 1004335 Ontario Inc. | Methods for configuring touch sensor system |
US12111994B2 (en) * | 2022-04-21 | 2024-10-08 | Clement KOH | Input system |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5115107A (en) * | 1991-01-11 | 1992-05-19 | Ncr Corporation | Method of correcting skew between a digitizer and a digital display |
US20080236902A1 (en) * | 2007-03-28 | 2008-10-02 | Oki Data Corporation | Handwriting input system |
US20100141589A1 (en) * | 2008-12-09 | 2010-06-10 | Microsoft Corporation | Touch input interpretation |
US20110291998A1 (en) * | 2010-05-27 | 2011-12-01 | Guy Adams | Calibrating a Digital Stylus |
US20110304577A1 (en) * | 2010-06-11 | 2011-12-15 | Sp Controls, Inc. | Capacitive touch screen stylus |
US20120262407A1 (en) * | 2010-12-17 | 2012-10-18 | Microsoft Corporation | Touch and stylus discrimination and rejection for contact sensitive computing devices |
US20130106722A1 (en) * | 2011-10-28 | 2013-05-02 | Shahrooz Shahparnia | Pulse- Or Frame-Based Communication Using Active Stylus |
US20130300696A1 (en) * | 2012-05-14 | 2013-11-14 | N-Trig Ltd. | Method for identifying palm input to a digitizer |
Family Cites Families (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6259438B1 (en) | 1998-06-04 | 2001-07-10 | Wacom Co., Ltd. | Coordinate input stylus |
US4384201A (en) * | 1978-04-24 | 1983-05-17 | Carroll Manufacturing Corporation | Three-dimensional protective interlock apparatus |
US4476463A (en) | 1981-08-24 | 1984-10-09 | Interaction Systems, Inc. | Display device having unpatterned touch detection |
US4785564A (en) | 1982-12-20 | 1988-11-22 | Motorola Inc. | Electronic notepad |
US4677428A (en) | 1985-06-07 | 1987-06-30 | Hei, Inc. | Cordless light pen |
US4794634A (en) | 1985-12-24 | 1988-12-27 | Kabushiki Kaisha Komatsu Seisakusho | Position-sensitive photodetector and light transmissive tablet and light-emitting pen |
US4705942A (en) | 1985-12-26 | 1987-11-10 | American Telephone And Telegraph Company, At&T Bell Laboratories | Pressure-sensitive light pen |
GB2245708A (en) | 1990-06-29 | 1992-01-08 | Philips Electronic Associated | Touch sensor array systems |
US5117071A (en) | 1990-10-31 | 1992-05-26 | International Business Machines Corporation | Stylus sensing system |
JPH0736142B2 (en) * | 1991-10-10 | 1995-04-19 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Method and information processing apparatus for recognizing movement stop of movement instruction means |
US5610629A (en) | 1991-12-06 | 1997-03-11 | Ncr Corporation | Pen input to liquid crystal display |
JP3140837B2 (en) | 1992-05-29 | 2001-03-05 | シャープ株式会社 | Input integrated display |
US5488204A (en) | 1992-06-08 | 1996-01-30 | Synaptics, Incorporated | Paintbrush stylus for capacitive touch sensor pad |
US5880411A (en) | 1992-06-08 | 1999-03-09 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
KR950004378B1 (en) | 1992-09-09 | 1995-04-28 | 주식회사금성사 | Lcd cell and manufacturing method of situation sensing |
US6133906A (en) | 1993-03-15 | 2000-10-17 | Microtouch Systems, Inc. | Display-integrated stylus detection system |
EP0618527B1 (en) | 1993-03-29 | 1999-09-29 | NCR International, Inc. | Input means for liquid crystal display |
JP3358744B2 (en) | 1993-05-06 | 2002-12-24 | シャープ株式会社 | Liquid crystal display |
GB2295017A (en) | 1994-11-08 | 1996-05-15 | Ibm | Touch sensor input system for a computer display |
FI103837B1 (en) | 1994-12-22 | 1999-09-30 | Nokia Mobile Phones Ltd | Method of transmission and processing |
US6363164B1 (en) * | 1996-05-13 | 2002-03-26 | Cummins-Allison Corp. | Automated document processing system using full image scanning |
GB9516441D0 (en) | 1995-08-10 | 1995-10-11 | Philips Electronics Uk Ltd | Light pen input systems |
JPH09106320A (en) | 1995-08-24 | 1997-04-22 | Symbios Logic Inc | Apparatus and method for input of graphic |
US5914708A (en) | 1996-04-04 | 1999-06-22 | Cirque Corporation | Computer input stylus method and apparatus |
JP3876942B2 (en) | 1997-06-13 | 2007-02-07 | 株式会社ワコム | Optical digitizer |
GB9722766D0 (en) | 1997-10-28 | 1997-12-24 | British Telecomm | Portable computers |
CA2318815C (en) | 1998-01-26 | 2004-08-10 | Wayne Westerman | Method and apparatus for integrating manual input |
US8479122B2 (en) | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US7663607B2 (en) | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
US6188391B1 (en) | 1998-07-09 | 2001-02-13 | Synaptics, Inc. | Two-layer capacitive touchpad and method of making same |
US7268774B2 (en) | 1998-08-18 | 2007-09-11 | Candledragon, Inc. | Tracking motion of a writing instrument |
JP4016526B2 (en) | 1998-09-08 | 2007-12-05 | 富士ゼロックス株式会社 | 3D object identification device |
JP4542637B2 (en) | 1998-11-25 | 2010-09-15 | セイコーエプソン株式会社 | Portable information device and information storage medium |
ATE250784T1 (en) | 1998-11-27 | 2003-10-15 | Synaptics Uk Ltd | POSITION SENSOR |
US6597348B1 (en) | 1998-12-28 | 2003-07-22 | Semiconductor Energy Laboratory Co., Ltd. | Information-processing device |
US6681034B1 (en) | 1999-07-15 | 2004-01-20 | Precise Biometrics | Method and system for fingerprint template matching |
US6504530B1 (en) | 1999-09-07 | 2003-01-07 | Elo Touchsystems, Inc. | Touch confirming touchscreen utilizing plural touch sensors |
US6529189B1 (en) | 2000-02-08 | 2003-03-04 | International Business Machines Corporation | Touch screen stylus with IR-coupled selection buttons |
US6803906B1 (en) | 2000-07-05 | 2004-10-12 | Smart Technologies, Inc. | Passive touch system and method of detecting user input |
AU2002217577A1 (en) | 2000-12-15 | 2002-06-24 | Finger System Inc. | Pen type optical mouse device and method of controlling the same |
WO2002093467A1 (en) | 2001-05-11 | 2002-11-21 | Anoto Ab | Electronic pen with actuation through removal of cap |
JP2003173237A (en) | 2001-09-28 | 2003-06-20 | Ricoh Co Ltd | Information input-output system, program and storage medium |
US7009663B2 (en) | 2003-12-17 | 2006-03-07 | Planar Systems, Inc. | Integrated optical light sensitive active matrix liquid crystal display |
US20030214481A1 (en) * | 2002-05-14 | 2003-11-20 | Yongming Xiong | Finger worn and operated input device and method of use |
US6894683B2 (en) | 2002-07-10 | 2005-05-17 | Intel Corporation | Multi-mouse actions stylus |
US7292229B2 (en) | 2002-08-29 | 2007-11-06 | N-Trig Ltd. | Transparent digitiser |
KR100459230B1 (en) | 2002-11-14 | 2004-12-03 | 엘지.필립스 엘시디 주식회사 | touch panel for display device |
CN102156571B (en) | 2003-02-10 | 2013-07-24 | N-特莱格有限公司 | Touch detection for a digitizer |
US7649527B2 (en) | 2003-09-08 | 2010-01-19 | Samsung Electronics Co., Ltd. | Image display system with light pen |
JP2005149140A (en) | 2003-11-14 | 2005-06-09 | Wacom Co Ltd | Position detector and position indicator |
JP2006039686A (en) * | 2004-07-22 | 2006-02-09 | Pioneer Electronic Corp | Touch panel device, touch region detecting method, and touch region detecting program |
JP2008522183A (en) | 2004-12-01 | 2008-06-26 | エヌ−トリグ リミテッド | Position detection system and apparatus and method for its use and control |
US7646377B2 (en) | 2005-05-06 | 2010-01-12 | 3M Innovative Properties Company | Position digitizing using an optical stylus to image a display |
US7612767B1 (en) | 2005-08-24 | 2009-11-03 | Griffin Technology, Inc. | Trackpad pen for use with computer touchpad |
US8997015B2 (en) * | 2006-09-28 | 2015-03-31 | Kyocera Corporation | Portable terminal and control method therefor |
US8134542B2 (en) | 2006-12-20 | 2012-03-13 | 3M Innovative Properties Company | Untethered stylus employing separate communication and power channels |
US8040329B2 (en) | 2006-12-20 | 2011-10-18 | 3M Innovative Properties Company | Frequency control circuit for tuning a resonant circuit of an untethered device |
US8243049B2 (en) | 2006-12-20 | 2012-08-14 | 3M Innovative Properties Company | Untethered stylus employing low current power converter |
US8130203B2 (en) | 2007-01-03 | 2012-03-06 | Apple Inc. | Multi-touch input discrimination |
US20080297487A1 (en) | 2007-01-03 | 2008-12-04 | Apple Inc. | Display integrated photodiode matrix |
US8493331B2 (en) | 2007-06-13 | 2013-07-23 | Apple Inc. | Touch detection using multiple simultaneous frequencies |
US20090095540A1 (en) * | 2007-10-11 | 2009-04-16 | N-Trig Ltd. | Method for palm touch identification in multi-touch digitizing systems |
US8125469B2 (en) | 2008-04-18 | 2012-02-28 | Synaptics, Inc. | Passive stylus for capacitive sensors |
EP2282254A1 (en) | 2008-05-12 | 2011-02-09 | Sharp Kabushiki Kaisha | Display device and control method |
JP4609543B2 (en) | 2008-07-25 | 2011-01-12 | ソニー株式会社 | Information processing apparatus and information processing method |
US8536471B2 (en) | 2008-08-25 | 2013-09-17 | N-Trig Ltd. | Pressure sensitive stylus for a digitizer |
JP2010067117A (en) | 2008-09-12 | 2010-03-25 | Mitsubishi Electric Corp | Touch panel device |
US8278571B2 (en) | 2009-04-03 | 2012-10-02 | Pixart Imaging Inc. | Capacitive touchscreen or touchpad for finger and active stylus |
US8154529B2 (en) | 2009-05-14 | 2012-04-10 | Atmel Corporation | Two-dimensional touch sensors |
US9417738B2 (en) | 2009-06-12 | 2016-08-16 | Synaptics Incorporated | Untethered active pen and a method for communicating with a capacitive sensing device using the untethered active pen |
EP2460542A4 (en) | 2009-07-31 | 2015-02-18 | Riken | Pancreatic endocrine cell indicator and utilization of same |
US8535133B2 (en) * | 2009-11-16 | 2013-09-17 | Broadcom Corporation | Video game with controller sensing player inappropriate activity |
JP2012022543A (en) * | 2010-07-15 | 2012-02-02 | Panasonic Corp | Touch panel system |
US20120013565A1 (en) | 2010-07-16 | 2012-01-19 | Perceptive Pixel Inc. | Techniques for Locally Improving Signal to Noise in a Capacitive Touch Sensor |
US20120177567A1 (en) | 2010-08-11 | 2012-07-12 | National Institutes Of Health (Nih), U.S. Dept. Of Health And Human Services (Dhhs), U.S. Government | Methods of Treating Pediatric Acute Lymphoblastic Leukemia with an Anti-CD22 Immunotoxin |
US9176630B2 (en) * | 2010-08-30 | 2015-11-03 | Perceptive Pixel, Inc. | Localizing an electrostatic stylus within a capacitive touch sensor |
US9239637B2 (en) | 2010-08-30 | 2016-01-19 | Perceptive Pixel, Inc. | Systems for an electrostatic stylus within a capacitive touch sensor |
EP2619644A1 (en) | 2010-09-22 | 2013-07-31 | Cypress Semiconductor Corporation | Capacitive stylus for a touch screen |
US8988398B2 (en) | 2011-02-11 | 2015-03-24 | Microsoft Corporation | Multi-touch input device with orientation sensing |
TWI464402B (en) | 2011-01-12 | 2014-12-11 | Univ Kaohsiung Medical | Method for conjugating nucleic acid and small molecular |
JP5350437B2 (en) * | 2011-06-27 | 2013-11-27 | シャープ株式会社 | Touch sensor system |
US9329703B2 (en) | 2011-06-22 | 2016-05-03 | Apple Inc. | Intelligent stylus |
US8928635B2 (en) | 2011-06-22 | 2015-01-06 | Apple Inc. | Active stylus |
US8638320B2 (en) | 2011-06-22 | 2014-01-28 | Apple Inc. | Stylus orientation detection |
US9110523B2 (en) | 2011-09-08 | 2015-08-18 | JCM Electronics Stylus, LLC | Stylus and stylus circuitry for capacitive touch screens |
US9690431B2 (en) * | 2011-10-28 | 2017-06-27 | Atmel Corporation | Locking active stylus and touch-sensor device |
JP5296185B2 (en) * | 2011-12-21 | 2013-09-25 | シャープ株式会社 | Touch sensor system |
EP2845083A4 (en) | 2012-04-29 | 2016-01-13 | Jcm Electronic Stylus Llc | Stylus and stylus circuitry for capacitive touch screens |
KR101989408B1 (en) * | 2012-10-25 | 2019-06-14 | 삼성전자 주식회사 | Touch input apparatus using delay component and touch input method thereof and touch input system and method |
-
2013
- 2013-08-29 US US14/014,282 patent/US20140168141A1/en not_active Abandoned
- 2013-08-29 US US14/014,274 patent/US9367185B2/en active Active
- 2013-08-29 US US14/014,283 patent/US9367186B2/en active Active
- 2013-08-29 US US14/014,277 patent/US20140168140A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5115107A (en) * | 1991-01-11 | 1992-05-19 | Ncr Corporation | Method of correcting skew between a digitizer and a digital display |
US20080236902A1 (en) * | 2007-03-28 | 2008-10-02 | Oki Data Corporation | Handwriting input system |
US20100141589A1 (en) * | 2008-12-09 | 2010-06-10 | Microsoft Corporation | Touch input interpretation |
US20110291998A1 (en) * | 2010-05-27 | 2011-12-01 | Guy Adams | Calibrating a Digital Stylus |
US20110304577A1 (en) * | 2010-06-11 | 2011-12-15 | Sp Controls, Inc. | Capacitive touch screen stylus |
US20120262407A1 (en) * | 2010-12-17 | 2012-10-18 | Microsoft Corporation | Touch and stylus discrimination and rejection for contact sensitive computing devices |
US20130106722A1 (en) * | 2011-10-28 | 2013-05-02 | Shahrooz Shahparnia | Pulse- Or Frame-Based Communication Using Active Stylus |
US20130300696A1 (en) * | 2012-05-14 | 2013-11-14 | N-Trig Ltd. | Method for identifying palm input to a digitizer |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170097720A1 (en) * | 2015-10-05 | 2017-04-06 | Samsung Electronics Co., Ltd | Electronic device and method for identifying input made by external device of electronic device |
US10386954B2 (en) * | 2015-10-05 | 2019-08-20 | Samsung Electronics Co., Ltd. | Electronic device and method for identifying input made by external device of electronic device |
US11340780B2 (en) * | 2018-07-11 | 2022-05-24 | Samsung Electronics Co., Ltd. | Electronic device and method for performing function of electronic device |
EP4068058A4 (en) * | 2019-12-25 | 2024-01-03 | Shenzhen Hitevision Technology Co., Ltd. | Handwriting generation method and apparatus, storage medium, electronic device, and system |
CN114237414A (en) * | 2020-09-09 | 2022-03-25 | 川奇光电科技(扬州)有限公司 | Touch display device and sensing method thereof |
US11698698B2 (en) | 2020-09-09 | 2023-07-11 | E Ink Holdings Inc. | Touch display apparatus and sensing method of the same for identifying different touch sources and reducing power consumption |
US12050750B2 (en) | 2021-12-28 | 2024-07-30 | Lx Semicon Co., Ltd. | Touch sensing device |
Also Published As
Publication number | Publication date |
---|---|
US20140168142A1 (en) | 2014-06-19 |
US9367185B2 (en) | 2016-06-14 |
US20140168140A1 (en) | 2014-06-19 |
US9367186B2 (en) | 2016-06-14 |
US20140168116A1 (en) | 2014-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9367186B2 (en) | Method and system for discriminating stylus and touch interactions | |
US9158393B2 (en) | Active stylus for touch sensing applications | |
AU2018282404B2 (en) | Touch-sensitive button | |
EP3274793B1 (en) | Stylus having a plurality of operating portions configured to transmit synchronized signals | |
EP3241099B1 (en) | Touch down detection with a stylus | |
TWI726357B (en) | Storage medium, timing synchronization method and stylus | |
US9116571B2 (en) | Method and system of data input for an electronic device equipped with a touch screen | |
CN108780369B (en) | Method and apparatus for soft touch detection of stylus | |
TWI531934B (en) | Stylus pen, synchronization system and method thereof for touch detection | |
WO2018193711A1 (en) | Touch sensor-type electronic device and sensor control method | |
CN114721557A (en) | Stylus hover and location communication protocol | |
WO2015131675A1 (en) | Compensation method for broken slide paths, electronic device and computer storage medium | |
US20170315631A1 (en) | System and method for multimode stylus | |
EP3676692B1 (en) | Selective scanning for touch-sensitive display device | |
US10353493B2 (en) | Apparatus and method of pen detection at a digitizer | |
KR20170095285A (en) | Stylus with a dynamic transmission protocol | |
CN104079024A (en) | Active type touch pen and charging method thereof | |
WO2015096007A1 (en) | Active capacitive pen, and touch detection and feedback driving methods therefor | |
WO2014061020A1 (en) | Digitizer system with stylus housing station | |
CN111813257A (en) | Touch processor, touch device, touch system and touch method | |
US9069430B2 (en) | Fail safe design for a stylus that is used with a touch sensor | |
KR20230069131A (en) | Fingerprint detection guide based on user feedback | |
JP7317730B2 (en) | Information processing device, information processing method and information processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LOGITECH EUROPE S.A., SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SASSELLI, NICOLAS;SALAMIN, PATRICK;VLASOV, MAXIM;AND OTHERS;SIGNING DATES FROM 20130819 TO 20130826;REEL/FRAME:031121/0525 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |