KR20160123200A - Touch input processing method and electronic device supporting the same - Google Patents
Touch input processing method and electronic device supporting the same Download PDFInfo
- Publication number
- KR20160123200A KR20160123200A KR1020150065395A KR20150065395A KR20160123200A KR 20160123200 A KR20160123200 A KR 20160123200A KR 1020150065395 A KR1020150065395 A KR 1020150065395A KR 20150065395 A KR20150065395 A KR 20150065395A KR 20160123200 A KR20160123200 A KR 20160123200A
- Authority
- KR
- South Korea
- Prior art keywords
- touch
- state
- electronic device
- sensor
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
1. An electronic device comprising: a first touch sensor for sensing a touch object to collect sensor information; a processor for determining a state of the touch object corresponding to the sensor information; and a controller for adjusting the touch sensitivity according to the state of the touch object. An electronic device comprising a two-touch sensor is disclosed. Various other embodiments are also possible which are known from the specification.
Description
Various embodiments of the invention relate to a method of processing touch input.
Electronic devices support touch-based input devices such as touch screens, touch pads, or touch keys as part of a user interface (UI). The touch-type input device is implemented in various ways such as a capacitance type, a pressure reduction type, an infrared type, or an ultrasonic type. For example, the capacitive touch input method can recognize a touch by determining a change in capacitance caused by a conductive object such as a user's finger or a stylus pen.
However, the capacitive touch input method described above can malfunction when there is a factor that can change the electrostatic capacity of the conductive object (touch object). For example, in the case where a finger in a state in which water, sweat, or the like is in contact or a finger in the state of wearing gloves is contacted, the capacitive touch input device does not correctly recognize the touch of the touch object, It can be recognized as a contact point. In addition, a touch input method such as a pressure sensitive type, an infrared type, or an ultrasonic type may malfunction when water, sweat, or the like is stuck on a touch object such as a finger or gloves are worn.
Various embodiments of the present invention provide a method of processing a touch input according to a state of a touch object, determining a state of the touch object in response to sensor information according to approach or contact of the touch object, and an electronic device supporting the touch input .
An electronic device according to various embodiments of the present invention includes a first touch sensor for sensing a touch object to collect sensor information, a processor for determining a state of the touch object corresponding to the sensor information, And a second touch sensor in which the touch sensitivity is adjusted.
According to various embodiments of the present invention, malfunction of the touch input device can be prevented by processing the touch input according to the state of the touch object.
1 schematically shows a configuration of an electronic device related to touch input processing according to various embodiments.
2 shows a block diagram of an electronic device associated with touch input processing according to various embodiments.
3 illustrates a method of operating an electronic device associated with a method for processing a touch input in response to sensor information according to various embodiments.
4 illustrates a method of operating an electronic device related to a method of setting a touch function using sensor information according to various embodiments.
FIG. 5 illustrates a method of operating an electronic device related to a method of setting a touch function using a touch function selection object according to various embodiments.
FIG. 6 illustrates an embodiment of determining the state of a touch object based on a fingerprint sensor according to various embodiments.
7 shows a touch function selection object according to various embodiments.
FIG. 8 shows an embodiment for adjusting the touch sensitivity according to the state of the touch object according to various embodiments.
9 shows an embodiment for adjusting the output state of a display object according to the state of a touch object according to various embodiments.
10 illustrates a method of operating an electronic device associated with a method for processing a touch input based on a fingerprint sensor according to various embodiments.
11 shows a finger status event table according to various embodiments.
12 shows a diagram for explaining a finger status event corresponding to a fingerprint recognition event according to various embodiments.
13 illustrates a method of operating an electronic device associated with touch input processing according to various embodiments.
14 illustrates an electronic device in a network environment in accordance with various embodiments.
15 shows a block diagram of an electronic device according to various embodiments.
16 shows a block diagram of a program module according to various embodiments.
Hereinafter, various embodiments of the present document will be described with reference to the accompanying drawings. It should be understood, however, that this invention is not intended to be limited to the particular embodiments described herein but includes various modifications, equivalents, and / or alternatives of the embodiments of this document . In connection with the description of the drawings, like reference numerals may be used for similar components.
In this document, the expressions "have," "may," "include," or "include" may be used to denote the presence of a feature (eg, a numerical value, a function, Quot ;, and does not exclude the presence of additional features.
In this document, the expressions "A or B," "at least one of A and / or B," or "one or more of A and / or B," etc. may include all possible combinations of the listed items . For example, "A or B," "at least one of A and B," or "at least one of A or B" includes (1) at least one A, (2) Or (3) at least one A and at least one B all together.
The expressions "first," " second, "" first, " or "second ", etc. used in this document may describe various components, It is used to distinguish the components and does not limit the components. For example, the first user equipment and the second user equipment may represent different user equipment, regardless of order or importance. For example, without departing from the scope of the rights described in this document, the first component can be named as the second component, and similarly the second component can also be named as the first component.
(Or functionally or communicatively) coupled with / to "another component (eg, a second component), or a component (eg, a second component) Quot; connected to ", it is to be understood that any such element may be directly connected to the other element or may be connected through another element (e.g., a third element). On the other hand, when it is mentioned that a component (e.g., a first component) is "directly connected" or "directly connected" to another component (e.g., a second component) It can be understood that there is no other component (e.g., a third component) between other components.
As used herein, the phrase " configured to " (or set) to be "adapted to, " To be designed to, "" adapted to, "" made to, "or" capable of ". The term " configured to (or set up) "may not necessarily mean" specifically designed to "in hardware. Instead, in some situations, the expression "configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a generic-purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the other embodiments. The singular expressions may include plural expressions unless the context clearly dictates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as commonly understood by one of ordinary skill in the art. The general predefined terms used in this document may be interpreted in the same or similar sense as the contextual meanings of the related art and, unless expressly defined in this document, include ideally or excessively formal meanings . In some cases, even the terms defined in this document can not be construed as excluding the embodiments of this document.
An electronic device in accordance with various embodiments of the present document may be, for example, a smartphone, a tablet personal computer, a mobile phone, a video phone, an e-book reader, Such as a desktop personal computer, a laptop personal computer, a netbook computer, a workstation, a server, a personal digital assistant (PDA), a portable multimedia player (PMP) A device, a camera, or a wearable device. According to various embodiments, the wearable device may be of the accessory type (e.g., a watch, a ring, a bracelet, a bracelet, a necklace, a pair of glasses, a contact lens or a head-mounted-device (HMD) (E. G., Electronic apparel), a body attachment type (e. G., A skin pad or tattoo), or a bioimplantable type (e.g., implantable circuit).
In some embodiments, the electronic device may be a home appliance. Home appliances include, for example, televisions, digital video disc (DVD) players, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwaves, washing machines, air cleaners, set- Such as a home automation control panel, a security control panel, a TV box such as Samsung HomeSync TM , Apple TV TM or Google TV TM , a game console such as Xbox TM and PlayStation TM , , An electronic key, a camcorder, or an electronic frame.
In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) Navigation systems, global navigation satellite systems (GNSS), event data recorders (EDRs), flight data recorders (FDRs), infotainment (infotainment) systems, ) Automotive electronic equipment (eg marine navigation systems, gyro compass, etc.), avionics, security devices, head units for vehicles, industrial or home robots, automatic teller's machines (ATMs) Point of sale, or internet of things (eg, light bulbs, various sensors, electrical or gas meters, sprinkler devices, fire alarms, thermostats, street lights, A toaster, a fitness equipment, a hot water tank, a heater, a boiler, and the like).
According to some embodiments, the electronic device is a piece of furniture or a part of a building / structure, an electronic board, an electronic signature receiving device, a projector, Water, electricity, gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device may be a combination of one or more of the various devices described above. An electronic device according to some embodiments may be a flexible electronic device. Further, the electronic device according to the embodiment of the present document is not limited to the above-described devices, and may include a new electronic device according to technological advancement.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An electronic apparatus according to various embodiments will now be described with reference to the accompanying drawings. In this document, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).
1 schematically shows a configuration of an electronic device related to touch input processing according to various embodiments. The
In order to perform the above-described functions, the
The first touch sensor 110 may collect sensor information corresponding to approach or contact of a touch object. For example, the first touch sensor 110 may collect touch information calculated by a change in capacitance or a change in pressure corresponding to approach or contact of the touch object. According to one embodiment, the first touch sensor 110 may include a fingerprint sensor (or a fingerprint sensor), a tactile sensor, or a pH concentration sensor. For example, when the first touch sensor 110 includes the fingerprint recognition sensor, the length, direction, or specific point of the ridges included in the user fingerprint (e.g., the point where the ridges are split, the ridges are connected to each other, The fingerprint information can be collected. In addition, when the first touch sensor 110 includes the tactile sensor, the first touch sensor 110 may collect contact state information such as the intensity, direction, or pressure distribution of the contact force corresponding to the touch of the touch object. According to various embodiments, when the first touch sensor 110 includes the pH concentration sensor, the first touch sensor 110 may collect information such as a pH concentration distribution on the contact surface of the touch object. In addition, the first touch sensor 110 may transmit the collected sensor information to the
One or more sensors included in the first touch sensor 110 may be disposed in a certain area of the
The second touch sensor 150 may include the same or similar components as the first touch sensor 110. For example, the second touch sensor 150 may collect sensor information corresponding to approach or contact of a touch object. According to various embodiments, the second touch sensor 150 may be formed in the form of a panel and may be included in a touch screen panel (TSP).
The
The steady state information is a state (normal state) in which no foreign matter (e.g., water, sweat, dust, sand, soil, or gloves) is detected between the first touch sensor 110 and the touch object based on the first touch sensor 110, And the sensor information collected from the sensor. For example, the steady state information may be fingerprint information of a user collected in a state in which the finger is not exposed to water, sweat, dust, sand, or soil based on the fingerprint recognition sensor, or in a state in which the user does not wear the glove. The steady state information may be contact state information, pH concentration distribution information, or touch information collected in the steady state. According to various embodiments, the steady state information may be information that is collected at a cold start of the
When the state of the touch object is abnormal, the
According to various embodiments, the
In this regard, the pattern information of the foreign object may be information obtained by sensing one or more sensors included in the first touch sensor 110. For example, the pattern information of the foreign object may be sensor information of the foreign object collected during the fingerprint recognition function by the fingerprint recognition sensor. In addition, the pattern information of the foreign matter may include information such as pH concentration information of the foreign matter, surface state information, electric conductivity, and the like.
According to various embodiments, the pattern information of the foreign matter may be the frequency spectrum information of the foreign matter. In this case, the
According to various embodiments, the
According to various embodiments, the
The
2 shows a block diagram of an
Referring to FIG. 2, the
The
According to various embodiments, the
According to various embodiments, the function of determining and updating the state of the touch object described above may be performed based on a plurality of sensors. For example, when the state of the touch object is first determined, a touch object state determination function is performed using the fingerprint information collected based on the fingerprint recognition sensor, and then, when the state of the touch object is updated, The touch object state update function can be performed using the touch information collected by the touch object. Accordingly, the user can omit the operation of accessing or contacting the finger with the fingerprint recognition sensor separately for updating the state of the touch object.
According to various embodiments, the
The
According to various embodiments, the
According to various embodiments, the
According to various embodiments, the
According to various embodiments, the
According to various embodiments, the
The
The
According to various embodiments, the
According to various embodiments, the
In this regard, the touch function may include a function of determining the validity of the touch input by changing the touch sensitivity of the
The
According to various embodiments, the
The
The display control device 260 can perform control and data processing of the
According to various embodiments, the display control device 260 receives a touch object state event (or a corresponding command) from the
According to various embodiments, the display control device 260 may receive information of a touch function selection object including items corresponding to various states of the touch object from the
According to various embodiments, the display control device 260 may control the
According to various embodiments, instead of adjusting the touch settings by communicating a touch object state event (or a corresponding command) to the
According to various embodiments, the
As described above, according to various embodiments, an electronic device (e.g., electronic device 200) may include a first touch sensor (e.g., sensor 210) that senses a touch object and collects sensor information, A processor (e.g., processor 220) for determining the state of an object, and a second touch sensor (e.g., touch input device 280) whose touch sensitivity is adjusted according to the state of the touch object.
According to various embodiments, the first touch sensor may include at least one of a fingerprint recognition sensor, a tactile sensor, a pH concentration sensor, or a touch sensor.
According to various embodiments, the processor may further include: steady state information corresponding to a state in which a foreign object is not detected between the touch object and the first touch sensor; information obtained by sensing the foreign object based on the first touch sensor; The state of the touch object can be determined by comparing the pattern information of the foreign object including at least one of the pH concentration information of the foreign substance, the surface state information, the electric conductivity information, and the frequency spectrum information with the sensor information.
According to various embodiments, the processor may control to adjust the touch region of display objects output to the display (e.g., display 290) according to the state of the touch object.
According to various embodiments, the processor may control to change at least one of the size or position of the display objects to correspond to the touch region.
According to various embodiments, the processor can control to display on the display a touch function selection object that includes at least one item corresponding to the state of the touch object.
According to various embodiments, the processor may be configured to switch the touch function selection object from a state in which the screen of the electronic device is turned off to a turn-on state, a state in which the touch input processing function is turned off, A time at which a specific physical button included in the electronic device is selected, a time at which a specific application program included in the electronic device is executed or a time at which the specific application program requests a certain area, It is possible to control the touch object to be displayed at at least one of a time when the touch object moves within a predetermined distance in a predetermined direction in a predetermined direction or a time when the electronic device moves or swivels in a predetermined direction by a predetermined number of times have.
According to various embodiments, the processor updates the state of the touch object based on sensor information collected by sensing the touch object at a point of time after a predetermined time has elapsed based on a time point at which the state of the touch object is determined, And the second touch sensor can be adjusted in touch sensitivity according to the state of the updated touch object.
According to various embodiments, the processor may control to display an icon set in another image of at least one of the shape, color, or size according to the state of the touch object in a certain area of the display.
According to various embodiments, the processor may further include an output of an object including at least one of a text, an image, or an icon associated with the change of the touch object state when the state of the touch object is changed, And / or the output.
3 illustrates a method of operating an electronic device associated with a method for processing a touch input in response to sensor information according to various embodiments. According to various embodiments, the electronic device (e. G.,
Referring to FIG. 3, as in
With respect to the operation of receiving the sensor information, the electronic device may receive the sensor information at a specific time point that satisfies a specific condition. According to one embodiment, the electronic device can receive fingerprint information collected at the time of recognizing the user's fingerprint with respect to the fingerprint recognition function from the fingerprint recognition sensor. According to various embodiments, the electronic device may receive the sensor information corresponding to the first touch operation of the sensed touch object based on the touch sensor. In addition, the electronic device may receive the sensor information corresponding to the sensed touch operation at a point of time longer than a specified time after the first touch operation.
Upon receipt of the sensor information, the electronic device may analyze the sensor information to confirm the status of the touch object, as in
According to various embodiments, the fingerprint recognition sensor can internally analyze the fingerprint information of the user to determine the state of the user's finger. Further, the fingerprint recognition sensor may transmit a finger status event (or information corresponding thereto) corresponding to the finger state of the user to the electronic device, and the electronic device may transmit the finger status event (or the corresponding information) The finger status can be confirmed. Alternatively, the fingerprint recognition sensor may transmit a fingerprint recognition event generated in the fingerprint information collection process of the user to the electronic device. In this case, the electronic device can manage the fingerprint recognition event by mapping it to a finger state event (or information corresponding thereto). In this process, the electronic device can confirm the finger state based on the finger state event (or information corresponding thereto).
According to various embodiments, if the state of the touch object is in a normal state, then
At
If the state of the touch object is changed, as in
If the state of the touch object is not changed, the electronic device can maintain the setting of the previously set touch function. For example, the electronic device can maintain the touch sensitivity of the touch input device as it is, and can maintain the output state of the display object as it is. As in
According to various embodiments, the electronic device may control to perform an operation after the
According to various embodiments, a method of finely categorizing the state of a touch object may include a method of using sensor information corresponding to approach or contact of the collected touch object based on a sensor (e.g.,
4 illustrates a method of operating an electronic device related to a method of setting a touch function using sensor information according to various embodiments.
Referring first to FIG. 4, as in
In
Once the state of the touch object is classified, as in
If a touch function is designated, the electronic device can adjust the touch input processing method according to the designated touch function, as in
According to various embodiments, the electronic device can adjust the touch region of the display objects according to the designated touch function. In addition, the electronic device can adjust the output state such as the size or the position of the display objects to correspond to the touch area. For example, if the touch function is designated as a water film function, the electronic device can display the size of the display objects at a predetermined ratio. As a result, the electronic device can prevent the unintentional region from being selected by spreading water or sweat on the touch object.
As described above, instead of the method of finely classifying the state of the touch object using the sensor information collected based on the sensor, the electronic device can receive the state of the touch object from the user. For example, the electronic device can guide the user to select the state of the touch object by displaying a touch function selection object including items corresponding to various states of the touch object on the screen.
FIG. 5 illustrates a method of operating an electronic device related to a method of setting a touch function using a touch function selection object according to various embodiments.
Referring to FIG. 5, as in
According to various embodiments, the act of displaying the touch function selection object on the screen, as in
As in
Once the selected touch function is confirmed, as in
As described above, according to various embodiments, a touch input processing method of an electronic device includes an operation of sensing a touch object based on a first touch sensor and collecting sensor information, a state of the touch object corresponding to the sensor information And an operation of adjusting the touch sensitivity of the second touch sensor according to the state of the touch object.
According to various embodiments, the act of collecting the sensor information may include collecting the sensor information corresponding to approach or touch of the touch object based on at least one of a fingerprint sensor, a tactile sensor, a pH concentration sensor, or a touch sensor Operation.
According to various embodiments, the operation of determining the state of the touch object may include steady state information corresponding to a state in which no foreign object is detected between the touch object and the first touch sensor, And comparing the sensor information with the pattern information of the foreign object including at least one of the information of sensing the foreign substance, the pH concentration information of the foreign matter, the surface state information, the electric conductivity information, and the frequency spectrum information.
According to various embodiments, the act of determining the state of the touch object may include controlling to display on the display a touch function selection object including at least one item corresponding to the state of the touch object.
According to various embodiments, the operation of controlling the display of the touch function selection object on the display may include a point of time when the screen of the electronic device changes from the turned-off state to the turn-on state, A time point at which a specific physical button included in the electronic device is selected, a time point at which a specific application program included in the electronic device is executed or a time point at which the specific application program requests, When the touch object is moved in a predetermined distance by a predetermined distance in a certain direction in a state where a certain area of the screen is pressed by the touch object or when the electronic device moves or rotates by a predetermined number of times The operation of controlling the display of at least one point in time Can.
According to various embodiments, the touch input processing method may further include an operation of controlling the touch area of the display objects output to the display according to the state of the touch object.
According to various embodiments, the operation of controlling the touch area may further include controlling to change at least one of the size or the position of the display objects to correspond to the touch area.
According to various embodiments, the touch input processing method includes updating the state of the touch object based on sensor information obtained by sensing the touch object at a point of time when a predetermined time has elapsed based on a time point at which the state of the touch object is determined And adjusting the touch sensitivity of the second touch sensor according to the updated state of the touch object.
According to various embodiments, the touch input processing method may further include an operation of displaying an icon set in at least one of a shape, a color, and a size in a predetermined region of the display according to the state of the touch object have.
According to various embodiments, the touch input processing method further includes a step of, when the state of the touch object changes, outputting an object including at least one of a text, an image, or an icon related to the change of the touch object state, And outputting the audio information.
FIG. 6 illustrates an embodiment of determining the state of a touch object based on a fingerprint sensor according to various embodiments.
Referring to FIG. 6, the
When the sensor information is collected based on the fingerprint recognition sensor, the
According to various embodiments, the
According to various embodiments, the
According to various embodiments, the
According to various embodiments, a processor (e.g.,
7 shows a touch function selection object according to various embodiments. The
Referring to FIG. 7, the
According to various embodiments, the
According to various embodiments, the
According to various embodiments, the
FIG. 8 shows an embodiment for adjusting the touch sensitivity according to the state of the touch object according to various embodiments.
Referring to FIG. 8, the
According to various embodiments, when a touch operation is performed in a state where water or sweat is applied to the
As described above, in order to prevent the unintended touch coordinates from being recognized by the user, the
9 shows an embodiment for adjusting the output state of a display object according to the state of a touch object according to various embodiments. The
Referring to FIG. 9, the
10 illustrates a method of operating an electronic device associated with a method for processing a touch input based on a fingerprint sensor according to various embodiments. According to various embodiments, an electronic device (e.g.,
Referring to FIG. 10, the electronic device may sense a finger touch based on the fingerprint recognition sensor, as in
In the case of the fingerprint recognition function, as in
If the fingerprint recognition function is not performed, the electronic device can determine whether the fingerprint recognition function is being performed, as in
According to various embodiments, the electronic device may include a specific object (e.g., an icon (or image), etc.) configured at a point in time when an action of a particular physical button (e.g., a home button or a power button) May be selected, or a function of determining a finger state, for example, when a specific application program is executed or when a specific application program requests it. For example, when a specific physical button is selected (e.g., pressed) a predetermined number of times within a specified time, or when a specific physical button is selected at the time when the electronic device completes performing the fingerprint recognition, the electronic device performs a finger state determination function can do. In addition, the electronic device can perform a finger state determination function at a time point when a specific object (e.g., the touch
According to various embodiments, the electronic device performs the fingerprint recognition function, such as
At
In
11 shows a finger status event table according to various embodiments. According to various embodiments, an electronic device (e.g.,
Referring to FIG. 11, the finger status event table 1110 may include event information corresponding to the finger status. For example, the finger status event table 1110 may include identifier information of an event specified according to the finger status. In addition, the finger status event table 1110 may include the operation status information of the sensor (e.g., the
12 shows a diagram for explaining a finger status event corresponding to a fingerprint recognition event according to various embodiments.
Referring to FIG. 12, an electronic device (e.g.,
According to various embodiments, the electronic device may manage the fingerprint recognition event 1210 by mapping it to a finger state event 1230 (or information corresponding thereto). For example, the electronic device responds to the occurrence of a fingerprint recognition success event (e.g., an event whose identifier is identified as "STATUS_GOOD" in the fingerprint recognition event 1210) as a finger state event in a normal state (e.g., STATUS_FINGERCONDITION_GOOD ") (or information corresponding thereto) or a finger state event in a dry state (e.g., an event specified by the identifier" STATUS_FINGERCONDITION_DRY "in the finger state event 1230) (or information corresponding thereto) Can be managed by mapping. According to various embodiments, the electronic device may provide the finger state event (or corresponding information) to at least one of a touch input device (e.g.,
According to various embodiments, the electronic device can control to process the touch input based on the fingerprint recognition event 1210 and the event occurrence time information stored in the memory. For example, the electronic device can identify the most recently stored fingerprint recognition event 1210 based on the event occurrence time information. If the storage time of the fingerprint recognition event 1210 does not exceed the designated time, the electronic device sends a finger state event 1230 (or corresponding information) corresponding to the fingerprint recognition event 1210 to at least one of the touch input device or the display To control the touch input to be processed. According to one embodiment, when the storage time of the fingerprint recognition event 1210 exceeds a specified time, the electronic device can control to output a display object or voice information that induces the fingerprint sensor to access or contact the finger have.
According to various embodiments, events (e.g., fingerprint recognition events 1210, etc.) and touch object state events (e.g., finger status events 1230, etc.) that occur during sensor information collection (e.g.,
13 illustrates a method of operating an electronic device associated with touch input processing according to various embodiments.
Referring to FIG. 13, at
According to various embodiments, the sensor may communicate the collected sensor information to the electronic device. According to one embodiment, the sensor may communicate to an electronic device an event (e.g., a fingerprint recognition event 1210 in Figure 12) that occurs in association with the collection of the sensor information. In this case, the electronic device can manage the event by mapping the event to a touch object state event (e.g., finger state event 1230 in FIG. 12) (or information corresponding thereto).
In
At
14 shows an electronic device 1401 in a
14, an electronic device 1401 may include a
The
Processor 1420 may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP). Processor 1420 may perform computations or data processing related to, for example, control and / or communication of at least one other component of electronic device 1401. [
The
The
In addition, the
The API 1445 is an interface for the application 1447 to control the functions provided by the
The input /
The
Wireless communications may include, for example, cellular communication protocols such as long-term evolution (LTE), LTE Advance (LTE), code division multiple access (CDMA), wideband CDMA (WCDMA) mobile telecommunications system, WiBro (Wireless Broadband), or Global System for Mobile Communications (GSM). The wireless communication may also include, for example,
Each of the first and second external
15 is a block diagram of an electronic device 1501 according to various embodiments. The electronic device 1501 may include all or part of the electronic device 1401 shown in Fig. 14, for example. The electronic device 1501 may include one or more processors (e.g., APs) 1510, a
The
The
The
Each of the
The
The
Memory 1530 (e.g., memory 1430) may include, for example,
The
The
The input device 1550 may include, for example, a
(Digital) pen sensor 1554 may be, for example, part of a touch panel or may include a separate recognition sheet.
Display 1560 (e.g., display 1460) may include a
The
The
The
The
Each of the components described in this document may be composed of one or more components, and the name of the component may be changed according to the type of the electronic device. In various embodiments, the electronic device may comprise at least one of the components described herein, some components may be omitted, or may further include additional other components. In addition, some of the components of the electronic device according to various embodiments may be combined into one entity, so that the functions of the components before being combined can be performed in the same manner.
16 is a block diagram of a program module in accordance with various embodiments. According to one embodiment, the program module 1610 (e.g., program 1440) includes an operating system (OS) that controls resources associated with an electronic device (e.g., electronic device 1401) and / (E.g., an application program 1447). The operating system may be, for example, android, iOS, windows, symbian, tizen, or bada.
The
The kernel 1620 (e.g., kernel 1441) may include, for example, a system resource manager 1621 and / or a device driver 1623. The system resource manager 1621 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 1621 may include a process manager, a memory manager, or a file system manager. The device driver 1623 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication (IPC) driver.
The middleware 1630 may provide various functions to the application 1670 through the
The runtime library 1635 may include, for example, a library module used by the compiler to add new functionality via a programming language while the application 1670 is running. The runtime library 1635 may perform input / output management, memory management, or functions for arithmetic functions.
The
The power manager 1645 operates in conjunction with a basic input / output system (BIOS), for example, to manage a battery or a power source, and to provide power information necessary for the operation of the electronic device. The database manager 1646 may create, retrieve, or modify a database to be used in at least one of the applications 1670. The package manager 1647 can manage installation or update of an application distributed in the form of a package file.
The connection manager 1648 can manage wireless connections, such as, for example, WiFi or Bluetooth. The
Middleware 1630 may include a middleware module that forms a combination of various functions of the above-described components. The middleware 1630 can provide a module specialized for each type of operating system to provide differentiated functions. In addition, the middleware 1630 may dynamically delete some existing components or add new ones.
API 1660 (e.g., API 1445) is a collection of API programming functions, for example, and can be provided in different configurations depending on the operating system. For example, for Android or iOS, you can provide one API set per platform, and for tizen, you can provide more than two API sets per platform.
An application 1670 (e.g., an application program 1447) may include, for example, a home 1671, a
According to one embodiment, an application 1670 is an application that supports the exchange of information between an electronic device (e.g., electronic device 1401) and an external electronic device (e.g., electronic device 1402, 1404) Application "). The information exchange application may include, for example, a notification relay application for communicating specific information to an external electronic device, or a device management application for managing an external electronic device.
For example, the notification delivery application may send notification information generated in other applications (e.g., SMS / MMS applications, email applications, healthcare applications, or environmental information applications) of the electronic device to external
The device management application may be used to control at least one function (e.g., turn-on / turn-off) of an external electronic device (e.g., (E. G., Installing, deleting, or otherwise) managing services provided by an external electronic device or external electronic device (e. G., A call service or message service) Update).
According to one embodiment, the application 1670 may include an application (e.g., a healthcare application of a mobile medical device, etc.) designated according to attributes of an external electronic device (e.g., electronic device 1402, 1404). According to one embodiment, the application 1670 may include an application received from an external electronic device (e.g.,
According to various embodiments, at least some of the
As used in this document, the term "module" may refer to a unit comprising, for example, one or a combination of two or more of hardware, software or firmware. A "module" may be interchangeably used with terms such as, for example, unit, logic, logical block, component, or circuit. A "module" may be a minimum unit or a portion of an integrally constructed component. A "module" may be a minimum unit or a portion thereof that performs one or more functions. "Modules" may be implemented either mechanically or electronically. For example, a "module" may be an application-specific integrated circuit (ASIC) chip, field-programmable gate arrays (FPGAs) or programmable-logic devices And may include at least one.
At least a portion of a device (e.g., modules or functions thereof) or a method (e.g., operations) according to various embodiments may include, for example, computer-readable storage media in the form of program modules, As shown in FIG. When the instruction is executed by a processor (e.g., processor 1420), the one or more processors may perform a function corresponding to the instruction. The computer-readable storage medium may be, for example, a
The computer readable recording medium may be a hard disk, a floppy disk, a magnetic media (e.g., a magnetic tape), an optical media (e.g., a compact disc read only memory (CD-ROM) digital versatile discs, magneto-optical media such as floptical disks, hardware devices such as read only memory (ROM), random access memory (RAM) Etc. The program instructions may also include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter, etc. The above- May be configured to operate as one or more software modules to perform the operations of the embodiment, and vice versa.
Modules or program modules according to various embodiments may include at least one or more of the elements described above, some of which may be omitted, or may further include additional other elements. Operations performed by modules, program modules, or other components in accordance with various embodiments may be performed in a sequential, parallel, iterative, or heuristic manner. Also, some operations may be performed in a different order, omitted, or other operations may be added. And the embodiments disclosed in this document are presented for the purpose of explanation and understanding of the disclosed technology and do not limit the scope of the technology described in this document. Accordingly, the scope of this document should be interpreted to include all modifications based on the technical idea of this document or various other embodiments.
Claims (20)
A first touch sensor that senses a touch object and collects sensor information,
A processor for determining a state of the touch object corresponding to the sensor information,
And a second touch sensor whose touch sensitivity is adjusted according to the state of the touch object.
Wherein the first touch sensor includes at least one of a fingerprint recognition sensor, a tactile sensor, a pH concentration sensor, or a touch sensor.
The processor
Steady state information corresponding to a state in which no foreign matter is detected between the touch object and the first touch sensor,
Wherein the sensor information includes at least one of sensing information of the foreign substance based on the first touch sensor, pH concentration information of the foreign matter, surface state information, electric conductivity information, and frequency spectrum information, To determine the state of the touch object.
The processor
And controls the touch area of the display objects output to the display according to the state of the touch object.
The processor
And changes at least one of a size or a position of the display objects to correspond to the touch area.
The processor
And a touch function selection object including at least one item corresponding to the state of the touch object.
The processor
A point of time when the screen of the electronic device changes from a turn-off state to a turn-on state, a time point when the touch input processing function is turned off to a turn-on state, When a specific application program included in the electronic device is executed or when a specific application program requests a certain object, a certain area of the screen is pressed by the touch object, Or at a time point at which the electronic device moves or rotates in a predetermined direction a predetermined number of times at a predetermined interval.
The processor
Wherein the state of the touch object is updated based on sensor information obtained by sensing the touch object at a point of time that has elapsed from a point of time when the state of the touch object is determined,
The second touch sensor
And the touch sensitivity is adjusted according to the updated state of the touch object.
The processor
And controls the display unit to display an icon having at least one of a shape, a color, and a size set to another image according to the state of the touch object in a predetermined area of the display.
The processor
An output of an object including at least one of a text, an image, and an icon related to the change of the touch object state or an output of voice information associated with the change of the touch object state when the state of the touch object changes, .
An operation of sensing the touch object based on the first touch sensor to collect sensor information,
Determining the state of the touch object corresponding to the sensor information, and
And adjusting a touch sensitivity of the second touch sensor according to a state of the touch object.
The operation of collecting the sensor information
And collecting the sensor information corresponding to approach or contact of the touch object based on at least one of a fingerprint recognition sensor, a tactile sensor, a pH concentration sensor, or a touch sensor.
The operation of determining the state of the touch object
Steady state information corresponding to a state in which no foreign matter is detected between the touch object and the first touch sensor,
Wherein the sensor information includes at least one of sensing information of the foreign substance based on the first touch sensor, pH concentration information of the foreign matter, surface state information, electric conductivity information, and frequency spectrum information, And comparing the touch input with the touch input.
The operation of determining the state of the touch object
And controlling to display on the display a touch function selection object including at least one item corresponding to the state of the touch object.
The operation of controlling to display the touch function selection object on the display
A point of time when the screen of the electronic device changes from a turn-off state to a turn-on state, a time point when the touch input processing function is turned off to a turn-on state, When a specific application program included in the electronic device is executed or when a specific application program requests a certain object, a certain area of the screen is pressed by the touch object, To the at least one of a time when the electronic device is moved within a predetermined distance by a predetermined distance or a time when the electronic device moves or swivels in a predetermined direction a predetermined number of times at a predetermined interval.
The touch input processing method
And controlling the touch region of the display objects to be displayed on the display according to the state of the touch object.
The operation of controlling to adjust the touch area
And controlling to change at least one of a size and a position of the display objects to correspond to the touch area.
The touch input processing method
Updating the state of the touch object based on sensor information obtained by sensing the touch object at a point of time when a predetermined time has elapsed based on a time at which the state of the touch object is determined;
And adjusting the touch sensitivity of the second touch sensor according to the updated state of the touch object.
The touch input processing method
And controlling the display unit to display an icon having at least one of a shape, a color, and a size set to a different image according to a state of the touch object in a predetermined area of the display.
The touch input processing method
An output of an object including at least one of a text, an image, and an icon related to the change of the touch object state or an output of voice information related to the change of the touch object state when the state of the touch object changes, Wherein the touch input processing method further includes:
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/095,413 US9904409B2 (en) | 2015-04-15 | 2016-04-11 | Touch input processing method that adjusts touch sensitivity based on the state of a touch object and electronic device for supporting the same |
EP16165404.1A EP3082025A1 (en) | 2015-04-15 | 2016-04-14 | Touch input processing method and electronic device for supporting the same |
CN201610236856.8A CN106055138A (en) | 2015-04-15 | 2016-04-15 | Touch input processing method and electronic device for supporting the same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150052965 | 2015-04-15 | ||
KR20150052965 | 2015-04-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160123200A true KR20160123200A (en) | 2016-10-25 |
Family
ID=57446615
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150065395A KR20160123200A (en) | 2015-04-15 | 2015-05-11 | Touch input processing method and electronic device supporting the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160123200A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018128422A1 (en) * | 2017-01-06 | 2018-07-12 | 삼성전자 주식회사 | Method and apparatus for processing distortion of fingerprint image |
WO2019160173A1 (en) * | 2018-02-14 | 2019-08-22 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
CN112639805A (en) * | 2018-09-04 | 2021-04-09 | 三星电子株式会社 | Electronic device including fingerprint sensor in ultrasonic display and method of operating the same |
CN113591666A (en) * | 2021-07-26 | 2021-11-02 | 深圳创联时代电子商务有限公司 | Control method and device applied to mobile phone, computer readable medium and mobile phone |
WO2022050627A1 (en) * | 2020-09-01 | 2022-03-10 | 삼성전자 주식회사 | Electronic device comprising flexible display, and operation method thereof |
US11614829B2 (en) | 2020-12-04 | 2023-03-28 | Samsung Display Co., Ltd. | Display device and driving method thereof |
US11687167B2 (en) | 2019-08-30 | 2023-06-27 | Google Llc | Visual indicator for paused radar gestures |
US11790693B2 (en) | 2019-07-26 | 2023-10-17 | Google Llc | Authentication management through IMU and radar |
US11841933B2 (en) | 2019-06-26 | 2023-12-12 | Google Llc | Radar-based authentication status feedback |
US11868537B2 (en) | 2019-07-26 | 2024-01-09 | Google Llc | Robust radar-based gesture-recognition by user equipment |
US12008169B2 (en) | 2019-08-30 | 2024-06-11 | Google Llc | Radar gesture input methods for mobile devices |
WO2024177199A1 (en) * | 2023-02-20 | 2024-08-29 | 삼성메디슨 주식회사 | Ultrasound imaging device and operation method thereof |
US12093463B2 (en) | 2019-07-26 | 2024-09-17 | Google Llc | Context-sensitive control of radar-based gesture-recognition |
WO2024214912A1 (en) * | 2023-04-12 | 2024-10-17 | 삼성전자주식회사 | Electronic device for sensing contact with fluid via touch sensor |
-
2015
- 2015-05-11 KR KR1020150065395A patent/KR20160123200A/en unknown
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018128422A1 (en) * | 2017-01-06 | 2018-07-12 | 삼성전자 주식회사 | Method and apparatus for processing distortion of fingerprint image |
US11093776B2 (en) | 2017-01-06 | 2021-08-17 | Samsung Electronics Co., Ltd. | Method and apparatus for processing distortion of fingerprint image |
WO2019160173A1 (en) * | 2018-02-14 | 2019-08-22 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
CN112639805A (en) * | 2018-09-04 | 2021-04-09 | 三星电子株式会社 | Electronic device including fingerprint sensor in ultrasonic display and method of operating the same |
US11841933B2 (en) | 2019-06-26 | 2023-12-12 | Google Llc | Radar-based authentication status feedback |
US11790693B2 (en) | 2019-07-26 | 2023-10-17 | Google Llc | Authentication management through IMU and radar |
US11868537B2 (en) | 2019-07-26 | 2024-01-09 | Google Llc | Robust radar-based gesture-recognition by user equipment |
US12093463B2 (en) | 2019-07-26 | 2024-09-17 | Google Llc | Context-sensitive control of radar-based gesture-recognition |
US11687167B2 (en) | 2019-08-30 | 2023-06-27 | Google Llc | Visual indicator for paused radar gestures |
US12008169B2 (en) | 2019-08-30 | 2024-06-11 | Google Llc | Radar gesture input methods for mobile devices |
WO2022050627A1 (en) * | 2020-09-01 | 2022-03-10 | 삼성전자 주식회사 | Electronic device comprising flexible display, and operation method thereof |
US11614829B2 (en) | 2020-12-04 | 2023-03-28 | Samsung Display Co., Ltd. | Display device and driving method thereof |
CN113591666A (en) * | 2021-07-26 | 2021-11-02 | 深圳创联时代电子商务有限公司 | Control method and device applied to mobile phone, computer readable medium and mobile phone |
WO2024177199A1 (en) * | 2023-02-20 | 2024-08-29 | 삼성메디슨 주식회사 | Ultrasound imaging device and operation method thereof |
WO2024214912A1 (en) * | 2023-04-12 | 2024-10-17 | 삼성전자주식회사 | Electronic device for sensing contact with fluid via touch sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9904409B2 (en) | Touch input processing method that adjusts touch sensitivity based on the state of a touch object and electronic device for supporting the same | |
KR20160123200A (en) | Touch input processing method and electronic device supporting the same | |
KR102319803B1 (en) | Electronic device, operating method thereof and recording medium | |
KR102399557B1 (en) | Apparatus and method for obtaining coordinate through touch panel thereof | |
US10282019B2 (en) | Electronic device and method for processing gesture input | |
KR102413108B1 (en) | Method and electronic device for recognizing touch | |
KR20160104976A (en) | Touch module and electronic device and operating method thereof | |
KR102340480B1 (en) | Electronic device and method for controlling thereof | |
KR20180106527A (en) | Electronic device and method for identifying falsification of biometric information | |
US10551980B2 (en) | Electronic device and method for determining touch coordinate thereof | |
KR20160128872A (en) | Fingerprint information processing method and electronic device supporting the same | |
KR102398503B1 (en) | Electronic device for detecting pressure of input and operating method thereof | |
KR102386480B1 (en) | Apparatus and method for distinguishing input by external device thereof | |
KR102388590B1 (en) | Electronic device and method for inputting in electronic device | |
US20190324640A1 (en) | Electronic device for providing user interface according to electronic device usage environment and method therefor | |
KR20170046410A (en) | Method and eletronic device for providing user interface | |
KR20160128606A (en) | Device For Providing Shortcut User Interface and Method Thereof | |
KR20160036927A (en) | Method for reducing ghost touch and electronic device thereof | |
KR20160124536A (en) | Method and electronic apparatus for providing user interface | |
KR20170066050A (en) | Object notification method and electronic device supporting the same | |
EP3079046A1 (en) | Method and apparatus for operating sensor of electronic device | |
KR20160147432A (en) | Device For Controlling Respectively Multiple Areas of Display and Method thereof | |
KR102692984B1 (en) | Touch input processing method and electronic device supporting the same | |
KR20180014446A (en) | Electronic device and method for controlling touch screen display | |
KR20170086806A (en) | Electronic device and method for recognizing touch input using the same |