[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR20160123200A - Touch input processing method and electronic device supporting the same - Google Patents

Touch input processing method and electronic device supporting the same Download PDF

Info

Publication number
KR20160123200A
KR20160123200A KR1020150065395A KR20150065395A KR20160123200A KR 20160123200 A KR20160123200 A KR 20160123200A KR 1020150065395 A KR1020150065395 A KR 1020150065395A KR 20150065395 A KR20150065395 A KR 20150065395A KR 20160123200 A KR20160123200 A KR 20160123200A
Authority
KR
South Korea
Prior art keywords
touch
state
electronic device
sensor
information
Prior art date
Application number
KR1020150065395A
Other languages
Korean (ko)
Inventor
이성준
김민정
허원
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to US15/095,413 priority Critical patent/US9904409B2/en
Priority to EP16165404.1A priority patent/EP3082025A1/en
Priority to CN201610236856.8A priority patent/CN106055138A/en
Publication of KR20160123200A publication Critical patent/KR20160123200A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

1. An electronic device comprising: a first touch sensor for sensing a touch object to collect sensor information; a processor for determining a state of the touch object corresponding to the sensor information; and a controller for adjusting the touch sensitivity according to the state of the touch object. An electronic device comprising a two-touch sensor is disclosed. Various other embodiments are also possible which are known from the specification.

Description

TECHNICAL FIELD The present invention relates to a touch input processing method and an electronic device supporting the touch input processing method.

Various embodiments of the invention relate to a method of processing touch input.

Electronic devices support touch-based input devices such as touch screens, touch pads, or touch keys as part of a user interface (UI). The touch-type input device is implemented in various ways such as a capacitance type, a pressure reduction type, an infrared type, or an ultrasonic type. For example, the capacitive touch input method can recognize a touch by determining a change in capacitance caused by a conductive object such as a user's finger or a stylus pen.

However, the capacitive touch input method described above can malfunction when there is a factor that can change the electrostatic capacity of the conductive object (touch object). For example, in the case where a finger in a state in which water, sweat, or the like is in contact or a finger in the state of wearing gloves is contacted, the capacitive touch input device does not correctly recognize the touch of the touch object, It can be recognized as a contact point. In addition, a touch input method such as a pressure sensitive type, an infrared type, or an ultrasonic type may malfunction when water, sweat, or the like is stuck on a touch object such as a finger or gloves are worn.

Various embodiments of the present invention provide a method of processing a touch input according to a state of a touch object, determining a state of the touch object in response to sensor information according to approach or contact of the touch object, and an electronic device supporting the touch input .

An electronic device according to various embodiments of the present invention includes a first touch sensor for sensing a touch object to collect sensor information, a processor for determining a state of the touch object corresponding to the sensor information, And a second touch sensor in which the touch sensitivity is adjusted.

According to various embodiments of the present invention, malfunction of the touch input device can be prevented by processing the touch input according to the state of the touch object.

1 schematically shows a configuration of an electronic device related to touch input processing according to various embodiments.
2 shows a block diagram of an electronic device associated with touch input processing according to various embodiments.
3 illustrates a method of operating an electronic device associated with a method for processing a touch input in response to sensor information according to various embodiments.
4 illustrates a method of operating an electronic device related to a method of setting a touch function using sensor information according to various embodiments.
FIG. 5 illustrates a method of operating an electronic device related to a method of setting a touch function using a touch function selection object according to various embodiments.
FIG. 6 illustrates an embodiment of determining the state of a touch object based on a fingerprint sensor according to various embodiments.
7 shows a touch function selection object according to various embodiments.
FIG. 8 shows an embodiment for adjusting the touch sensitivity according to the state of the touch object according to various embodiments.
9 shows an embodiment for adjusting the output state of a display object according to the state of a touch object according to various embodiments.
10 illustrates a method of operating an electronic device associated with a method for processing a touch input based on a fingerprint sensor according to various embodiments.
11 shows a finger status event table according to various embodiments.
12 shows a diagram for explaining a finger status event corresponding to a fingerprint recognition event according to various embodiments.
13 illustrates a method of operating an electronic device associated with touch input processing according to various embodiments.
14 illustrates an electronic device in a network environment in accordance with various embodiments.
15 shows a block diagram of an electronic device according to various embodiments.
16 shows a block diagram of a program module according to various embodiments.

Hereinafter, various embodiments of the present document will be described with reference to the accompanying drawings. It should be understood, however, that this invention is not intended to be limited to the particular embodiments described herein but includes various modifications, equivalents, and / or alternatives of the embodiments of this document . In connection with the description of the drawings, like reference numerals may be used for similar components.

In this document, the expressions "have," "may," "include," or "include" may be used to denote the presence of a feature (eg, a numerical value, a function, Quot ;, and does not exclude the presence of additional features.

In this document, the expressions "A or B," "at least one of A and / or B," or "one or more of A and / or B," etc. may include all possible combinations of the listed items . For example, "A or B," "at least one of A and B," or "at least one of A or B" includes (1) at least one A, (2) Or (3) at least one A and at least one B all together.

The expressions "first," " second, "" first, " or "second ", etc. used in this document may describe various components, It is used to distinguish the components and does not limit the components. For example, the first user equipment and the second user equipment may represent different user equipment, regardless of order or importance. For example, without departing from the scope of the rights described in this document, the first component can be named as the second component, and similarly the second component can also be named as the first component.

(Or functionally or communicatively) coupled with / to "another component (eg, a second component), or a component (eg, a second component) Quot; connected to ", it is to be understood that any such element may be directly connected to the other element or may be connected through another element (e.g., a third element). On the other hand, when it is mentioned that a component (e.g., a first component) is "directly connected" or "directly connected" to another component (e.g., a second component) It can be understood that there is no other component (e.g., a third component) between other components.

As used herein, the phrase " configured to " (or set) to be "adapted to, " To be designed to, "" adapted to, "" made to, "or" capable of ". The term " configured to (or set up) "may not necessarily mean" specifically designed to "in hardware. Instead, in some situations, the expression "configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a generic-purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the other embodiments. The singular expressions may include plural expressions unless the context clearly dictates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as commonly understood by one of ordinary skill in the art. The general predefined terms used in this document may be interpreted in the same or similar sense as the contextual meanings of the related art and, unless expressly defined in this document, include ideally or excessively formal meanings . In some cases, even the terms defined in this document can not be construed as excluding the embodiments of this document.

An electronic device in accordance with various embodiments of the present document may be, for example, a smartphone, a tablet personal computer, a mobile phone, a video phone, an e-book reader, Such as a desktop personal computer, a laptop personal computer, a netbook computer, a workstation, a server, a personal digital assistant (PDA), a portable multimedia player (PMP) A device, a camera, or a wearable device. According to various embodiments, the wearable device may be of the accessory type (e.g., a watch, a ring, a bracelet, a bracelet, a necklace, a pair of glasses, a contact lens or a head-mounted-device (HMD) (E. G., Electronic apparel), a body attachment type (e. G., A skin pad or tattoo), or a bioimplantable type (e.g., implantable circuit).

In some embodiments, the electronic device may be a home appliance. Home appliances include, for example, televisions, digital video disc (DVD) players, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwaves, washing machines, air cleaners, set- Such as a home automation control panel, a security control panel, a TV box such as Samsung HomeSync TM , Apple TV TM or Google TV TM , a game console such as Xbox TM and PlayStation TM , , An electronic key, a camcorder, or an electronic frame.

In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) Navigation systems, global navigation satellite systems (GNSS), event data recorders (EDRs), flight data recorders (FDRs), infotainment (infotainment) systems, ) Automotive electronic equipment (eg marine navigation systems, gyro compass, etc.), avionics, security devices, head units for vehicles, industrial or home robots, automatic teller's machines (ATMs) Point of sale, or internet of things (eg, light bulbs, various sensors, electrical or gas meters, sprinkler devices, fire alarms, thermostats, street lights, A toaster, a fitness equipment, a hot water tank, a heater, a boiler, and the like).

According to some embodiments, the electronic device is a piece of furniture or a part of a building / structure, an electronic board, an electronic signature receiving device, a projector, Water, electricity, gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device may be a combination of one or more of the various devices described above. An electronic device according to some embodiments may be a flexible electronic device. Further, the electronic device according to the embodiment of the present document is not limited to the above-described devices, and may include a new electronic device according to technological advancement.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An electronic apparatus according to various embodiments will now be described with reference to the accompanying drawings. In this document, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).

1 schematically shows a configuration of an electronic device related to touch input processing according to various embodiments. The electronic device 100 may adjust the touch input processing method according to the state of the touch object in order to prevent a malfunction of the touch input that may occur due to an abnormal state of the touch object. For example, the electronic device 100 adjusts the touch input processing method to prevent a ghost touch phenomenon that may occur due to water or sweat on the touch object, that is, a phenomenon that the user perceives that an unintended area is touched .

In order to perform the above-described functions, the electronic device 100 collects sensor information according to approach or contact of the touch object, and analyzes the collected sensor information to determine the state of the touch object. In this regard, referring to FIG. 1, the electronic device 100 may include a first touch sensor 110, a second touch sensor 150, and a processor 130.

The first touch sensor 110 may collect sensor information corresponding to approach or contact of a touch object. For example, the first touch sensor 110 may collect touch information calculated by a change in capacitance or a change in pressure corresponding to approach or contact of the touch object. According to one embodiment, the first touch sensor 110 may include a fingerprint sensor (or a fingerprint sensor), a tactile sensor, or a pH concentration sensor. For example, when the first touch sensor 110 includes the fingerprint recognition sensor, the length, direction, or specific point of the ridges included in the user fingerprint (e.g., the point where the ridges are split, the ridges are connected to each other, The fingerprint information can be collected. In addition, when the first touch sensor 110 includes the tactile sensor, the first touch sensor 110 may collect contact state information such as the intensity, direction, or pressure distribution of the contact force corresponding to the touch of the touch object. According to various embodiments, when the first touch sensor 110 includes the pH concentration sensor, the first touch sensor 110 may collect information such as a pH concentration distribution on the contact surface of the touch object. In addition, the first touch sensor 110 may transmit the collected sensor information to the processor 130.

One or more sensors included in the first touch sensor 110 may be disposed in a certain area of the electronic device 100. [ For example, the fingerprint recognition sensor may be disposed at a front end lower end region or a rear end upper end region of the electronic device 100. According to various embodiments, the fingerprint recognition sensor may be configured to be included in a specific physical button included in the electronic device 100, such as a home button or a side button. The tactile sensor or the pH concentration sensor may be disposed in a predetermined area of the electronic device 100, similar to the fingerprint recognition sensor. According to one embodiment, the tactile sensor or the pH concentration sensor may be disposed at a position adjacent to the fingerprint recognition sensor.

The second touch sensor 150 may include the same or similar components as the first touch sensor 110. For example, the second touch sensor 150 may collect sensor information corresponding to approach or contact of a touch object. According to various embodiments, the second touch sensor 150 may be formed in the form of a panel and may be included in a touch screen panel (TSP).

The processor 130 may analyze the collected sensor information to determine the state of the touch object. According to one embodiment, the processor 130 may determine the state of the touch object by comparing the steady state information of the touch object with the sensor information. For example, the processor 130 compares the steady state information of the touch object with the collected sensor information, and can determine that the state of the touch object is in an abnormal state when the degree of similarity between the information is less than a predetermined ratio.

The steady state information is a state (normal state) in which no foreign matter (e.g., water, sweat, dust, sand, soil, or gloves) is detected between the first touch sensor 110 and the touch object based on the first touch sensor 110, And the sensor information collected from the sensor. For example, the steady state information may be fingerprint information of a user collected in a state in which the finger is not exposed to water, sweat, dust, sand, or soil based on the fingerprint recognition sensor, or in a state in which the user does not wear the glove. The steady state information may be contact state information, pH concentration distribution information, or touch information collected in the steady state. According to various embodiments, the steady state information may be information that is collected at a cold start of the electronic device 100 or at a particular point in time that meets certain conditions. For example, the steady state information may be information stored at the time of registering the fingerprint of the user in association with the use of the fingerprint recognition function.

When the state of the touch object is abnormal, the processor 130 can further classify the state of the touch object. For example, the processor 130 may classify the state of the touch object into at least one of a water film state, a contaminated film state, or a glove wearing state. In the water film state, a foreign matter including moisture such as water or sweat is adhered to a touch object, and the state of the contamination film may indicate a state where a foreign object not containing moisture such as dust, sand, have. If the user performs a touch operation with the glove worn, the user can be classified into the glove wearing state.

According to various embodiments, the processor 130 can finely classify the state of the touch object using the pattern information of the foreign object. According to one embodiment, the processor 130 compares the sensor information with the pattern information of the foreign object, and classifies the state of the touch object according to the kind of the foreign object when the degree of similarity between the information is equal to or greater than a certain ratio. For example, when the foreign object is a substance including water such as water or sweat, the processor 130 may determine the state of the touch object as a water film state. In addition, the processor 130 may determine that the state of the touch object is wearing the glove when the foreign matter is a material that can be used as a material of the glove, and the degree of similarity is a predetermined ratio (e.g., 90%) or more.

In this regard, the pattern information of the foreign object may be information obtained by sensing one or more sensors included in the first touch sensor 110. For example, the pattern information of the foreign object may be sensor information of the foreign object collected during the fingerprint recognition function by the fingerprint recognition sensor. In addition, the pattern information of the foreign matter may include information such as pH concentration information of the foreign matter, surface state information, electric conductivity, and the like.

According to various embodiments, the pattern information of the foreign matter may be the frequency spectrum information of the foreign matter. In this case, the processor 130 may convert the sensor information into frequency spectrum information and compare the frequency spectrum information with the foreign substance frequency spectrum information. If the degree of similarity between the frequency spectrum information is greater than a predetermined ratio, the processor 130 may classify the state of the touch object into a water film state, a contaminated film state, or a glove wearing state according to the type of the foreign matter.

According to various embodiments, the processor 130 can distinguish the touch input state from the hovering operation and the glove wearing state by determining the state of the touch object using the pattern information of the foreign object. For example, the processor 130 may convert sensor information resulting from the hovering operation into frequency spectrum information and compare it with frequency spectrum information of the gloves. In this case, the processor 130 can discriminate between the glove wearing state and the touch input state by the hovering operation by comparing the frequency of the specific frequency indicated by the frequency spectrum information or the frequency range having a predetermined size or more.

According to various embodiments, the processor 130 may fine-tune the state of the touch object through user input. According to one embodiment, the processor 130 may control the display of a screen for selecting the state of the touch object. For example, the processor 130 may include a touch including items such as a general function corresponding to a normal state of a touch object, a water film function corresponding to a water film state, a function of a contamination film corresponding to a state of a contaminated film, You can control to output the function selection object. In addition, the processor 130 may determine the state corresponding to the selected item as the state of the touch object.

The processor 130 may adjust the touch input processing method according to the state of the classified touch object. According to one embodiment, the processor 130 may adjust the touch sensitivity according to the state of the touch object. For example, when the touch object is judged to be in the water film state, the processor 130 may reduce the touch sensitivity and prevent a phenomenon that a substance including water such as water or sweat is recognized as a touch input. In addition, the processor 130 may adjust a touch area (a region in which a display object is recognized as being touched) of the display object according to the state of the touch object, and adjust the size or position of the display object corresponding thereto. For example, if the touch object is judged to be in the water film state, the processor 130 may adjust the touch area of the display object to a predetermined ratio and adjust the size of the display object to a predetermined ratio.

2 shows a block diagram of an electronic device 200 in connection with touch input processing in accordance with various embodiments. The electronic device 200 may be of the expanded form of the electronic device 100 of FIG.

Referring to FIG. 2, the electronic device 200 may include a sensor 210, a processor 220, a memory 230, a touch input device 280, and a display 290. According to one embodiment, the electronic device 200 may omit at least one of the components or additionally include other components. For example, the electronic device 200 may further include a voice output device.

The sensor 210 may have the same or similar configuration as the first touch sensor 110 of FIG. For example, the sensor 210 may be a biometric sensor, and may include at least one of a fingerprint recognition sensor, a tactile sensor, or a pH concentration sensor. The sensor 210 senses a touch object at a specific time point that satisfies a specific condition, and can transmit the collected sensor information to the processor 220. According to one embodiment, when the sensor 210 includes the fingerprint recognition sensor, the sensor 210 may transmit information collected during the fingerprint recognition function to the processor 220 to determine the finger state. According to various embodiments, the sensor 210 may include a touch sensor (e.g., a touch input device 280) configured in the form of a panel. In this case, the sensor 210 can transmit the collected touch information to the processor 220 at a specific time that satisfies a specific condition. For example, the sensor 210 may transmit the sensor information corresponding to the first touch operation of the sensed touch object to the processor 220 based on the touch sensor to determine the state of the touch object. Accordingly, the touch information sensed after the specific time is transmitted to the touch input device 280 and can be processed according to the state of the touch object.

According to various embodiments, the sensor 210 can update the state of the touch object by transmitting newly collected sensor information to the processor 220 at a point of time longer than a specified time based on the time when the state of the touch object is determined. For example, the fingerprint recognition sensor may collect the fingerprint information again and pass the fingerprint information to the processor 220 after a predetermined time based on the determination of the finger status. In this regard, the processor 220 may control the fingerprint recognition sensor to output a display object or voice information to guide the finger to approach or touch the finger. In addition, the touch sensor may transmit the sensed touch information to the processor 220 at a point of time longer than a predetermined time after the first touch operation.

According to various embodiments, the function of determining and updating the state of the touch object described above may be performed based on a plurality of sensors. For example, when the state of the touch object is first determined, a touch object state determination function is performed using the fingerprint information collected based on the fingerprint recognition sensor, and then, when the state of the touch object is updated, The touch object state update function can be performed using the touch information collected by the touch object. Accordingly, the user can omit the operation of accessing or contacting the finger with the fingerprint recognition sensor separately for updating the state of the touch object.

According to various embodiments, the sensor 210 may communicate to the processor 220 an event (e.g., a fingerprint recognition event) that occurs during the acquisition of the sensor information. For example, the sensor 210 may analyze an event (e.g., a fingerprint recognition success event) in response to the success of collection of the sensor information and transmit the analyzed event to the processor 220. According to various embodiments, the processor 220 may map and manage the event (e.g., a fingerprint recognition event) to a touch object state event (e.g., a finger state event) (or information corresponding thereto) . The processor 220 may also pass the mapped event (or a corresponding command) to the associated components (e.g., touch input device 280 or display 290, etc.).

The processor 220 may perform computations or data processing relating to control and / or communication of at least one other component included in the electronic device 200. According to one embodiment, the processor 220 may drive one or more application programs stored in the memory 230 to control a plurality of hardware or software components coupled to the processor 220, and may perform various data processing and operations. For example, the processor 220 may drive the touch input control program 231 stored in the memory 230 to perform the same or similar operations and functions as the processor 130 of FIG.

According to various embodiments, the processor 220 may activate the sensor 210. According to one embodiment, the processor 220 may activate a fingerprint recognition sensor in conjunction with performing a fingerprint recognition function. In addition, the processor 220 may activate the sensor 210 at a point in time when the electronic device 200 is started to be used, for example, when the screen of the electronic device 200 is changed to a turn-on state. In addition, the electronic device 200 can activate the sensor 210 at a time when a specific application program is executed or a specific application program requests.

According to various embodiments, the processor 220 can determine the state of the touch object by analyzing sensor information received from the sensor 210. [ According to one embodiment, the processor 220 can receive sensor information from the sensor 210 in connection with the implemented operation or function of the touch input control program 231 and analyze the received sensor information to determine the state of the touch object .

According to various embodiments, the processor 220 may communicate a designated touch object state event (or a corresponding command) to a corresponding component included in the electronic device 200 according to the determined state of the touch object. For example, the processor 220 may pass a hydro-state event (or a corresponding command) to the touch input device 280 or the display 290 when the touch object is in the water-state. In this regard, the touch object state event may be designated as a normal state event when the state of the touch object is a normal state, and may be designated as a water film state event when the state is a water film state. In addition, the touch object state event may be designated as a contamination film state event when the state of the touch object is a contaminated film state, or may be designated as a glove wearing state event when the object is wearing a glove.

Memory 230 may store instructions or data related to at least one other component of electronic device 200. According to one embodiment, the memory 230 may store the touch input control program 231. The touch input control program 231 may include a module, a program, a routine, a set of instructions or a process related to touch input processing.

According to various embodiments, the memory 230 may store steady state information of the touch object. For example, the memory 230 may store sensor information collected when the touch object is in a normal state based on the sensor 210. [ According to various embodiments, the memory 230 may store pattern information of various foreign objects. For example, the memory 230 may store sensor information obtained by sensing the foreign object (e.g., water, sweat, dust, sand, soil, gloves, etc.) based on the sensor 210. Also, the memory 230 may store the pH concentration, surface condition, electric conductivity, or frequency spectrum information of the foreign matter.

According to various embodiments, the memory 230 may store information related to the state of the touch object, in connection with performing an action or function implemented in the touch input control program 231. For example, the memory 230 stores display setting information such as a state of the determined touch object, a state determination point (a point at which the state of the touch object is determined), a level (sensitivity) of the adjusted touch sensitivity, And so on.

The touch input device 280 may collect sensor information corresponding to approach or contact of a touch object. The touch input device 280 may have the same or similar configuration as the second touch sensor 150 of FIG. The touch input device 280 may include a touch control device 240, a touch panel 250, and the like.

The touch control device 240 can perform control and data processing of the touch panel 250 in association with the touch recognition function. According to one embodiment, the touch control device 240 can set the touch sensitivity of the touch panel 250. For example, in the case of the capacitive touch input method, the touch control device 240 can set a range of capacitance change magnitudes processed by a valid touch input.

According to various embodiments, the touch control device 240 may receive touch information (e.g., touch coordinates, touch time, or touch intensity, etc.) from the touch panel 250. Also, the touch control device 240 can determine whether the touch input corresponding to the received touch information is a valid touch input. For example, the touch control device 240 may compare the touch intensity (e.g., capacitance change size) with a range of the touch intensity corresponding to the touch sensitivity to determine whether the touch input is a valid touch input. In addition, the touch control device 240 can transmit the touch information corresponding to the touch input determined to be a valid touch input to the processor 220.

According to various embodiments, the touch control device 240 may receive a touch object state event (or a corresponding command) from the processor 220 and set the touch function according to the touch object state event (or a corresponding command) have. For example, when the touch object state event is a normal state event, the touch control device 240 may set the touch function as a general function. If the touch object state event is a water film state event, a contamination film state event, The touch function can be set by a water film function, a contaminated membrane function, or a glove function, respectively.

In this regard, the touch function may include a function of determining the validity of the touch input by changing the touch sensitivity of the touch panel 250 according to the state of the touch object. According to one embodiment, when the touch function is set to the water film function, the touch control device 240 may reduce the touch sensitivity of the touch panel 250, thereby invalidating the touch input due to water, sweat or the like, or performing noise processing. For example, the touch control device 240 can increase the magnitude of capacitance change processed by a valid touch input. Accordingly, a substance, such as water or sweat, adhering to the touch object, which is capable of causing a change in capacitance on the touch panel 250, is dropped to a point adjacent to the touch object and is contacted to the touch panel 250, The touch control device 240 can invalidate or noise-process a change in capacitance caused by water, sweat or the like. Also, when the touch function is set to the function of the contamination film or the glove function, the touch control device 240 can increase the touch sensitivity of the touch panel 250. Accordingly, the touch control device 240 can process an effective touch input even if the magnitude of capacitance change corresponding to approach or contact of the touch object is low due to dust, sand, soil, gloves, or the like.

The touch panel 250 can perform a function of detecting approach or contact of a touch object by an electrostatic type, a pressure sensitive type, an infrared ray type, an ultrasonic wave type, or the like. The touch panel 250 may transmit touch information, for example, touch coordinates, touch time, or touch intensity corresponding to approach or touch of the sensed touch object to the touch control device 240. According to various embodiments, the touch panel 250 may be configured to include a touch control device 240.

According to various embodiments, the touch input device 280 may communicate the collected sensor information to the processor 220 in response to an approach or touch of a touch object. In this case, the processor 220 can determine the state of the touch object using the sensor information. Further, the processor 220 may transmit a touch object state event (or a corresponding command) corresponding to the state of the touch object to at least one of the sensor 210 and the touch input device 280. Upon receipt of the touch object state event (or a corresponding command) from the processor 220, the sensor 210 or the touch input device 280 displays the touch object state event (or a command corresponding thereto) (E.g., a point adjustment or a touch sensitivity setting) related to the performance of the function of the touch panel.

The display 290 can visually output data to the screen. For example, display 290 may display various content (e.g., text, images, video, icons, or symbols). The display 290 may include a display controller 260, a display panel 270, and the like.

The display control device 260 can perform control and data processing of the display panel 270 with respect to the display function. According to one embodiment, the display control device 260 can receive display object information from the processor 220 and output the display object information to the display panel 270.

According to various embodiments, the display control device 260 receives a touch object state event (or a corresponding command) from the processor 220 and, depending on the touch object state event (or a corresponding command) Can be controlled. According to one embodiment, the display control device 260 may control the display object to display and adjust the size or position of the display object according to the touch object state event (or a corresponding command). For example, when the touch object state event is a water film state event, the display control device 260 may control the size of the display object to be adjusted by a predetermined ratio.

According to various embodiments, the display control device 260 may receive information of a touch function selection object including items corresponding to various states of the touch object from the processor 220, and display the information on the display panel 270. For example, the display control device 260 includes items such as a general function corresponding to the steady state of the touch object, a water film function corresponding to the water film state, a function of the contamination film corresponding to the state of the contamination film, or a glove function corresponding to the wearing state of the glove The touch function selection object to be displayed on the display panel 270 can be controlled.

According to various embodiments, the display control device 260 may control the display panel 270 to display a touch object status icon (or image) indicating the status of the touch object. The display control device 260 may control to display the image of the touch object state icon differently according to the state of the touch object. For example, the display control device 260 may control to display the shape, color, or size of the image representing the touch object state icon differently according to the state of the touch object. According to one embodiment, the display control device 260 displays the touch object state icon in accordance with the state of the touch input processing function (turn-on state or turn-off state) , Transparency, background color, or the like. For example, when the touch input processing function is not used (turn-off state), the display control device 260 may not output the touch object state icon on the screen. In addition, when the touch input processing function is used (turn-on state), the display control device 260 displays the touch object state icon set in the image of different shape, color, or size according to the state of the touch object, The background color, and the like.

Display panel 270 may display a variety of content to the user, such as text, images, video, icons, symbols, or the like. The display panel 270 may be embodied as being flexible, transparent, or wearable. According to various embodiments, the display panel 270 may be configured to include a display controller 260. According to one embodiment, the display panel 270 may be composed of a touch panel 250 and a single module. In some embodiments, the touch input device 280 and the display 290 may be configured as a single module.

According to various embodiments, instead of adjusting the touch settings by communicating a touch object state event (or a corresponding command) to the touch control device 240 or the display control device 260, the processor 220 directly adjusts the touch sensitivity of the touch panel 250 Or adjust the size or position of the display object on the display panel 270 and display it. In addition, the processor 220 may control the touch function selection object or the touch object state icon (or the image) to be displayed on the display panel 270.

According to various embodiments, the electronic device 200 may further include a voice output device. In this case, the sound output apparatus can output sound information related to the state of the touch object. For example, the sound output apparatus may output voice information related to the state of the touch object at a time when the state of the touch object is determined or the touch input processing method is adjusted according to the state of the touch object.

As described above, according to various embodiments, an electronic device (e.g., electronic device 200) may include a first touch sensor (e.g., sensor 210) that senses a touch object and collects sensor information, A processor (e.g., processor 220) for determining the state of an object, and a second touch sensor (e.g., touch input device 280) whose touch sensitivity is adjusted according to the state of the touch object.

According to various embodiments, the first touch sensor may include at least one of a fingerprint recognition sensor, a tactile sensor, a pH concentration sensor, or a touch sensor.

According to various embodiments, the processor may further include: steady state information corresponding to a state in which a foreign object is not detected between the touch object and the first touch sensor; information obtained by sensing the foreign object based on the first touch sensor; The state of the touch object can be determined by comparing the pattern information of the foreign object including at least one of the pH concentration information of the foreign substance, the surface state information, the electric conductivity information, and the frequency spectrum information with the sensor information.

According to various embodiments, the processor may control to adjust the touch region of display objects output to the display (e.g., display 290) according to the state of the touch object.

According to various embodiments, the processor may control to change at least one of the size or position of the display objects to correspond to the touch region.

According to various embodiments, the processor can control to display on the display a touch function selection object that includes at least one item corresponding to the state of the touch object.

According to various embodiments, the processor may be configured to switch the touch function selection object from a state in which the screen of the electronic device is turned off to a turn-on state, a state in which the touch input processing function is turned off, A time at which a specific physical button included in the electronic device is selected, a time at which a specific application program included in the electronic device is executed or a time at which the specific application program requests a certain area, It is possible to control the touch object to be displayed at at least one of a time when the touch object moves within a predetermined distance in a predetermined direction in a predetermined direction or a time when the electronic device moves or swivels in a predetermined direction by a predetermined number of times have.

According to various embodiments, the processor updates the state of the touch object based on sensor information collected by sensing the touch object at a point of time after a predetermined time has elapsed based on a time point at which the state of the touch object is determined, And the second touch sensor can be adjusted in touch sensitivity according to the state of the updated touch object.

According to various embodiments, the processor may control to display an icon set in another image of at least one of the shape, color, or size according to the state of the touch object in a certain area of the display.

According to various embodiments, the processor may further include an output of an object including at least one of a text, an image, or an icon associated with the change of the touch object state when the state of the touch object is changed, And / or the output.

3 illustrates a method of operating an electronic device associated with a method for processing a touch input in response to sensor information according to various embodiments. According to various embodiments, the electronic device (e. G., Electronic device 200 of FIG. 2) may be configured to determine when the screen changes from turn-off to turn-on, (E.g., sensor 210 of FIG. 2) at the time of the request.

Referring to FIG. 3, as in operation 310, the electronic device may receive sensor information from the sensor corresponding to an approach or touch of a touch object. For example, the electronic device receives fingerprint information including information on the length, direction, or specific point of the ridges included in the user fingerprint from the fingerprint recognition sensor, or receives the fingerprint information from the tactile sensor, Or a distribution of pressure, or receives information such as a pH concentration distribution with respect to a contact surface of a touch object from a pH concentration sensor, or receives information from a pH sensor on contact or contact with a touch object It is possible to receive touch information that can be judged by a change in capacitance or a change in pressure.

With respect to the operation of receiving the sensor information, the electronic device may receive the sensor information at a specific time point that satisfies a specific condition. According to one embodiment, the electronic device can receive fingerprint information collected at the time of recognizing the user's fingerprint with respect to the fingerprint recognition function from the fingerprint recognition sensor. According to various embodiments, the electronic device may receive the sensor information corresponding to the first touch operation of the sensed touch object based on the touch sensor. In addition, the electronic device may receive the sensor information corresponding to the sensed touch operation at a point of time longer than a specified time after the first touch operation.

Upon receipt of the sensor information, the electronic device may analyze the sensor information to confirm the status of the touch object, as in operation 320. According to one embodiment, the electronic device can check the state of the touch object by comparing the sensor information with the steady state information of the touch object stored in the storage medium (e.g., the memory 230 of FIG. 2). For example, the electronic device compares fingerprint information received from the fingerprint recognition sensor with fingerprint information of a normal state stored in the storage medium (for example, fingerprint information collected in a state where no foreign matter is detected between the finger and the fingerprint recognition sensor) You can check your finger status.

According to various embodiments, the fingerprint recognition sensor can internally analyze the fingerprint information of the user to determine the state of the user's finger. Further, the fingerprint recognition sensor may transmit a finger status event (or information corresponding thereto) corresponding to the finger state of the user to the electronic device, and the electronic device may transmit the finger status event (or the corresponding information) The finger status can be confirmed. Alternatively, the fingerprint recognition sensor may transmit a fingerprint recognition event generated in the fingerprint information collection process of the user to the electronic device. In this case, the electronic device can manage the fingerprint recognition event by mapping it to a finger state event (or information corresponding thereto). In this process, the electronic device can confirm the finger state based on the finger state event (or information corresponding thereto).

According to various embodiments, if the state of the touch object is in a normal state, then operation 330 may be skipped. If the state of the touch object is not in a steady state, as in operation 330, the electronic device can fine-tune the state of the touch object. For example, the electronic device can classify the state of the touch object into a water film state, a contaminated film state, or a glove wearing state. In this regard, a method of finely classifying the state of the touch object may include a method of using the sensor information or a method of using the touch function selection object. The method of operating the electronic device related to the method of finely classifying the state of the touch object will be described with reference to the embodiments described later.

At operation 340, the electronic device may determine whether the state of the touch object has changed. For example, the electronic device can confirm the state of the touch object previously determined and stored in the storage medium. The electronic device may compare the state of the touch object stored in the storage medium with the state of the newly determined touch object to determine whether or not the object is changed. If the information related to the state of the touch object is not stored in the storage medium, the electronic device can store the state of the newly determined touch object, the time of state determination, and the like in the storage medium. In addition, even when the state of the touch object is changed, the electronic device can store the state of the newly determined touch object, the time of state determination, and the like in the storage medium.

If the state of the touch object is changed, as in operation 350, the electronic device can perform the setting of the touch function according to the state of the touch object. For example, the electronic device can set a touch function by a general function, a water film function, a contamination film function, or a glove function according to the state of a touch object. According to various embodiments, the electronic device may adjust the touch sensitivity of the touch input device (e.g., the touch input device 280 of FIG. 2) according to the set touch function. According to various embodiments, the electronic device may communicate information related to the set touch function to the touch input device. For example, the electronic device can transmit information related to the set touch function to the touch input device in the form of an instruction. The touch input device may perform a calibration operation (e.g., a touch point adjustment or a touch sensitivity setting) related to the performance of the touch input device according to the received information. Also, the electronic device can adjust the touch area of the display object according to the state of the touch object, and control the output state such as the size or position of the display object correspondingly.

If the state of the touch object is not changed, the electronic device can maintain the setting of the previously set touch function. For example, the electronic device can maintain the touch sensitivity of the touch input device as it is, and can maintain the output state of the display object as it is. As in operation 360, the electronic device may process the touch input in response to the approach or touch of the touch object detected later including the time when the sensor information is received according to the set touch function.

According to various embodiments, the electronic device may control to perform an operation after the operation 310 at a point in time that has elapsed after a predetermined time based on the state determination time stored in the storage medium. For example, the electronic device may receive sensor information corresponding to approach or contact of a new touch object, and may update the state of the touch object based on the received sensor information. Accordingly, the electronic device can more accurately determine the state change of the touch object, and can adjust the touch input processing method according to the change of the state of the touch object.

According to various embodiments, a method of finely categorizing the state of a touch object may include a method of using sensor information corresponding to approach or contact of the collected touch object based on a sensor (e.g., sensor 210 of FIG. 2) A method of using a touch function selection object that includes items corresponding to various states.

4 illustrates a method of operating an electronic device related to a method of setting a touch function using sensor information according to various embodiments.

Referring first to FIG. 4, as in operation 410, the electronic device can identify sensor information corresponding to an approach or touch of a collected touch object based on the sensor. For example, the sensor information may include fingerprint information collected based on a fingerprint sensor, contact state information collected based on a tactile sensor, pH concentration distribution information of a contact surface collected based on a pH concentration sensor, Touch information, and the like.

In operation 430, the electronic device may analyze the sensor information to fine-tune the state of the touch object. According to one embodiment, the electronic device can classify the state of the touch object by comparing the pattern information of various foreign substances stored in the storage medium (e.g., the memory 230 of FIG. 2) with the sensor information. For example, the electronic device can determine the degree of similarity between the pattern information of various foreign substances and the sensor information. Thus, the electronic device can classify the state of the touch object as a normal state, a water film state, a contaminated film state, or a glove wearing state.

Once the state of the touch object is classified, as in operation 450, the electronic device can specify a touch function according to the state of the touch object. For example, the electronic device can designate the touch function as a general function when the state of the touch object is in the normal state, and can designate it as the water film function in the case of the water film state. In addition, the electronic device can designate the touch function as the function of the contamination membrane when the state of the touch object is the contaminated film state, or as the function of the glove when wearing the glove.

If a touch function is designated, the electronic device can adjust the touch input processing method according to the designated touch function, as in operation 470. [ For example, the electronic device may change the touch settings. According to one embodiment, the electronic device can adjust the touch sensitivity of the touch input device (e.g., the touch input device 280 of FIG. 2) according to the touch function. For example, if the touch function is designated as a water film function, the electronic device can adjust the touch sensitivity of the touch input device to a low level to invalidate the touch input generated by water, sweat or the like, or perform noise processing. Further, when the touch function is designated as the function of the contamination film or the glove function, the electronic device adjusts the touch sensitivity of the touch input device to a high level so that the touch intensity corresponding to approach or contact of the touch object due to dust, sand, soil, Can be prevented from being recognized as a touch input. According to various embodiments, the electronic device may transmit information related to the touch function to the touch input device (e.g., in the form of an instruction), and the touch input device may perform functions of the touch input device It is possible to carry out the related calibration work. In some embodiments, the touch input device may change the algorithm and threshold for ghost touch recognition to operate internally according to the information being received.

According to various embodiments, the electronic device can adjust the touch region of the display objects according to the designated touch function. In addition, the electronic device can adjust the output state such as the size or the position of the display objects to correspond to the touch area. For example, if the touch function is designated as a water film function, the electronic device can display the size of the display objects at a predetermined ratio. As a result, the electronic device can prevent the unintentional region from being selected by spreading water or sweat on the touch object.

As described above, instead of the method of finely classifying the state of the touch object using the sensor information collected based on the sensor, the electronic device can receive the state of the touch object from the user. For example, the electronic device can guide the user to select the state of the touch object by displaying a touch function selection object including items corresponding to various states of the touch object on the screen.

FIG. 5 illustrates a method of operating an electronic device related to a method of setting a touch function using a touch function selection object according to various embodiments.

Referring to FIG. 5, as in operation 510, the electronic device may display the touch function selection object on the screen. In this case, the electronic device can receive the state of the touch object from the user through the touch function selection object. If the user input is not received for a certain period of time or the user input is not a valid selection (e.g., any one of the items of the touch function selection object is selected), the electronic device displays the output state of the touch function selection object . Alternatively, the electronic device may terminate the output of the touch function selection object and maintain the touch function as previously set information. According to various embodiments, the electronic device may display audio information that induces a user to select a state of a touch object, or to display an object including a text, an image, or an icon that prompts a user to select a state of the touch object It may be output through a voice output device.

According to various embodiments, the act of displaying the touch function selection object on the screen, as in operation 510, may be performed at a specific time point that satisfies a specific condition. According to one embodiment, the electronic device may be configured such that a time at which the screen changes from a turn-off state to a turn-on state, a time at which the touch input processing function changes from a turn-off state to a turn- When a certain application program is executed or when a specific application program requests a certain area of the screen is pressed by a touch object and the touch object is flicked within a predetermined distance in a predetermined direction , Or a point in time when the electronic device moves or rotates in a predetermined direction a predetermined number of times at a predetermined interval.

As in operation 530, the electronic device may receive a touch function selection event that is generated upon the valid selection. For example, the electronic device may receive a steady state event, a water film state event, a contaminated film state event, or a glove wearing state event according to the item. When the touch function selection event is received, as in operation 550, the electronic device can confirm the touch function corresponding to the touch function selection event. For example, the electronic device may be a general function if the touch function selection event is the steady state event, a water film function in the case of the water film state event, a contamination film function in the case of the contamination film state event, In case of gloves function can be confirmed.

Once the selected touch function is confirmed, as in operation 570, the electronic device can adjust the touch input processing method according to the selected touch function. For example, the electronic device may change the touch settings. According to one embodiment, the electronic device can adjust the touch sensitivity of the touch input device (e.g., the touch input device 280 of FIG. 2) according to the selected touch function. Also, the electronic device can adjust the touch area of the display objects according to the selected touch function, and adjust the output state of the display objects correspondingly.

As described above, according to various embodiments, a touch input processing method of an electronic device includes an operation of sensing a touch object based on a first touch sensor and collecting sensor information, a state of the touch object corresponding to the sensor information And an operation of adjusting the touch sensitivity of the second touch sensor according to the state of the touch object.

According to various embodiments, the act of collecting the sensor information may include collecting the sensor information corresponding to approach or touch of the touch object based on at least one of a fingerprint sensor, a tactile sensor, a pH concentration sensor, or a touch sensor Operation.

According to various embodiments, the operation of determining the state of the touch object may include steady state information corresponding to a state in which no foreign object is detected between the touch object and the first touch sensor, And comparing the sensor information with the pattern information of the foreign object including at least one of the information of sensing the foreign substance, the pH concentration information of the foreign matter, the surface state information, the electric conductivity information, and the frequency spectrum information.

According to various embodiments, the act of determining the state of the touch object may include controlling to display on the display a touch function selection object including at least one item corresponding to the state of the touch object.

According to various embodiments, the operation of controlling the display of the touch function selection object on the display may include a point of time when the screen of the electronic device changes from the turned-off state to the turn-on state, A time point at which a specific physical button included in the electronic device is selected, a time point at which a specific application program included in the electronic device is executed or a time point at which the specific application program requests, When the touch object is moved in a predetermined distance by a predetermined distance in a certain direction in a state where a certain area of the screen is pressed by the touch object or when the electronic device moves or rotates by a predetermined number of times The operation of controlling the display of at least one point in time Can.

According to various embodiments, the touch input processing method may further include an operation of controlling the touch area of the display objects output to the display according to the state of the touch object.

According to various embodiments, the operation of controlling the touch area may further include controlling to change at least one of the size or the position of the display objects to correspond to the touch area.

According to various embodiments, the touch input processing method includes updating the state of the touch object based on sensor information obtained by sensing the touch object at a point of time when a predetermined time has elapsed based on a time point at which the state of the touch object is determined And adjusting the touch sensitivity of the second touch sensor according to the updated state of the touch object.

According to various embodiments, the touch input processing method may further include an operation of displaying an icon set in at least one of a shape, a color, and a size in a predetermined region of the display according to the state of the touch object have.

According to various embodiments, the touch input processing method further includes a step of, when the state of the touch object changes, outputting an object including at least one of a text, an image, or an icon related to the change of the touch object state, And outputting the audio information.

FIG. 6 illustrates an embodiment of determining the state of a touch object based on a fingerprint sensor according to various embodiments.

Referring to FIG. 6, the electronic device 600 may include a physical button (or a physical panel) 610 having a fingerprint recognition sensor built therein. According to one embodiment, the electronic device 600 may support a user fingerprinting function in connection with user authentication. When an input signal for a turn-on state change occurs while the screen of the electronic device 600 is turned off, the electronic device 600 can activate the fingerprint recognition sensor. In addition, the electronic device 600 may direct a user's finger 630 to access or contact the physical button 610 with the fingerprint sensor. For example, the electronic device 600 may display an object such as a text, an image, or an icon that induces the use of the fingerprint recognition function on the screen, or may output voice information that induces the use of the fingerprint recognition function through the voice output device.

When the sensor information is collected based on the fingerprint recognition sensor, the electronic device 600 may analyze the sensor information to determine the state of the touch object. For example, the electronic device 600 can determine the state of the touch object as a normal state, a water film state, a contaminated film state, a glove wearing state, or the like. Also, the electronic device 600 can designate the touch function according to the state of the determined touch object. When the touch function is designated, the electronic device 600 can adjust the touch input processing method according to the designated touch function. For example, the electronic device 600 can adjust the touch sensitivity of the touch input device (e.g., the touch input device 280 of FIG. 2). Also, the electronic device 600 can adjust the touch area of the display object according to the designated touch function, thereby adjusting the output state of the display object.

According to various embodiments, the electronic device 600 may output information corresponding to the designated touch function according to the state of the determined touch object. For example, the electronic device 600 may output a notification object 670 including a text, an image, or an icon indicating that the designated touch function is set, to the screen. Also, the electronic device 600 can output voice information indicating that the designated touch function is set through the voice output device. According to various embodiments, the electronic device 600 may display the notification object 670 in a pop-up or screen switching manner. In the drawing, the notification object 670 is displayed in a pop-up form. The electronic device 600 can control the notification object 670 to remain displayed for a predetermined period of time or to terminate the screen output of the notification object 670 and output the previous screen when a user input occurs. According to one embodiment, when the notification object 670 is displayed as a pop-up, the electronic device 600 may display the color, transparency, size, or position of the notification object 670 differently at predetermined time intervals for a predetermined time.

According to various embodiments, the electronic device 600 may display the touch object status icon 650 in a certain area of the screen, e.g., an indicator bar. The electronic device 600 may display an image of the touch object state icon 650 differently according to the state of the touch object. For example, when the state of the touch object is the water film state, the electronic device 600 can display the touch object state icon 650 in a droplet shape image. In addition, the electronic device 600 may display the touch object status icon 650 in a glove-like image when the state of the touch object is in a glove wearing state.

According to various embodiments, the electronic device 600 may display the touch object state icon 650 in different display states, transparency, or background color depending on the state (turn-on state or turn-off state) of the touch input processing function . For example, when the touch input processing function is turned off and an input signal for changing to the turn-on state is generated, the electronic device 600 can display the touch object state icon 650 in a certain area of the screen. In addition, when an input signal for changing the touch input processing function from the turn-on state to the turn-off state occurs, the electronic device 600 does not output the touch object state icon 650 to the screen, The background color of the touch object state icon 650 and the like can be displayed differently.

According to various embodiments, a processor (e.g., processor 220 of FIG. 2) included in electronic device 600 may collect information based on the fingerprint recognition sensor or information that transforms the collected information into a touch control device (e.g., To the touch control device 240 of FIG. 2). The touch control device can adjust the touch sensitivity of the touch panel (e.g., the touch panel 250 of FIG. 2) using the information (the collected information or the converted information). In some embodiments, the processor communicates an event (e.g., a touch object state event) corresponding to the information (or a corresponding command) to the touch input device so that the touch input device can adjust the touch sensitivity Can be controlled. For example, the processor can determine whether the state of the touch object is the water film state based on the fingerprint sensor. In addition, the processor may control the touch input device to adjust the touch sensitivity of the touch input device by transmitting a water film status event corresponding to the water film state (or a corresponding command) to the touch input device.

7 shows a touch function selection object according to various embodiments. The electronic device 700 may display a touch function selection object that includes touch functions corresponding to various states of the touch object as items. For example, the electronic device 700 may display a touch function selection object that includes a general function, a water film function, a contamination film function, or a glove function as items.

Referring to FIG. 7, the electronic device 700 may display a touch function selection object in a submenu format in an upper menu (a menu displayed by dragging the upper area of the screen downward). For example, the electronic device 700 may display a submenu object 710 on the upper menu including at least one of text, images, or icons corresponding to the currently set touch function. The electronic device 700 may change the text, image, icon, or the like of the submenu object 710 according to the touch function. According to various embodiments, when the submenu object 710 is selected, the electronic device 700 can change the currently set touch function. For example, when the sub menu object 710 is selected, the electronic device 700 can change the currently set touch function to at least one of a general function, a water film function, a contamination film function, or a glove function. In this case, the electronic device 700 may change the text, image, icon, or the like of the submenu object 710 so as to correspond to the newly set touch function. The electronic device 700 may display the transparency, color, or background color of the submenu object 710 according to the state of the touch input processing function.

According to various embodiments, the electronic device 700 may display a touch function selection object in a pop-up format in a certain area of the screen. For example, the electronic device 700 may include a pop-up object 730 that includes text, images, icons, or the like corresponding to the touch functions (general function, water film function, Can be displayed. According to one embodiment, the electronic device 700 may configure the items as a button object or further include a button object so that the user can select any one of the items. In addition, when any one of the items is selected, the electronic device 700 can change the currently set touch function to a touch function corresponding to the selected item.

According to various embodiments, the electronic device 700 may terminate the output of the pop-up object 730 when the pop-up object 730 remains displayed for a period of time or when a user input is received. In this case, the electronic device 700 may process to return a resource associated with the pop-up object 730. According to one embodiment, when the pop-up object 730 remains displayed for a certain period of time and ends the screen output, or when the user input is not a valid selection (e.g., selecting an item included in the pop-up object 730) , The currently set touch function can be maintained and the popup object 730 can be processed to return to the screen before being displayed. According to various embodiments, the electronic device 700 may change the currently set touch function to the selected touch function if the user input is a valid selection. In this case, the electronic device 700 may terminate the screen output of the pop-up object 730 and output an object including a text, an image, or an icon indicating that the selected touch function has been changed to the screen. Also, the electronic device 700 may output voice information indicating that the selected touch function has been changed through the voice output device.

According to various embodiments, the electronic device 700 may display the touch function selection object in a screen switching manner. For example, the electronic device 700 can display and display an object such as a list including items of text, images, icons, or the like corresponding to the touch functions on the entire screen.

FIG. 8 shows an embodiment for adjusting the touch sensitivity according to the state of the touch object according to various embodiments.

Referring to FIG. 8, the electronic device 800 may include a touch panel 810. The touch panel 810 may be constituted by one or more cells having a lattice shape in which the screen display area of the electronic device 800 is divided into a vertical direction and a horizontal direction. For example, the touch panel 810 may include a plurality of cells occupying a predetermined area centering on a point where a plurality of vertical direction lines 811 and a plurality of horizontal direction lines 813 meet. In addition, the touch panel 810 can designate a point at which each vertical line 811 and each horizontal line 813 meet by touch coordinates corresponding to each cell.

According to various embodiments, when a touch operation is performed in a state where water or sweat is applied to the finger 830, a region 850 in which water or sweat may spread may be generated based on the region 831 where the finger 830 is contacted. The electronic device 800 can recognize the touch coordinates 833 included in the area 831 as touch coordinates corresponding to a valid touch input. In addition, the electronic device 800 may recognize the touch coordinates 851 included in the area 850 as touch coordinates corresponding to valid touch inputs. However, the touch coordinate 851 may be a touch coordinate that is not intended by the user. In addition, when the touch operation is performed in a state in which water or sweat is adhered, a region 870 in which water or sweat is dropped may occur. In this case, the electronic device 800 can recognize the touch coordinates 871 included in the area 870 as touch coordinates corresponding to a valid touch input.

As described above, in order to prevent the unintended touch coordinates from being recognized by the user, the electronic device 800 can adjust the touch input processing method according to the state of the touch object. For example, the electronic device 800 can adjust the touch sensitivity of the touch panel 810 according to the state of the touch object. According to one embodiment, the electronic device 800 can adjust the touch sensitivity of the touch panel 810 to a low level when the state of the touch object is determined to be a water film state. For example, the electronic device 800 may adjust the touch sensitivity so that it is processed with a valid touch input only when the magnitude of the capacitance change is greater than or equal to a specified magnitude. Accordingly, the electronic device 800 can process the capacitance change corresponding to the contact of the finger 830 with an effective touch input, and can invalidate or noise-process the capacitance change due to water, sweat, or the like. In addition, the electronic device 800 can process only the touch coordinates 833 as touch coordinates corresponding to effective touch inputs.

9 shows an embodiment for adjusting the output state of a display object according to the state of a touch object according to various embodiments. The electronic device 900 can further perform a method of adjusting the touch area of the display object as well as a method of adjusting the touch sensitivity according to the state of the touch object. Also, the electronic device 900 may adjust the output state of the display object to correspond to the touch region of the display object.

Referring to FIG. 9, the electronic device 900 can display the screen 910 by changing the screen 910 to the screen 930 according to the state of the touch object. The electronic device 900 can determine the state of the touch object and adjust the touch area of the display object when the state change is confirmed. Also, the electronic device 900 may change the output state of the display object to correspond to the touch region of the display object. For example, the electronic device 900 can display the screen 910 when the state of the touch object is in a normal state. Also, when the state of the touch object is changed to the water film state, the electronic device 900 can adjust the touch area of the display object displayed on the screen 910 to a large ratio. In this case, the electronic device 900 may output a screen 930 in which the size of the display object is adjusted to a predetermined ratio to correspond to the touch area of the display object. Through the above-described method, the electronic device 900 can control so as to recognize the touch input more accurately.

10 illustrates a method of operating an electronic device associated with a method for processing a touch input based on a fingerprint sensor according to various embodiments. According to various embodiments, an electronic device (e.g., electronic device 200 of FIG. 2) may process touch input (e.g., finger touch) sensed through the fingerprint recognition sensor differently depending on the specific function performing state of the electronic device . For example, the electronic device may process the touch input differently depending on whether the fingerprint recognition function is performed or a function of determining a state of a touch object (e.g., a finger).

Referring to FIG. 10, the electronic device may sense a finger touch based on the fingerprint recognition sensor, as in operation 1010. In addition, the electronic device may determine whether the fingerprint recognition function is in an execution state, as in operation 1020. [ According to one embodiment, at a point in time when the electronic device is turned on with the power turned off, the electronic device can perform a fingerprint recognition function. Further, the electronic device performs a fingerprint recognition function at a time point when the screen of the electronic device changes from a turned-off state to a turn-on state, or when a specific application program included in the electronic device is executed or a specific application program requests can do.

In the case of the fingerprint recognition function, as in operation 1030, the electronic device may perform a fingerprint recognition function based on the sensor information corresponding to the finger touch. For example, the electronic device may collect user fingerprint information corresponding to the sensor information (e.g., information about the length, direction, or specific point of the ridges included in the user fingerprint, etc.). Further, the electronic device can perform functions such as user authentication using the fingerprint information.

If the fingerprint recognition function is not performed, the electronic device can determine whether the fingerprint recognition function is being performed, as in operation 1040. [ In some embodiments, the electronic device may skip operation 1040 and perform operations below operation 1050 if the fingerprint recognition function is not in a running state. According to various embodiments, even when the fingerprint recognition function is in the performing state, the electronic device may perform operations at operation 1050 and below with the execution of operation 1030. [

According to various embodiments, the electronic device may include a specific object (e.g., an icon (or image), etc.) configured at a point in time when an action of a particular physical button (e.g., a home button or a power button) May be selected, or a function of determining a finger state, for example, when a specific application program is executed or when a specific application program requests it. For example, when a specific physical button is selected (e.g., pressed) a predetermined number of times within a specified time, or when a specific physical button is selected at the time when the electronic device completes performing the fingerprint recognition, the electronic device performs a finger state determination function can do. In addition, the electronic device can perform a finger state determination function at a time point when a specific object (e.g., the touch object state icon 650 in Fig. 6) functioning to change the state of the touch input processing function is selected. The electronic device may perform a finger state determination function at the time when a specific application program, for example, a health care application program, is executed.

According to various embodiments, the electronic device performs the fingerprint recognition function, such as operation 1030, based on the fingerprint recognition sensor, or at a predetermined time interval, using the information collected through the fingerprint recognition sensor, Operations can be performed. For example, when a specific fingerprint recognition event occurs during execution of the fingerprint recognition function, the electronic device transmits the fingerprint recognition event, the finger status event corresponding to the specific fingerprint recognition event, or the information collected based on the fingerprint recognition sensor to the processor (E.g., the processor 220 of FIG. 2) or a touch input device (e.g., the touch input device 280 of FIG. 2) to process the touch input. In some embodiments, the electronic device may store information collected based on the fingerprint recognition sensor in a memory (e.g., memory 230 of FIG. 2). In this case, the electronic device may determine the finger state based on the information stored in the memory at a specific point in time.

At operation 1050, the electronic device may verify finger status information corresponding to the sensor information. According to one embodiment, the electronic device may analyze the sensor information to determine a finger condition. Alternatively, the electronic device may check the finger status through a fingerprint recognition event occurring during the fingerprint recognition function or a finger status event corresponding to the fingerprint recognition event.

In operation 1060, the electronic device can set the touch function according to the finger state. For example, when the finger state is a water film state, a contaminated film state, or a glove wearing state, the electronic device can set the touch function with a water film function, a contaminated membrane function, or a glove function, respectively. According to various embodiments, the electronic device may communicate the finger status event (or information corresponding thereto) to the touch input device. In this case, the touch input device sets the touch function within the touch input device based on the finger state event (or information corresponding thereto) (e.g., performs a touch sensitivity adjustment or a ghost touch input removal function) . At operation 1070, the electronic device may process the touch input according to the set touch function.

11 shows a finger status event table according to various embodiments. According to various embodiments, an electronic device (e.g., electronic device 200 of FIG. 2) can verify the status of a touch object (e.g., a finger) through a touch object state event (e.g., a finger state event). For example, the electronic device may include a program (e.g., the touch input control program of Fig. 2) implemented to control the touch input process. The electronic device can confirm the state of the touch object through the touch object state event according to the processing routine implemented in the touch input control program. In this case, the touch input control program may include a touch object state event table (e.g., a finger state event table 1110) for specifying a state of a touch object corresponding to a touch object state event. According to one embodiment, the electronic device may store the touch object state event table in a memory (e.g., memory 230 in FIG. 2).

Referring to FIG. 11, the finger status event table 1110 may include event information corresponding to the finger status. For example, the finger status event table 1110 may include identifier information of an event specified according to the finger status. In addition, the finger status event table 1110 may include the operation status information of the sensor (e.g., the sensor 210 in FIG. 2). According to one embodiment, the finger status event table 1110 may include information related to hardware or software related to the sensor, interrupt information generated during sensor information collection corresponding to the finger status through the sensor, and the like. For example, the finger status event table 1110 may include identifier information of an event corresponding to error information or interrupt information of the sensor occurring during sensor information collection.

12 shows a diagram for explaining a finger status event corresponding to a fingerprint recognition event according to various embodiments.

Referring to FIG. 12, an electronic device (e.g., electronic device 200 of FIG. 2) may associate a fingerprint recognition event 1210 with a finger status event 1230 (or information corresponding thereto). According to one embodiment, a fingerprint recognition sensor (e.g., sensor 210 of FIG. 2) may communicate a fingerprint recognition event 1210 that occurs during the acquisition of sensor information to the electronic device. For example, if the fingerprint recognition sensor succeeds in the fingerprint recognition, the fingerprint recognition sensor transmits at least one of the sensor information or the fingerprint recognition success event corresponding to the success of the fingerprint recognition (e.g., the event whose identifier is designated as "STATUS_GOOD" E.g., processor 220 of FIG. 2). According to various embodiments, the electronic device may store the fingerprint recognition event 1210 in a memory (e.g., memory 230 in FIG. 2) along with the event occurrence time information.

According to various embodiments, the electronic device may manage the fingerprint recognition event 1210 by mapping it to a finger state event 1230 (or information corresponding thereto). For example, the electronic device responds to the occurrence of a fingerprint recognition success event (e.g., an event whose identifier is identified as "STATUS_GOOD" in the fingerprint recognition event 1210) as a finger state event in a normal state (e.g., STATUS_FINGERCONDITION_GOOD ") (or information corresponding thereto) or a finger state event in a dry state (e.g., an event specified by the identifier" STATUS_FINGERCONDITION_DRY "in the finger state event 1230) (or information corresponding thereto) Can be managed by mapping. According to various embodiments, the electronic device may provide the finger state event (or corresponding information) to at least one of a touch input device (e.g., touch input device 280 of FIG. 2) or a display (e.g., display 290 of FIG. 2) . For example, the electronic device responds to the occurrence of a wet state event (e.g., an event designated by the identifier "IMAGE_QUALITY_WET_FINGER") with a finger or the fingerprint recognition sensor of the fingerprint recognition event 1210, (Or an event corresponding to the event whose identifier is designated as "STATUS_FINGERCONDITION_WET ") among the 1230s. In addition, the electronic device can control to process the touch input by transmitting the wet finger state event (or information corresponding thereto) to at least one of the touch input device or the display.

According to various embodiments, the electronic device can control to process the touch input based on the fingerprint recognition event 1210 and the event occurrence time information stored in the memory. For example, the electronic device can identify the most recently stored fingerprint recognition event 1210 based on the event occurrence time information. If the storage time of the fingerprint recognition event 1210 does not exceed the designated time, the electronic device sends a finger state event 1230 (or corresponding information) corresponding to the fingerprint recognition event 1210 to at least one of the touch input device or the display To control the touch input to be processed. According to one embodiment, when the storage time of the fingerprint recognition event 1210 exceeds a specified time, the electronic device can control to output a display object or voice information that induces the fingerprint sensor to access or contact the finger have.

According to various embodiments, events (e.g., fingerprint recognition events 1210, etc.) and touch object state events (e.g., finger status events 1230, etc.) that occur during sensor information collection (e.g., sensor 210 of FIG. 2) And the corresponding information) can be performed in accordance with the processing routine implemented in the touch input control program (e.g., the touch input control program 231 in Fig. 2). Although the figure shows only the case where the electronic device maps the fingerprint recognition event 1210 of the fingerprint recognition sensor to the finger state event 1230 (or the corresponding information), in addition to this, And can be managed by mapping to a status event (e.g., finger status event 1230).

13 illustrates a method of operating an electronic device associated with touch input processing according to various embodiments.

Referring to FIG. 13, at operation 1310, an electronic device (e.g., electronic device 200 of FIG. 2) may acquire sensor information by sensing a touch object based on a sensor (e.g., sensor 210 of FIG. 2). For example, the sensor may collect sensor information corresponding to approach or contact of the touch object.

According to various embodiments, the sensor may communicate the collected sensor information to the electronic device. According to one embodiment, the sensor may communicate to an electronic device an event (e.g., a fingerprint recognition event 1210 in Figure 12) that occurs in association with the collection of the sensor information. In this case, the electronic device can manage the event by mapping the event to a touch object state event (e.g., finger state event 1230 in FIG. 12) (or information corresponding thereto).

In operation 1330, the electronic device may determine the state of the touch object based on the sensor information. For example, the electronic device may analyze the sensor information, and determine the state of the touch object by comparing the analyzed information with the steady-state information of the touch object and the pattern information of the foreign object. According to one embodiment, the electronic device may determine the state of the touch object based on the touch object state event (or information corresponding thereto).

At operation 1350, the electronic device may adjust the touch settings according to the determined state of the touch object. According to one embodiment, the electronic device can adjust the touch sensitivity of the touch input device (e.g., the touch input device 280 of FIG. 2) according to the state of the touch object. In addition, the electronic device can adjust the touch area of the display objects displayed on the display (e.g., the display 290 in FIG. 2) according to the state of the touch object, thereby adjusting the output state of the display object.

14 shows an electronic device 1401 in a network environment 1400 according to various embodiments.

14, an electronic device 1401 may include a bus 1410, a processor 1420, a memory 1430, an input / output interface 1450, a display 1460, and a communication interface 1470. In some embodiments, the electronic device 1401 may omit at least one of the components, or may additionally include other components.

The bus 1410 may include circuitry, for example, for connecting the components 1410-1470 to each other and for communicating (e.g., control messages and / or data) between the components.

Processor 1420 may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP). Processor 1420 may perform computations or data processing related to, for example, control and / or communication of at least one other component of electronic device 1401. [

Memory 1430 may include volatile and / or non-volatile memory. Memory 1430 may, for example, store instructions or data related to at least one other component of electronic device 1401. According to one embodiment, the memory 1430 may store software and / or programs 1440. The program 1440 may include, for example, a kernel 1441, a middleware 1443, an application programming interface (API) 1445, and / or an application program (or "application" At least a portion of the kernel 1441, middleware 1443, or API 1445 may be referred to as an operating system (OS).

The kernel 1441 may include, for example, system resources (e.g., bus 1410, processor 1420, etc.) used to execute an operation or function implemented in other programs (e.g., middleware 1443, API 1445, or application program 1447) Memory 1430, etc.). In addition, the kernel 1441 may provide an interface that can control or manage system resources by accessing individual components of the electronic device 1401 in the middleware 1443, API 1445, or application program 1447.

The middleware 1443 can perform an intermediary function, for example, such that the API 1445 or the application program 1447 communicates with the kernel 1441 to exchange data.

In addition, the middleware 1443 may process one or more task requests received from the application program 1447 according to the priority order. For example, middleware 1443 may prioritize the use of system resources (e.g., bus 1410, processor 1420, or memory 1430, etc.) of electronic device 1401 in at least one of application programs 1447. For example, the middleware 1443 can perform the scheduling or load balancing of the one or more task requests by processing the one or more task requests according to the priority assigned to the at least one task.

The API 1445 is an interface for the application 1447 to control the functions provided by the kernel 1441 or the middleware 1443. The API 1445 includes at least one interface for file control, window control, image processing, Or functions (e.g., commands).

The input / output interface 1450 may serve as an interface through which commands or data input from, for example, a user or other external device can be communicated to another component (s) of the electronic device 1401. The input / output interface 1450 may also output commands or data received from other component (s) of the electronic device 1401 to a user or other external device.

Display 1460 can be, for example, a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) A microelectromechanical systems (MEMS) display, or an electronic paper display. Display 1460 may display various content (e.g., text, images, video, icons, symbols, etc.) to a user, for example. Display 1460 may include a touch screen and may receive touch, gesture, proximity, or hovering input, for example, using an electronic pen or a portion of the user's body.

The communication interface 1470 may establish communication between the electronic device 1401 and an external device (e.g., the first external electronic device 1402, the second external electronic device 1404, or the server 1406). For example, the communication interface 1470 may be connected to the network 1462 via wireless or wired communication to communicate with an external device (e.g., a second external electronic device 1404 or server 1406).

Wireless communications may include, for example, cellular communication protocols such as long-term evolution (LTE), LTE Advance (LTE), code division multiple access (CDMA), wideband CDMA (WCDMA) mobile telecommunications system, WiBro (Wireless Broadband), or Global System for Mobile Communications (GSM). The wireless communication may also include, for example, local communication 1464. The local area communication 1464 may include at least one of, for example, wireless fidelity (WiFi), Bluetooth, near field communication (NFC), or global navigation satellite system (GNSS). GNSS can be classified into two types according to the use area or bandwidth, for example, Global Positioning System (GPS), Global Navigation Satellite System (Glonass), Beidou Navigation Satellite System (Beidou) And may include at least one. Hereinafter, in this document, "GPS" can be interchangeably used with "GNSS ". The wired communication may include at least one of, for example, a universal serial bus (USB), a high definition multimedia interface (HDMI), a recommended standard 232 (RS-232), or plain old telephone service (POTS). The network 1462 may include at least one of a telecommunications network, e.g., a computer network (e.g., a LAN or WAN), the Internet, or a telephone network.

Each of the first and second external electronic devices 1402, 1404 may be the same or different kind of device as the electronic device 1401. According to one embodiment, the server 1406 may include one or more groups of servers. According to various embodiments, all or a portion of the operations performed on the electronic device 1401 may be performed on one or more other electronic devices (e.g., electronic device 1402, 1404, or server 1406). According to one embodiment, in the event that the electronic device 1401 has to perform some function or service automatically or upon request, the electronic device 1401 may, instead of or in addition to executing the function or service itself, (E.g., electronic device 1402, 1404, or server 1406). Other electronic devices (e.g., electronic device 1402, 1404, or server 1406) may execute the requested function or additional function and forward the results to electronic device 1401. The electronic device 1401 can directly or additionally process the received result to provide the requested function or service. For this purpose, for example, cloud computing, distributed computing, or client-server computing technology may be used.

15 is a block diagram of an electronic device 1501 according to various embodiments. The electronic device 1501 may include all or part of the electronic device 1401 shown in Fig. 14, for example. The electronic device 1501 may include one or more processors (e.g., APs) 1510, a communication module 1520, a subscriber identification module 1524, a memory 1530, a sensor module 1540, an input device 1550, a display 1560, an interface 1570, an audio module 1580, 1591, a power management module 1595, a battery 1596, an indicator 1597, and a motor 1598.

The processor 1510 may, for example, operate an operating system or an application program to control a number of hardware or software components coupled to the processor 1510, and may perform various data processing and operations. The processor 1510 may be implemented with, for example, a system on chip (SoC). According to one embodiment, the processor 1510 may further include a graphics processing unit (GPU) and / or an image signal processor. Processor 1510 may include at least a portion (e.g., cellular module 1521) of the components shown in FIG. Processor 1510 may load and process instructions or data received from at least one of the other components (e.g., non-volatile memory) into volatile memory and store the various data in non-volatile memory.

The communication module 1520 may have the same or similar configuration as the communication interface 1470 of Fig. The communication module 1520 may include, for example, a cellular module 1521, a WiFi module 1523, a Bluetooth module 1525, a GNSS module 1527 (e.g., a GPS module, a Glonass module, a Beidou module or a Galileo module), an NFC module 1528, Module 1529. < / RTI >

The cellular module 1521 can provide voice calls, video calls, text services, or Internet services, for example, over a communication network. According to one embodiment, the cellular module 1521 may utilize a subscriber identity module (e.g., a SIM card) 1524 to perform the identification and authentication of the electronic device 1501 within the communication network. According to one embodiment, the cellular module 1521 may perform at least some of the functions that the processor 1510 may provide. According to one embodiment, the cellular module 1521 may include a communication processor (CP).

Each of the WiFi module 1523, the Bluetooth module 1525, the GNSS module 1527, or the NFC module 1528 may include, for example, a processor for processing data transmitted and received through a corresponding module. According to some embodiments, at least some (e.g., two or more) of the cellular module 1521, the WiFi module 1523, the Bluetooth module 1525, the GNSS module 1527, or the NFC module 1528 may be included in one integrated chip (IC) .

The RF module 1529 can transmit and receive a communication signal (e.g., an RF signal), for example. The RF module 1529 may include, for example, a transceiver, a power amplifier module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna. According to another embodiment, at least one of the cellular module 1521, the WiFi module 1523, the Bluetooth module 1525, the GNSS module 1527 or the NFC module 1528 can transmit and receive an RF signal through a separate RF module.

The subscriber identity module 1524 may include, for example, a card containing a subscriber identity module and / or an embedded SIM and may include unique identification information (e.g., an integrated circuit card identifier (ICCID) (E.g., international mobile subscriber identity (IMSI)).

Memory 1530 (e.g., memory 1430) may include, for example, internal memory 1532 or external memory 1534. The built-in memory 1532 may be a volatile memory such as a dynamic RAM (DRAM), a static random access memory (SRAM), or a synchronous dynamic RAM (SDRAM), a non-volatile memory such as an OTPROM one time programmable ROM), programmable ROM (PROM), erasable and programmable ROM (EPROM), electrically erasable and programmable ROM (EEPROM), mask ROM, flash ROM, flash memory (e.g. NAND flash or NOR flash) , Or a solid state drive (SSD).

The external memory 1534 may be a flash drive, for example, a compact flash (CF), a secure digital (SD), a micro secure digital (SD-SD), a mini secure digital (SD) , A multi-media card (MMC), a memory stick, or the like. The external memory 1534 may be functionally and / or physically connected to the electronic device 1501 through various interfaces.

The sensor module 1540 may, for example, measure a physical quantity or sense an operating state of the electronic device 1501 and convert the measured or sensed information into an electrical signal. The sensor module 1540 includes a gesture sensor 1540A, a gyro sensor 1540B, an air pressure sensor 1540C, a magnetic sensor 1540D, an acceleration sensor 1540E, a grip sensor 1540F, a proximity sensor 1540G, a color sensor 1540H green, and blue sensors), a biological sensor 1540I, a temperature / humidity sensor 1540J, an illuminance sensor 1540K, or an ultraviolet (UV) sensor 1540M. Additionally or alternatively, the sensor module 1540 can be, for example, an E-nose sensor, an electromyography sensor, an electroencephalogram sensor, an electrocardiogram sensor, an IR an infrared sensor, an iris sensor, and / or a fingerprint sensor. The sensor module 1540 may further include a control circuit for controlling at least one sensor belonging to the sensor module 1540. In some embodiments, the electronic device 1501 may further include a processor configured to control the sensor module 1540, either as part of the processor 1510 or separately, to control the sensor module 1540 while the processor 1510 is in a sleep state .

The input device 1550 may include, for example, a touch panel 1552, a (digital) pen sensor 1554, a key 1556, or an ultrasonic input device 1558. The touch panel 1552 can employ, for example, at least one of an electrostatic type, a pressure sensitive type, an infrared type, and an ultrasonic type. Further, the touch panel 1552 may further include a control circuit. The touch panel 1552 may further include a tactile layer to provide a tactile response to the user.

(Digital) pen sensor 1554 may be, for example, part of a touch panel or may include a separate recognition sheet. Key 1556 may include, for example, a physical button, an optical key, or a keypad. The ultrasonic input device 1558 can detect the ultrasonic wave generated by the input tool through the microphone (e.g., the microphone 1588) and confirm the data corresponding to the ultrasonic wave detected.

Display 1560 (e.g., display 1460) may include a panel 1562, a hologram device 1564, or a projector 1566. Panel 1562 may include the same or similar configuration as display 1460 of FIG. The panel 1562 can be embodied, for example, flexible, transparent, or wearable. The panel 1562 may be composed of a touch panel 1552 and one module. The hologram device 1564 can display stereoscopic images in the air using the interference of light. The projector 1566 can display images by projecting light onto the screen. The screen may, for example, be located inside or outside the electronic device 1501. According to one embodiment, the display 1560 may further include control circuitry for controlling the panel 1562, the hologram device 1564, or the projector 1566.

The interface 1570 may include, for example, a high-definition multimedia interface (HDMI) 1572, a universal serial bus (USB) 1574, an optical interface 1576, or a D-sub (D-subminiature) 1578. The interface 1570 may be included in the communication interface 1470 shown in Fig. 14, for example. Additionally or alternatively, the interface 1570 can be, for example, a mobile high-definition link (MHL) interface, a secure digital (SD) card / multi-media card (MMC) ) Standard interface.

Audio module 1580 can, for example, convert sound and electrical signals in both directions. At least some of the components of the audio module 1580 may be included in, for example, the input / output interface 1450 shown in FIG. The audio module 1580 can process sound information input or output through, for example, a speaker 1582, a receiver 1584, an earphone 1586, a microphone 1588, or the like.

The camera module 1591 is, for example, a device capable of capturing still images and moving images, and according to one embodiment, one or more image sensors (e.g., a front sensor or a rear sensor), a lens, an image signal processor And may include a flash (e.g., LED or xenon lamp).

The power management module 1595 can manage the power of the electronic device 1501, for example. According to one embodiment, the power management module 1595 may include a power management integrated circuit (PMIC), a charger integrated circuit ("IC"), or a battery or fuel gauge. The PMIC may have a wired and / or wireless charging scheme. The wireless charging scheme may include, for example, a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic wave scheme, and may further include an additional circuit for wireless charging, for example, a coil loop, a resonant circuit, have. The battery gauge can measure, for example, the remaining amount of the battery 1596, the voltage during charging, the current, or the temperature. The battery 1596 may include, for example, a rechargeable battery and / or a solar battery.

The indicator 1597 may indicate a particular state of the electronic device 1501 or a portion thereof (e.g., processor 1510), such as a boot state, a message state, or a charged state. The motor 1598 can convert an electrical signal into mechanical vibration and can generate vibration, haptic effects, and the like. Although not shown, the electronic device 1501 may include a processing unit (e.g., a GPU) for mobile TV support. The processing unit for supporting the mobile TV can process media data conforming to standards such as digital multimedia broadcasting (DMB), digital video broadcasting (DVB), or media flow ( TM ).

Each of the components described in this document may be composed of one or more components, and the name of the component may be changed according to the type of the electronic device. In various embodiments, the electronic device may comprise at least one of the components described herein, some components may be omitted, or may further include additional other components. In addition, some of the components of the electronic device according to various embodiments may be combined into one entity, so that the functions of the components before being combined can be performed in the same manner.

16 is a block diagram of a program module in accordance with various embodiments. According to one embodiment, the program module 1610 (e.g., program 1440) includes an operating system (OS) that controls resources associated with an electronic device (e.g., electronic device 1401) and / (E.g., an application program 1447). The operating system may be, for example, android, iOS, windows, symbian, tizen, or bada.

The program module 1610 may include a kernel 1620, a middleware 1630, an application programming interface (API) 1660, and / or an application 1670. At least a portion of the program module 1610 may be preloaded on an electronic device or downloaded from an external electronic device (e.g., electronic device 1402, 1404, server 1406, etc.).

The kernel 1620 (e.g., kernel 1441) may include, for example, a system resource manager 1621 and / or a device driver 1623. The system resource manager 1621 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 1621 may include a process manager, a memory manager, or a file system manager. The device driver 1623 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication (IPC) driver.

The middleware 1630 may provide various functions to the application 1670 through the API 1660, for example, to provide functions that are commonly needed by the application 1670, or to allow the application 1670 to efficiently use limited system resources within the electronic device have. According to one embodiment, the middleware 1630 (e.g., middleware 1443) includes a runtime library 1635, an application manager 1641, a window manager 1642, a multimedia manager 1643, a resource manager 1644 A power manager 1645, a database manager 1646, a package manager 1647, a connectivity manager 1648, a notification manager 1649, a location manager 1650, A graphical manager 1651, or a security manager 1652. [

The runtime library 1635 may include, for example, a library module used by the compiler to add new functionality via a programming language while the application 1670 is running. The runtime library 1635 may perform input / output management, memory management, or functions for arithmetic functions.

The application manager 1641 may, for example, manage the life cycle of at least one of the applications 1670. The window manager 1642 can manage the GUI resources used on the screen. The multimedia manager 1643 can identify the format required for playback of various media files and can encode or decode a media file using a codec suitable for the format. The resource manager 1644 can manage resources such as source code, memory or storage space of at least one of the applications 1670.

The power manager 1645 operates in conjunction with a basic input / output system (BIOS), for example, to manage a battery or a power source, and to provide power information necessary for the operation of the electronic device. The database manager 1646 may create, retrieve, or modify a database to be used in at least one of the applications 1670. The package manager 1647 can manage installation or update of an application distributed in the form of a package file.

The connection manager 1648 can manage wireless connections, such as, for example, WiFi or Bluetooth. The notification manager 1649 may display or notify events such as arrival messages, appointments, proximity notifications, etc. in a manner that is not disturbed to the user. The location manager 1650 can manage the location information of the electronic device. The graphic manager 1651 can manage the graphical effect to be provided to the user or a related user interface. The security manager 1652 can provide all the security functions necessary for system security or user authentication. According to one embodiment, when an electronic device (e.g., electronic device 1401) includes a telephone function, middleware 1630 may further include a telephony manager for managing the voice or video call capabilities of the electronic device.

Middleware 1630 may include a middleware module that forms a combination of various functions of the above-described components. The middleware 1630 can provide a module specialized for each type of operating system to provide differentiated functions. In addition, the middleware 1630 may dynamically delete some existing components or add new ones.

API 1660 (e.g., API 1445) is a collection of API programming functions, for example, and can be provided in different configurations depending on the operating system. For example, for Android or iOS, you can provide one API set per platform, and for tizen, you can provide more than two API sets per platform.

An application 1670 (e.g., an application program 1447) may include, for example, a home 1671, a dialer 1672, an SMS / MMS 1673, an instant message 1674, a browser 1675, a camera 1676, an alarm 1677, 1680, a calendar 1681, a media player 1682, an album 1683, or a clock 1684, providing health care (e.g., measuring exercise or blood glucose), or providing environmental information (e.g., pressure, humidity, And the like) capable of performing the functions of the < / RTI >

According to one embodiment, an application 1670 is an application that supports the exchange of information between an electronic device (e.g., electronic device 1401) and an external electronic device (e.g., electronic device 1402, 1404) Application "). The information exchange application may include, for example, a notification relay application for communicating specific information to an external electronic device, or a device management application for managing an external electronic device.

For example, the notification delivery application may send notification information generated in other applications (e.g., SMS / MMS applications, email applications, healthcare applications, or environmental information applications) of the electronic device to external electronic devices 1404 via a network. Further, the notification delivery application can receive notification information from, for example, an external electronic device and provide it to the user.

The device management application may be used to control at least one function (e.g., turn-on / turn-off) of an external electronic device (e.g., (E. G., Installing, deleting, or otherwise) managing services provided by an external electronic device or external electronic device (e. G., A call service or message service) Update).

According to one embodiment, the application 1670 may include an application (e.g., a healthcare application of a mobile medical device, etc.) designated according to attributes of an external electronic device (e.g., electronic device 1402, 1404). According to one embodiment, the application 1670 may include an application received from an external electronic device (e.g., server 1406 or electronic devices 1402, 1404). According to one embodiment, application 1670 may include a preloaded application or a third party application downloadable from a server. The names of the components of the program module 1610 according to the illustrated embodiment may vary depending on the type of the operating system.

According to various embodiments, at least some of the program modules 1610 may be implemented in software, firmware, hardware, or a combination of at least two of them. At least some of the program modules 1610 may be implemented (e.g., executed) by, for example, a processor (e.g., processor 1510). At least some of the program modules 1610 may include, for example, modules, programs, routines, sets of instructions or processes, etc., to perform one or more functions.

As used in this document, the term "module" may refer to a unit comprising, for example, one or a combination of two or more of hardware, software or firmware. A "module" may be interchangeably used with terms such as, for example, unit, logic, logical block, component, or circuit. A "module" may be a minimum unit or a portion of an integrally constructed component. A "module" may be a minimum unit or a portion thereof that performs one or more functions. "Modules" may be implemented either mechanically or electronically. For example, a "module" may be an application-specific integrated circuit (ASIC) chip, field-programmable gate arrays (FPGAs) or programmable-logic devices And may include at least one.

At least a portion of a device (e.g., modules or functions thereof) or a method (e.g., operations) according to various embodiments may include, for example, computer-readable storage media in the form of program modules, As shown in FIG. When the instruction is executed by a processor (e.g., processor 1420), the one or more processors may perform a function corresponding to the instruction. The computer-readable storage medium may be, for example, a memory 1430.

The computer readable recording medium may be a hard disk, a floppy disk, a magnetic media (e.g., a magnetic tape), an optical media (e.g., a compact disc read only memory (CD-ROM) digital versatile discs, magneto-optical media such as floptical disks, hardware devices such as read only memory (ROM), random access memory (RAM) Etc. The program instructions may also include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter, etc. The above- May be configured to operate as one or more software modules to perform the operations of the embodiment, and vice versa.

Modules or program modules according to various embodiments may include at least one or more of the elements described above, some of which may be omitted, or may further include additional other elements. Operations performed by modules, program modules, or other components in accordance with various embodiments may be performed in a sequential, parallel, iterative, or heuristic manner. Also, some operations may be performed in a different order, omitted, or other operations may be added. And the embodiments disclosed in this document are presented for the purpose of explanation and understanding of the disclosed technology and do not limit the scope of the technology described in this document. Accordingly, the scope of this document should be interpreted to include all modifications based on the technical idea of this document or various other embodiments.

Claims (20)

In an electronic device,
A first touch sensor that senses a touch object and collects sensor information,
A processor for determining a state of the touch object corresponding to the sensor information,
And a second touch sensor whose touch sensitivity is adjusted according to the state of the touch object.
The method according to claim 1,
Wherein the first touch sensor includes at least one of a fingerprint recognition sensor, a tactile sensor, a pH concentration sensor, or a touch sensor.
The method according to claim 1,
The processor
Steady state information corresponding to a state in which no foreign matter is detected between the touch object and the first touch sensor,
Wherein the sensor information includes at least one of sensing information of the foreign substance based on the first touch sensor, pH concentration information of the foreign matter, surface state information, electric conductivity information, and frequency spectrum information, To determine the state of the touch object.
The method according to claim 1,
The processor
And controls the touch area of the display objects output to the display according to the state of the touch object.
The method of claim 4,
The processor
And changes at least one of a size or a position of the display objects to correspond to the touch area.
The method according to claim 1,
The processor
And a touch function selection object including at least one item corresponding to the state of the touch object.
The method of claim 6,
The processor
A point of time when the screen of the electronic device changes from a turn-off state to a turn-on state, a time point when the touch input processing function is turned off to a turn-on state, When a specific application program included in the electronic device is executed or when a specific application program requests a certain object, a certain area of the screen is pressed by the touch object, Or at a time point at which the electronic device moves or rotates in a predetermined direction a predetermined number of times at a predetermined interval.
The method according to claim 1,
The processor
Wherein the state of the touch object is updated based on sensor information obtained by sensing the touch object at a point of time that has elapsed from a point of time when the state of the touch object is determined,
The second touch sensor
And the touch sensitivity is adjusted according to the updated state of the touch object.
The method according to claim 1,
The processor
And controls the display unit to display an icon having at least one of a shape, a color, and a size set to another image according to the state of the touch object in a predetermined area of the display.
The method according to claim 1,
The processor
An output of an object including at least one of a text, an image, and an icon related to the change of the touch object state or an output of voice information associated with the change of the touch object state when the state of the touch object changes, .
A method of processing a touch input of an electronic device,
An operation of sensing the touch object based on the first touch sensor to collect sensor information,
Determining the state of the touch object corresponding to the sensor information, and
And adjusting a touch sensitivity of the second touch sensor according to a state of the touch object.
The method of claim 11,
The operation of collecting the sensor information
And collecting the sensor information corresponding to approach or contact of the touch object based on at least one of a fingerprint recognition sensor, a tactile sensor, a pH concentration sensor, or a touch sensor.
The method of claim 11,
The operation of determining the state of the touch object
Steady state information corresponding to a state in which no foreign matter is detected between the touch object and the first touch sensor,
Wherein the sensor information includes at least one of sensing information of the foreign substance based on the first touch sensor, pH concentration information of the foreign matter, surface state information, electric conductivity information, and frequency spectrum information, And comparing the touch input with the touch input.
The method of claim 11,
The operation of determining the state of the touch object
And controlling to display on the display a touch function selection object including at least one item corresponding to the state of the touch object.
15. The method of claim 14,
The operation of controlling to display the touch function selection object on the display
A point of time when the screen of the electronic device changes from a turn-off state to a turn-on state, a time point when the touch input processing function is turned off to a turn-on state, When a specific application program included in the electronic device is executed or when a specific application program requests a certain object, a certain area of the screen is pressed by the touch object, To the at least one of a time when the electronic device is moved within a predetermined distance by a predetermined distance or a time when the electronic device moves or swivels in a predetermined direction a predetermined number of times at a predetermined interval.
The method of claim 11,
The touch input processing method
And controlling the touch region of the display objects to be displayed on the display according to the state of the touch object.
18. The method of claim 16,
The operation of controlling to adjust the touch area
And controlling to change at least one of a size and a position of the display objects to correspond to the touch area.
The method of claim 11,
The touch input processing method
Updating the state of the touch object based on sensor information obtained by sensing the touch object at a point of time when a predetermined time has elapsed based on a time at which the state of the touch object is determined;
And adjusting the touch sensitivity of the second touch sensor according to the updated state of the touch object.
The method of claim 11,
The touch input processing method
And controlling the display unit to display an icon having at least one of a shape, a color, and a size set to a different image according to a state of the touch object in a predetermined area of the display.
The method of claim 11,
The touch input processing method
An output of an object including at least one of a text, an image, and an icon related to the change of the touch object state or an output of voice information related to the change of the touch object state when the state of the touch object changes, Wherein the touch input processing method further includes:
KR1020150065395A 2015-04-15 2015-05-11 Touch input processing method and electronic device supporting the same KR20160123200A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/095,413 US9904409B2 (en) 2015-04-15 2016-04-11 Touch input processing method that adjusts touch sensitivity based on the state of a touch object and electronic device for supporting the same
EP16165404.1A EP3082025A1 (en) 2015-04-15 2016-04-14 Touch input processing method and electronic device for supporting the same
CN201610236856.8A CN106055138A (en) 2015-04-15 2016-04-15 Touch input processing method and electronic device for supporting the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150052965 2015-04-15
KR20150052965 2015-04-15

Publications (1)

Publication Number Publication Date
KR20160123200A true KR20160123200A (en) 2016-10-25

Family

ID=57446615

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150065395A KR20160123200A (en) 2015-04-15 2015-05-11 Touch input processing method and electronic device supporting the same

Country Status (1)

Country Link
KR (1) KR20160123200A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018128422A1 (en) * 2017-01-06 2018-07-12 삼성전자 주식회사 Method and apparatus for processing distortion of fingerprint image
WO2019160173A1 (en) * 2018-02-14 2019-08-22 엘지전자 주식회사 Mobile terminal and control method thereof
CN112639805A (en) * 2018-09-04 2021-04-09 三星电子株式会社 Electronic device including fingerprint sensor in ultrasonic display and method of operating the same
CN113591666A (en) * 2021-07-26 2021-11-02 深圳创联时代电子商务有限公司 Control method and device applied to mobile phone, computer readable medium and mobile phone
WO2022050627A1 (en) * 2020-09-01 2022-03-10 삼성전자 주식회사 Electronic device comprising flexible display, and operation method thereof
US11614829B2 (en) 2020-12-04 2023-03-28 Samsung Display Co., Ltd. Display device and driving method thereof
US11687167B2 (en) 2019-08-30 2023-06-27 Google Llc Visual indicator for paused radar gestures
US11790693B2 (en) 2019-07-26 2023-10-17 Google Llc Authentication management through IMU and radar
US11841933B2 (en) 2019-06-26 2023-12-12 Google Llc Radar-based authentication status feedback
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment
US12008169B2 (en) 2019-08-30 2024-06-11 Google Llc Radar gesture input methods for mobile devices
WO2024177199A1 (en) * 2023-02-20 2024-08-29 삼성메디슨 주식회사 Ultrasound imaging device and operation method thereof
US12093463B2 (en) 2019-07-26 2024-09-17 Google Llc Context-sensitive control of radar-based gesture-recognition
WO2024214912A1 (en) * 2023-04-12 2024-10-17 삼성전자주식회사 Electronic device for sensing contact with fluid via touch sensor

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018128422A1 (en) * 2017-01-06 2018-07-12 삼성전자 주식회사 Method and apparatus for processing distortion of fingerprint image
US11093776B2 (en) 2017-01-06 2021-08-17 Samsung Electronics Co., Ltd. Method and apparatus for processing distortion of fingerprint image
WO2019160173A1 (en) * 2018-02-14 2019-08-22 엘지전자 주식회사 Mobile terminal and control method thereof
CN112639805A (en) * 2018-09-04 2021-04-09 三星电子株式会社 Electronic device including fingerprint sensor in ultrasonic display and method of operating the same
US11841933B2 (en) 2019-06-26 2023-12-12 Google Llc Radar-based authentication status feedback
US11790693B2 (en) 2019-07-26 2023-10-17 Google Llc Authentication management through IMU and radar
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment
US12093463B2 (en) 2019-07-26 2024-09-17 Google Llc Context-sensitive control of radar-based gesture-recognition
US11687167B2 (en) 2019-08-30 2023-06-27 Google Llc Visual indicator for paused radar gestures
US12008169B2 (en) 2019-08-30 2024-06-11 Google Llc Radar gesture input methods for mobile devices
WO2022050627A1 (en) * 2020-09-01 2022-03-10 삼성전자 주식회사 Electronic device comprising flexible display, and operation method thereof
US11614829B2 (en) 2020-12-04 2023-03-28 Samsung Display Co., Ltd. Display device and driving method thereof
CN113591666A (en) * 2021-07-26 2021-11-02 深圳创联时代电子商务有限公司 Control method and device applied to mobile phone, computer readable medium and mobile phone
WO2024177199A1 (en) * 2023-02-20 2024-08-29 삼성메디슨 주식회사 Ultrasound imaging device and operation method thereof
WO2024214912A1 (en) * 2023-04-12 2024-10-17 삼성전자주식회사 Electronic device for sensing contact with fluid via touch sensor

Similar Documents

Publication Publication Date Title
US9904409B2 (en) Touch input processing method that adjusts touch sensitivity based on the state of a touch object and electronic device for supporting the same
KR20160123200A (en) Touch input processing method and electronic device supporting the same
KR102319803B1 (en) Electronic device, operating method thereof and recording medium
KR102399557B1 (en) Apparatus and method for obtaining coordinate through touch panel thereof
US10282019B2 (en) Electronic device and method for processing gesture input
KR102413108B1 (en) Method and electronic device for recognizing touch
KR20160104976A (en) Touch module and electronic device and operating method thereof
KR102340480B1 (en) Electronic device and method for controlling thereof
KR20180106527A (en) Electronic device and method for identifying falsification of biometric information
US10551980B2 (en) Electronic device and method for determining touch coordinate thereof
KR20160128872A (en) Fingerprint information processing method and electronic device supporting the same
KR102398503B1 (en) Electronic device for detecting pressure of input and operating method thereof
KR102386480B1 (en) Apparatus and method for distinguishing input by external device thereof
KR102388590B1 (en) Electronic device and method for inputting in electronic device
US20190324640A1 (en) Electronic device for providing user interface according to electronic device usage environment and method therefor
KR20170046410A (en) Method and eletronic device for providing user interface
KR20160128606A (en) Device For Providing Shortcut User Interface and Method Thereof
KR20160036927A (en) Method for reducing ghost touch and electronic device thereof
KR20160124536A (en) Method and electronic apparatus for providing user interface
KR20170066050A (en) Object notification method and electronic device supporting the same
EP3079046A1 (en) Method and apparatus for operating sensor of electronic device
KR20160147432A (en) Device For Controlling Respectively Multiple Areas of Display and Method thereof
KR102692984B1 (en) Touch input processing method and electronic device supporting the same
KR20180014446A (en) Electronic device and method for controlling touch screen display
KR20170086806A (en) Electronic device and method for recognizing touch input using the same