CN101714055A - Method and apparatus for displaying graphical user interface depending on a user's contact pattern - Google Patents
Method and apparatus for displaying graphical user interface depending on a user's contact pattern Download PDFInfo
- Publication number
- CN101714055A CN101714055A CN200910169036A CN200910169036A CN101714055A CN 101714055 A CN101714055 A CN 101714055A CN 200910169036 A CN200910169036 A CN 200910169036A CN 200910169036 A CN200910169036 A CN 200910169036A CN 101714055 A CN101714055 A CN 101714055A
- Authority
- CN
- China
- Prior art keywords
- contact
- user
- hand
- sensor
- contact mode
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04142—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position the force sensing means being located peripherally, e.g. disposed at the corners or at the side of a touch sensing plate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to a method and apparatus for displaying graphical user interface depending on a user's contact pattern. A graphical user interface (GUI) may be displayed on a display unit in an apparatus which may include a tactile sensor unit. When a contact by a user is detected at the tactile sensor unit, a control unit may receive a contact detection signal therefrom. Based on the contact detection signal, the control unit may determine a contact pattern and may then display the GUI corresponding to the contact pattern. The GUI may be displayed and modified depending on the location and pressure of contacts by a user's manipulating fingers. Therefore, a user can manipulate the apparatus without any inconvenience or accidental touches.
Description
Technical field
Exemplary embodiment of the present invention relates to a kind of graphic user interface (GUI) that is used for electronic installation, particularly, relates to a kind of equipment that shows the method for GUI according to user's contact mode.
Background technology
Touch-screen can be used as display unit and input block.Therefore, the electronic installation with touch-screen can not need the display unit and the input block that add.Because this advantage, touch-screen can be widely used in such as, the electronic installation of the limited size of mobile device (also can be called as mancarried device or hand-held device) for example.
Usually, the user can operate touch-screen with a hand or two hands, with the function or the application of command execution expectation.When the user used two hands, one held device, the touch-screen of another hand touchable device.But when the user only used a hand, the finger of the hand of holding (for example, thumb) was understood the shield portions touch-screen usually.
Figure 10 A is that left thumb that the user is shown selects to be presented at one schematic example in the menu icon on the touch-screen.In this example, if the user touch be positioned at the touch-screen upper right portion special icon (for example, music icon), then point some that to block whole or in part in other icon (for example, game icon, display icon and schedule icon) that is presented on the touch-screen.In addition, these icons that block can with thumb contact, thereby can carry out the function that is associated with the icon that blocks undesirably.
Figure 10 B is that the left thumb that the user is shown touches another schematic example that is presented on the scroll bar on the touch-screen.If the user touches the scroll bar on the right side that is positioned at touch-screen, then content displayed (for example, sight) can be blocked by thumb.In addition, some content displayed can be touched by thumb undesirably and be accessed.
Under the situation of not using touch-screen or keyboard, have only user and specific part to keep in touch with electronic installation of touch sensor, described electronic installation just can provide the control of electronic device applications.These electronic installations can provide the display screen with GUI, with the input of guiding based on contact., GUI do not consider user's contact mode if showing with fixing form, the contact that some position in GUI may not record/input user.Because the size of the size of each hand, finger, the difference of holding form can cause this phenomenon.Be difficult to realize being suitable for a plurality of users' GUI.If do not have coupling between the position in GUI and the contact point that the user contacts, then when the user operates in application on the electronic installation, can produce puzzlement.
Summary of the invention
Exemplary embodiment of the present invention provides the method and apparatus of a kind of user's of being suitable for the hand display graphics user interface (GUI) of operating.
Exemplary embodiment of the present invention also provides a kind of equipment with touch-screen and touch sensor.
To illustrate the other characteristics of the present invention in the following description, by describing, it can become partly clear, perhaps can understand by implementing the present invention.
Exemplary embodiment of the present discloses the method that shows GUI on a kind of display unit in the equipment that comprises the touch sensor unit.Described method comprises: the touch sensor unit detects user's contact; Determine contact mode according to the contact that detects; Correspondingly show GUI with contact mode.
Exemplary embodiment of the present invention provides the equipment of a kind of GUI of demonstration.Described equipment comprises: the touch sensor unit, be set to when detecting user's contact, create the contact detection signal, wherein, described touch sensor unit comprises left Sensor section and right Sensor section, and each Sensor section has a plurality of sensor modules; Display unit is set to show GUI; Control module is set to receive the contact detection signal from the touch sensor unit, determine contact mode based on the contact detection signal, and the control display unit is correspondingly to show GUI with contact mode.
Should be appreciated that above-mentioned general description and following detailed all are exemplary and indicative, and attempt the of the present invention further explanation that provides claimed.
Description of drawings
Accompanying drawing illustrates exemplary embodiment of the present invention, and is used from explanation principle of the present invention with instructions one, wherein, comprises accompanying drawing to provide further understanding of the present invention, and accompanying drawing is comprised in the instructions and constitutes the part of instructions.
Figure 1A illustrates the block diagram of the inner structure of equipment according to an exemplary embodiment of the present invention;
Figure 1B illustrates the example of the touch sensor unit of the side that is positioned at the equipment shown in Figure 1A according to an exemplary embodiment of the present invention;
Fig. 2 illustrates the process flow diagram that shows the method for GUI according to an exemplary embodiment of the present invention according to the hand of operating;
Fig. 3 is the process flow diagram of example that the detailed process of the step of determining the hand operated in the GUI display packing shown in figure 2 is shown according to an exemplary embodiment of the present invention;
Fig. 4 is the process flow diagram of another example that the detailed process of the step of determining the hand operated in the GUI display packing shown in figure 2 is shown according to an exemplary embodiment of the present invention;
Fig. 5 is the process flow diagram of another example that the detailed process of the step of determining the hand operated in the GUI display packing shown in figure 2 is shown according to an exemplary embodiment of the present invention;
Fig. 6 A illustrates left hand according to an exemplary embodiment of the present invention and holds the example that equipment and touch sensor unit are positioned at the side of equipment;
Fig. 6 B illustrates the right hand according to an exemplary embodiment of the present invention and holds the example that equipment and touch sensor unit are positioned at the side of equipment;
Fig. 7 A illustrates left hand according to an exemplary embodiment of the present invention and holds another example that equipment and touch sensor unit are positioned at the side of equipment;
Fig. 7 B illustrates the right hand according to an exemplary embodiment of the present invention and holds another example that equipment and touch sensor unit are positioned at the side of equipment;
Fig. 8 illustrates the example of GUI according to an exemplary embodiment of the present invention;
Fig. 9 illustrates another example of GUI according to an exemplary embodiment of the present invention;
Figure 10 A is that left thumb that the user is shown selects to be presented at one schematic example in the menu icon on the touch-screen;
Figure 10 B is that the left thumb that illustrates according to the user of traditional GUI touches another schematic example that is positioned at the scroll bar on the touch-screen;
Figure 11 illustrates the process flow diagram that shows the method for GUI according to an exemplary embodiment of the present invention based on the hand of operating;
Figure 12 A illustrates the example that shows the screen of menu icon according to an exemplary embodiment of the present invention in idle screen is used by user's contact;
Figure 12 B illustrates the example of the screen of the menu icon variation that shows by the new contact of user according to an exemplary embodiment of the present invention in idle screen is used;
Figure 13 A illustrates the example that shows the screen of icon according to an exemplary embodiment of the present invention in camera applications by user's contact;
Figure 13 B illustrates another example of the screen of the icon variation that shows by the new contact of user according to an exemplary embodiment of the present invention in camera applications
Figure 14 A illustrates the example that shows the screen of icon according to an exemplary embodiment of the present invention in MP3 uses by user's contact;
Figure 14 B illustrates the example of the screen of the icon change that shows by the new contact of user according to an exemplary embodiment of the present invention in MP3 uses.
Embodiment
With reference to accompanying drawing the present invention has been described more fully hereinafter, embodiments of the invention shown in the drawings.Yet the present invention can implement with many different forms, and the exemplary embodiment that should not be interpreted as being confined to here and set forth.On the contrary, provide these exemplary embodiments to make that the disclosure will be completely, and scope of the present invention is conveyed to those skilled in the art fully.In the accompanying drawings, for clarity, can exaggerate the layer and the zone size and relative size.Identical label in the accompanying drawing is represented components identical.
Although it should be understood that and can use the term first, second, third, etc. to describe different elements, assembly, zone, layer and/or part here, these elements, assembly, zone, layer and/or part should not be subjected to the restriction of these terms.These terms only are to be used for an element, assembly, zone, layer or part and another element, assembly, zone, layer or part are made a distinction.Therefore, under the situation that does not break away from instruction of the present invention, first element of discussing below, assembly, zone, layer or part can be known as second element, assembly, zone, layer or part.
For convenience of description, but usage space relative terms here, as " ... under ", " in ... below ", " following ", " in ... top ", " top " etc., be used for describing the relation of an element shown in the figure or feature and other element or feature.It should be understood that the space relative terms is intended to comprise the different azimuth of device in using or operating except the orientation that is described in the drawings.For example, if the device in the accompanying drawing is reversed, then be described as " " other element or feature " below " or " under " element will be positioned as subsequently " " other element or feature " top ".Thereby exemplary term " in ... below " can comprise " in ... top " and " in ... below " two kinds of orientation.Described device can be by other location (revolve turn 90 degrees or in other orientation), and the space relative descriptors of here using is made corresponding explanation.
Term used herein is only in order to describe the purpose of specific embodiment, and is not intended to limit the present invention.As used herein, unless context spells out in addition, otherwise singulative also is intended to comprise plural form.What will also be understood that is, when using term " to comprise " in this manual and/or when " comprising ", illustrate to have described feature, integral body, step, operation, element and/or assembly, do not exist or additional one or more further features, integral body, step, operation, element, assembly and/or their group but do not get rid of.
Unless otherwise defined, otherwise all terms used herein (comprising technical term and scientific terminology) have the meaning equivalent in meaning with those skilled in the art institute common sense.It will also be understood that, unless clearly definition here, otherwise term (such as the term that defines in general dictionary) should be interpreted as having the meaning of their aggregatio mentium in the environment with association area, and will be not explain them with desirable or too formal implication.
In addition, with not describing or illustrate technology, element, mechanism and processing known or that use widely in detail, with the essence of the present invention of avoiding confusion.
Before explaining exemplary embodiment of the present, with the relevant technology of description definition below.
Graphic user interface (GUI) can refer to be arranged on the graphic presentation on the display (for example, screen) of electronic installation.GUI can comprise that at least one window, at least one icon, at least one scroll bar and user are used for order is inputed to any other image item of device.Should be appreciated that exemplary embodiment of the present can comprise the various GUI of different shape, design and structure.
The hand of operating can refer to electronic installation user's the hand of the touch-screen of operating electronic devices.The hand of operating can be included in or many hands that carry out touch action on the touch-screen.In addition, the hand of operating can comprise or many hands that contact with the electronic installation with touch sensor.The hand of operating can be left hand, the right hand or two hands of user.
Touch sensor unit or touch sensor can refer at least one sensor to user's touch-sensitive.The touch sensor unit can be different with the touch sensor in being included in touch-screen, and the touch sensor unit can be usually located at least one side of electronic installation.If the user holds device, then contacting between the touch sensor unit hand that can detect the user and the device created the contact detection signal, and the contact detection signal sent to control module.The touch sensor unit can comprise at least one touch sensor, the size that described touch sensor can detect contact pressure with contact/position of pressure.Alternatively, the combination of pressure transducer and touch sensor can be used to the touch sensor unit.The touch sensor unit can comprise left Sensor section and right Sensor section, and each in left Sensor section and the right Sensor section can comprise a plurality of sensor modules.The touch sensor unit can be formed on the top of device and/or below, maybe can be formed on any of device and/or all sides.
Sensor module can refer to constitute the element of touch sensor unit.Each sensor module can detect user's contact independently.Based on the kind or the size of sensor module, can determine to be included in the quantity of the sensor module in the touch sensor unit.
The cover sensor module that component groups can refer to be disposed in order in the contact-detection sensor assembly.Component groups can be used for creating contact mode information.The position of component groups can change according to user's the form of holding with the quantity that is included in the sensor module in the single component group.
Below, with reference to accompanying drawing exemplary embodiment of the present is described in detail.
Figure 1A illustrates the block diagram of the inner structure of equipment according to an exemplary embodiment of the present invention.
With reference to Figure 1A, equipment 100 can be that mobile communication terminal, the portable terminal such as PDA(Personal Digital Assistant), computing machine, TV or any other have the electronic installation of touch-screen.Equipment 100 can comprise touch sensor 110, memory cell 120, touch-screen 130 and control module 140.
When the user held equipment 100, touch sensor unit 110 can detect the contact of user's hand.The size that touch sensor unit 110 can detect contact pressure with contact/position of pressure.Touch sensor unit 110 can comprise the combination of touch sensor, touch-screen and/or pressure transducer and touch-screen.Touch sensor unit 110 can be positioned at the side of equipment 100, but is not limited thereto.In certain example embodiment, touch sensor unit 110 can be positioned on each face of equipment 100.
Figure 1B illustrates the example of touch sensor unit 110 of the side of the equipment of being positioned at 100.After the contact that detects user's hand, touch sensor unit 110 can send to the contact detection signal control module 140.Touch sensor unit 110 can comprise left Sensor section and right Sensor section, and each in left Sensor section and the right Sensor section can comprise a plurality of sensor modules.
The data of creating when memory cell 120 can be stored the required a plurality of programs of the function of actuating equipment 100 and carry out described function.Memory cell 120 can be stored data relevant with the processing of the hand of supposing to operate and contact mode information.
But touch-screen 130 display message, and can receive user's input.Touch-screen 130 can comprise display unit 132 and touch sensor unit 134.
The state and the operation of one or more elements of control module 140 controllable devices 100.For example, control module 140 can be from the touch sensor unit 110 receives the contact detection signals, and can determine user's contact mode by using the contact detection signal.Therefore, control module 140 can show GUI according to user's contact mode order display unit 132.
Fig. 2 illustrates the process flow diagram that shows the method for GUI according to an exemplary embodiment of the present invention according to the hand of operating.
With reference to Fig. 2, touch sensor unit 110 can detect user's contact (S210).User's contact can be the result that the user holds equipment 100.When detecting user's contact, touch sensor unit 110 can send to the contact detection signal control module 140.
After step S230, control module 140 can show GUI (S240) according to the hand order touch-screen of operating 130.Then, control module 140 can determine whether the additional contact detection signal (S250) of 110 receptions from the touch sensor unit.If control module 140 determines to receive additional contact detection signal, the method that then shows GUI can turn back to step S230 with the contact mode that redefines the user and redefine the hand of operating.When form is held in user's change, can provide additional contact detection signal by touch sensor unit 110.
If control module 140 does not receive additional contact detection signal from touch sensor unit 110, then display unit 132 can keep current GUI.Subsequently, user-operable is presented at GUI on the touch-screen 130 order is input to equipment 100.
Fig. 3 is the process flow diagram that the example of the step S230 shown in Figure 2 of detailed process according to an exemplary embodiment of the present invention is shown.
With reference to Fig. 3, control module 140 can be based on the 110 contact detection signals that receive produce at least one component groups (S310) from the touch sensor unit.As mentioned above, component groups can refer in the contact-detection sensor assembly by one or more cover sensor modules that are disposed in order.
The exemplary embodiment of sensor module and component groups has been shown in Fig. 6 A, Fig. 6 B, Fig. 7 A and Fig. 7 B.
Fig. 6 A illustrates the user holds equipment 100 with left hand example.Touch sensor unit 110 can be positioned at the side of equipment 100.Left side Sensor section can be positioned at the left side of equipment 100, and right Sensor section can be positioned at the right side of equipment 100.Each Sensor section can comprise a plurality of sensor modules.The quantity of sensor module is according to the size variation of sensor module.For example, the size of sensor module is more little, and many more sensor modules can be arranged on the side of equipment 100.Among Fig. 6 A, for example, the quantity that belongs to the sensor module of each Sensor section can be 23.The assembly of the mark in the sensor module of left Sensor section can represent to detect the assembly that contacts with left hand.In the sensor module of right Sensor section, the assembly of mark can represent to detect the assembly with finger (for example, four fingers except that the thumb) contact of left hand.The contact detection assembly can be grouped by the order of its layout.For example, tactic 9 assemblies in left Sensor section can be divided into one group.In addition, two assemblies in right Sensor section are that four pairs of a pair of assemblies can be divided into four groups.
Turn back to Fig. 3, after the step S310 of generation component group, control module 140 can be created contact mode information (S320) based on component groups.Therefore, how contact mode information can hold equipment 100 and different based on the user.Contact mode information can comprise, for example, and the pressure detection data of the spacing between the quantity of the component groups in each Sensor section, the position of component groups, the component groups, the quantity of the sensor module in each component groups and/or each sensor module.
With reference to Fig. 6 A, the contact mode information of left Sensor section can comprise following data: comprise a component groups of nine sensor modules, described nine sensor modules are positioned at, for example, and from the 12nd sensor module to the 20 sensor modules.The contact mode information of right Sensor section can comprise following data: four component groups, each component groups comprises two sensor modules, these 8 sensor modules are positioned at, for example, and the 4th, the 5th, the 9th, the 10th, the 14th, the 15th, the 19th and the 20th module position.Three sensor modules can be able to be placed between two adjacent component groups.
Turn back to Fig. 3, control module 140 can be from the contact mode information (S330) of memory cell 120 retrieve stored.Memory cell 120 can be stored contact mode information, and can store usually and the different corresponding different contact mode information of type of holding.The contact mode information that is stored in the memory cell 120 can comprise, for example, the pressure detection data of the quantity of the position of the quantity of the component groups in each Sensor section, component groups, the spacing between the component groups, the sensor module in each component groups and/or each sensor module.
If the contact mode information of creating is in the contact mode range of information of retrieval, then control module 140 can be determined the hand (S360) operated accordingly with the contact mode information of creating.Memory cell 120 can have been stored and the different relevant information of the hand of operating according to different contact mode information.If the contact mode information of creating belongs to the contact mode range of information of retrieval, then control module 140 can be determined the hand operated accordingly with the contact mode information of creating.The hand of determining of operating can be a left hand, or the right hand.
If the contact mode information of creating does not belong to the contact mode range of information of retrieval, then control module 140 can determine that the hand of operating is two hands (S370).After having determined the hand of operating, control module 140 can turn back to the previous steps S240 that shows GUI according to the hand of operating.
Fig. 4 is the process flow diagram that another example of the detailed process that is illustrated in the step S230 among Fig. 2 according to an exemplary embodiment of the present invention is shown.
With reference to Fig. 4, control module 140 can produce at least one component groups (S410), and described at least one component groups can be a cover sensor module that is disposed in order in the contact-detection sensor assembly.Then, control module 140 can calculate the quantity (S420) that is included in the sensor module in each component groups.For example, as exemplary illustrating in Fig. 6 A, a component groups in the left Sensor section can have 9 sensor modules, and each in four component groups in right Sensor section can have two sensor modules.
After the quantity of the sensor module in having calculated each component groups, control module 140 can determine which component groups and which Sensor section can have maximum contact-detection sensor assemblies (S430).For example, maximum component groups can be in left Sensor section or right Sensor section, and therefore, at step S430, control module 140 can determine that maximum Sensor section is left Sensor section or right Sensor section.For example, with reference to Fig. 6 A, control module 140 can determine that maximum component groups has 9 sensor modules and is arranged in left Sensor section.With reference to another example that is illustrated among Fig. 6 B, maximum component groups can have 9 sensor modules and be arranged in right Sensor section.Therefore, control module 140 can determine that maximum Sensor section is right Sensor section.Similarly, Zui Da Sensor section can be left Sensor section in Fig. 7 A and the right Sensor section in Fig. 7 B.
If maximum Sensor section is left Sensor section, then control module 140 can determine also whether left Sensor section has add-on assemble group (S440).The add-on assemble group can refer to be arranged in maximum Sensor section but not be one or more component groups of the sensor module group of maximum.At Fig. 6 A, for example, can have the largest component group as the left Sensor section of the Sensor section of maximum, but not have the add-on assemble group.But at Fig. 7 A, the left Sensor section with maximum component groups can have an add-on assemble group that comprises from three sensor modules of the 3rd sensor module to the 5 sensor modules.
If there is not the add-on assemble group as shown in Figure 6A, then control module 140 can determine that the hand of operating is left hand (S450).Then, can think that maximum component groups contacts with the palm of left hand.In addition, there is not the add-on assemble group can represent that the thumb of left hand can not contact with touch sensor unit 110.In these cases, control module 140 can be determined the left-handed thumb manipulation touch-screen 130 of user.That is, the user can use his or her left hand to hold equipment 100 and touch touch-screen 130.Control module 140 can determine that the hand of operating is a left hand.
If it is the right hand that the hand of operating is confirmed as, then can carry out similar step.For example, if maximum Sensor section is right Sensor section, then control module 140 can determine whether right Sensor section has add-on assemble group (S460).If right Sensor section does not have the add-on assemble group shown in Fig. 6 B, then control module 140 can determine that the hand of operating is the right hand (S470).
If right Sensor section has the add-on assemble group, then control module 140 can determine that the hand of operating may be two hands (S480).The existence of add-on assemble group can represent that the thumb of the hand that the user holds contacts with touch sensor unit 110.Control module 140 can determine that the user can be with the thumb manipulation touch-screen 130 of the hand of not holding.So control module 140 determines that the hand of operating is two hands.
Fig. 5 illustrates the process flow diagram of another example of the detailed process of the step S230 shown in Fig. 2 according to an exemplary embodiment of the present invention.
With reference to Fig. 5, control module 140 can produce at least one component groups (S510), and described at least one component groups can be a cover sensor module that is disposed in order in the contact-detection sensor assembly.Control module 140 can calculate the quantity (S520) of the component groups in each Sensor section.In Fig. 6 A, Fig. 6 B, Fig. 7 A and Fig. 7 B, Sensor section can be left Sensor section and right Sensor section.In some cases, control module 140 can calculate the quantity of the component groups in left Sensor section and right Sensor section simultaneously.
For example, in Fig. 6 A, the quantity of the component groups of left Sensor section can be 1, and the quantity of the component groups of right Sensor section can be 4.In Fig. 6 B, the quantity of the component groups of left Sensor section can be 4, can be 1 in the quantity of the component groups of right Sensor section.In Fig. 7 A, the quantity of the component groups of left Sensor section is 2, and the quantity of the component groups of right Sensor section is 4.In Fig. 7 B, the quantity of the component groups of left Sensor section can be 4, and the quantity of the component groups of right Sensor section can be 2.
Similarly, at the hand of determining to operate is in the process of left hand, and control module 140 can determine whether the quantity of the component groups in the right Sensor section is more than three or three and whether the quantity of the component groups in the left Sensor section is (S550) below one or one.
, in the answer of step 530 and step 550 all whether then control module 140 can determine that the hand of operating is two hands (S570) if.
Fig. 8 illustrates the example of GUI according to an exemplary embodiment of the present invention.
Fig. 8 illustrates that to have that menu icon and left hand be confirmed as be display screen 810 exemplary embodiments of the hand operated.Control module 140 can be arranged menu icon with corresponding with the mobile route of left thumb from the upper left corner to the lower right corner of display screen 810.Therefore, the user can select icon to carry out and the corresponding function of expecting of the icon of selecting by using his or she left thumb touch display screen 810.Because menu icon arranges along the mobile route of left thumb, so icon can not can be blocked and can prevent the touch of not expecting of icon by thumb.
Fig. 8 also shows the exemplary embodiment of the display screen 820 of the hand of operating that the right hand is determined.In this case, control module 140 can be arranged menu icon along right thumb mobile route.Under some other situation, shown in the display screen among Fig. 8 830, when hand that two hands are confirmed as operating, control module 140 can keep normal GUI, and this normal GUI changes according to user's intention.
Fig. 9 illustrates another example of GUI according to an exemplary embodiment of the present invention.
Fig. 9 illustrates the exemplary embodiment of the display screen 910 with hand that scroll bar and left hand be confirmed as operating.Control module 140 can be arranged scroll bar with corresponding with the mobile route of left thumb along the left side of display screen 910.Therefore, the user can pull scroll bar by the left thumb with the user and move up or down scroll bar.Because scroll bar can be listed as along left bank, content displayed can not can be blocked by thumb, and can prevent to touch undesirably content displayed.
Fig. 9 also shows the exemplary embodiment of the display screen 920 with hand that scroll bar and the right hand be confirmed as operating.In this case, control module 140 can be arranged scroll bar with corresponding with the mobile route of right thumb along the right side of display screen 910.Therefore, the right thumb of user's available subscribers move/pull scroll bar and can not block or touch content displayed.Therefore, when the user pulls scroll bar, can prevent to touch undesirably content displayed.Under some other situation, shown in the display screen 930 of Fig. 9, when hand that two hands are confirmed as operating, control module 140 can keep normal GUI, and this normal GUI can change.
Figure 11 illustrates the process flow diagram that shows the method for GUI according to an exemplary embodiment of the present invention based on the hand of operating.For example, when the user uses touch sensor unit 110 that order is input to equipment 100, can use the method for describing with reference to Figure 11.
With reference to Figure 11, touch sensor unit 110 can detect user's contact (S1110).User's contact can be corresponding to holding of equipment 100 with the user.For example, the user can hold equipment 100 with a hand (shown in Figure 12 A and Figure 12 B) or with two hands (shown in Figure 13 A and Figure 13 B).When detecting user's contact, touch sensor unit 110 can will comprise that the contact detection signal about the information of position contacting and pressure sends to control module 140.
Then, control module 140 can show GUI (S1140) according to user's the particular location of contact position order display unit 132 on display unit 132.Particularly, control module 140 can at first be discerned (before showing GUI) application of current execution, can select then and the corresponding GUI element of the application of current execution.For example, when the operation idle screen was used, control module 140 can be chosen as menu icon the GUI element that is used for idle screen.In other cases, if the operation camera applications, then control module 140 can be selected icon that is used to take pictures and the scroll bar that is used to amplify/dwindle, and shows.After the GUI element of selecting customization, control module 140 can be determined GUI pattern of rows and columns based on the application of current execution and user's the form of holding.For example, with reference to Figure 12 A and Figure 12 B, control module 140 can be used free screen and be identified as the application of current execution, and can determine also that user's left hand held equipment 100.Then, control module 140 can be determined GUI pattern of rows and columns, thereby menu icon can be arranged near at least one the contact position in four fingers (except that thumb) of user's left hand.
Determine after GUI pattern of rows and columns that control module 140 can show the GUI element based on GUI pattern of rows and columns order display unit 132.That is, can on display unit 132, show the previous GUI element of selecting according to GUI pattern of rows and columns.
Figure 12 A and Figure 12 B illustrate two examples that show the screen of menu icon according to an exemplary embodiment of the present invention in idle screen is used.Shown in Figure 12 A and Figure 12 B, three menu icons can be positioned at the position of pointing contact positions along transverse direction from three.But the rank of the frequency of utilization of storage unit 120 storage menu and menu icon.Control module 140 can be by the series arrangement menu icon of frequency of utilization.For example, when the user is carrying out when holding equipment 100 during free screen is used, control module 140 can be from the frequency of utilization rank of storage unit 120 search menus, and can be according to the rank order display unit 132 display menu icons of retrieval.Can change the icon that is presented in the display unit 132 according to user's preference.
Figure 13 A and Figure 13 B illustrate two examples of the screen that shows icon according to an exemplary embodiment of the present invention in camera applications.With reference to Figure 13 A and Figure 13 B, the user can hold equipment 100 with the thumb and the forefinger of two hands.The icon that is used to take pictures can be positioned at from the near position of the forefinger of the right hand, is used to amplify and the scroll bar that dwindles can be positioned at position from the thumb of the right hand along longitudinal direction.If the user increases contact pressure by RIF, the icon that then is used to take pictures moves to the RIF direction.When this icon arrives the upside of display unit 132, can carry out the function of taking pictures.In addition, the user can amplify/dwindle with control by right thumb increase contact pressure.
Figure 14 A and Figure 14 B are illustrated in two examples that show the screen of icon in the MP3 application.With reference to Figure 14 A and Figure 14 B, the user can hold equipment 100 with left hand.Can be according to the contact position Presentation Function icon of the finger except that thumb of left hand, however can show the volume control strip according to the contact position of thumb.The icon that shows can be according to predetermined GUI pattern of rows and columns.When the equipment 100 of holding, the user can be by increasing contact pressure or carrying out controlling the execution that MP3 uses such as the action of patting.For example, the position of icon, size and/or expression effect can change according to user's contact.
Turn back to Figure 11, after showing GUI, control module 140 can determine whether user's contact position changes (S1150).Particularly, change when holding form during the user is holding equipment 100, touch sensor unit 110 can detect the variation of user's contact, and can produce new contact detection signal.Then, control module 140 can be from the touch sensor unit 110 receives new contact detection signal, can determine user's contact mode once more, and can be according to the demonstration of new contact mode modification GUI.
With reference to Figure 12 A and Figure 12 B, the contact of the user among the contact of the user among Figure 12 B and Figure 12 A is different.For example, in Figure 12 B, the position of finger can move down.Control module 140 can be from the touch sensor unit 110 receives new contact detection signal, and can determine new contact mode based on the new information about position contacting and pressure.Then, control module 140 can change the demonstration of GUI according to new contact mode order.
Comparison diagram 13A and Figure 13 B compare with Figure 13 A, and user's RIF can (for example, left) move among Figure 13 B.Shown in Figure 13 B, control module 140 can receive new contact detection signal, determines new contact mode, and the icon of will taking pictures moves to current forefinger contact direction.
With reference to Figure 14 A and Figure 14 B, the quantity of contact and position contacting can change.For example, four contacts on the right side in Figure 14 A can move down, and can be reduced to 3 contacts among Figure 14 B.In addition, the contact in left side can move down.The volume control strip also can move down along the left side.In addition, laggard icon, broadcast/time-out icon and stop icon and can move down along the right side, but can remove and the corresponding forward icon of the finger (for example, little finger of toe) of left hand from display unit 132.
As mentioned above, exemplary embodiment of the present discloses a kind of method and apparatus according to user's position contacting and pressure demonstration and modification GUI.Therefore, when user's operating equipment, exemplary embodiment of the present can be avoided puzzlement.
It will be clear to someone skilled in the art that do not breaking away under the spirit or scope of the present invention situation, can carry out various modifications and change the present invention.Therefore, the invention is intended to cover fall into claim and equivalent thereof scope in to modifications and variations of the present invention.
Claims (15)
1. the method for a display graphics user interface GUI on the display unit of the equipment that comprises the touch sensor unit, described method comprises:
Detect the contact of touch sensor unit;
Determine contact mode according to the contact that detects;
Show GUI based on contact mode.
2. method according to claim 1 also comprises: determine the hand operated based on contact mode.
3. method according to claim 2, wherein, determine that the step of contact mode comprises:
Produce one or more component groups, described one or more component groups comprise and are used for detecting contact and with one or more tactic cover sensor modules, a described cover sensor module is arranged at least one Sensor section of touch sensor unit;
Create contact mode information based on one or more component groups;
The contact mode information of creating is compared with the contact mode information of storage;
If the contact mode information of creating in the scope that the contact mode information with storage is associated, is then correspondingly determined the hand operated with the contact mode information of creating.
4. method according to claim 3, wherein, contact mode information comprises: the pressure detection data of the spacing between the quantity of the one or more component groups in each Sensor section, the position of one or more component groups, the one or more component groups, the quantity of the sensor module in each component groups and/or each sensor module.
5. method according to claim 2, wherein, determine that the step of contact mode comprises:
Produce one or more component groups, described one or more component groups comprise and are used for detecting contact and with one or more tactic cover sensor modules, a described cover sensor module is arranged at least one Sensor section of touch sensor unit;
Determine to have the Sensor section of largest component group, described largest component group comprises maximum sensor modules;
Determine whether the Sensor section with largest component group has the add-on assemble group;
If there is no add-on assemble group, then the left sensor part timesharing that belongs to the touch sensor unit when the largest component group determines that the hand of operating is user's a left hand, or the right sensor part timesharing that belongs to the touch sensor unit when the largest component group determines that the hand of operating is user's the right hand.
6. method according to claim 2, wherein, determine that the step of contact mode comprises:
Produce one or more component groups, described one or more component groups comprise and are used for detecting contact and with one or more tactic cover sensor modules, a described cover sensor module is arranged at least one Sensor section of touch sensor unit;
Calculating is included in the quantity of the component groups in each Sensor section;
When the quantity of the component groups in the left Sensor section of touch sensor unit be more than three or three and the right Sensor section of touch sensor unit in the quantity of sensor module be one when being less than one, the right hand of determining the user is the hand of operating;
When the quantity of the component groups in the right Sensor section is more than three or three and the quantity of the sensor module in the left Sensor section is one or when being less than one, the left hand of determining the user is the hand of operating.
7. method according to claim 2 wherein, shows that the step of GUI comprises:
When the hand of operating is user's left hand, on display screen with the mobile route of user's left thumb arranging menu icon correspondingly;
When the hand of operating is user's the right hand, on display screen with the mobile route of user's right thumb arranging menu icon correspondingly.
8. method according to claim 2 wherein, shows that the step of GUI comprises:
When the hand of operating is user's left hand, arrange scroll bar along the left half of display screen;
When the hand of operating is user's the right hand, arrange scroll bar along the right half of display screen.
9. method according to claim 1 wherein, determines that the step of contact mode comprises: the form of holding of determining the user based on the position contacting that detects.
10. method according to claim 9 wherein, determines that the step of contact mode also comprises: determine that whether contact pressure is greater than threshold value.
11. method according to claim 9 also comprises:
Determine GUI pattern of rows and columns according to the application of current execution and user's the form of holding.
12. method according to claim 1 wherein, shows that the step of GUI comprises: based on the position contacting that detects certain position display GUI at display screen.
13. method according to claim 1 wherein, shows that the step of GUI comprises: according to the position of the pressure change GUI of the contact that detects or the demonstration size of GUI.
14. the equipment of a display graphics user interface GUI, the equipment of described display graphics user interface comprises:
The touch sensor unit, be set to produce the contact detection signal in response to the detection of contact, described touch sensor unit comprises a plurality of Sensor sections, and described a plurality of Sensor sections comprise left Sensor section and right Sensor section, and each Sensor section has a plurality of sensor modules;
Display unit shows GUI;
Control module receives the contact detection signal from the touch sensor unit, determines contact mode based on described contact detection signal, and the order display unit shows GUI based on described contact mode.
15. equipment according to claim 14, wherein, control module also is set to produce one or more component groups, to create and the corresponding contact mode information of contact mode based on one or more component groups, the contact mode information that produces is compared with the contact mode information of storage, if and the contact mode information that produces is in the scope that the contact mode information with storage is associated, then determine based on the contact mode information of creating that the hand operated, described one or more component groups comprise and be used for detecting contact and with one or more tactic cover sensor modules.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710119581.4A CN106909304B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
CN201710119962.2A CN106909305B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20080097591 | 2008-10-06 | ||
KR10-2008-0097591 | 2008-10-06 | ||
KR10-2009-0012687 | 2009-02-17 | ||
KR1020090012687A KR20100039194A (en) | 2008-10-06 | 2009-02-17 | Method for displaying graphic user interface according to user's touch pattern and apparatus having the same |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710119581.4A Division CN106909304B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
CN201710119962.2A Division CN106909305B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
CN101714055A true CN101714055A (en) | 2010-05-26 |
Family
ID=42215793
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710119962.2A Active CN106909305B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
CN201710119581.4A Active CN106909304B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
CN200910169036A Pending CN101714055A (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710119962.2A Active CN106909305B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
CN201710119581.4A Active CN106909304B (en) | 2008-10-06 | 2009-09-14 | Method and apparatus for displaying graphical user interface |
Country Status (3)
Country | Link |
---|---|
KR (1) | KR20100039194A (en) |
CN (3) | CN106909305B (en) |
ES (1) | ES2776103T3 (en) |
Cited By (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102256015A (en) * | 2011-04-06 | 2011-11-23 | 罗蒙明 | Virtual keyboard finger key-pressing judgment method for touch mobile phone |
CN102299996A (en) * | 2011-08-19 | 2011-12-28 | 华为终端有限公司 | Handheld device operating mode distinguishing method and handheld device |
CN102375652A (en) * | 2010-08-16 | 2012-03-14 | 中国移动通信集团公司 | Mobile terminal user interface regulation system and method |
CN102402275A (en) * | 2010-09-13 | 2012-04-04 | 联想(北京)有限公司 | Portable electronic equipment and holding gesture detection method |
CN102479035A (en) * | 2010-11-23 | 2012-05-30 | 汉王科技股份有限公司 | Electronic device with touch screen, and method for displaying left or right hand control interface |
CN102662603A (en) * | 2012-05-18 | 2012-09-12 | 广州市渡明信息技术有限公司 | Input method display method and input method display system for mobile phone with touch screen |
CN102722247A (en) * | 2012-03-09 | 2012-10-10 | 张伟明 | Operation and control component, information processing system using same and information processing method thereof |
CN102790816A (en) * | 2011-05-16 | 2012-11-21 | 中兴通讯股份有限公司 | Processing method and device of pushbutton function |
CN102810039A (en) * | 2011-05-31 | 2012-12-05 | 中兴通讯股份有限公司 | Left or right hand adapting virtual keyboard display method and terminal |
CN102841723A (en) * | 2011-06-20 | 2012-12-26 | 联想(北京)有限公司 | Portable terminal and display switching method thereof |
CN102890558A (en) * | 2012-10-26 | 2013-01-23 | 北京金和软件股份有限公司 | Method for detecting handheld motion state of mobile handheld device based on sensor |
CN103118166A (en) * | 2012-11-27 | 2013-05-22 | 广东欧珀移动通信有限公司 | Method of realizing single hand operation of mobile phone based on pressure sensing |
CN103140822A (en) * | 2010-10-13 | 2013-06-05 | Nec卡西欧移动通信株式会社 | Mobile terminal device and display method for touch panel in mobile terminal device |
CN103136130A (en) * | 2011-11-23 | 2013-06-05 | 三星电子株式会社 | Method and apparatus for peripheral connection |
CN103282869A (en) * | 2010-08-12 | 2013-09-04 | 谷歌公司 | Finger identification on a touchscreen |
CN103324423A (en) * | 2012-03-21 | 2013-09-25 | 北京三星通信技术研究有限公司 | Terminal and user interface display method thereof |
CN103502924A (en) * | 2011-06-24 | 2014-01-08 | 株式会社Ntt都科摩 | Mobile information terminal and operational state assessment method |
CN103502914A (en) * | 2011-06-29 | 2014-01-08 | 株式会社Ntt都科摩 | Mobile information terminal and position region acquisition method |
CN103513763A (en) * | 2012-06-26 | 2014-01-15 | Lg电子株式会社 | Mobile terminal and control method thereof |
CN103576850A (en) * | 2012-12-26 | 2014-02-12 | 深圳市创荣发电子有限公司 | Method and system for judging holding mode of handheld device |
CN103795949A (en) * | 2014-01-14 | 2014-05-14 | 四川长虹电器股份有限公司 | Control terminal, device terminal and system for adjusting volume of device terminal |
CN103809866A (en) * | 2012-11-13 | 2014-05-21 | 联想(北京)有限公司 | Operation mode switching method and electronic equipment |
CN103870140A (en) * | 2012-12-13 | 2014-06-18 | 联想(北京)有限公司 | Object processing method and device |
CN103902141A (en) * | 2012-12-27 | 2014-07-02 | 北京富纳特创新科技有限公司 | Device and method for achieving dynamic arrangement of desktop functional icons |
CN103947286A (en) * | 2011-09-30 | 2014-07-23 | 英特尔公司 | Mobile device rejection of unintentional touch sensor contact |
CN104049734A (en) * | 2013-03-13 | 2014-09-17 | 伊梅森公司 | Method and devices for displaying graphical user interfaces based on user contact |
CN104216602A (en) * | 2013-05-31 | 2014-12-17 | 国际商业机器公司 | Method and system for controlling slider |
CN104281408A (en) * | 2013-07-10 | 2015-01-14 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
CN104360813A (en) * | 2013-04-12 | 2015-02-18 | 深圳市中兴移动通信有限公司 | Display equipment and information processing method thereof |
CN104461322A (en) * | 2014-12-30 | 2015-03-25 | 中科创达软件股份有限公司 | Display method and system for user interface of handheld device |
CN104571919A (en) * | 2015-01-26 | 2015-04-29 | 深圳市中兴移动通信有限公司 | Terminal screen display method and device |
CN104615368A (en) * | 2015-01-21 | 2015-05-13 | 上海华豚科技有限公司 | Following switching method of keyboard interface |
CN104679427A (en) * | 2015-01-29 | 2015-06-03 | 深圳市中兴移动通信有限公司 | Terminal split-screen display method and system |
CN104679323A (en) * | 2013-09-06 | 2015-06-03 | 意美森公司 | Dynamic haptic conversion system |
CN104714731A (en) * | 2013-12-12 | 2015-06-17 | 中兴通讯股份有限公司 | Display method and device for terminal interface |
CN104731501A (en) * | 2015-03-25 | 2015-06-24 | 努比亚技术有限公司 | Icon control method and mobile terminal |
CN104765541A (en) * | 2015-04-10 | 2015-07-08 | 南京理工大学 | Method and system for identifying whether left hand or right hand operates mobile phone |
CN104765446A (en) * | 2014-01-07 | 2015-07-08 | 三星电子株式会社 | Electronic device and method of controlling electronic device |
CN104793856A (en) * | 2014-01-22 | 2015-07-22 | Lg电子株式会社 | Mobile terminal and method of controlling the mobile terminal |
CN104798030A (en) * | 2012-12-28 | 2015-07-22 | 英特尔公司 | Adapting user interface based on handedness of use of mobile computing device |
CN104834463A (en) * | 2015-03-31 | 2015-08-12 | 努比亚技术有限公司 | Holding recognition method and device of mobile terminal |
CN104850339A (en) * | 2014-02-19 | 2015-08-19 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104866136A (en) * | 2015-05-11 | 2015-08-26 | 努比亚技术有限公司 | Terminal operation mode determining method and apparatus |
CN104898959A (en) * | 2015-04-30 | 2015-09-09 | 努比亚技术有限公司 | Method and device for regulating position of virtual push button |
CN104915143A (en) * | 2015-06-19 | 2015-09-16 | 努比亚技术有限公司 | Frameless mobile terminal control method and terminal |
CN104915073A (en) * | 2014-03-14 | 2015-09-16 | 敦泰科技有限公司 | Hand-held type touch device |
CN105183235A (en) * | 2015-10-19 | 2015-12-23 | 上海斐讯数据通信技术有限公司 | Method for preventing mistakenly touching edge of touch control screen |
CN105227768A (en) * | 2015-09-18 | 2016-01-06 | 努比亚技术有限公司 | A kind of application APP display system and method |
CN105224181A (en) * | 2015-10-20 | 2016-01-06 | 魅族科技(中国)有限公司 | A kind of sidebar display packing and device |
CN105468245A (en) * | 2014-08-22 | 2016-04-06 | 中兴通讯股份有限公司 | Terminal and display method for terminal operation interface |
CN105468269A (en) * | 2014-08-15 | 2016-04-06 | 深圳市中兴微电子技术有限公司 | Mobile terminal capable of automatically identifying holding by left hand or right hand, and implementation method thereof |
CN105573622A (en) * | 2015-12-15 | 2016-05-11 | 广东欧珀移动通信有限公司 | Single-hand control method and device of user interface and terminal device |
JP2016099355A (en) * | 2014-11-20 | 2016-05-30 | ケースレー・インスツルメンツ・インコーポレイテッドKeithley Instruments,Inc. | Graphical display method |
WO2016123890A1 (en) * | 2015-02-02 | 2016-08-11 | 中兴通讯股份有限公司 | Handheld electronic device and control method, apparatus, and computer storage medium thereof |
WO2016155509A1 (en) * | 2015-03-27 | 2016-10-06 | 努比亚技术有限公司 | Method and device for determining holding mode of mobile terminal |
CN106104434A (en) * | 2014-03-17 | 2016-11-09 | 谷歌公司 | Touch panel device is used to determine user's handedness and orientation |
CN106227375A (en) * | 2015-06-02 | 2016-12-14 | 三星电子株式会社 | For controlling method and the electronic installation thereof of the display of electronic installation |
CN106406656A (en) * | 2016-08-30 | 2017-02-15 | 维沃移动通信有限公司 | Application program toolbar control method and mobile terminal |
CN106462341A (en) * | 2014-06-12 | 2017-02-22 | 微软技术许可有限责任公司 | Sensor correlation for pen and touch-sensitive computing device interaction |
CN106610746A (en) * | 2015-10-26 | 2017-05-03 | 青岛海信移动通信技术股份有限公司 | Mobile terminal and control method thereof |
CN106648329A (en) * | 2016-12-30 | 2017-05-10 | 维沃移动通信有限公司 | Application icon display method and mobile terminal |
CN107704082A (en) * | 2012-05-15 | 2018-02-16 | 三星电子株式会社 | Operate the method for display unit and support the terminal of methods described |
CN108124054A (en) * | 2016-11-29 | 2018-06-05 | 三星电子株式会社 | The equipment that sensing signal based on grip sensor shows user interface |
CN108737633A (en) * | 2017-04-18 | 2018-11-02 | 谷歌有限责任公司 | In response to the electronic equipment of power sensitive interface |
CN109710099A (en) * | 2017-10-26 | 2019-05-03 | 南昌欧菲生物识别技术有限公司 | Electronic device |
CN110709808A (en) * | 2017-12-14 | 2020-01-17 | 深圳市柔宇科技有限公司 | Control method and electronic device |
CN110874117A (en) * | 2018-09-03 | 2020-03-10 | 宏达国际电子股份有限公司 | Method for operating handheld device, and computer-readable recording medium |
CN112543362A (en) * | 2020-11-02 | 2021-03-23 | 当趣网络科技(杭州)有限公司 | Display interface switching method, remote controller, television system and electronic equipment |
CN113867594A (en) * | 2021-10-21 | 2021-12-31 | 元心信息科技集团有限公司 | Information input panel switching method and device, electronic equipment and storage medium |
CN115087952A (en) * | 2020-02-10 | 2022-09-20 | 日本电气株式会社 | Program for portable terminal, processing method, and portable terminal |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012168932A (en) | 2011-02-10 | 2012-09-06 | Sony Computer Entertainment Inc | Input device, information processing device and input value acquisition method |
KR101866272B1 (en) * | 2011-12-15 | 2018-06-12 | 삼성전자주식회사 | Apparatas and method of user based using for grip sensor in a portable terminal |
US9591339B1 (en) | 2012-11-27 | 2017-03-07 | Apple Inc. | Agnostic media delivery system |
US9774917B1 (en) | 2012-12-10 | 2017-09-26 | Apple Inc. | Channel bar user interface |
US10200761B1 (en) | 2012-12-13 | 2019-02-05 | Apple Inc. | TV side bar user interface |
US9532111B1 (en) | 2012-12-18 | 2016-12-27 | Apple Inc. | Devices and method for providing remote control hints on a display |
US10521188B1 (en) | 2012-12-31 | 2019-12-31 | Apple Inc. | Multi-user TV user interface |
JP5995171B2 (en) * | 2013-03-13 | 2016-09-21 | シャープ株式会社 | Electronic device, information processing method, and information processing program |
KR102139110B1 (en) * | 2013-06-20 | 2020-07-30 | 삼성전자주식회사 | Electronic device and method for controlling using grip sensing in the electronic device |
US9134818B2 (en) * | 2013-07-12 | 2015-09-15 | Facebook, Inc. | Isolating mobile device electrode |
WO2015200227A1 (en) | 2014-06-24 | 2015-12-30 | Apple Inc. | Column interface for navigating in a user interface |
KR102707403B1 (en) | 2014-06-24 | 2024-09-20 | 애플 인크. | Input device and user interface interactions |
KR102291565B1 (en) | 2014-12-03 | 2021-08-19 | 삼성디스플레이 주식회사 | Display device and drving method for display devece using the same |
KR101686629B1 (en) * | 2015-01-30 | 2016-12-14 | 한국과학기술연구원 | Method for determining location in virtual space indicated by users input regarding information on pressure and apparatus and computer-readable recording medium using the same |
KR102358110B1 (en) | 2015-03-05 | 2022-02-07 | 삼성디스플레이 주식회사 | Display apparatus |
KR102384284B1 (en) * | 2015-04-01 | 2022-04-08 | 삼성전자주식회사 | Apparatus and method for controlling volume using touch screen |
US10157410B2 (en) * | 2015-07-14 | 2018-12-18 | Ebay Inc. | Enhanced shopping actions on a mobile device |
KR101876020B1 (en) * | 2016-05-10 | 2018-07-06 | 홍익대학교세종캠퍼스산학협력단 | Cursor Scrolling Control Method Using A 3D Touch Of A Mobile Device |
DK201670581A1 (en) | 2016-06-12 | 2018-01-08 | Apple Inc | Device-level authorization for viewing content |
DK201670582A1 (en) | 2016-06-12 | 2018-01-02 | Apple Inc | Identifying applications on which content is available |
US11966560B2 (en) | 2016-10-26 | 2024-04-23 | Apple Inc. | User interfaces for browsing content from multiple content applications on an electronic device |
DK201870354A1 (en) | 2018-06-03 | 2019-12-20 | Apple Inc. | Setup procedures for an electronic device |
KR102539579B1 (en) | 2018-12-18 | 2023-06-05 | 삼성전자주식회사 | Electronic device for adaptively changing display area of information and operation method thereof |
EP3928526A1 (en) | 2019-03-24 | 2021-12-29 | Apple Inc. | User interfaces for viewing and accessing content on an electronic device |
US11445263B2 (en) | 2019-03-24 | 2022-09-13 | Apple Inc. | User interfaces including selectable representations of content items |
CN114297620A (en) | 2019-03-24 | 2022-04-08 | 苹果公司 | User interface for media browsing application |
US11683565B2 (en) | 2019-03-24 | 2023-06-20 | Apple Inc. | User interfaces for interacting with channels that provide content that plays in a media browsing application |
CN113906380A (en) | 2019-05-31 | 2022-01-07 | 苹果公司 | User interface for podcast browsing and playback applications |
US11863837B2 (en) | 2019-05-31 | 2024-01-02 | Apple Inc. | Notification of augmented reality content on an electronic device |
CN112486346B (en) | 2019-09-12 | 2023-05-30 | 北京小米移动软件有限公司 | Key mode setting method, device and storage medium |
JP7279622B2 (en) * | 2019-11-22 | 2023-05-23 | トヨタ自動車株式会社 | display device and display program |
US11843838B2 (en) | 2020-03-24 | 2023-12-12 | Apple Inc. | User interfaces for accessing episodes of a content series |
US11899895B2 (en) | 2020-06-21 | 2024-02-13 | Apple Inc. | User interfaces for setting up an electronic device |
KR20220064162A (en) * | 2020-11-11 | 2022-05-18 | 삼성전자주식회사 | An electronic device including a stretchable display |
US11720229B2 (en) | 2020-12-07 | 2023-08-08 | Apple Inc. | User interfaces for browsing and presenting content |
US11934640B2 (en) | 2021-01-29 | 2024-03-19 | Apple Inc. | User interfaces for record labels |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7800592B2 (en) * | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
CN1241109C (en) * | 2001-07-17 | 2006-02-08 | 仁宝电脑工业股份有限公司 | Touch display able to control amplificatino rabio by pressure |
US7009599B2 (en) * | 2001-11-20 | 2006-03-07 | Nokia Corporation | Form factor for portable device |
US20030117376A1 (en) * | 2001-12-21 | 2003-06-26 | Elen Ghulam | Hand gesturing input device |
GB0201074D0 (en) * | 2002-01-18 | 2002-03-06 | 3G Lab Ltd | Graphic user interface for data processing device |
US7116314B2 (en) * | 2003-05-06 | 2006-10-03 | International Business Machines Corporation | Method for distribution wear for a touch entry display |
WO2005008444A2 (en) * | 2003-07-14 | 2005-01-27 | Matt Pallakoff | System and method for a portbale multimedia client |
EP1557744B1 (en) * | 2004-01-20 | 2008-04-16 | Sony Deutschland GmbH | Haptic key controlled data input |
KR100608576B1 (en) * | 2004-11-19 | 2006-08-03 | 삼성전자주식회사 | Apparatus and method for controlling a potable electronic device |
CN101133385B (en) * | 2005-03-04 | 2014-05-07 | 苹果公司 | Hand held electronic device, hand held device and operation method thereof |
CN1901785B (en) * | 2005-07-22 | 2012-08-29 | 鸿富锦精密工业(深圳)有限公司 | Display device and its display control method |
CN100592247C (en) * | 2005-09-21 | 2010-02-24 | 鸿富锦精密工业(深圳)有限公司 | Multi-gradation menu displaying device and display control method |
CN1940834B (en) * | 2005-09-30 | 2014-10-29 | 鸿富锦精密工业(深圳)有限公司 | Circular menu display device and its display controlling method |
JP4699955B2 (en) * | 2006-07-21 | 2011-06-15 | シャープ株式会社 | Information processing device |
KR101144423B1 (en) * | 2006-11-16 | 2012-05-10 | 엘지전자 주식회사 | Mobile phone and display method of the same |
JP2008204402A (en) * | 2007-02-22 | 2008-09-04 | Eastman Kodak Co | User interface device |
-
2009
- 2009-02-17 KR KR1020090012687A patent/KR20100039194A/en active Search and Examination
- 2009-08-10 ES ES09167533T patent/ES2776103T3/en active Active
- 2009-09-14 CN CN201710119962.2A patent/CN106909305B/en active Active
- 2009-09-14 CN CN201710119581.4A patent/CN106909304B/en active Active
- 2009-09-14 CN CN200910169036A patent/CN101714055A/en active Pending
Cited By (106)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107665089B (en) * | 2010-08-12 | 2021-01-22 | 谷歌有限责任公司 | Finger recognition on touch screens |
CN103282869A (en) * | 2010-08-12 | 2013-09-04 | 谷歌公司 | Finger identification on a touchscreen |
CN107665089A (en) * | 2010-08-12 | 2018-02-06 | 谷歌公司 | Finger identification on touch-screen |
CN102375652A (en) * | 2010-08-16 | 2012-03-14 | 中国移动通信集团公司 | Mobile terminal user interface regulation system and method |
CN102402275A (en) * | 2010-09-13 | 2012-04-04 | 联想(北京)有限公司 | Portable electronic equipment and holding gesture detection method |
CN103140822A (en) * | 2010-10-13 | 2013-06-05 | Nec卡西欧移动通信株式会社 | Mobile terminal device and display method for touch panel in mobile terminal device |
CN102479035A (en) * | 2010-11-23 | 2012-05-30 | 汉王科技股份有限公司 | Electronic device with touch screen, and method for displaying left or right hand control interface |
CN102256015A (en) * | 2011-04-06 | 2011-11-23 | 罗蒙明 | Virtual keyboard finger key-pressing judgment method for touch mobile phone |
CN102790816A (en) * | 2011-05-16 | 2012-11-21 | 中兴通讯股份有限公司 | Processing method and device of pushbutton function |
CN102810039A (en) * | 2011-05-31 | 2012-12-05 | 中兴通讯股份有限公司 | Left or right hand adapting virtual keyboard display method and terminal |
CN102841723A (en) * | 2011-06-20 | 2012-12-26 | 联想(北京)有限公司 | Portable terminal and display switching method thereof |
CN102841723B (en) * | 2011-06-20 | 2016-08-10 | 联想(北京)有限公司 | Portable terminal and display changeover method thereof |
CN103502924B (en) * | 2011-06-24 | 2016-08-31 | 株式会社Ntt都科摩 | Personal digital assistant device and mode of operation decision method |
CN103502924A (en) * | 2011-06-24 | 2014-01-08 | 株式会社Ntt都科摩 | Mobile information terminal and operational state assessment method |
CN103502914B (en) * | 2011-06-29 | 2016-05-25 | 株式会社Ntt都科摩 | Personal digital assistant device and configuring area adquisitiones |
CN103502914A (en) * | 2011-06-29 | 2014-01-08 | 株式会社Ntt都科摩 | Mobile information terminal and position region acquisition method |
CN102299996A (en) * | 2011-08-19 | 2011-12-28 | 华为终端有限公司 | Handheld device operating mode distinguishing method and handheld device |
CN103947286A (en) * | 2011-09-30 | 2014-07-23 | 英特尔公司 | Mobile device rejection of unintentional touch sensor contact |
CN103947286B (en) * | 2011-09-30 | 2019-01-01 | 英特尔公司 | For refusing the mobile device and method of touch sensor contact unintentionally |
US9389772B2 (en) | 2011-11-23 | 2016-07-12 | Samsung Electronics Co., Ltd. | Method and apparatus for peripheral connection |
CN103136130B (en) * | 2011-11-23 | 2016-06-08 | 三星电子株式会社 | Peripheral method of attachment and device |
US10120548B2 (en) | 2011-11-23 | 2018-11-06 | Samsung Electronics Co., Ltd. | Method and apparatus for peripheral connection |
CN103136130A (en) * | 2011-11-23 | 2013-06-05 | 三星电子株式会社 | Method and apparatus for peripheral connection |
CN102722247A (en) * | 2012-03-09 | 2012-10-10 | 张伟明 | Operation and control component, information processing system using same and information processing method thereof |
CN103324423A (en) * | 2012-03-21 | 2013-09-25 | 北京三星通信技术研究有限公司 | Terminal and user interface display method thereof |
US11461004B2 (en) | 2012-05-15 | 2022-10-04 | Samsung Electronics Co., Ltd. | User interface supporting one-handed operation and terminal supporting the same |
CN107704082A (en) * | 2012-05-15 | 2018-02-16 | 三星电子株式会社 | Operate the method for display unit and support the terminal of methods described |
CN107704082B (en) * | 2012-05-15 | 2021-11-30 | 三星电子株式会社 | Method of operating display unit and terminal supporting the same |
CN102662603A (en) * | 2012-05-18 | 2012-09-12 | 广州市渡明信息技术有限公司 | Input method display method and input method display system for mobile phone with touch screen |
CN103513763A (en) * | 2012-06-26 | 2014-01-15 | Lg电子株式会社 | Mobile terminal and control method thereof |
CN102890558B (en) * | 2012-10-26 | 2015-08-19 | 北京金和软件股份有限公司 | The method of mobile hand-held device handheld motion state is detected based on sensor |
CN102890558A (en) * | 2012-10-26 | 2013-01-23 | 北京金和软件股份有限公司 | Method for detecting handheld motion state of mobile handheld device based on sensor |
CN103809866A (en) * | 2012-11-13 | 2014-05-21 | 联想(北京)有限公司 | Operation mode switching method and electronic equipment |
CN103118166A (en) * | 2012-11-27 | 2013-05-22 | 广东欧珀移动通信有限公司 | Method of realizing single hand operation of mobile phone based on pressure sensing |
CN103870140A (en) * | 2012-12-13 | 2014-06-18 | 联想(北京)有限公司 | Object processing method and device |
CN103576850A (en) * | 2012-12-26 | 2014-02-12 | 深圳市创荣发电子有限公司 | Method and system for judging holding mode of handheld device |
CN103902141A (en) * | 2012-12-27 | 2014-07-02 | 北京富纳特创新科技有限公司 | Device and method for achieving dynamic arrangement of desktop functional icons |
CN104798030B (en) * | 2012-12-28 | 2020-06-09 | 英特尔公司 | Adapting user interface based on handedness of use of mobile computing device |
CN104798030A (en) * | 2012-12-28 | 2015-07-22 | 英特尔公司 | Adapting user interface based on handedness of use of mobile computing device |
US9904394B2 (en) | 2013-03-13 | 2018-02-27 | Immerson Corporation | Method and devices for displaying graphical user interfaces based on user contact |
CN104049734B (en) * | 2013-03-13 | 2019-05-03 | 意美森公司 | The method and apparatus for contacting display graphic user interface based on user |
CN104049734A (en) * | 2013-03-13 | 2014-09-17 | 伊梅森公司 | Method and devices for displaying graphical user interfaces based on user contact |
CN104360813A (en) * | 2013-04-12 | 2015-02-18 | 深圳市中兴移动通信有限公司 | Display equipment and information processing method thereof |
CN104216602B (en) * | 2013-05-31 | 2017-10-20 | 国际商业机器公司 | A kind of method and system for control slide block |
CN104216602A (en) * | 2013-05-31 | 2014-12-17 | 国际商业机器公司 | Method and system for controlling slider |
CN104281408A (en) * | 2013-07-10 | 2015-01-14 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
CN104281408B (en) * | 2013-07-10 | 2019-05-31 | Lg电子株式会社 | Mobile terminal and its control method |
CN104679323B (en) * | 2013-09-06 | 2019-03-08 | 意美森公司 | Dynamic haptic converting system |
CN104679323A (en) * | 2013-09-06 | 2015-06-03 | 意美森公司 | Dynamic haptic conversion system |
CN104714731A (en) * | 2013-12-12 | 2015-06-17 | 中兴通讯股份有限公司 | Display method and device for terminal interface |
CN104714731B (en) * | 2013-12-12 | 2019-10-11 | 南京中兴软件有限责任公司 | The display methods and device of terminal interface |
CN104765446A (en) * | 2014-01-07 | 2015-07-08 | 三星电子株式会社 | Electronic device and method of controlling electronic device |
CN103795949A (en) * | 2014-01-14 | 2014-05-14 | 四川长虹电器股份有限公司 | Control terminal, device terminal and system for adjusting volume of device terminal |
CN104793856A (en) * | 2014-01-22 | 2015-07-22 | Lg电子株式会社 | Mobile terminal and method of controlling the mobile terminal |
CN104793856B (en) * | 2014-01-22 | 2020-04-14 | Lg电子株式会社 | Mobile terminal and method for controlling mobile terminal |
CN104850339B (en) * | 2014-02-19 | 2018-06-01 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN104850339A (en) * | 2014-02-19 | 2015-08-19 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104915073B (en) * | 2014-03-14 | 2018-06-01 | 敦泰科技有限公司 | Hand-held type touch device |
CN104915073A (en) * | 2014-03-14 | 2015-09-16 | 敦泰科技有限公司 | Hand-held type touch device |
CN106104434B (en) * | 2014-03-17 | 2019-07-16 | 谷歌有限责任公司 | User's handedness and orientation are determined using touch panel device |
CN106104434A (en) * | 2014-03-17 | 2016-11-09 | 谷歌公司 | Touch panel device is used to determine user's handedness and orientation |
CN106462341B (en) * | 2014-06-12 | 2020-02-28 | 微软技术许可有限责任公司 | Sensor correlation for pen and touch sensitive computing device interaction |
CN106462341A (en) * | 2014-06-12 | 2017-02-22 | 微软技术许可有限责任公司 | Sensor correlation for pen and touch-sensitive computing device interaction |
CN105468269A (en) * | 2014-08-15 | 2016-04-06 | 深圳市中兴微电子技术有限公司 | Mobile terminal capable of automatically identifying holding by left hand or right hand, and implementation method thereof |
CN105468245B (en) * | 2014-08-22 | 2020-05-01 | 中兴通讯股份有限公司 | Terminal and display method of terminal operation interface |
CN105468245A (en) * | 2014-08-22 | 2016-04-06 | 中兴通讯股份有限公司 | Terminal and display method for terminal operation interface |
JP2016099355A (en) * | 2014-11-20 | 2016-05-30 | ケースレー・インスツルメンツ・インコーポレイテッドKeithley Instruments,Inc. | Graphical display method |
CN104461322A (en) * | 2014-12-30 | 2015-03-25 | 中科创达软件股份有限公司 | Display method and system for user interface of handheld device |
CN104615368A (en) * | 2015-01-21 | 2015-05-13 | 上海华豚科技有限公司 | Following switching method of keyboard interface |
CN104571919A (en) * | 2015-01-26 | 2015-04-29 | 深圳市中兴移动通信有限公司 | Terminal screen display method and device |
CN104679427A (en) * | 2015-01-29 | 2015-06-03 | 深圳市中兴移动通信有限公司 | Terminal split-screen display method and system |
CN105988692A (en) * | 2015-02-02 | 2016-10-05 | 中兴通讯股份有限公司 | Handheld electronic equipment, and method and device for controlling handheld electronic equipment |
WO2016123890A1 (en) * | 2015-02-02 | 2016-08-11 | 中兴通讯股份有限公司 | Handheld electronic device and control method, apparatus, and computer storage medium thereof |
CN104731501A (en) * | 2015-03-25 | 2015-06-24 | 努比亚技术有限公司 | Icon control method and mobile terminal |
WO2016155509A1 (en) * | 2015-03-27 | 2016-10-06 | 努比亚技术有限公司 | Method and device for determining holding mode of mobile terminal |
WO2016155434A1 (en) * | 2015-03-31 | 2016-10-06 | 努比亚技术有限公司 | Method and device for recognizing holding of mobile terminal, storage medium, and terminal |
CN104834463A (en) * | 2015-03-31 | 2015-08-12 | 努比亚技术有限公司 | Holding recognition method and device of mobile terminal |
CN104765541A (en) * | 2015-04-10 | 2015-07-08 | 南京理工大学 | Method and system for identifying whether left hand or right hand operates mobile phone |
CN104898959B (en) * | 2015-04-30 | 2018-06-05 | 努比亚技术有限公司 | A kind of method and apparatus for adjusting virtual push button position |
CN104898959A (en) * | 2015-04-30 | 2015-09-09 | 努比亚技术有限公司 | Method and device for regulating position of virtual push button |
CN104866136A (en) * | 2015-05-11 | 2015-08-26 | 努比亚技术有限公司 | Terminal operation mode determining method and apparatus |
CN104866136B (en) * | 2015-05-11 | 2019-02-15 | 努比亚技术有限公司 | A kind of method and device of determining terminal operating mode |
CN106227375A (en) * | 2015-06-02 | 2016-12-14 | 三星电子株式会社 | For controlling method and the electronic installation thereof of the display of electronic installation |
CN104915143A (en) * | 2015-06-19 | 2015-09-16 | 努比亚技术有限公司 | Frameless mobile terminal control method and terminal |
CN105227768A (en) * | 2015-09-18 | 2016-01-06 | 努比亚技术有限公司 | A kind of application APP display system and method |
CN105183235B (en) * | 2015-10-19 | 2018-02-06 | 上海斐讯数据通信技术有限公司 | A kind of method of touch-control platen edge false-touch prevention |
CN105183235A (en) * | 2015-10-19 | 2015-12-23 | 上海斐讯数据通信技术有限公司 | Method for preventing mistakenly touching edge of touch control screen |
CN105224181B (en) * | 2015-10-20 | 2018-05-25 | 魅族科技(中国)有限公司 | A kind of sidebar display methods and device |
CN105224181A (en) * | 2015-10-20 | 2016-01-06 | 魅族科技(中国)有限公司 | A kind of sidebar display packing and device |
CN106610746A (en) * | 2015-10-26 | 2017-05-03 | 青岛海信移动通信技术股份有限公司 | Mobile terminal and control method thereof |
CN105573622A (en) * | 2015-12-15 | 2016-05-11 | 广东欧珀移动通信有限公司 | Single-hand control method and device of user interface and terminal device |
CN106406656B (en) * | 2016-08-30 | 2019-07-26 | 维沃移动通信有限公司 | A kind of control method and mobile terminal of application tool bar |
CN106406656A (en) * | 2016-08-30 | 2017-02-15 | 维沃移动通信有限公司 | Application program toolbar control method and mobile terminal |
CN108124054B (en) * | 2016-11-29 | 2022-06-17 | 三星电子株式会社 | Apparatus for displaying user interface based on sensing signal of grip sensor |
CN108124054A (en) * | 2016-11-29 | 2018-06-05 | 三星电子株式会社 | The equipment that sensing signal based on grip sensor shows user interface |
CN106648329A (en) * | 2016-12-30 | 2017-05-10 | 维沃移动通信有限公司 | Application icon display method and mobile terminal |
US10635255B2 (en) | 2017-04-18 | 2020-04-28 | Google Llc | Electronic device response to force-sensitive interface |
US11237660B2 (en) | 2017-04-18 | 2022-02-01 | Google Llc | Electronic device response to force-sensitive interface |
CN108737633A (en) * | 2017-04-18 | 2018-11-02 | 谷歌有限责任公司 | In response to the electronic equipment of power sensitive interface |
CN109710099A (en) * | 2017-10-26 | 2019-05-03 | 南昌欧菲生物识别技术有限公司 | Electronic device |
CN110709808A (en) * | 2017-12-14 | 2020-01-17 | 深圳市柔宇科技有限公司 | Control method and electronic device |
CN110874117B (en) * | 2018-09-03 | 2021-05-18 | 宏达国际电子股份有限公司 | Method for operating handheld device, and computer-readable recording medium |
CN110874117A (en) * | 2018-09-03 | 2020-03-10 | 宏达国际电子股份有限公司 | Method for operating handheld device, and computer-readable recording medium |
CN115087952A (en) * | 2020-02-10 | 2022-09-20 | 日本电气株式会社 | Program for portable terminal, processing method, and portable terminal |
CN112543362A (en) * | 2020-11-02 | 2021-03-23 | 当趣网络科技(杭州)有限公司 | Display interface switching method, remote controller, television system and electronic equipment |
CN113867594A (en) * | 2021-10-21 | 2021-12-31 | 元心信息科技集团有限公司 | Information input panel switching method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN106909304B (en) | 2020-08-14 |
CN106909304A (en) | 2017-06-30 |
KR20100039194A (en) | 2010-04-15 |
ES2776103T3 (en) | 2020-07-29 |
CN106909305B (en) | 2020-10-27 |
CN106909305A (en) | 2017-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101714055A (en) | Method and apparatus for displaying graphical user interface depending on a user's contact pattern | |
EP2175344B1 (en) | Method and apparatus for displaying graphical user interface depending on a user's contact pattern | |
US8159469B2 (en) | User interface for initiating activities in an electronic device | |
US20140123049A1 (en) | Keyboard with gesture-redundant keys removed | |
US8856674B2 (en) | Electronic device and method for character deletion | |
US20140152585A1 (en) | Scroll jump interface for touchscreen input/output device | |
US20070229472A1 (en) | Circular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface | |
CN105630327B (en) | The method of the display of portable electronic device and control optional element | |
CN101675410A (en) | Virtual keyboard input system using pointing apparatus in digial device | |
KR20070085631A (en) | Portable electronic device having user interactive visual interface | |
WO2012101710A1 (en) | Input device, input method, and computer program | |
CN103733162A (en) | Method and apparatus for providing character input interface | |
KR20080111453A (en) | User interface for scrolling | |
CN103543945A (en) | System and method for displaying keypad via various types of gestures | |
JP2012155483A (en) | Input device, input method and computer program | |
US20110227844A1 (en) | Method and apparatus for inputting character in portable terminal | |
US20120120004A1 (en) | Touch control device and touch control method with multi-touch function | |
CN102279699A (en) | Information processing apparatus, information processing method, and program | |
KR20130035857A (en) | Apparatus and method for mobile screen navigation | |
WO2014056338A1 (en) | Method and device for interaction of list data of mobile terminal | |
KR20110082494A (en) | Method for data transferring between applications and terminal apparatus using the method | |
KR101879856B1 (en) | Apparatus and method for setting idle screen | |
US20150042585A1 (en) | System and electronic device of transiently switching operational status of touch panel | |
US20130111390A1 (en) | Electronic device and method of character entry | |
KR101208202B1 (en) | System and method for non-roman text input |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20100526 |
|
RJ01 | Rejection of invention patent application after publication |