CN105511726B - user input - Google Patents
user input Download PDFInfo
- Publication number
- CN105511726B CN105511726B CN201510919403.0A CN201510919403A CN105511726B CN 105511726 B CN105511726 B CN 105511726B CN 201510919403 A CN201510919403 A CN 201510919403A CN 105511726 B CN105511726 B CN 105511726B
- Authority
- CN
- China
- Prior art keywords
- item
- image item
- graph
- image
- touch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004590 computer program Methods 0.000 claims abstract description 25
- 230000004044 response Effects 0.000 claims abstract description 21
- 238000000034 method Methods 0.000 claims abstract description 17
- 238000001514 detection method Methods 0.000 claims description 6
- 230000003068 static effect Effects 0.000 claims description 3
- 210000003811 finger Anatomy 0.000 description 58
- 210000003813 thumb Anatomy 0.000 description 28
- 230000006870 function Effects 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 102100036378 T-cell immunomodulatory protein Human genes 0.000 description 1
- 101710194900 T-cell immunomodulatory protein Proteins 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Embodiment of the disclosure is related to user's input.A kind of equipment, method and computer program are provided.The equipment includes: at least one processor;And at least one processor of storage computer program instructions, at least one processor is configured as executing the computer program instructions so that the equipment at least executes: control touch-sensitive display carrys out at first position to show the first image item and shows second graph item in the second place, and second graph item and the first image item separate first distance;The first user's finger is detected at first position and detects second user finger in the second place;In response to detecting the movement of at least one of the first user's finger and second user finger across touch-sensitive display, first distance is reduced by least one of across touch-sensitive display the first image item of movement and second graph item;And in determination, reduced first distance makes the first image item after second graph item, controls touch-sensitive display to show third image item.
Description
The application be on July 20th, 2012 enter National Phase in China, international filing date be on January 20th, 2010,
Application No. is the divisional applications of 201080062072.9 application for a patent for invention (entitled " user's input ").
Technical field
The embodiment of the present invention is related to user's input.In particular to using touch-sensitive display tissue figure menu item.
Background technique
Some electronic equipments include touch-sensitive display.Touch-sensitive display allow users to by touch touch-sensitive display come
User's input is provided.
Summary of the invention
According to the present invention various but not necessarily all embodiment, provides a kind of equipment, which includes: at least one
Manage device;And at least one processor of storage computer program instructions, at least one processor are configured as executing the meter
Calculation machine program instruction is so that the equipment at least executes: control touch-sensitive display come at first position show the first image item and
Show that second graph item, the second graph item and first image item separate first distance in the second place;At this first
It sets place's the first user's finger of detection and detects second user finger in the second place;In response to detecting first user hand
Refer to and the movement of at least one of the second user finger across the touch-sensitive display, by across the touch-sensitive display it is mobile this
At least one of one image item and the second graph item reduce the first distance;And determination it is reduced this first away from
From first image item is made after the second graph item, the touch-sensitive display is controlled to show third image item.
According to the present invention various but not necessarily all embodiment, provides a kind of method, this method comprises: control is touch-sensitive
Display at first position show the first image item and the second place show second graph item, the second graph item with
First image item separates first distance;The first user's finger is detected at the first position and in second place detection the
Two user's fingers;In response to detecting at least one of first user's finger and the second user finger across the touch-sensitive display
The movement of device, by reducing this across the touch-sensitive display mobile at least one of first image item and the second graph item
First distance;And make first image item after the second graph item in the reduced first distance of determination, it controls
The touch-sensitive display is made to show third image item.
According to the present invention various but not necessarily all embodiment is provided with a kind of computer program, including computer journey
Sequence instruction, the computer program instructions execute equipment at least when executed by least one processor: controlling touch-sensitive display
Device to show the first image item at first position and show second graph item in the second place, the second graph item and this
One image item separates first distance;The first user's finger is detected at the first position and is used in second place detection second
Family finger;In response to detecting at least one of first user's finger and the second user finger across the touch-sensitive display
It is mobile, by reduce across the touch-sensitive display mobile at least one of first image item and the second graph item this first
Distance;And making first image item after the second graph item in the reduced first distance of determination, control should
Touch-sensitive display shows third image item.
According to the present invention various but not necessarily all embodiment, is provided with a kind of equipment, which includes: for controlling
Touch-sensitive display at first position to show the first image item and show the device of second graph item in the second place, this
Two image items and first image item separate first distance;For detected at the first position the first user's finger and this
The device of second user finger is detected at two positions;For in response to detecting first user's finger and the second user finger
At least one of the movement across the touch-sensitive display and by across mobile first image item of the touch-sensitive display and this second
At least one of image item reduces the device of the first distance;And for making in the reduced first distance of determination
The device that first image item controls the touch-sensitive display after the second graph item to show third image item.
According to the present invention various but not necessarily all embodiment, is provided with a kind of graphic user interface, the graphical user
Interface includes: the first image item shown at first position on the touch sensitive display and the second on the touch-sensitive display
The second graph item of place's display is set, the second graph item and first image item separate first distance, wherein the first user's finger
Placement and second user finger at the first position the second place placement so that identifying first image item
With at least one of the second graph item with for moving, and across mobile first finger of the touch-sensitive display and this second
At least one of finger is to reduce the first distance;And reduce first distance make first image item close to this second
After image item, third image item is shown on the touch-sensitive display.
Detailed description of the invention
The various examples of embodiment for a better understanding of the present invention, now will be only by way of example referring to attached
Figure, in the accompanying drawings:
Fig. 1 illustrates a kind of equipment;
Fig. 2 illustrates another equipment;
Fig. 3 illustrates the flow chart of method;
The equipment that Fig. 4 A illustrates display the first image item and second graph item;
Fig. 4 B illustrates the first image item and second graph item after being moved by user;
The equipment that Fig. 4 C illustrates display third image item;
Fig. 5 A illustrates exemplary first image item;
Fig. 5 B illustrates exemplary second graph item;
Fig. 5 C illustrates exemplary third image item;
Fig. 6 A illustrates the first image item of another exemplary;
Fig. 6 B illustrates another exemplary second graph item;
Fig. 6 C illustrates another exemplary third image item;
Fig. 7 A illustrates the movement executed in response to the first image item of selection;
Fig. 7 B illustrates the movement executed in response to selection second graph item;
Fig. 7 C illustrates the movement executed in response to selection third image item;
The equipment that Fig. 8 A illustrates display third image item;And
The equipment that Fig. 8 B illustrates display the first image item and second graph item.
Specific embodiment
The embodiment of the present invention is related to allowing users to using touch-sensitive display tissue figure menu item.It is more using detecting
Touch-sensitive display that is a while touching provides intuitive user experience.
Drawing illustration equipment 10/30 comprising: at least one processor 12;And storage computer program instructions 18
At least one processor 14, at least one described processor 12 be configured as execute computer program instructions 18 so that equipment
10/30 at least executes: control touch-sensitive display 22 carrys out at first position 51 to show the first image item 50/150 and in second
It sets and shows second graph item 60/160 at 61, second graph item 60/160 and the first image item 50/150 separate first distance 100;
The first user's finger 120 is detected at first position 51 and second user finger 130 is detected at the second position 61;In response to inspection
Movement of at least one of the first user's finger 120 and the second user finger 130 across touch-sensitive display 22 is measured, by across touching
Quick display 22 moves at least one of the first image item 50/150 and second graph item 60/160 to reduce first distance
100;And determination reduced first distance 100 make the first image item 50/150 close to second graph item 60/160 it
Afterwards, touch-sensitive display 22 is controlled to show third image item 70/170.
Fig. 1 illustrates equipment 10.The equipment for example can be chip or chipset.Equipment 10 shown in FIG. 1 includes processing
Device 12 and memory 14.In alternative embodiment of the invention, equipment 10 may include multiple processors.
Processor 12 is configured as being read out from memory 14 and memory 14 being written.Memory 12 can also
To include output interface and input interface, processor 12 is and defeated via this via the output interface output data and/or order
Incoming interface is to 12 input data of processor and/or order.
Although memory 14 is illustrated as single component, it can be used as one or more separated components and come in fact
It is existing, some of or all components can integrate/can be removed and/or can provide permanently/semipermanent/dynamic/
Buffer memory.
Memory 14 stores the computer program 16 including computer program instructions 18, which, which works as, adds
It is downloaded to the operation of time control control equipment 10/30 in processor 12.The offer of computer program instructions 18 enables equipment 10/30 to hold
The logic and routine of method shown in row Fig. 3.Processor 12 can load and execute computer program by reading memory 14 and refer to
Enable 18.
Computer program 16 can reach equipment 10/30 via any delivery mechanism 40 appropriate.Delivery mechanism 40
It such as can be tangible computer readable storage medium, computer program product, memory devices, such as CD-ROM, DVD or blue light
The recording medium of CD etc or any product for visibly implementing computer program 16.Delivery mechanism 40 can be to be configured
At the signal for reliably transmitting computer program 16.
Fig. 2 illustrates another equipment 30.The equipment 30 shown in Fig. 2 for example can be hand held portable electronics, all
Such as mobile phone, personal music player, personal digital assistant, computer, game console or camera.
Equipment 30 shown in Fig. 2 includes equipment 10 shown in FIG. 1.Equipment 30 further includes shell 28,22 and of touch-sensitive display
Optional RF transceiver 24.Shell 28 accommodates: processor 12, memory 14, touch-sensitive display 22 and RF transceiver 24.Member
Part 12,14,22 and 24 is co-located in shell 28.Element 12,14,22 and 24 is operatively coupled, and there may be appoint
The combination (including no intermediary element) of the intermediary element or intermediary element of what number.
Processor 12 is configured as providing output to touch-sensitive display 22 and RF transceiver 24.Processor 12 is configured as
Receive the input from RF transceiver 24 and touch-sensitive display 22.
Touch-sensitive display 22 is configured to supply graphic user interface.Touch-sensitive display 22 can be any type of touch-sensitive
Display, such as resistive touch display or condenser type touch-sensitive display.Touch-sensitive display 22 is configured as detecting simultaneously
Multiple and different touch inputs.
RF transceiver 24 is configured as emitting and receiving radiofrequency signal.RF transceiver 24 can for example be received and dispatched for honeycomb
Device, can be with such as GSM (global system for mobile communications), IS-95 (Interim Standard 95) or UMTS (Universal Mobile Communication System)
Etc one or more cellular protocols it is compatible.Optionally, RF transceiver 24 can be short-range transceiver, can with it is all
One or more short-range protocols such as Bluetooth protocol or IEEE (the Electrical and Electronic engineer committee) agreement etc are compatible.?
In some embodiments of the present invention, equipment 30 includes that one or more cellular transceivers and one or more short distances are received and dispatched
Device.
Illustrative methods according to an embodiment of the present invention will be described now about Fig. 3 to Fig. 7 C.
At the block 301 of Fig. 3, processor 12 controls touch-sensitive display 22 to show the first image item at first position 51
50 and second graph item 60 is shown at the second position 61.First image item 50 separates first distance 100 with second graph item 60.
Fig. 4 A illustrates the first image item 50 and second graph item 60 shown on touch-sensitive display 22.
In this example, the first image item 50 and second graph item 60 can individually be selected by user.It is, may be selected first
Image item 50, without selecting second graph item 60 and optional second graph item 60, without selecting the first image item 50.
User can provide user's input at the position of image item 50,60 51,61, and user input is solved by processor 12
It is translated into the selection to image item 50,60.Processor 12 is interpreted as the touch input type of the selection to image item 50,60 in this hair
It can be different in bright difference embodiment.Such as in some embodiments of the invention, processor 12 can by image item 50,
" double touches " at 60 is interpreted as the selection to image item 50,60." double touches " is related to user in the position of image item 50,60
51, the first touch and second is in extremely rapid succession provided at 61 to touch.If processor 12 is configured as the first touch and the second touching
Time interval between touching is less than threshold time period, then touches first and the second touch is interpreted as " double touches ".
In other embodiments of the invention, " one-touch " at image item 50,60 is interpreted as to figure by processor 12
The selection of item 50,60.
Processor 12 is configured to respond to the selection of image item 50,60 to execute movement.The movement of execution depends on choosing
The image item 50,60 selected.In this example, the selection of the first image item 50 is so that processor 12 executes the first function.Second graph
So that processor 12 executes the second function, which is different from the first function for the selection of item 60.
First image item 50 and second graph item 60 can be with different softwares using related.The selection of first image item 50 can
So that processor 12 opens the first software application.The selection of second graph item 60 can make processor 12 open the second software
Using.
Fig. 5 A illustrates the example of the first image item 50.In this example, the first image item 50 and messaging software application
It is related.The graphical appearance of the first image item 50 in Fig. 5 A shows it with messaging software using related to user.First figure
The selection of shape item 50 is so that processor 12 opens messaging software application.Messaging software application can for example mention for user
For the access to the message sent and received.
Fig. 5 B illustrates the example of second graph item 60.In this example, second graph item 60 and call log software application
It is related.The graphical appearance of second graph item 60 in Fig. 5 B shows that it is related with call log software application to user.Second figure
The selection of shape item 60 is so that processor 12 opens call log software application.Call log software application can for example mention for user
For to one or more call logs (for example, the call log of the call log received, loss, Outgoing Number log etc.)
Access.
This method allows users to create third image item 70 by the first image item 50 of combination and second graph item 60.
This is by shifting to second graph item 60, by the way that second graph item 60 is shifted to the first image item 50 or is led to for the first image item 50
It crosses while shifting to the first image item 50 and second graph item 60 each other to realize.
Before mobile graphics item 50,60, user is provided so that processor 12 identifies the use of one or more image items
Family input.It is provided to identify one or more image items to be different from being provided to select for mobile user's input
Select user's input of the image item 50,60.
For example, in this hair that " the double touches " at image item 50,60 is wherein interpreted as to the selection to image item 50,60
In bright embodiment, the one-touch at image item 50,60 can make processor 12 identify image item 50,60 for moving.
It is real in the present invention that the one-touch at image item 50,60 is wherein interpreted as the selection to the image item 50,60
It applies in example, the extension at image item 50,60, which touches (lasting longer than threshold time period), can make processor 12 identify the figure
Shape item 50,60 is for moving.
Optionally, the one-touch at image item 50,60 makes processor 12 identify the image item 50,60 for moving
Before, it may be necessary to the users of some other forms input (select on display 22 different graphic item or with display 22
The activation of separated key).
At the block 302 of Fig. 3, user at the first image item 50 (at first position 51) by placing the first user hand
Refer to 120 and user's input is provided, and by placing second user finger (at the second position 61) at second graph item 60
130 and provide user input.In this example, the first user's finger 120 is the finger except thumb, and second user finger is thumb
130.Fig. 4 A illustrates the first user's finger 120 at first position 51 and the second user finger at the second position 61
130。
In some embodiments of the invention, only one in the first image item 50 and second graph item 60 is removable.And
In this example, both the first image item 50 and second graph item 60 are removable.
When user touches the first image item 50 at first position 51 using the first user's finger 120, processor 12
Identify the first image item 50 for moving.Processor 12 can be shown by highlighting such as the first image item 50 to user
Identified first image item 50 is for moving.Similarly, when user is touched at the second position 61 using second finger 130
Second graph item 60 when, processor 12 identifies second graph item 60 with for moving.Processor 12 can be by making such as
Two image items 60 highlight to show that identified second graph item 60 is for moving to user.
Once having utilized the identification image item 50,60 of user's finger 120,130 to be used to move, user's finger 120,130
Movement (and not removing user's finger 120,130 from display 22) across display 22 makes image item 50,60 around display
22 is mobile parallel with user's finger 120,130.It is, user can be by across 22 mobile subscriber's finger 120,130 of display
And image item 50,60 is pulled around display 22.
In the example shown in Fig. 4 A and Fig. 4 B, at the block 303 of Fig. 3, user passes through the finger other than the thumb by him
120 and his thumb 130 move towards each other and execute " squeezing movement ".Processor 12 detect thumb other than finger 120 across
Display 22 towards second graph item 60 movement, and control display 22 with the concurrently court of the finger 120 other than thumb
To mobile first image item 50 of second graph item 60.For example, processor 12, which can control display 22, makes the first image item 50
Remain closest to the part of the finger 120 other than thumb contacted with display 22 (for example, at least partly thereunder).
Processor 12 also detects movement of the thumb 130 across display 22 towards the first image item 50, and controls display
22 concurrently move second graph item 60 towards the first image item 50 with thumb 130.For example, processor 12 can control display
22 make second graph item 60 remain closest to the part of thumb 130 contacted with display 22 (for example, at least partly at it
Lower section).
Due to user by other than thumb finger 120 and thumb 130 simultaneously towards moving each other, processor 12 is by the first figure
Shape item 50 and second graph item 60 are simultaneously towards moving each other.Processor 12 is by by the first image item 50 and second graph item 60
It moves towards each other and reduces the distance 100 between the first image item 50 and second graph item 60 on display 22.
Wherein not in one of mobile graphics item 50,60 embodiment of the present invention of (or irremovable), only image item 50,
One of 60 movement causes the distance between image item 50,60 100 to reduce.
In some embodiments of the invention, when processor 12 determines the first image item 50 close to second graph item 60,
It controls the overlapping or merging that display 22 shows the first image item 50 and second graph item 60.Fig. 4 B illustrates wherein processor
12 control displays 22 show the combined embodiment of the present invention of the first image item 50 and second graph item 60.
When the distance between the first image item 50 and second graph item 60 100 is less than threshold value, processor 12 can be determined
First image item 50 is close to second graph item 60.The threshold value for example can be zero or non-zero.
In some embodiments of the invention, the first image item 50 and second graph item 60 are had shown that in display 22
After overlapping or merging, user can be by keeping separating to increase on display 22 and by them his finger 120,130
Add the distance between the first image item 50 and second graph item 60 100.When processor 12 detects the first image item 50 no longer most
When close to second graph item 60, control display removes overlapping or merges.
In this example, when user has moved the first image item 50 and/or second graph item 60 makes the first image item 50
When near second graph item 60 (as shown in Figure 4 B), he removes his finger 120,130 from display 22.
At the block 304 of Fig. 3, processor 12, which detects, removes finger 120,130, while the first figure from display 22
Shape item 50 is near second graph item 60.In response to this, processor 12 controls display 22 to show third image item 70, is somebody's turn to do
Third image item 70 is different from the first image item 50 and second graph item 60.This shows in figure 4 c.
In this example, when processor 12 controls display 22 to show third image item 70, display 22 is also controlled
Remove the first image item 50 and second graph item 60.In other words, processor 12 is by by the first image item on display 22
50 and second graph item 60 replace with third image item 70 and be grouped together the first image item 50 and second graph item 60.The
The picture appearance of three image items 70 can show that it is related with the combination of the first image item 50 and second graph item 60 to user.
Fig. 5 C illustrates the example of third image item 70.In this example, third image item 70 is included in than the first image item
50 and the first image item 50 and second graph item 60 on the smaller scale of second graph item 60 graphical appearance.
Fig. 7 A, Fig. 7 B and Fig. 7 C are illustrated when exemplary first image item 50, the second graph in selection Fig. 5 A to Fig. 5 C
The movement that processor 12 executes when item 60 and third image item 70.
In this example, processor 12 is configured to supply with menu layered.First image item 50 and the second figure
Shape item 60 is the item that can individually select in the specific grade of menu.Third image item 70 be can with the first image item 50 and second
The item that can be individually selected in different (higher) grade of image item 60.
Third image item 70 can provide the access to the first image item 50 and second graph item 60 for user.Fig. 7 C instruction
Wherein the selection (as indicated by figure arrow 78) of third image item 70 makes the control display of display 22 of processor 12 then can
The example of the first image item 50 and second graph item 60 individually selected.It can be after selection by third image item 70 from display
It removes.
Fig. 7 A illustrates multiple image item 52-55, and multiple image item 52-55 is in menu structure in than the first figure
It 50 lower grades and can be accessed via the first image item 50.It presents and schemes to user in response to the selection to the first image item 50
Shape item 52-55.To the selection of the first image item 50 as indicated by the arrow 58 in Fig. 7 A.
Fig. 7 B illustrates multiple image item 62-65, and multiple image item 62-65 is in menu structure in than second graph
It 60 lower grades and can be accessed via second graph item 60.It is presented in response to the selection to second graph item 60 to user
Image item 62-65.To the selection of second graph item 60 as indicated by the arrow 68 in Fig. 7 B.
Advantageously, the embodiment of the present invention allows users to the preference tissue image item with intuitive way according to him.
For example, he may select the first image item 50 if user infrequently selects the first image item 50 and second graph item 60
It is grouped together with second graph item 60 and creates third image item 70.When generating third image item 70 to the first image item
50 and the removal of second graph item 60 creation can be used to show the sky of image item that user more often selects on display 22
Between.
Optional first image item 150, second for Fig. 6 A to Fig. 6 C illustrates shown in Fig. 5 A to Fig. 5 C those
Image item 160 and third image item 170.Fig. 6 A to Fig. 6 C be related to wherein combining the first image item 150 and second graph item 160 with
Create the alternative embodiment of the invention of playlist.
In those embodiments of the invention, the first image item 150 indicates the first media content, and second graph item
160 indicate the second media content.First media content and the second media content can be audio content (such as music file), view
Frequency content (such as image file) or audio-video frequency content (such as video file).In this particular example, the first media content
It is audio content with the second media content.
The graphical appearance of first image item 150 shown in Fig. 6 A shows it in relation to the first audio content, in this example to user
In, which is the song " Gotta Be Somebody " of Nickelback.Similarly, second graph item 160
Graphical appearance shows it in relation to the second audio content to user, and in this example, which is the song of Linkin Park
Bent " Leave Out All The Rest ".
First image item 150 and second graph item 160 can be and can individually select.For example, the choosing of the first image item 150
Selecting, which can make processor 12 control audio output apparatus, plays back the first audio content.The selection of second graph item 160 can make
It obtains processor 12 and controls audio output apparatus the second audio content of playback.
User can in a manner of above in conjunction with Fig. 3 to Fig. 4 C description the first image item shown in constitutional diagram 6A and Fig. 6 B
150 and second graph item 160.When executing the combination, processor 12 controls display 22 and shows third figure shown in Fig. 6 C
Item 170.Third image item 170 is related with the playlist comprising the first audio content and the second audio content.In first audio
Hold and the second audio content can be accessed via third image item 170.For example, the selection of third image item 170 can to handle
Device 12 controls audio output apparatus and successively plays back the first audio content and the second audio content.
When processor 12, which controls display 22, shows third image item 170, the first figure can be removed from display
Item 50 and second graph item 60, or the first image item 50 and second graph item 60 can not be removed from display.
When processor 12, which controls display, shows third image item 170, the display request of display 22 also can control
The prompt of user's name playlist.For example, processor 12, which can control display 22, shows dummy keyboard, allow users to
Playlist is named.In the example shown in Fig. 6 C, playlist has been named as " Rock ".
First image item 150 and second graph item 160 can be considered as a part of the specific grade in hierarchical menu structures.
Third image item 170 can be considered at the different from the first image item 150 and second graph item 160 of hierarchical menu structures
In the grade of (higher).
Advantageously, the embodiment of the present invention shown in Fig. 6 A to Fig. 6 C is allowed users to his media of intuitive manner tissue
Content.
Fig. 8 A and Fig. 8 B, which are related to wherein user, can implement the present invention of the image item being grouped together " release and be grouped "
Example.For example, user " can solve the image item being grouped together before him above in connection with the mode described in Fig. 3 to Fig. 7 C
Except grouping ".
Fig. 8 A illustrates wherein processor 12 and controls the example that display 22 shows third image item 70 shown in Fig. 4 C.The
Three image items 70 are related to the combination of the first image item 50 and second graph item 60 shown in Fig. 4 A.
In order to release grouping to the first image item 50 and second graph item 60, user is by the first user's finger 120 and second
User's finger 130 is placed at third image item 70.In this example, the first user's finger 120 is the finger other than thumb, and
Second user finger 130 is thumb.When processor 12 detects the first user's finger and second user finger on display 22
When, identify third image item 70 for releasing grouping.
In this particular example, after identified third image item 70 is for releasing grouping, user will be other than thumb
Finger 120 moved away from each other simultaneously with thumb 130, while holding them at display 22.Processor 12 detects
The movement of finger 120 and thumb across display 22 other than thumb.Once processor 12 detects 120 He of finger other than thumb
The distance more than threshold value away from each other of thumb 130, then control display 22 and show the first image item 50 and second graph item 60 simultaneously
And third image item 70 is removed from display.First image item 50 shows the finger 120 other than thumb on display 22
At position, and second graph item 60 is shown at the position of thumb 130 on display 22.
Processor 12 passes through mobile first image item 50 and it is made to remain closest to finger 120 and display other than thumb
The part of 22 contacts (for example, at least partly thereunder).This allows users to the first image item 50 being located in display
Appropriate position on device 22.
Processor 12 passes through mobile second graph item 60 and it is made to remain closest to the portion that thumb 130 is contacted with display 22
Divide (for example, at least partly thereunder).This allows users to position second graph item 60 on display 22 suitable
At position.Fig. 8 B, which is illustrated, releases grouping and separated by a distance 100 to the first image item 50 and second graph item 60
The first image item 50 and second graph item 60 later.
In some embodiments of the invention, in order to execute " grouping releases ", user is not required simultaneously will be other than thumb
Finger 120 be moved away from each other with thumb 130.For example, on the contrary, user can keep finger 120 or thumb other than thumb
130 static relative to other fingers 130/120 and mobile another fingers 130/120.
To the ginseng of " tangible computer readable storage medium ", " computer program product ", " computer " and " processor " etc.
It examines and should be interpreted to cover the difference with such as mono-/multi- processor architecture with serial (von Neumann)/parallel architecture etc
The computer of framework and cover such as field programmable gate array (FPGA), special circuit (ASIC), signal handling equipment and
The special circuit of other equipment etc.The reference of computer program, instruction, code etc. is interpreted as covering for programmable place
It manages the software of device or the software of firmware etc., the programmable content of such as hardware device can be the instruction or use of processor
Be arranged in the configuration of fixed function equipment, gate array or programmable logic device etc..
Block shown in Fig. 3 can indicate the code segment in computer program 16 and/or the step in method.To the specific suitable of block
The diagram of sequence, which does not necessarily imply that for block, has required or preferred sequence, and the sequence and arrangement of block are can to change
's.Furthermore, it is possible to omit certain steps.
Although referring to describing the embodiment of the present invention in various examples paragraph in front, it should be understood that
In the case where not departing from requested the scope of the present invention, it can modify to the example provided.For example, the first image item
50/150, second graph item 60/160 and third image item 70/170 can have the size different from shown in Fig. 5 A to Fig. 6 C
And/or shape.In some embodiments of the invention, some in shown image item 50/150,60/160 and 70/170
Or the size and shape of all image items can be it is different from each other.
Above description is described the first image item 50/150 and the combination of second graph item 60/160 to create third figure
Shape item 70/170.In some embodiments of the invention, user can be by executing above-mentioned (and showing in Fig. 4 A and Fig. 4 B)
Movement to add other items to the group.For example, the 4th image item and third image item 70 can be combined to create newly by user
The 5th image item.The combination of 5th image item expression the first image item, second graph item and the 4th image item.Of the invention
In some embodiments, the 4th image item, which can be, carries out the item for being previously rendered as independent item on display 22 in response to user
Grouping and created.
In the exemplary context of Fig. 5 A to Fig. 5 C, the 4th image item can for example indicate contact person's software application and/
Or Internet browsing software application.In the exemplary context of Fig. 6 A to Fig. 6 C, the 4th image item can for example indicate to be used for
The playlist of media content or the item of media content.
Skilled artisan will also appreciate that, user may not necessarily physically touch display 22 in order to provide with
Family input.For example, in some embodiments of the invention, display 22 can use capacitance type sensing technology.In these implementations
In example, user's input can detecte when user is not close to display 22 but to place finger on display 22.
Feature described in description in front can use in the combination other than the combination being expressly recited.
Although describing function referring to special characteristic, these functions are can by other descriptions or not
The feature of description is performed.
Although describing feature referring to specific embodiment, these features can also be presented on other descriptions or not
In the embodiment of description.
Although being absorbed in the present invention in specification in front is considered those especially important features, it should be understood that
Arrive, the claimed aspect of applicant be front refer to and/or be shown in the accompanying drawings it is any can granted patent feature or
The combination of person's feature is especially emphasized in spite of combining to these features or feature.
Claims (25)
1. a kind of equipment for tissue image item, comprising:
At least one processor;And
At least one processor of computer program instructions is stored, the computer program instructions are configured as and described at least one
A processor operates such that the equipment together:
So that touch-sensitive display shows the first image item and second graph item, wherein first image item is at first position,
The second graph item is in the second place, and the second graph item and first image item separate the greater than threshold value
One distance, and wherein the first touch input at the first position makes the first software application be opened, described first
Second touch input at the place of setting makes first image item be identified for moving, and the third of the second place touches defeated
Enter so that the second software application is opened, the 4th touch input makes the second graph item be identified for moving;
It is mobile come in response to second touch input by being used for first image item identification at the first position;
So that the touch-sensitive display visually indicates that first image item has been identified for moving;
Determining that the first distance has been reduced so that first image item is described in being less than for the second graph item
After in the distance of threshold value, layered structure, in the layered structure, first image item and the second graph item are created
It is grouped together and is indicated by the third image item on the touch-sensitive display;And
The choosing to the third image item is responded by providing the access to first image item and the second graph item
It selects, first image item can be chosen so as to open first software application, and the second graph item can be chosen so as to open
Second software application.
2. equipment according to claim 1, wherein first touch input is for the touch less than threshold time period
Input, and second touch input is for the touch input greater than threshold time.
3. equipment according to claim 2, wherein the third touch input is for the touch less than threshold time period
Input, and the 4th touch input is for the touch input greater than threshold time.
4. equipment according to claim 1,2 or 3, wherein by being inputted in response to the user at the touch-sensitive display
Come to mobile first image item of the second graph item, to reduce the first distance.
5. equipment according to claim 1,2 or 3, by mobile first image item simultaneously and across the touch-sensitive display
The user's finger of device reduces the first distance, without using the user's finger to provide institute at the first position
It states the second touch input and removes the user's finger from the touch-sensitive display later.
6. equipment according to claim 5, wherein the second graph item be it is static, and first image item with across
The user's finger of the touch-sensitive display is mobile simultaneously.
7. equipment according to claim 5, wherein the computer program instructions are configured as with described at least one
Reason device operates such that the equipment together: determining that the first distance has been reduced so that first image item in institute
After stating in the distance less than the threshold value of second graph item, so that first image item and the second graph item merge
It is displayed on the touch-sensitive display.
8. equipment according to claim 1,2 or 3, wherein shown on the touch-sensitive display third image item it
Afterwards, first image item and the second graph item are removed from the display.
9. equipment according to claim 1,2 or 3 further includes the touch-sensitive display.
10. equipment according to claim 9, wherein the equipment is hand held portable electronics.
11. a kind of method for tissue image item, comprising:
So that touch-sensitive display shows the first image item and second graph item, wherein first image item is at first position,
The second graph item is in the second place, and the second graph item and first image item separate the greater than threshold value
One distance, and wherein the first touch input at the first position makes the first software application be opened, described first
Second touch input at the place of setting makes first image item be identified for moving, and the third of the second place touches defeated
Enter so that the second software application is opened, the 4th touch input makes the second graph item be identified for moving;
It is mobile come in response to second touch input by being used for first image item identification at the first position;
So that the touch-sensitive display visually indicates that first image item has been identified for moving;
Determining that the first distance has been reduced so that first image item is described in being less than for the second graph item
After in the distance of threshold value, layered structure, in the layered structure, first image item and the second graph item are created
It is grouped together and is indicated by the third image item on the touch-sensitive display;And
The choosing to the third image item is responded by providing the access to first image item and the second graph item
It selects, first image item can be chosen so as to open first software application, and the second graph item can be chosen so as to open
Second software application.
12. according to the method for claim 11, wherein first touch input is for the touching less than threshold time period
Input is touched, and second touch input is for the touch input greater than threshold time.
13. according to the method for claim 12, wherein the third touch input is for the touching less than threshold time period
Input is touched, and the 4th touch input is for the touch input greater than threshold time.
14. method described in 1,12 or 13 according to claim 1, wherein by response to the user at the touch-sensitive display
Input comes to mobile first image item of the second graph item, to reduce the first distance.
15. method described in 1,12 or 13 according to claim 1, by mobile first image item simultaneously and across described touch-sensitive
The user's finger of display reduces the first distance, without being mentioned at the first position using the user's finger
For removing the user's finger from the touch-sensitive display after second touch input.
16. according to the method for claim 15, wherein the second graph item be it is static, and first image item with
The user's finger across the touch-sensitive display is mobile simultaneously.
17. a kind of equipment for tissue image item, comprising:
For making touch-sensitive display show the device of the first image item and second graph item, wherein first image item is
At one position, the second graph item is in the second place, and the second graph item and first image item separate greatly
In the first distance of threshold value, and wherein, the first touch input at the first position makes the first software application be opened,
The second touch input at the first position makes first image item be identified for moving, the second place
Third touch input makes the second software application be opened, and the 4th touch input makes the second graph item be identified for moving
It is dynamic;
For mobile come in response to second touch by being used for first image item identification at the first position
The device of input;
For making the touch-sensitive display visually indicate that first image item has been identified for mobile device;
For determining that the first distance has been reduced so that the first image item being less than in the second graph item
Device layered is created later in the distance of the threshold value, in the layered structure, first image item and institute
Second graph item is stated to be grouped together and indicated by the third image item on the touch-sensitive display;And
For being responded by providing the access to first image item and the second graph item to the third image item
Selection device, first image item can be chosen so as to open first software application, and the second graph item can quilt
Selection is to open second software application.
18. equipment according to claim 17, wherein first touch input is for the touching less than threshold time period
Input is touched, and second touch input is for the touch input greater than threshold time.
19. equipment according to claim 18, wherein the third touch input is for the touching less than threshold time period
Input is touched, and the 4th touch input is for the touch input greater than threshold time.
20. a kind of equipment for tissue image item, comprising:
At least one processor;And
At least one processor of computer program instructions is stored, the computer program instructions are configured as and described at least one
A processor operates such that the equipment together:
So that touch-sensitive display display display includes the menu of the first image item and second graph item, first image item can quilt
Selection is to open the first software application, and the second graph item can be chosen so as to open the second software application, wherein described first
Image item is at first position, and the second graph item is in the second place, and the second graph item and first figure
Shape item separates the first distance greater than threshold value;
The first user's finger is detected at the first position;
In response to detecting the movement of first user's finger across the touch-sensitive display, by across the touch-sensitive display
First image item is moved to reduce the first distance;
Determining that the first distance has been reduced so that first image item is described in being less than for the second graph item
After in the distance of threshold value, layered structure is created in the menu, the layered structure includes first image item, described
The third image item of the grouping of second graph item and expression first image item and the second graph item, wherein the third
Image item is in the first layer layered, and first image item and the second graph item are in the layered structure
, different from the first layer layered second layer;And
By providing the access to the second layer layered, respond in the first layer layered
The third image item selection, in the layered structure, first image item can be chosen so as to open described first
Software application and the second graph item can be chosen so as to open second software application.
21. equipment according to claim 20, wherein the computer program instructions be configured as with it is described at least one
Processor operates such that the equipment together: before across mobile first image item of the touch-sensitive display, detection is used
Family input, first image item identification is used to move by user's input, wherein first image item identification is used for
The mobile user inputs and is provided as selecting the user input of first image item different.
22. a kind of method for tissue image item, comprising:
So that touch-sensitive display display display includes the menu of the first image item and second graph item, first image item can quilt
Selection is to open the first software application, and the second graph item can be chosen so as to open the second software application, wherein described first
Image item is at first position, and the second graph item is in the second place, and the second graph item and first figure
Shape item separates the first distance greater than threshold value;
The first user's finger is detected at the first position;
In response to detecting the movement of first user's finger across the touch-sensitive display, by across the touch-sensitive display
First image item is moved to reduce the first distance;
Determining that the first distance has been reduced so that first image item is described in being less than for the second graph item
After in the distance of threshold value, layered structure is created in the menu, the layered structure includes first image item, described
The third image item of the grouping of second graph item and expression first image item and the second graph item, wherein the third
Image item is in the first layer layered, and first image item and the second graph item are in the layered structure
, different from the first layer layered second layer;And
By providing the access to the second layer layered, respond in the first layer layered
The third image item selection, in the layered structure, first image item can be chosen so as to open described first
Software application and the second graph item can be chosen so as to open second software application.
23. according to the method for claim 22, further includes: across mobile first image item of the touch-sensitive display it
Before, first image item identification is used to move by detection user's input, user's input, wherein by first image item
Identification inputs for the mobile user and is provided as selecting the user input of first image item different.
24. a kind of equipment for tissue image item, comprising:
For making touch-sensitive display display display include the device of the menu of the first image item and second graph item, described first
Image item can be chosen so as to open the first software application, and the second graph item can be chosen so as to open the second software application,
Described in the first image item at first position, the second graph item in the second place, and the second graph item with
First image item separates the first distance greater than threshold value;
For detecting the device of the first user's finger at the first position;
For in response to detecting the movement of first user's finger across the touch-sensitive display and by across described touch-sensitive
Display moves first image item to reduce the device of the first distance;
For determining that the first distance has been reduced so that the first image item being less than in the second graph item
Device layered is created in the menu after in the distance of the threshold value, the layered structure includes described first
The third image item of the grouping of image item, the second graph item and expression first image item and the second graph item,
Wherein the third image item is in the first layer layered, and first image item and the second graph item exist
The second layer layered, different from the first layer layered;And
For being responded the access of the second layer layered to described layered described the by providing
The device of the selection of the third image item in one layer, in the layered structure, first image item can be chosen so as to
It opens first software application and the second graph item can be chosen so as to open second software application.
25. equipment according to claim 24, further includes: for across mobile first figure of the touch-sensitive display
First image item identification is used to move by the device of detection user input before, user's input, wherein will be described
The identification of first image item inputs for the mobile user and is provided as selecting the user of first image item defeated
Enter difference.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510919403.0A CN105511726B (en) | 2010-01-20 | 2010-01-20 | user input |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080062072.9A CN102770835B (en) | 2010-01-20 | 2010-01-20 | For organizing the method and apparatus of image item |
CN201510919403.0A CN105511726B (en) | 2010-01-20 | 2010-01-20 | user input |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201080062072.9A Division CN102770835B (en) | 2010-01-20 | 2010-01-20 | For organizing the method and apparatus of image item |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105511726A CN105511726A (en) | 2016-04-20 |
CN105511726B true CN105511726B (en) | 2019-02-12 |
Family
ID=55747930
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510919403.0A Active CN105511726B (en) | 2010-01-20 | 2010-01-20 | user input |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105511726B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101344848A (en) * | 2007-07-12 | 2009-01-14 | 辉达公司 | Management of icons in a display interface |
CN101527745A (en) * | 2008-03-07 | 2009-09-09 | 三星电子株式会社 | User interface method and apparatus for mobile terminal having touchscreen |
CN101561758A (en) * | 2009-05-20 | 2009-10-21 | 深圳北控信息发展有限公司 | Method for classifying application program of embedded terminal |
CN102207788A (en) * | 2010-02-19 | 2011-10-05 | 微软公司 | Radial menus with bezel gestures |
CN102257794A (en) * | 2008-12-18 | 2011-11-23 | 诺基亚公司 | Mobile communication device with a sliding display screen and screen-dividing member |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101651134B1 (en) * | 2010-06-24 | 2016-08-29 | 엘지전자 주식회사 | Mobile terminal and group operation control method thereof |
-
2010
- 2010-01-20 CN CN201510919403.0A patent/CN105511726B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101344848A (en) * | 2007-07-12 | 2009-01-14 | 辉达公司 | Management of icons in a display interface |
CN101527745A (en) * | 2008-03-07 | 2009-09-09 | 三星电子株式会社 | User interface method and apparatus for mobile terminal having touchscreen |
CN102257794A (en) * | 2008-12-18 | 2011-11-23 | 诺基亚公司 | Mobile communication device with a sliding display screen and screen-dividing member |
CN101561758A (en) * | 2009-05-20 | 2009-10-21 | 深圳北控信息发展有限公司 | Method for classifying application program of embedded terminal |
CN102207788A (en) * | 2010-02-19 | 2011-10-05 | 微软公司 | Radial menus with bezel gestures |
Also Published As
Publication number | Publication date |
---|---|
CN105511726A (en) | 2016-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10198173B2 (en) | User input | |
US11567640B2 (en) | Gesture-alteration of media files | |
CN108476168B (en) | Applying confirmation options in a graphical messaging user interface | |
KR102636000B1 (en) | Applying acknowledgement of options in a graphical messaging user interface | |
US8413075B2 (en) | Gesture movies | |
JP5483908B2 (en) | Portable electronic device with interface reconfiguration mode | |
TWI459282B (en) | Method and system and computer readable product for providing a user interface for accessing multimedia items | |
CN104685470B (en) | For the device and method from template generation user interface | |
US8269736B2 (en) | Drop target gestures | |
KR101544364B1 (en) | Mobile terminal having dual touch screen and method for controlling contents thereof | |
US8508475B2 (en) | User interface elements positioned for display | |
EP2804178A1 (en) | Reproduction of file series | |
EP3336672A1 (en) | Method and apparatus for providing graphic user interface in mobile terminal | |
US20100175008A1 (en) | Apparatus and method for playing of multimedia item | |
CN103026329B (en) | For the method and apparatus controlling user interface | |
US20170060374A1 (en) | Combined Tablet Screen Drag-and-drop Interface | |
KR101377735B1 (en) | Electronic device and method for allowing a user to select a menu option | |
WO2016183912A1 (en) | Menu layout arrangement method and apparatus | |
WO2011079438A1 (en) | An apparatus, method, computer program and user interface | |
AU2017330785A1 (en) | Electronic apparatus and controlling method thereof | |
JP6405143B2 (en) | Content display apparatus and display method | |
CN105511726B (en) | user input | |
US20090327968A1 (en) | Apparatus and method for enabling user input | |
CN106062667A (en) | Apparatus and method for processing user input | |
US20150095778A1 (en) | Media content management |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |