US20140359538A1 - Systems and methods for moving display objects based on user gestures - Google Patents
Systems and methods for moving display objects based on user gestures Download PDFInfo
- Publication number
- US20140359538A1 US20140359538A1 US13/903,056 US201313903056A US2014359538A1 US 20140359538 A1 US20140359538 A1 US 20140359538A1 US 201313903056 A US201313903056 A US 201313903056A US 2014359538 A1 US2014359538 A1 US 2014359538A1
- Authority
- US
- United States
- Prior art keywords
- output device
- user
- user gesture
- gesture
- display object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- Embodiments of the disclosure generally relate to moving display objects displayed on an output device, and more particularly, to systems and methods for moving display objects based on user gestures.
- touch-sensitive displays in mobile and communication-type computing devices, such as, hand held tablets or smartphones.
- objects being displayed on such devices can be moved by touching and dragging the object using one or more fingers.
- users of relatively smaller computing devices can comfortably use conventional touch and drag gestures to move objects on associated displays
- relatively large surface computers have much larger display areas making it relatively uncomfortable for users to move display objects using conventional touch and drag gestures.
- Certain embodiments may include systems and methods for moving display objects based on user gestures, such as objects displayed on an output device of a surface computer.
- the system may include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions.
- the instructions may be configured to detect a first user gesture adjacent to an output device of a surface computer to identify a display object displayed on the output device.
- the instructions may be further configured to detect a second user gesture adjacent to the output device identifying a location to move the display object on the output device.
- the instructions may further be configured to update the output device to display the display object at the identified location on the output device.
- the method can include detecting, by a control device, a first user gesture adjacent to an output device of a surface computer identifying at least one display object displayed on the output device.
- the method may further include detecting, by the control device, a second user gesture adjacent to the output device identifying a location to move the display object.
- the method may also include updating, by the control device, the output device to display the identified display object at the identified location on the output device.
- FIG. 1 illustrates an example system for moving display objects based on user gestures, according to an embodiment of the disclosure.
- FIG. 2 is a flow diagram of an example method for moving display objects based on user gestures, according to an embodiment of the disclosure.
- FIG. 3A is an example method for identifying a display object based on user gestures, according to an embodiment of the disclosure.
- FIG. 3B is an example method for identifying a location to move a display object based on user gestures, according to an embodiment of the disclosure.
- FIG. 3C is an example method for updating an output device to display an identified display object at an identified location on the output device, according to an embodiment of the disclosure.
- a system can be provided to facilitate moving display objects based upon detecting a first user gesture and a second user gesture generated by one or more users interacting with the output device. For example, a user may interact with the output device by, for instance, a finger stroke and/or a finger tap adjacent to the surface of the output device. Based upon the first and the second user gesture, a display object may be selected and a location to move the display object on the output device may be identified. Thereafter, the output device may be updated to display the identified display object at the identified location on the output device.
- One or more technical effects associated with certain embodiments herein may include, but are not limited to, reduced time and expense for a user to move display objects to new positions on a relatively large output device of a surface computer without employing traditional touch and drag methods as briefly described above. Furthermore, one or more technical effects associated with certain embodiments can include providing
- FIG. 1 depicts a block diagram of one example system 100 that facilitates moving display objects based on user gestures.
- the system 100 may include a surface computer 110 that includes an output device 120 .
- the output device 120 may be configured to display to a user one or more display objects 130 , such as, for instance, user interface controls, that may include text, colors, images, icons, and the like.
- the surface computer 110 may further include one or more input devices 140 configured to detect and/or capture user gestures adjacent to the output device 120 .
- the input devices 140 may include a user gesture capturing device, such as, for instance, one or more cameras and/or transparent ink pad controls disposed in close proximity to the output device 120 .
- a user device 140 can include a gesture reader software module and/or a transparent ink pad user interface control.
- the input devices 140 can be configured to detect a first user gesture and a second user gesture adjacent to the output device 120 and communicate them in real-time or near real-time to a control device, such as, control device 150 in FIG. 1 , via a network, such as, network 105 in FIG. 1 .
- the control device 150 may be configured to receive and to analyze the first and the second user gestures from the input devices 140 .
- control device 150 may also be configured to identify and/or select a display object 130 , a location on the output device 120 to move the display object 130 and/or generate and transmit to the surface computer 110 an updated output device 120 to display the identified display object 120 at the identified location on the output device 120 via network 105 as will be described.
- the control device 150 may include any number of suitable computer processing components that may, among other things, analyze user gestures detected by the input devices 140 .
- suitable processing devices that may be incorporated into the control device 150 include, but are not limited to, personal computers, server computers, application-specific circuits, microcontrollers, minicomputers, other computing devices, and the like.
- the control device 150 may include any number of processors 155 that facilitate the execution of computer-readable instructions.
- the control device 150 may include or form a special purpose computer or particular machine that facilitates processing of user gestures in order to move display objects displayed on the output device 120 .
- the control device 150 may include one or more memory devices 160 , one or more input/output (“I/O”) interfaces 165 , and/or one or more communications and/or network interfaces 170 .
- the one or more memory devices 160 or memories may include any suitable memory devices, for example, caches, read-only memory devices, random access memory devices, magnetic storage devices, etc.
- the one or more memory devices 160 may store user gestures or other data, executable instructions, and/or various program modules utilized by the control device 150 , for example, data files 170 , an operating system (“OS”) 180 and/or a user gesture analyzer module 185 .
- OS operating system
- a user gesture analyzer module 185 a user gesture analyzer module
- the data files 170 may include any suitable data that facilitates the operation of the control device 150 including, but not limited to, information associated with one or more detected user gestures and/or information associated with one or more control actions directed by the control device 150 based on detected user gestures.
- the OS 180 may include executable instructions and/or program modules that facilitate and/or control the general operation of the controller 150 .
- the OS 180 may facilitate the execution of other software programs and/or program modules by the processors 155 , such as, the user gesture analyzer module 185 .
- the user gesture analyzer module 185 may be a suitable software module configured to analyze and/or process user gestures detected by the input devices 140 .
- the user gesture analyzer module 185 may analyze user gestures detected by the input devices 140 , which may be collected and stored in memory 160 .
- the control device 150 may be configured to detect a first user gesture via the one or more input devices 140 .
- a first user may generate a first user gesture using one or more fingers in order to select, or otherwise identify, a display object 130 the user would like to move.
- a user may tap the screen of the output device 120 with a finger where the display object 130 is displayed in order to indicate that the user would like to move the display object 130 to another location on the output device 120 .
- the user may generate a finger stroke gesture on the screen of the output device 120 in order to identify the display object 130 the user would like to move.
- the input devices 140 may include one or more program modules that facilitate capturing detected user gestures and any other information associated with the user gestures.
- the input devices 140 may include one or more cameras that detect a user gesture. Thereafter, a user gesture reader software module may be executed and configured to automatically, or in response to some other trigger, transmit the captured user gesture and any other information associated with the user gesture to the control device 150 via network 105 .
- the input devices 140 may include one or more transparent ink pad controls, where upon detecting a user gesture by the transparent ink pad controls, the transparent ink pad control interface transmits the user gesture to the control device 150 via network 105 .
- the control device 150 may be configured to execute the user gesture analyzer module 185 .
- the user gesture analyzer module 185 may be configured to analyze the first user gesture. For instance, the user gesture analyzer module 185 may be configured to associate a location of the first user gesture on the output device 120 to the location of a display object 130 on the output device 120 . Using this example, the user gesture analyzer module 185 may determine the particular display object 130 the user would like to move. Having identified the display object 130 the user would like to move, in one embodiment, the user gesture analyzer module 185 may be configured to select the display object 130 and/or wait to receive a second user gesture detected from the input device 140 .
- a second user gesture may be generated by a second user.
- a second user may generate a second user gesture, such as, a finger tap gesture, using one or more fingers in order to select or otherwise identify a location on the output device 120 to move the identified display object 130 .
- a user may tap a location on the screen of the output device 120 with a finger in order to indicate the location to move the display object 130 on the output device 120 .
- the input device 140 may be configured to automatically, or in response to some other trigger, transmit to the control device 150 via network 105 the captured second user gesture and any other information associated with the second user gesture.
- the control device 150 may be configured to execute the user gesture analyzer module 185 .
- the user gesture analyzer module 185 may be configured to analyze the second user gesture. For instance, the user gesture analyzer module 185 may be configured to associate the second user gesture to the first user gesture. In this way, the user gesture analyzer module 185 may be configured to associate the display object 130 identified by the first user gesture to the location on the output device 120 identified by the second user gesture.
- the user gesture analyzer module 185 may be configured to update the output device 120 to display the identified display object 130 at the identified location on the output device 130 .
- the user gesture analyzer module 185 may direct the communication by the control device 150 of an updated of presentation of the display objects 120 to the surface computer 110 for display on the output device 120 .
- the user gesture analyzer module 185 may be configured to make more complex user gesture assessments.
- a first user gesture may be, for instance, a finger stroke that is a circle gesture representing a command or a text character, such as the letter “o.”
- the user gesture analyzer module 185 may be executed and configured to analyze a first user gesture in order to identify the command or the text character associated with the first user gesture.
- the user gesture analyzer module 185 may then search one or more data files 180 that may identify , for each command or text character, a corresponding gesture action, such as, selecting and/or moving the display object 120 , to be executed by the control device 150 in response to detecting a second user gesture by the input device 140 or some other trigger.
- the user gesture analyzer module 185 Upon detecting the second user gesture, the user gesture analyzer module 185 then may execute the command action and/or direct the communication from the control device 150 of an updated of presentation of the display objects 120 to the surface computer 110 for display on the output device 120 that includes executing the identified action.
- embodiments of the disclosure may include a system 100 with more or less than the components illustrated in FIG. 1 . Additionally, certain components of the system 100 may be combined in various embodiments of the disclosure.
- the system 100 of FIG. 1 is provided by way of example only.
- FIG. 2 shown is a flow diagram of an example method 200 for moving one or more display objects being displayed on an output device of a surface based on one or more user gestures, according to an illustrative embodiment of the disclosure.
- the method 200 may be utilized in association with various systems, such as the system 100 illustrated in FIG. 1 .
- the method 200 may begin at block 205 .
- a control device such as 150 in FIG. 1 , may detect a first user gesture adjacent to an output device, such as 120 in FIG. 1 , of a surface computer, such as 110 in FIG. 1 .
- the first user gesture may be analyzed by, for example, a user gesture analyzer module such as 185 in FIG. 1 , in order to identify a display object, such as 130 in FIG. 1 , on the output device that a user would like to move to another location on the output device.
- the first user gesture may be detected by an input device, such as, input device 140 illustrated in FIG. 1 .
- the first user gesture may include a finger-based gesture, such as, a finger stroke gesture, that may be generated by a first user.
- the control device 150 may detect, via the input device 140 , a second user gesture adjacent to the output device 120 of the surface computer 110 identifying a location to move the display object 130 on the output device 120 .
- the second user gesture may be a finger tap gesture generated by a second user.
- control device 150 may update the output device 120 to display the identified display object 130 at the identified location on the output device 120 .
- the control device 150 may be configured to communicate an updated presentation of the display object 130 to the output device 120 for display to one or more users.
- the method 200 of FIG. 2 may optionally end following block 215 .
- the operations described and shown in the method 200 of FIG. 2 may be carried out or performed in any suitable order as desired in various embodiments of the disclosure. Additionally, in certain embodiments, at least a portion of the operations may be carried out in parallel. Furthermore, in certain embodiments, less than or more than the operations described in FIG. 2 may be performed. As desired, the operations set forth in FIG. 2 may also be performed in a loop as a rotating machine is monitored. For example, the operations may be performed every twenty minutes.
- FIG. 3A shown is an example method for identifying a display object based on user gestures as described in block 205 of FIG. 2 .
- one or more display objects 320 a may be displayed on an output device 310 a.
- a user may identify or otherwise select the display object 320 a by generating a first user gesture. For example, as shown in FIG. 3A a user may tap the screen of the output device 321 a with a finger where the display object 320 a is displayed in order to indicate that the user would like to move the display object 320 a to another location on the output device 310 a.
- FIG. 3B shown is an example method for identifying a location to move a display object based on user gestures as described in block 210 of FIG. 2 .
- a user may generate a second user gesture using one or more fingers in order to identify a location on an output device 310 b to move an identified display object 320 b.
- a user may tap a location on the screen of the output device 310 b with a finger in order to indicate the location to move the display object 320 b on the output device 310 b.
- FIG. 3C shown is an example method for updating an output device to display an identified display object at an identified location on the output device as described in block 215 of FIG. 2 .
- an updated presentation of a selected display object 320 c at an identified location on an output device 310 c may be displayed to one or more users.
- the disclosure is described above with reference to block and flow diagrams of systems, methods, apparatus, and/or computer program products according to example embodiments of the disclosure. It will be understood that one or more blocks of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, respectively, can be implemented by computer-executable program instructions. Likewise, some blocks of the block diagrams and flow diagrams may not necessarily need to be performed in the order presented, or may not necessarily need to be performed at all, according to some embodiments of the disclosure.
- These computer-executable program instructions may be loaded onto a general purpose computer, a special purpose computer, a processor, or other programmable data processing apparatus to produce a particular machine, such that the instructions that execute on the computer, processor, or other programmable data processing apparatus create means for implementing one or more functions specified in the flow diagram block or blocks.
- These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement one or more functions specified in the flow diagram block or blocks.
- embodiments of the disclosure may provide for a computer program product, comprising a computer usable medium having a computer-readable program code or program instructions embodied therein, said computer-readable program code adapted to be executed to implement one or more functions specified in the flow diagram block or blocks.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational elements or steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide elements or steps for implementing the functions specified in the flow diagram block or blocks.
- blocks of the block diagrams and flow diagrams support combinations of means for performing the specified functions, combinations of elements or steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, can be implemented by special purpose, hardware-based computer systems that perform the specified functions, elements or steps, or combinations of special purpose hardware and computer instructions.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Certain embodiments herein relate to systems and methods for moving display objects based on user gestures. In one embodiment, a system can include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions. The instructions may be configured to detect a first user gesture adjacent to an output device in order to identity a display object displayed on the output device. The instructions may be configured to detect a second user gesture adjacent to the output device in order to identify a location to move the display object. The instructions may be configured to update the output device to display the display object at the identified location on the output device.
Description
- Embodiments of the disclosure generally relate to moving display objects displayed on an output device, and more particularly, to systems and methods for moving display objects based on user gestures.
- It has become increasing popular to provide touch-sensitive displays in mobile and communication-type computing devices, such as, hand held tablets or smartphones. Typically, objects being displayed on such devices can be moved by touching and dragging the object using one or more fingers. While users of relatively smaller computing devices can comfortably use conventional touch and drag gestures to move objects on associated displays, relatively large surface computers have much larger display areas making it relatively uncomfortable for users to move display objects using conventional touch and drag gestures.
- Some or all of the above needs and/or problems may be addressed by certain embodiments of the disclosure. Certain embodiments may include systems and methods for moving display objects based on user gestures, such as objects displayed on an output device of a surface computer. According to one embodiment of the disclosure, there is disclosed a system. The system may include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions. The instructions may be configured to detect a first user gesture adjacent to an output device of a surface computer to identify a display object displayed on the output device. The instructions may be further configured to detect a second user gesture adjacent to the output device identifying a location to move the display object on the output device. The instructions may further be configured to update the output device to display the display object at the identified location on the output device.
- According to another embodiment of the disclosure, there is disclosed a method. The method can include detecting, by a control device, a first user gesture adjacent to an output device of a surface computer identifying at least one display object displayed on the output device. The method may further include detecting, by the control device, a second user gesture adjacent to the output device identifying a location to move the display object. The method may also include updating, by the control device, the output device to display the identified display object at the identified location on the output device.
- Other embodiments, systems, methods, aspects, and features of the disclosure will become apparent to those skilled in the art from the following detailed description, the accompanying drawings, and the appended claims.
- The detailed description is set forth with reference to the accompanying drawings, which are not necessarily drawn to scale. The use of the same reference numbers in different figures indicates similar or identical items.
-
FIG. 1 illustrates an example system for moving display objects based on user gestures, according to an embodiment of the disclosure. -
FIG. 2 is a flow diagram of an example method for moving display objects based on user gestures, according to an embodiment of the disclosure. -
FIG. 3A is an example method for identifying a display object based on user gestures, according to an embodiment of the disclosure. -
FIG. 3B is an example method for identifying a location to move a display object based on user gestures, according to an embodiment of the disclosure. -
FIG. 3C is an example method for updating an output device to display an identified display object at an identified location on the output device, according to an embodiment of the disclosure. - Illustrative embodiments of the disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the disclosure are shown. The disclosure may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements.
- Certain embodiments disclosed herein relate to moving one or more display objects based on user gestures. Accordingly, a system can be provided to facilitate moving display objects based upon detecting a first user gesture and a second user gesture generated by one or more users interacting with the output device. For example, a user may interact with the output device by, for instance, a finger stroke and/or a finger tap adjacent to the surface of the output device. Based upon the first and the second user gesture, a display object may be selected and a location to move the display object on the output device may be identified. Thereafter, the output device may be updated to display the identified display object at the identified location on the output device. One or more technical effects associated with certain embodiments herein may include, but are not limited to, reduced time and expense for a user to move display objects to new positions on a relatively large output device of a surface computer without employing traditional touch and drag methods as briefly described above. Furthermore, one or more technical effects associated with certain embodiments can include providing
-
FIG. 1 depicts a block diagram of oneexample system 100 that facilitates moving display objects based on user gestures. According to an embodiment of the disclosure, thesystem 100 may include asurface computer 110 that includes an output device 120. The output device 120 may be configured to display to a user one ormore display objects 130, such as, for instance, user interface controls, that may include text, colors, images, icons, and the like. - With continued reference to
FIG. 1 , thesurface computer 110 may further include one ormore input devices 140 configured to detect and/or capture user gestures adjacent to the output device 120. In certain embodiments, theinput devices 140 may include a user gesture capturing device, such as, for instance, one or more cameras and/or transparent ink pad controls disposed in close proximity to the output device 120. In certain embodiments, auser device 140 can include a gesture reader software module and/or a transparent ink pad user interface control. In any instance, theinput devices 140 can be configured to detect a first user gesture and a second user gesture adjacent to the output device 120 and communicate them in real-time or near real-time to a control device, such as,control device 150 inFIG. 1 , via a network, such as,network 105 inFIG. 1 . In certain embodiments, thecontrol device 150 may be configured to receive and to analyze the first and the second user gestures from theinput devices 140. - Based at least upon the first and the second user gestures, the
control device 150 may also be configured to identify and/or select adisplay object 130, a location on the output device 120 to move thedisplay object 130 and/or generate and transmit to thesurface computer 110 an updated output device 120 to display the identified display object 120 at the identified location on the output device 120 vianetwork 105 as will be described. - The
control device 150 may include any number of suitable computer processing components that may, among other things, analyze user gestures detected by theinput devices 140. Examples of suitable processing devices that may be incorporated into thecontrol device 150 include, but are not limited to, personal computers, server computers, application-specific circuits, microcontrollers, minicomputers, other computing devices, and the like. As such, thecontrol device 150 may include any number ofprocessors 155 that facilitate the execution of computer-readable instructions. By executing computer-readable instructions, thecontrol device 150 may include or form a special purpose computer or particular machine that facilitates processing of user gestures in order to move display objects displayed on the output device 120. - In addition to one or more processor(s) 155, the
control device 150 may include one or more memory devices 160, one or more input/output (“I/O”)interfaces 165, and/or one or more communications and/ornetwork interfaces 170. The one or more memory devices 160 or memories may include any suitable memory devices, for example, caches, read-only memory devices, random access memory devices, magnetic storage devices, etc. The one or more memory devices 160 may store user gestures or other data, executable instructions, and/or various program modules utilized by thecontrol device 150, for example,data files 170, an operating system (“OS”) 180 and/or a user gesture analyzer module 185. Thedata files 170 may include any suitable data that facilitates the operation of thecontrol device 150 including, but not limited to, information associated with one or more detected user gestures and/or information associated with one or more control actions directed by thecontrol device 150 based on detected user gestures. TheOS 180 may include executable instructions and/or program modules that facilitate and/or control the general operation of thecontroller 150. - Additionally, the OS 180 may facilitate the execution of other software programs and/or program modules by the
processors 155, such as, the user gesture analyzer module 185. The user gesture analyzer module 185 may be a suitable software module configured to analyze and/or process user gestures detected by theinput devices 140. For instance, the user gesture analyzer module 185 may analyze user gestures detected by theinput devices 140, which may be collected and stored in memory 160. - According to one embodiment, the
control device 150 may be configured to detect a first user gesture via the one ormore input devices 140. For instance, upon viewing one ormore display objects 130 displayed on the output device 120, a first user may generate a first user gesture using one or more fingers in order to select, or otherwise identify, adisplay object 130 the user would like to move. To do so, in one embodiment, a user may tap the screen of the output device 120 with a finger where thedisplay object 130 is displayed in order to indicate that the user would like to move thedisplay object 130 to another location on the output device 120. As another non-limiting example, the user may generate a finger stroke gesture on the screen of the output device 120 in order to identify thedisplay object 130 the user would like to move. - In certain embodiments, the
input devices 140 may include one or more program modules that facilitate capturing detected user gestures and any other information associated with the user gestures. For instance, theinput devices 140 may include one or more cameras that detect a user gesture. Thereafter, a user gesture reader software module may be executed and configured to automatically, or in response to some other trigger, transmit the captured user gesture and any other information associated with the user gesture to thecontrol device 150 vianetwork 105. Similarly, in another example, theinput devices 140 may include one or more transparent ink pad controls, where upon detecting a user gesture by the transparent ink pad controls, the transparent ink pad control interface transmits the user gesture to thecontrol device 150 vianetwork 105. - Upon receiving the first user gesture, the
control device 150 may be configured to execute the user gesture analyzer module 185. The user gesture analyzer module 185 may be configured to analyze the first user gesture. For instance, the user gesture analyzer module 185 may be configured to associate a location of the first user gesture on the output device 120 to the location of adisplay object 130 on the output device 120. Using this example, the user gesture analyzer module 185 may determine theparticular display object 130 the user would like to move. Having identified thedisplay object 130 the user would like to move, in one embodiment, the user gesture analyzer module 185 may be configured to select thedisplay object 130 and/or wait to receive a second user gesture detected from theinput device 140. - Continuing with the same example, after a first user gesture by a first user, a second user gesture may be generated by a second user. According to one embodiment, a second user may generate a second user gesture, such as, a finger tap gesture, using one or more fingers in order to select or otherwise identify a location on the output device 120 to move the identified
display object 130. To do so, in one embodiment, a user may tap a location on the screen of the output device 120 with a finger in order to indicate the location to move thedisplay object 130 on the output device 120. - Similar to the first user gesture, the
input device 140 may be configured to automatically, or in response to some other trigger, transmit to thecontrol device 150 vianetwork 105 the captured second user gesture and any other information associated with the second user gesture. Upon receiving the second user gesture, thecontrol device 150 may be configured to execute the user gesture analyzer module 185. The user gesture analyzer module 185 may be configured to analyze the second user gesture. For instance, the user gesture analyzer module 185 may be configured to associate the second user gesture to the first user gesture. In this way, the user gesture analyzer module 185 may be configured to associate thedisplay object 130 identified by the first user gesture to the location on the output device 120 identified by the second user gesture. Thereafter, the user gesture analyzer module 185 may be configured to update the output device 120 to display the identifieddisplay object 130 at the identified location on theoutput device 130. For instance, the user gesture analyzer module 185 may direct the communication by thecontrol device 150 of an updated of presentation of the display objects 120 to thesurface computer 110 for display on the output device 120. - In another non-limiting example, the user gesture analyzer module 185 may be configured to make more complex user gesture assessments. For instance, a first user gesture may be, for instance, a finger stroke that is a circle gesture representing a command or a text character, such as the letter “o.” Upon receiving a first user gesture, according to one embodiment, the user gesture analyzer module 185 may be executed and configured to analyze a first user gesture in order to identify the command or the text character associated with the first user gesture. Thereafter, in certain embodiments, the user gesture analyzer module 185 may then search one or more data files 180 that may identify , for each command or text character, a corresponding gesture action, such as, selecting and/or moving the display object 120, to be executed by the
control device 150 in response to detecting a second user gesture by theinput device 140 or some other trigger. - Upon detecting the second user gesture, the user gesture analyzer module 185 then may execute the command action and/or direct the communication from the
control device 150 of an updated of presentation of the display objects 120 to thesurface computer 110 for display on the output device 120 that includes executing the identified action. - As desired, embodiments of the disclosure may include a
system 100 with more or less than the components illustrated inFIG. 1 . Additionally, certain components of thesystem 100 may be combined in various embodiments of the disclosure. Thesystem 100 ofFIG. 1 is provided by way of example only. - Referring now to
FIG. 2 , shown is a flow diagram of anexample method 200 for moving one or more display objects being displayed on an output device of a surface based on one or more user gestures, according to an illustrative embodiment of the disclosure. Themethod 200 may be utilized in association with various systems, such as thesystem 100 illustrated inFIG. 1 . - The
method 200 may begin atblock 205. Atblock 205, a control device, such as 150 inFIG. 1 , may detect a first user gesture adjacent to an output device, such as 120 inFIG. 1 , of a surface computer, such as 110 inFIG. 1 . In certain embodiments, the first user gesture may be analyzed by, for example, a user gesture analyzer module such as 185 inFIG. 1 , in order to identify a display object, such as 130 inFIG. 1 , on the output device that a user would like to move to another location on the output device. In certain embodiments, the first user gesture may be detected by an input device, such as,input device 140 illustrated inFIG. 1 . As described above, the first user gesture may include a finger-based gesture, such as, a finger stroke gesture, that may be generated by a first user. - Next, at
block 210, thecontrol device 150 may detect, via theinput device 140, a second user gesture adjacent to the output device 120 of thesurface computer 110 identifying a location to move thedisplay object 130 on the output device 120. In certain embodiments, the second user gesture may be a finger tap gesture generated by a second user. - Lastly, at
block 215, thecontrol device 150 may update the output device 120 to display the identifieddisplay object 130 at the identified location on the output device 120. As described above, based on the detected first and second user gesture, thecontrol device 150 may be configured to communicate an updated presentation of thedisplay object 130 to the output device 120 for display to one or more users. - The
method 200 ofFIG. 2 may optionally end followingblock 215. - The operations described and shown in the
method 200 ofFIG. 2 may be carried out or performed in any suitable order as desired in various embodiments of the disclosure. Additionally, in certain embodiments, at least a portion of the operations may be carried out in parallel. Furthermore, in certain embodiments, less than or more than the operations described inFIG. 2 may be performed. As desired, the operations set forth inFIG. 2 may also be performed in a loop as a rotating machine is monitored. For example, the operations may be performed every twenty minutes. - Referring now to
FIG. 3A , shown is an example method for identifying a display object based on user gestures as described inblock 205 ofFIG. 2 . As illustrated inFIG. 3A , one or more display objects 320 a may be displayed on anoutput device 310 a. A user may identify or otherwise select thedisplay object 320 a by generating a first user gesture. For example, as shown inFIG. 3A a user may tap the screen of the output device 321 a with a finger where thedisplay object 320 a is displayed in order to indicate that the user would like to move thedisplay object 320 a to another location on theoutput device 310 a. - Next, in
FIG. 3B , shown is an example method for identifying a location to move a display object based on user gestures as described inblock 210 ofFIG. 2 . As shown inFIG. 3B , a user may generate a second user gesture using one or more fingers in order to identify a location on anoutput device 310 b to move an identifieddisplay object 320 b. For instance, as shown inFIG. 3B , a user may tap a location on the screen of theoutput device 310 b with a finger in order to indicate the location to move thedisplay object 320 b on theoutput device 310 b. - Lastly, in
FIG. 3C , shown is an example method for updating an output device to display an identified display object at an identified location on the output device as described inblock 215 ofFIG. 2 . According to one embodiment, based upon the first and second user gesture, an updated presentation of a selecteddisplay object 320 c at an identified location on anoutput device 310 c may be displayed to one or more users. The disclosure is described above with reference to block and flow diagrams of systems, methods, apparatus, and/or computer program products according to example embodiments of the disclosure. It will be understood that one or more blocks of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, respectively, can be implemented by computer-executable program instructions. Likewise, some blocks of the block diagrams and flow diagrams may not necessarily need to be performed in the order presented, or may not necessarily need to be performed at all, according to some embodiments of the disclosure. - These computer-executable program instructions may be loaded onto a general purpose computer, a special purpose computer, a processor, or other programmable data processing apparatus to produce a particular machine, such that the instructions that execute on the computer, processor, or other programmable data processing apparatus create means for implementing one or more functions specified in the flow diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement one or more functions specified in the flow diagram block or blocks. As an example, embodiments of the disclosure may provide for a computer program product, comprising a computer usable medium having a computer-readable program code or program instructions embodied therein, said computer-readable program code adapted to be executed to implement one or more functions specified in the flow diagram block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational elements or steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide elements or steps for implementing the functions specified in the flow diagram block or blocks.
- Accordingly, blocks of the block diagrams and flow diagrams support combinations of means for performing the specified functions, combinations of elements or steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, can be implemented by special purpose, hardware-based computer systems that perform the specified functions, elements or steps, or combinations of special purpose hardware and computer instructions.
- While the disclosure has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the disclosure is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
- This written description uses examples to disclose the disclosure, including the best mode, and also to enable any person skilled in the art to practice the disclosure, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the disclosure is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
Claims (20)
1. A method for moving display objects based on user gestures, the method comprising:
detecting, by at least one control device, a first user gesture adjacent to an output device to identify at least one display object displayed on the output device;
detecting, by the at least one control device, a second user gesture adjacent to the output device identifying a location to move the at least one display object; and
updating, by the at least one control device, the output device to display the at least one display object at the identified location on the output device.
2. The method of claim 1 , wherein the first user gesture is generated by a first user.
3. The method of claim 1 , wherein the first user gesture is a finger stroke gesture.
4. The method of claim 1 , wherein either the first user gesture or the second user gesture is detected by the at least one control device via an input device disposed in close proximity to the output device.
5. The method of claim 4 , wherein the input device comprises at least one of: a camera, a transparent ink pad, a gesture reader software module or a transparent ink pad user interface control.
6. The method of claim 4 , wherein detecting, by at least one control device, a first user gesture adjacent to the output device to identify at least one display object displayed on the output device further comprises:
receiving, by the at least one control device from the input device, the first user gesture;
determining, by the at least one control device, a text character or command associated with the first user gesture; and
identifying, by the at least one control device, a gesture action associated with the text character or command.
7. The method of claim 6 , wherein the gesture action comprises selecting and moving the at least one display object on the output device.
8. The method of claim 1 , wherein the second user gesture is generated by a second user.
9. The method of claim 1 , wherein the second user gesture is a finger tap gesture.
10. The method of claim 6 , wherein updating, by the at least one control device, the output device to display the at least one display object at the identified location on the output device further comprises:
executing, by the at least one control device, the at least one display object to the identified location on the output device based at least in part on detecting the second user gesture.
11. A system for moving display objects being displayed on an output device of a computer based on one or more user gestures, the system comprising:
an input unit configured to detect at least one of a first user gesture or a second user gesture on an output device of a surface computer; and
at least one control device in communication with the input unit that is configured to:
detect a first user gesture adjacent to an output device in order to identity at least one display object displayed on the output device;
detect a second user gesture adjacent to the output device in order to identify a location to move the at least one display object; and
update the output device to display the at least one display object at the identified location on the output device.
12. The system of claim 11 , wherein the first user gesture is generated by a first user.
13. The system of claim 11 , wherein the first user gesture is a finger stroke gesture.
14. The system of claim 11 , wherein the input device is disposed in close proximity to the output device.
15. The system of claim 11 , wherein the input device comprises at least one of: a camera, a transparent ink pad, a gesture reader software module, or a transparent ink pad user interface control.
16. The system of claim 15 , wherein the input unit detects the first user gesture or the second user gesture via the transparent ink pad control user interface.
17. The system of claim 11 , wherein the at least one control device is further configured select the at least one display object based on the first user gesture.
18. The system of claim 11 , wherein the second user gesture is generated from a second user.
19. The system of claim 11 , wherein the second user gesture is a finger tap gesture.
20. The system of claim 11 , wherein the at least one controller is further configured to update the output device to display the at least one display object at the identified location on the output device in response to detecting the second user gesture.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/903,056 US20140359538A1 (en) | 2013-05-28 | 2013-05-28 | Systems and methods for moving display objects based on user gestures |
US14/136,840 US9459786B2 (en) | 2013-05-28 | 2013-12-20 | Systems and methods for sharing a user interface element based on user gestures |
PCT/US2014/038109 WO2014193657A1 (en) | 2013-05-28 | 2014-05-15 | Systems and methods for moving display objects based on user gestures |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/903,056 US20140359538A1 (en) | 2013-05-28 | 2013-05-28 | Systems and methods for moving display objects based on user gestures |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/136,840 Continuation US9459786B2 (en) | 2013-05-28 | 2013-12-20 | Systems and methods for sharing a user interface element based on user gestures |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140359538A1 true US20140359538A1 (en) | 2014-12-04 |
Family
ID=50942897
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/903,056 Abandoned US20140359538A1 (en) | 2013-05-28 | 2013-05-28 | Systems and methods for moving display objects based on user gestures |
US14/136,840 Active 2034-04-09 US9459786B2 (en) | 2013-05-28 | 2013-12-20 | Systems and methods for sharing a user interface element based on user gestures |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/136,840 Active 2034-04-09 US9459786B2 (en) | 2013-05-28 | 2013-12-20 | Systems and methods for sharing a user interface element based on user gestures |
Country Status (2)
Country | Link |
---|---|
US (2) | US20140359538A1 (en) |
WO (1) | WO2014193657A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150067456A1 (en) * | 2013-08-28 | 2015-03-05 | Canon Kabushiki Kaisha | Image display apparatus, control method therefor, and storage medium |
CN105354726A (en) * | 2015-11-30 | 2016-02-24 | 陕西服装工程学院 | Garment customization process-based digital management system and method |
US20190286245A1 (en) * | 2016-11-25 | 2019-09-19 | Sony Corporation | Display control device, display control method, and computer program |
CN112055842A (en) * | 2018-05-08 | 2020-12-08 | 谷歌有限责任公司 | Drag gesture animation |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9860145B2 (en) | 2015-07-02 | 2018-01-02 | Microsoft Technology Licensing, Llc | Recording of inter-application data flow |
US9733915B2 (en) | 2015-07-02 | 2017-08-15 | Microsoft Technology Licensing, Llc | Building of compound application chain applications |
US9733993B2 (en) | 2015-07-02 | 2017-08-15 | Microsoft Technology Licensing, Llc | Application sharing using endpoint interface entities |
US9712472B2 (en) | 2015-07-02 | 2017-07-18 | Microsoft Technology Licensing, Llc | Application spawning responsive to communication |
US9785484B2 (en) | 2015-07-02 | 2017-10-10 | Microsoft Technology Licensing, Llc | Distributed application interfacing across different hardware |
US10198252B2 (en) | 2015-07-02 | 2019-02-05 | Microsoft Technology Licensing, Llc | Transformation chain application splitting |
US9658836B2 (en) | 2015-07-02 | 2017-05-23 | Microsoft Technology Licensing, Llc | Automated generation of transformation chain compatible class |
US10261985B2 (en) | 2015-07-02 | 2019-04-16 | Microsoft Technology Licensing, Llc | Output rendering in dynamic redefining application |
US10031724B2 (en) | 2015-07-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | Application operation responsive to object spatial status |
US10198405B2 (en) | 2015-07-08 | 2019-02-05 | Microsoft Technology Licensing, Llc | Rule-based layout of changing information |
US10277582B2 (en) | 2015-08-27 | 2019-04-30 | Microsoft Technology Licensing, Llc | Application service architecture |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060007166A1 (en) * | 2004-07-06 | 2006-01-12 | Jao-Ching Lin | Method and controller for identifying a drag gesture |
US20060238515A1 (en) * | 2005-04-26 | 2006-10-26 | Alps Electric Co., Ltd. | Input device |
US20080055272A1 (en) * | 2006-09-06 | 2008-03-06 | Freddy Allen Anzures | Video Manager for Portable Multifunction Device |
US20090066666A1 (en) * | 2007-09-12 | 2009-03-12 | Casio Hitachi Mobile Communications Co., Ltd. | Information Display Device and Program Storing Medium |
US20130073980A1 (en) * | 2011-09-21 | 2013-03-21 | Sony Corporation, A Japanese Corporation | Method and apparatus for establishing user-specific windows on a multi-user interactive table |
US20140282066A1 (en) * | 2013-03-13 | 2014-09-18 | Promontory Financial Group, Llc | Distributed, interactive, collaborative, touchscreen, computing systems, media, and methods |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020018051A1 (en) * | 1998-09-15 | 2002-02-14 | Mona Singh | Apparatus and method for moving objects on a touchscreen display |
US20050091595A1 (en) | 2003-10-24 | 2005-04-28 | Microsoft Corporation | Group shared spaces |
US20060007174A1 (en) | 2004-07-06 | 2006-01-12 | Chung-Yi Shen | Touch control method for a drag gesture and control module thereof |
CA2635499A1 (en) * | 2005-02-12 | 2006-08-24 | Teresis Media Management, Inc. | Methods and apparatuses for assisting the production of media works and the like |
US8914733B2 (en) | 2005-10-04 | 2014-12-16 | International Business Machines Corporation | User interface widget unit sharing for application user interface distribution |
US8689115B2 (en) | 2008-09-19 | 2014-04-01 | Net Power And Light, Inc. | Method and system for distributed computing interface |
US8269736B2 (en) * | 2009-05-22 | 2012-09-18 | Microsoft Corporation | Drop target gestures |
US9065927B2 (en) * | 2010-10-13 | 2015-06-23 | Verizon Patent And Licensing Inc. | Method and system for providing context based multimedia intercom services |
TW201234223A (en) | 2011-02-01 | 2012-08-16 | Novatek Microelectronics Corp | Moving point gesture determination method, touch control chip, touch control system and computer system |
TW201237725A (en) | 2011-03-04 | 2012-09-16 | Novatek Microelectronics Corp | Single-finger and multi-touch gesture determination method, touch control chip, touch control system and computer system |
US20130290855A1 (en) | 2012-04-29 | 2013-10-31 | Britt C. Ashcraft | Virtual shared office bulletin board |
-
2013
- 2013-05-28 US US13/903,056 patent/US20140359538A1/en not_active Abandoned
- 2013-12-20 US US14/136,840 patent/US9459786B2/en active Active
-
2014
- 2014-05-15 WO PCT/US2014/038109 patent/WO2014193657A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060007166A1 (en) * | 2004-07-06 | 2006-01-12 | Jao-Ching Lin | Method and controller for identifying a drag gesture |
US20060238515A1 (en) * | 2005-04-26 | 2006-10-26 | Alps Electric Co., Ltd. | Input device |
US20080055272A1 (en) * | 2006-09-06 | 2008-03-06 | Freddy Allen Anzures | Video Manager for Portable Multifunction Device |
US20090066666A1 (en) * | 2007-09-12 | 2009-03-12 | Casio Hitachi Mobile Communications Co., Ltd. | Information Display Device and Program Storing Medium |
US20130073980A1 (en) * | 2011-09-21 | 2013-03-21 | Sony Corporation, A Japanese Corporation | Method and apparatus for establishing user-specific windows on a multi-user interactive table |
US20140282066A1 (en) * | 2013-03-13 | 2014-09-18 | Promontory Financial Group, Llc | Distributed, interactive, collaborative, touchscreen, computing systems, media, and methods |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150067456A1 (en) * | 2013-08-28 | 2015-03-05 | Canon Kabushiki Kaisha | Image display apparatus, control method therefor, and storage medium |
US9563606B2 (en) * | 2013-08-28 | 2017-02-07 | Canon Kabushiki Kaisha | Image display apparatus, control method therefor, and storage medium |
US10650489B2 (en) | 2013-08-28 | 2020-05-12 | Canon Kabushiki Kaisha | Image display apparatus, control method therefor, and storage medium |
CN105354726A (en) * | 2015-11-30 | 2016-02-24 | 陕西服装工程学院 | Garment customization process-based digital management system and method |
US20190286245A1 (en) * | 2016-11-25 | 2019-09-19 | Sony Corporation | Display control device, display control method, and computer program |
US11023050B2 (en) * | 2016-11-25 | 2021-06-01 | Sony Corporation | Display control device, display control method, and computer program |
CN112055842A (en) * | 2018-05-08 | 2020-12-08 | 谷歌有限责任公司 | Drag gesture animation |
Also Published As
Publication number | Publication date |
---|---|
US9459786B2 (en) | 2016-10-04 |
WO2014193657A1 (en) | 2014-12-04 |
US20140359478A1 (en) | 2014-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140359538A1 (en) | Systems and methods for moving display objects based on user gestures | |
CN103914260B (en) | Control method and device for operation object based on touch screen | |
US8390577B2 (en) | Continuous recognition of multi-touch gestures | |
US20130187923A1 (en) | Legend indicator for selecting an active graph series | |
US11262895B2 (en) | Screen capturing method and apparatus | |
US12093506B2 (en) | Systems and methods for a touchscreen user interface for a collaborative editing tool | |
US20120304199A1 (en) | Information processing apparatus, information processing method, and computer program | |
US10254950B2 (en) | Display method of terminal device and terminal device | |
US20140215393A1 (en) | Touch-based multiple selection | |
CN103064627A (en) | Application management method and device | |
US9025878B2 (en) | Electronic apparatus and handwritten document processing method | |
US9588678B2 (en) | Method of operating electronic handwriting and electronic device for supporting the same | |
US20150007118A1 (en) | Software development using gestures | |
CN113194024B (en) | Information display method and device and electronic equipment | |
CN104035714A (en) | Event processing method, device and equipment based on Android system | |
CN104679389B (en) | Interface display method and device | |
US10970476B2 (en) | Augmenting digital ink strokes | |
US10162518B2 (en) | Reversible digital ink for inking application user interfaces | |
CN103809912A (en) | Tablet personal computer based on multi-touch screen | |
CN105892895A (en) | Multi-finger sliding gesture recognition method and device as well as terminal equipment | |
US10318047B2 (en) | User interface for electronic device, input processing method, and electronic device | |
WO2017201655A1 (en) | Background application display method and apparatus, electronic device and computer program product | |
CN105824694A (en) | Adjustment method and device for resources of applications | |
CN104407763A (en) | Content input method and system | |
CN110262747B (en) | Method and device for controlling terminal, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GENERAL ELECTRIC COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:THAKUR, PAVAN KUMAR SINGH;GRUBBS, ROBERT WILLIAM;JOHN, JUSTIN V.;SIGNING DATES FROM 20130419 TO 20130502;REEL/FRAME:030497/0729 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |