WO2011029100A1 - User interface methods for ending an application - Google Patents
User interface methods for ending an application Download PDFInfo
- Publication number
- WO2011029100A1 WO2011029100A1 PCT/US2010/048008 US2010048008W WO2011029100A1 WO 2011029100 A1 WO2011029100 A1 WO 2011029100A1 US 2010048008 W US2010048008 W US 2010048008W WO 2011029100 A1 WO2011029100 A1 WO 2011029100A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- path
- measure
- processor
- path event
- event
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000000977 initiatory effect Effects 0.000 claims abstract description 36
- 230000004044 response Effects 0.000 claims abstract description 10
- 230000000007 visual effect Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 description 55
- 230000008569 process Effects 0.000 description 14
- 230000003213 activating effect Effects 0.000 description 9
- 230000001413 cellular effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000003825 pressing Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004091 panning Methods 0.000 description 2
- 229920001690 polydopamine Polymers 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Definitions
- the present invention relates generally to computer user interface systems and more particularly to user systems providing functionality to end an application.
- Personal computing devices e.g. cell phones, PDAs, laptops, gaming devices
- personal computing devices serve as personal organizers, storing documents, photographs, videos and music, and serving as portals to the Internet and electronic mail.
- computing devices shrink in size and reduce in weight to become even more portable.
- the space available for the display also reduces in size. Therefore, to provide users with the largest available display area, electronic device manufacturers are reducing the number of mechanical keys available on the computing device. The fewer keys available on the computing devices, the more space there is for larger displays.
- Various embodiments provide methods and devices for enabling users of a computing device to end present applications by tracing a circular or ellipsoidal shape on a touchscreen.
- the computing device receives a series of user pointing events from a user interface, such as a touchscreen or touchpad, and examines the event data to determine the shape and direction of a path traced in a continuous pointing event. If the traced path is circular or ellipsoidal in shape an application ending function may be initiated by which a present application may be ended and home image may be displayed. Depending on the direction of the traced path, the application may be ended in different manners.
- an ellipsoid-shaped path traced in the clockwise direction may be interpreted to terminate the present application and return to a home image, while an ellipsoid-shaped path traced in the counterclockwise direction may be interpreted to minimize the present application (without terminating it) and return to the home image.
- path lengths may be used to determine rotation angles that may be required to end a present application.
- sensory indicators such as display image distortions, may be used to inform the user about the progress towards ending the present application based upon rotation angles while the user traces an ellipsoidal path event on the touchscreen display.
- FIG. 1 is a frontal view of a portable computing device illustrating application ending functionality activated by a fmger moving in a clockwise direction on a touchscreen display.
- FIGs. 2A - 2B are frontal views of a portable computing device illustrating application ending functionality activated by a fmger moving in a clockwise direction on a touchscreen display.
- FIG. 3 is a frontal view of a portable computing device illustrating a return to home display after activating the application ending functionality.
- FIG. 4 is a frontal view of a portable computing device illustrating application ending functionality activated by a fmger moving in a counterclockwise direction on a touchscreen display.
- FIGs. 5A - 5D are frontal views of a portable computing device illustrating application ending functionality when deactivated by lifting the fmger from the touchscreen display.
- FIG. 6 is frontal view of a portable computing device illustrating application ending function display aides that may be presented on a touchscreen display.
- FIG. 7 is a system block diagram of a computer device suitable for use with the various embodiments.
- FIGs. 8 are process flow diagrams of an embodiment method for implementing an application ending function user interface.
- FIG. 9 is a process flow diagram of an embodiment method for determining whether touch data constitutes an ellipsoid shape for implementing the application ending function.
- FIG. 10 is a component block diagram of an example portable computing device suitable for use with the various embodiments.
- a "touchscreen” is a touch sensing input device or a touch sensitive input device with an associated image display.
- a “touchpad” is a touch sensing input device without an associated image display.
- a touchpad for example, can be implemented on any surface of an electronic device outside the image display area. Touchscreens and touchpads are generically referred to herein as a "touch surface.” Touch surfaces may be integral parts of an electronic device, such as a touchscreen display, or a separate module, such as a touchpad, which can be coupled to the electronic device by a wired or wireless data link. Touchscreen, touchpad and touch surface may be used interchangeably hereinafter.
- the terms "personal electronic device,” “computing device” and “portable computing device” refer to any one or all of cellular telephones, personal data assistants (PDA's), palm-top computers, notebook computers, personal computers, wireless electronic mail receivers and cellular telephone receivers (e.g., the Blackberry® and Treo® devices), multimedia Internet enabled cellular telephones (e.g., the Blackberry Storm®), and similar electronic devices which include a programmable processor, memory and a connected or integral touch surface or other pointing device (e.g., a computer mouse).
- the electronic device is a cellular telephone including an integral touchscreen display.
- this embodiment is present merely as one example implementation of the various embodiments, and as such is not intended to exclude other possible implementations of the subject matter recited in the claims.
- a touch event refers to a detected user input on a touch surface which may include information regarding location or relative location of the touch.
- a touch event refers to the detection of a user touching the device and may include information regarding the location on the device being touched.
- single continuous touch event refers to any input received on a user interface device (e.g., touchscreen or touchpad) in which the touch event (e.g., touch of touchscreen or touchpad) continues without significant interruption.
- a single continuous touch event occurs so long as a user's finger continues to touch the surface.
- path refers to a sequence of touch event locations that trace a path within a graphical user interface (GUI) display during a single continuous touch event.
- GUI graphical user interface
- path event refers to a detected user input on a touch surface which traces a path during a single continuous touch event.
- a path event may include information regarding the locations or relative locations (e.g., within a GUI display) of the touch events which constitute the traced path.
- ellipsoid-shape and “ellipsoidal” refer to any path traced in a single continuous touch event that approximately closes on itself, such as a circle, ellipse, triangle, square, rectangle, or polygon.
- An "ellipsoid-shape” may be detected before the path closes on itself and may include paths that overlap without closing such as a spiral path traced in a single continuous touch event.
- a single continuous touch event can be differentiated from other discrete touch events such as taps on a touchscreen for selecting items or activating an icon.
- the various embodiment methods and devices provide an intuitive user interface for initiating an application ending function. Users simply trace a path on a touchscreen or touch surface in a single continuous touch event. For example, users may use their fingers to touch and trace a circle on a touchscreen of a portable computing device.
- the processor of a computing device may be programmed to recognize paths traced in a single continuous touch event as an ellipsoid-shape and, in response, end the operation of an application.
- An ellipsoid-shaped path may then be differentiated from other path shapes, such as movement of a finger in one direction on a touchscreen for panning or pinching (e.g., in the case of the iPhone® two-finger pinch commands for zooming display images).
- the application ending functionality may be enabled automatically.
- the GUI software may include instructions for automatically recognizing a close-shaped path traced in a single continuous touch event and activating the application ending functionality. Automatic activation of the application ending features may be provided with any application. Also, whenever an application is activated, a GUI may automatically enable the application ending functionality to allow the user to terminate the application by tracing a closed-shape path on the touchscreen.
- the application ending functionality or end mode may be automatically disabled such as may be useful in applications in which a close-shaped path traced in a single continuous touch event may be common (e.g., drawing programs) or used for other functions (e.g. zooming or rotating images).
- the user may be allowed to manually enable the application ending functionality when required.
- a user may select and activate the application ending function by pressing a button or activating an icon on a GUI display.
- the application ending operation may be assigned to a soft key which the user may activate (e.g., by pressing or clicking) to launch the application ending functionality.
- the application ending functionality may be activated by a user command. For example, the user may use a voice command such as "ACTIVATE APPLICATION ENDING" to enable the application ending mode. Once activated, the application ending functionality may be used in the manner described below.
- a user may terminate or minimize (without termination) an application to return to the home display by beginning a single continuous touch event (e.g., by touching a touchscreen or touchpad) and tracing a closed shape, such as a circle, as illustrated in FIG. 1.
- the direction, length, and rotation angle of the path traced in single continuous touch event may control the initiation and progress of the application ending functionality.
- the direction of the path traced in the single continuous touch event may determine the method by which an application is ended (i.e., terminating or minimizing the application). For example, to terminate an application and return to the home display, the user may trace a circle in the clockwise direction using a touchscreen in a single continuous touch event.
- the user may trace a circle in the counterclockwise direction using the touchscreen with a single continuous touch event.
- the direction of the traced circle for purposes of activating either the close or minimize functions is arbitrary and may be reversed from the example description, and may be configured as a user-definable preference (e.g., to accommodate left or right handedness).
- the application ending functionality may depend upon or be triggered by the rotation angle traced in the single continuous touch event. This rotation angle may be calculated based upon the length of the traced path or upon a geometric analysis of the traced path. Thus, an application may be ended when a predetermined rotation angle is traced. For example, the application ending functionality may be assigned to a single loop or ellipsoid path which achieves a rotation angle of 360°. Alternatively, the ending functionality may be assigned to multiple loops or ellipsoid paths (i.e., two or more rotations through 360°) traced on the touchscreen.
- the application ending functionality may be executed in multiple phases, with each phase being triggered by a different rotation angle. For example, when a traced path with a rotation angle of 180° is detected, the application ending process may commence by generating a perceptible indicia to inform a user that the application ending function is being selected, with the functionality completing (i.e., closing the application and returning to the home display) when the traced path reaches a rotation angle of 360°. Other rotation angles which may fall within a range between the initiation and completion rotation angles may be linked to other sub- functions, such as alerting the user about the application ending process.
- the application ending functionality may contort, swirl or fade the display image of the application which is being ended prior to completely ending the application and returning to home display.
- the degree to which the display image is contorted may depend on the rotation angles achieved in the path of the touch event. For example, the degree of contortion may increase as the rotation angles approach 360° until the application is ended at 360° as illustrated in FIGs. 2A-2D and 3. [0029] Further, before achieving a rotation angle of 360°, a user may be allowed to reverse or halt the application ending functionality. For example, if at any time before achieving a maximum rotation angle, a user stops the touch path event (i.e. by either stopping the touch movement or lifting the finger off the touch surface) the
- application ending process may abort leaving the present application open on the display.
- the visual indicia of the ending process e.g., a swirl distortion linked to the angle of rotation
- combined with the ability to abort the function provides users with the visual feedback to recognize when a function is about to be ended so they can confirm their intent or change their mind without a further step of presenting an "Are You Sure?" prompt to which the user must respond.
- the gradual image distortions may be tied to the traced path length or the traced rotation angle.
- Path length may be measured from the starting point of the single continuous touch event (i.e., the GUI location of the first point where the touchscreen or touch pad was touched or the mouse button was depressed).
- the rotation angle may be determined in terms of the number of radians about an estimated center point spanned by the ellipsoidal path. Also in a preferred
- the degree of contortion applied to an image is linearly dependent upon the length of the traced path (or radians spanned).
- the application ending functionality may be implemented on any touch surface.
- the touch surface is a touchscreen that is touched by a finger, since touchscreens are generally superimposed on a display image, enabling users to interact with the display image with the touch of a finger.
- the user interacts with an image by touching the touchscreen with a finger and tracing an elliptical path (thus the user's finger activating the touchscreen serves as the pointing device).
- Touchscreen touch events acquisition i.e., detection of a finger touch on a touchscreen
- processing are well known, such as disclosed in U.S. Patent No. 6,323,846 the entire contents of which are hereby incorporated by reference.
- an example computing device 100 includes a touchscreen display 102 and function keys 106 for interfacing with a graphical user interface.
- the computing device 100 is running an address book application which displays the names of several contacts on the touchscreen display 102.
- a user can end the address book application and return to the home display by touching the touchscreen 106 with, for example, a finger 108 and moving the finger 108 to trace a closed path (e.g., a circle) in a single continuous touch event (i.e., without raising the finger from the touchscreen display 102).
- a closed path e.g., a circle
- An example direction and the general shape of the path that a user may trace are shown by a dotted circle 1 10 with arrows.
- the dotted circle 1 10 is shown to only indicate the shape and direction of the finger 108 movement and is not included as part of the touchscreen display 102 in the embodiment illustrated in FIG. 1.
- the application ending function may be configured to recognize a minimum rotation angle based upon the path event traced. Once a minimum rotation angle is achieved, the application ending function may be initiated to inform the user about the progression of the ending function by providing an indication (e.g., visual contortion of the present image, sound effects or vibrations) linked to the ending function. Additionally, the application ending function may be configured to recognize a maximum rotation angle based upon the path event traced at which angle the present application may be ended and the user is returned to the home display.
- a minimum rotation angle may be set at 180° (i.e., half circle) at which the ending function may be started and a maximum rotation angle may be set at 360° (i.e., a full circle) at which point the application is ended.
- a user may use his/her finger 108 to trace a half circle on the touchscreen to reach the rotation angle 180°, as shown in FIG. 1. No change will be shown on the display image as long as the rotation angle remains less than 180° in this example implementation.
- the application ending function may activate and begin contorting the display, such as swirling the present image to provide the user with visual indications about the progress of the application ending function as illustrated in FIGs. 2A - 2D.
- FIG. 2A illustrates the starting position of the finger 108 as the finger 108 touches the touchscreen 102.
- the touch of a finger 108 to the touchscreen 102 is referred to herein sometimes as a "touch down" event.
- FIG. 2B illustrates an example of the swirling effect that may be initiated as the user traces a path with a rotation angle larger than 180° and smaller than 360° on the touchscreen display shown in FIG. 1.
- the dotted lines 1 1 1 in FIGs. 2B - 2C illustrate the progress of the finger 108 as it traces a closed shape by touching the touchscreen 102.
- the increasing image distortion serves to indicate the approaching end of the application.
- FIGs. 2A - 2D also illustrate how the display image may appear when the user pauses while tracing a path without lifting the finger off of the touchscreen. Pausing the trace stops the swirling function at the current swirl state. A user may then continue to towards ending the application by continuing to trace the circular path on the touchscreen display 102 which will be indicated by increasing swirl distortions. A user may also back trace which may be reflected in decreasing swirl distortions (i.e., unwinding the swirl) to indicate that the user is backing away from ending the application.
- the application end functionality may terminate the present application and return to the home display as illustrated in FIG. 3 which shows a home display comprising a Mona Lisa image.
- FIGs. 1-3 links the application ending function to a clockwise ellipsoidal path.
- the application ending function can also be activated by tracing a counterclockwise ellipsoidal path as illustrated in FIG. 4.
- the application ending function may be configured to minimize an application instead of ending it when a user tracing a circular path in the counterclockwise direction as shown in FIG. 4.
- a minimized application continues to reside in memory or potentially perform operations in the background.
- a user can return to a minimized application at the same state as when the application was minimized.
- a user touches the touchscreen 102 using, for example, a finger 108 and, while touching the surface, traces a circular or ellipsoidal path in the counterclockwise direction as shown by the dotted circle 1 10.
- the application ending function may be configured to abort when the user lifts his/her finger off of the touchscreen before the maximum rotation angle is achieved as is illustrated in FIGs. 5A - 5D.
- FIG. 5A illustrates the starting position of finger 108 as it comes into contact with the touchscreen 102 (at touch down event).
- FIG. 5B shows an image of a present application display being transformed into a swirl as the user's finger 108 traces a path event pasted a half circle.
- the dotted lines 1 1 1 in FIGs. 5B - 5D illustrate the progress of the finger 108 in tracing a closed-shape on the touchscreen 102.
- FIG. 5A illustrates the starting position of finger 108 as it comes into contact with the touchscreen 102 (at touch down event).
- FIG. 5B shows an image of a present application display being transformed into a swirl as the user's finger 108 traces a path event pasted a half circle.
- the dotted lines 1 1 1 in FIGs. 5B - 5D
- 5B - 5C illustrate the progression of the swirl transformation as the user continues tracing of a path towards a full circle.
- the application ending function is aborted and the present application display (i.e., the contact names in the address book) is returned to normal as illustrated in FIG. 5D.
- the application ending function within the GUI may be configured to display a visual aid within the GUI display to assist the user in tracing a closed path.
- a guide wheel 1 12 may be presented on the touchscreen display 102 to illustrate the shape that the user can trace to end the present application.
- the GUI may be configured so the guide wheel 1 12 is displayed in response to a number of different triggers.
- a guide wheel 1 12 may appear on the touchscreen display 102 in response to the touch of the user's finger.
- the guide wheel 1 12 may appear each time the application ending function is enabled and the user touches the touchscreen display 102.
- the guide wheel 1 12 may appear in response to the user touching and applying pressure to the touchscreen 102 or a touchpad. In this case, just touching the touchscreen 102 (or a touchpad) and tracing a shape will not cause a guide wheel 1 12 to appear, however, the guide wheel 1 12 appears if the user touches and presses the touchscreen 102 (or touchpad).
- a soft key may be designated which when pressed by the user initiates display of the guide wheel 1 12. In this case, the user may view the guide wheel 1 12 on the touchscreen display 102 by pressing the soft key, and then touch the touchscreen to begin tracing the shape of the guide wheel 1 12 to end the present application and return to home display.
- the guide wheel 1 12 may be activated when the GUI detects a continuous path that spans more than 90° or angular rotation but less than the minimum rotation angle.
- the guide wheel 1 12 may be activated by voice command as in the manner of other voice activated functions that may be implemented on the portable computing device. In this case, when the user's voice command is received and recognized by the portable computing device 100, the guide wheel 1 12 is presented on the touchscreen display 102 to serve as a visual aid or guide for the user.
- the guide wheel 1 12 implementation description provided above is only one example of visual aids that may be implemented as part of the application ending functionality. As such, these examples are not intended to limit the scope of the present invention.
- the application ending functionality may be configured to enable users to change the display and other features of the function based on their individual preferences by using known methods. For example, users may turn off the guide wheel 1 12 feature or configure the application ending functionality to show a guide wheel 1 12 only when the user touches and holds a finger in one place on the touchscreen for a period of time, such as more than 5 seconds.
- FIG. 7 illustrates a system block diagram of software and/or hardware components of a computing device 100 suitable for use in implementing the various embodiments.
- the computing device 100 may include a touch surface 101, such as a touchscreen or touchpad, a display 104, a processor 103 and a memory device 105.
- the touch surface 101 and the display 104 may be the same device, such as a touchscreen 102.
- the processor 103 may be programmed to receive and process the touch information and recognize a single continuous touch event, such as an uninterrupted stream of touch location data received from the touch surface 101.
- the processor 103 may also be configured to recognize the path traced during a single continuous touch event by, for example, noting the location of the touch at each instant and movement of the touch location over time. Using such information, the processor 103 can determine the traced path length and direction, and from this information recognize a closed path and calculate a rotation angle based upon the path length. The processor 103 can apply the determined rotation angle to provide the user with indications about the progress of the ending function, for example, by generating the appropriate image sent to the display 102.
- the processor may also be coupled to memory 105 which may be used to store information related touch events, traced paths and image processing data.
- FIGs. 8 - 9 illustrate an embodiment method for implementing the application ending function on a computing device 100 equipped with a touchscreen 102.
- the processor 103 of a computing device 100 may be programmed to receive a touch events (i.e., detect a touch down event) from the touchscreen 102, step 1200, such as in the form of an interrupt or message indicating that the touchscreen 102 is being touched.
- the processor 103 may then determine a new touch path start location, step 1202.
- the processor 103 may obtain the touch location information from the touchscreen 102 and store the touch location information in memory as touch path data, step 1204. This operation may involve storing the location of the touch in memory in a data structure that the processor 103 can use to determine a traced path length.
- a touch up event i.e., the user's finger has been lifted and is no longer contacting the touchscreen
- the processor 103 may determine whether the length of the path that has been traced exceeds a predetermined threshold length "X," determination 1212.
- step 1210 such as image panning or scrolling functions.
- the processor 103 may determine the length of the traced path (or the number of revolutions or radians spanned about the display center), step 1218. The processor 103 may also calculate the rotation angle based on the path length, step 1220, and determine whether the calculated rotation angle is equal to or greater than a
- predetermined first rotation angle threshold value nj°
- the processor 103 may generate a distorted image display, such as by increasing the swirl effect based on the calculated rotation angle, step 1226.
- a maximum rotation angle second threshold value n 2 ° determination 1228
- the application ending function may enable users to either end the application, as described above, or minimize the application but leave it running in the background and return to the home display.
- the processor 103 may compare the path length to a first threshold value to determine when to initiate an operation. More generally, the processor 103 may determine a measure of the path, in which the measure may be a path length, spanned radians about a center point of the touch surface, or some other measure of the degree to which the ellipsoid (or other shape) is complete. This measure of the path may be compared to a first threshold value and a second threshold value to determine when to initiate the operation, such as contorting the image, and to complete the operation, such as terminating the present application.
- FIG. 9 illustrates an embodiment method for implementing the operations for determining whether a traced path is ellipsoidal in shape included in step 1214 of FIG. 8.
- the processor may access path data stored in memory, step 1300, and process the data using known methods to eliminate or interpolate among small path segments (i.e. "smooth"), step 1302. Once the small segments have been smoothed, the processor may check to determine whether the smoothed path data includes a series of touch locations Q including at least a minimum number of points, determination 1304, such as a minimum of five points. In alternative embodiments, the minimum number of stroke point array Q may be 3, 10 or more points.
- the invention and the claims encompass an embodiment in which a clockwise path trace is interpreted as an application minimizing command and a counterclockwise path trace is interpreted as an application terminating command.
- the processor when the processor detects an ellipsoidal path with a clockwise rotation the processor calculates a rotation angle and if the rotation angle is equal to or greater than a pre-determined maximum rotation angle it minimizes the present application and returns to the home display, and when the processor detects an ellipsoidal path with a counterclockwise rotation the processor calculates a rotation angle and if the rotation angle is equal to or greater than a predetermined maximum rotation angle it terminates the present application and returns to the home display.
- the application ending function associated with a direction of rotation i.e., clockwise or
- counterclockwise may be selected as a user-selectable option so that the user can determine whether a clockwise rotation results in termination or minimizing if a present application. This may be useful to enable users to configure their computing device according to whether they are left- or right-handed.
- the various embodiments are described in the context of activating an application termination or minimizing function; however, the circular gesture with corresponding image contortions may also be used to activate other types of computer and application functions and operations.
- the gesture may be used to activate a device shutdown operation, initiate a screen saver or battery saver mode, initiating a sleep mode, suspending an application operation (i.e., terminating a present operation of an application without terminating the application itself), or switch to another application.
- the operational functionality of initiating the function when a traced path achieves a minimum length or rotational angle, contorting the display image based upon the length of the path or rotation angle, and completing the function upon achieving a maximum length or rotational angle provide the same user feedback benefits as described above, enabling users to recognize when the corresponding function is about to be activated and to confirm the action by completing the circular motion.
- the path traced on the touch surface may be shapes other than circular or elliptical, and different functions or operations may correspond to different shapes.
- a triangular path shape may correspond to a different function, such as shutting down the device or activating a screen or battery saving mode of operation.
- a square path shape may correspond to a third function, such as switching to another application or operating mode (e.g., activating the telephone mode).
- the methods used to implement these further embodiments are substantially the same as described above with reference to FIGs. 8- 10 with the exception that the other path shapes are recognized as being different from the circular or elliptical shape.
- the portable computing devices 100 may include a processor 103 coupled to internal memory 105 and a touch surface input device 101 or display 104.
- the touch surface input device 101 can be any type of touchscreen 102, such as a resistive-sensing touchscreen, capacitive-sensing touchscreen, infrared sensing touchscreen, acoustic/piezoelectric sensing touchscreen or the like.
- the various aspects are not limited to any particular type of touchscreen 102 or touchpad technology.
- the portable computing device 100 may have an antenna 134 for sending and receiving
- Portable computing devices 100 which do not include a touchscreen input device 102 (typically including a display 104) typically include a key pad 136 or miniature keyboard and menu selection keys or rocker switches 137 which serve as pointing devices.
- the processor 103 may further be connected to a wired network interface 138, such as a universal serial bus (USB) or Fire Wire ® connector socket, for connecting the processor 103 to an external touchpad or touch surfaces or external local area network.
- USB universal serial bus
- Fire Wire ® connector socket for connecting the processor 103 to an external touchpad or touch surfaces or external local area network.
- a touch surface can be provided in areas of the electronic device 100 outside of the touchscreen 102 or display 104.
- the keypad 136 can include a touch surface with buried capacitive touch sensors.
- the keypad 136 may be eliminated so the touchscreen 102 provides the complete GUI.
- a touch surface may be an external touchpad that can be connected to the electronic device 100 by means of a cable to a cable connector 138 or a wireless transceiver (e.g., transceiver 135) coupled to the processor 103.
- the processor 103 may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described above.
- multiple processors 103 may be provided, such as one processor dedicated to wireless communication functions and one processor dedicated to running other applications.
- the processor may also be included as part of a communication chipset.
- software applications may be stored in the internal memory 105 before they are accessed and loaded into the processor 103.
- the processor 103 may include internal memory sufficient to store the application software instructions.
- the term memory refers to all memory accessible by the processor 103, including internal memory 105 and memory within the processor 103 itself.
- Application data files are typically stored in the memory 105.
- the memory 105 may be a volatile or nonvolatile memory, such as flash memory, or a mixture of both.
- An exemplary storage medium is coupled to a processor such that the processor can read information from, and write information to, the storage medium.
- the storage medium may be integral to the processor.
- the processor and the storage medium may reside in an ASIC.
- the ASIC may reside in a user terminal or computing device.
- the processor and the storage medium may reside as discrete components in a user terminal or computing device.
- the blocks and/or actions of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a machine readable medium and/or computer readable medium, which may be incorporated into a computer program product.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020127008863A KR101445944B1 (en) | 2009-09-07 | 2010-09-07 | User interface methods for ending an application |
CN201080043128.6A CN102576284B (en) | 2009-09-07 | 2010-09-07 | User interface methods for ending an application |
JP2012528119A JP5438221B2 (en) | 2009-09-07 | 2010-09-07 | User interface method to exit the application |
EP10760824.2A EP2476048B1 (en) | 2009-09-07 | 2010-09-07 | User interface methods for ending an application |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/554,973 US8413065B2 (en) | 2009-09-07 | 2009-09-07 | User interface methods for ending an application |
US12/554,973 | 2009-09-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011029100A1 true WO2011029100A1 (en) | 2011-03-10 |
Family
ID=43304188
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2010/048008 WO2011029100A1 (en) | 2009-09-07 | 2010-09-07 | User interface methods for ending an application |
Country Status (6)
Country | Link |
---|---|
US (1) | US8413065B2 (en) |
EP (1) | EP2476048B1 (en) |
JP (1) | JP5438221B2 (en) |
KR (1) | KR101445944B1 (en) |
CN (1) | CN102576284B (en) |
WO (1) | WO2011029100A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013228953A (en) * | 2012-04-26 | 2013-11-07 | Kyocera Corp | Device, method, and program |
CN103543920A (en) * | 2012-07-10 | 2014-01-29 | 联想(北京)有限公司 | Method for processing information and electronic device |
Families Citing this family (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6610917B2 (en) | 1998-05-15 | 2003-08-26 | Lester F. Ludwig | Activity indication, external source, and processing loop provisions for driven vibrating-element environments |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US8665272B2 (en) * | 2007-09-26 | 2014-03-04 | Autodesk, Inc. | Navigation system for a 3D virtual scene |
US9019237B2 (en) | 2008-04-06 | 2015-04-28 | Lester F. Ludwig | Multitouch parameter and gesture user interface employing an LED-array tactile sensor that can also operate as a display |
JP4636131B2 (en) * | 2008-07-04 | 2011-02-23 | ソニー株式会社 | Information providing apparatus, information providing method, and program |
US8345014B2 (en) | 2008-07-12 | 2013-01-01 | Lester F. Ludwig | Control of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface |
US9213426B2 (en) * | 2010-05-26 | 2015-12-15 | Cirque Corporation | Reenable delay of a touchpad or touch screen to prevent erroneous input when typing |
TWI525480B (en) * | 2010-06-14 | 2016-03-11 | Sitronix Technology Corp | Position detection device and detection method |
US9626023B2 (en) | 2010-07-09 | 2017-04-18 | Lester F. Ludwig | LED/OLED array approach to integrated display, lensless-camera, and touch-screen user interface devices and associated processors |
US20120060123A1 (en) * | 2010-09-03 | 2012-03-08 | Hugh Smith | Systems and methods for deterministic control of instant-on mobile devices with touch screens |
US20120159395A1 (en) | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Application-launching interface for multiple modes |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US8612874B2 (en) | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
US9223495B2 (en) * | 2011-03-25 | 2015-12-29 | Samsung Electronics Co., Ltd. | System and method for crossing navigation for use in an electronic terminal |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
JP5830935B2 (en) * | 2011-05-27 | 2015-12-09 | ソニー株式会社 | Information processing apparatus, information processing method, and computer program |
US9104307B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
TWI456434B (en) * | 2011-05-31 | 2014-10-11 | Compal Electronics Inc | Electronic apparatus with touch input system |
US9703382B2 (en) * | 2011-08-29 | 2017-07-11 | Kyocera Corporation | Device, method, and storage medium storing program with control for terminating a program |
US20130057587A1 (en) | 2011-09-01 | 2013-03-07 | Microsoft Corporation | Arranging tiles |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
TW201319921A (en) * | 2011-11-07 | 2013-05-16 | Benq Corp | Method for screen control and method for screen display on a touch screen |
US9223472B2 (en) * | 2011-12-22 | 2015-12-29 | Microsoft Technology Licensing, Llc | Closing applications |
TWI528235B (en) * | 2012-02-08 | 2016-04-01 | 緯創資通股份有限公司 | Touch display device and touch method |
CN102622178B (en) * | 2012-03-09 | 2013-06-12 | 游图明 | Touch screen electronic equipment-based method for warping plane image |
CN106133748B (en) | 2012-05-18 | 2020-01-31 | 苹果公司 | Device, method and graphical user interface for manipulating a user interface based on fingerprint sensor input |
CN104321729B (en) * | 2012-06-14 | 2017-11-17 | 宇龙计算机通信科技(深圳)有限公司 | The method of toch control of terminal and terminal |
KR20130143160A (en) * | 2012-06-20 | 2013-12-31 | 삼성전자주식회사 | Apparatus and method for scrolling a information of terminal equipment having touch device |
US20140004942A1 (en) * | 2012-07-02 | 2014-01-02 | Peter Steinau | Methods and systems for providing commands using repeating geometric shapes |
CN103970395A (en) * | 2013-01-30 | 2014-08-06 | 腾讯科技(深圳)有限公司 | Method and device for stopping background programs |
US9507425B2 (en) * | 2013-03-06 | 2016-11-29 | Sony Corporation | Apparatus and method for operating a user interface of a device |
CN108810256B (en) * | 2013-03-11 | 2022-04-19 | 联想(北京)有限公司 | Control method and device |
US9767076B2 (en) * | 2013-03-15 | 2017-09-19 | Google Inc. | Document scale and position optimization |
US9588675B2 (en) | 2013-03-15 | 2017-03-07 | Google Inc. | Document scale and position optimization |
KR102134443B1 (en) | 2013-05-03 | 2020-07-15 | 삼성전자주식회사 | Electronic device and method for manipulation screen of electronic device based control motion |
CN103309618A (en) * | 2013-07-02 | 2013-09-18 | 姜洪明 | Mobile operating system |
US9177362B2 (en) * | 2013-08-02 | 2015-11-03 | Facebook, Inc. | Systems and methods for transforming an image |
CN105359094A (en) | 2014-04-04 | 2016-02-24 | 微软技术许可有限责任公司 | Expandable Application Representation |
CN105359055A (en) | 2014-04-10 | 2016-02-24 | 微软技术许可有限责任公司 | Slider cover for computing device |
KR102107275B1 (en) | 2014-04-10 | 2020-05-06 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Collapsible shell cover for computing device |
CN106662891B (en) | 2014-10-30 | 2019-10-11 | 微软技术许可有限责任公司 | Multi-configuration input equipment |
JP2016224919A (en) * | 2015-06-01 | 2016-12-28 | キヤノン株式会社 | Data browsing device, data browsing method, and program |
CN105094801B (en) | 2015-06-12 | 2019-12-24 | 阿里巴巴集团控股有限公司 | Application function activation method and device |
US10497464B2 (en) | 2015-10-28 | 2019-12-03 | Samsung Electronics Co., Ltd. | Method and device for in silico prediction of chemical pathway |
US11409410B2 (en) | 2020-09-14 | 2022-08-09 | Apple Inc. | User input interfaces |
CN112817445A (en) * | 2021-01-25 | 2021-05-18 | 暗物智能科技(广州)有限公司 | Information acquisition method and device, electronic equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5590219A (en) | 1993-09-30 | 1996-12-31 | Apple Computer, Inc. | Method and apparatus for recognizing gestures on a computer system |
US6323846B1 (en) | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US20080168403A1 (en) * | 2007-01-06 | 2008-07-10 | Appl Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3301079B2 (en) * | 1990-06-18 | 2002-07-15 | ソニー株式会社 | Information input device, information input method, information processing device, and information processing method |
JPH09305300A (en) * | 1996-05-15 | 1997-11-28 | Mitsubishi Electric Corp | Application control unit |
US7290285B2 (en) * | 2000-06-30 | 2007-10-30 | Zinio Systems, Inc. | Systems and methods for distributing and viewing electronic documents |
EP1479065A4 (en) * | 2002-02-26 | 2009-11-11 | Cirque Corp | Touchpad having fine and coarse input resolution |
JP2004133086A (en) * | 2002-10-09 | 2004-04-30 | Seiko Epson Corp | Display apparatus, electronic device and watch |
ATE332528T1 (en) * | 2002-11-20 | 2006-07-15 | Nokia Corp | METHOD AND USER INTERFACE FOR CHARACTER ENTRY |
US7886236B2 (en) * | 2003-03-28 | 2011-02-08 | Microsoft Corporation | Dynamic feedback for gestures |
JP2005190058A (en) * | 2003-12-25 | 2005-07-14 | Fuji Xerox Co Ltd | User interface designing support device, user interface evaluating method, and computer program |
CN101208700A (en) * | 2005-06-27 | 2008-06-25 | 诺沃-诺迪斯克有限公司 | User interface for delivery system providing graphical programming of profile |
JP5239328B2 (en) * | 2007-12-21 | 2013-07-17 | ソニー株式会社 | Information processing apparatus and touch motion recognition method |
CN101482787B (en) * | 2008-01-09 | 2011-11-09 | 宏达国际电子股份有限公司 | Hand-hold electronic device operation method, touch type interface device |
US8286099B2 (en) * | 2008-03-24 | 2012-10-09 | Lenovo (Singapore) Pte. Ltd. | Apparatus, system, and method for rotational graphical user interface navigation |
US20090300554A1 (en) * | 2008-06-03 | 2009-12-03 | Nokia Corporation | Gesture Recognition for Display Zoom Feature |
US8407623B2 (en) * | 2009-06-25 | 2013-03-26 | Apple Inc. | Playback control using a touch interface |
-
2009
- 2009-09-07 US US12/554,973 patent/US8413065B2/en active Active
-
2010
- 2010-09-07 KR KR1020127008863A patent/KR101445944B1/en not_active IP Right Cessation
- 2010-09-07 EP EP10760824.2A patent/EP2476048B1/en not_active Not-in-force
- 2010-09-07 JP JP2012528119A patent/JP5438221B2/en not_active Expired - Fee Related
- 2010-09-07 CN CN201080043128.6A patent/CN102576284B/en not_active Expired - Fee Related
- 2010-09-07 WO PCT/US2010/048008 patent/WO2011029100A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5590219A (en) | 1993-09-30 | 1996-12-31 | Apple Computer, Inc. | Method and apparatus for recognizing gestures on a computer system |
US6323846B1 (en) | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US20080168403A1 (en) * | 2007-01-06 | 2008-07-10 | Appl Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013228953A (en) * | 2012-04-26 | 2013-11-07 | Kyocera Corp | Device, method, and program |
CN103543920A (en) * | 2012-07-10 | 2014-01-29 | 联想(北京)有限公司 | Method for processing information and electronic device |
Also Published As
Publication number | Publication date |
---|---|
KR20120059615A (en) | 2012-06-08 |
US20110057953A1 (en) | 2011-03-10 |
EP2476048A1 (en) | 2012-07-18 |
CN102576284A (en) | 2012-07-11 |
JP5438221B2 (en) | 2014-03-12 |
CN102576284B (en) | 2015-05-27 |
KR101445944B1 (en) | 2014-09-29 |
EP2476048B1 (en) | 2017-01-11 |
US8413065B2 (en) | 2013-04-02 |
JP2013504135A (en) | 2013-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8413065B2 (en) | User interface methods for ending an application | |
JP6619526B2 (en) | Display interface control method, apparatus and terminal for preventing erroneous operation | |
US8823749B2 (en) | User interface methods providing continuous zoom functionality | |
EP3521994B1 (en) | Method and apparatus for replicating physical key function with soft keys in an electronic device | |
EP3258366B1 (en) | Event recognition | |
EP2710455B1 (en) | Method and apparatus for providing quick access to device functionality | |
WO2013094371A1 (en) | Display control device, display control method, and computer program | |
US20150103013A9 (en) | Electronic Device and Method Using a Touch-Detecting Surface | |
US20100083108A1 (en) | Touch-screen device having soft escape key | |
WO2014200732A1 (en) | Proxy gesture recognizer | |
WO2014047247A1 (en) | Augmented touch control for hand-held devices | |
US20140104170A1 (en) | Method of performing keypad input in a portable terminal and apparatus | |
EP2169521A1 (en) | Touch-screen device having soft escape key | |
AU2021290380B2 (en) | Event recognition | |
CN114721574A (en) | Control method and device for one-hand operation mode, electronic equipment and storage medium | |
CN108983960A (en) | Display methods, intelligent terminal and the storage medium of terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080043128.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10760824 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012528119 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 603/MUMNP/2012 Country of ref document: IN |
|
REEP | Request for entry into the european phase |
Ref document number: 2010760824 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010760824 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20127008863 Country of ref document: KR Kind code of ref document: A |