[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20130113719A1 - Touch-sensitive display method and apparatus - Google Patents

Touch-sensitive display method and apparatus Download PDF

Info

Publication number
US20130113719A1
US20130113719A1 US13/339,151 US201113339151A US2013113719A1 US 20130113719 A1 US20130113719 A1 US 20130113719A1 US 201113339151 A US201113339151 A US 201113339151A US 2013113719 A1 US2013113719 A1 US 2013113719A1
Authority
US
United States
Prior art keywords
touch
indicator
information
display
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/339,151
Inventor
Jason Tyler Griffin
Jerome Pasquero
Andrew Mark Earnshaw
Jianfeng Weng
Scott Duncan Inglis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Malikie Innovations Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/339,151 priority Critical patent/US20130113719A1/en
Assigned to RESEARCH IN MOTION LIMITED reassignment RESEARCH IN MOTION LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Weng, Jianfeng, INGLIS, Scott Duncan, GRIFFIN, JASON TYLER, EARNSHAW, ANDREW MARK, PASQUERO, JEROME
Publication of US20130113719A1 publication Critical patent/US20130113719A1/en
Assigned to BLACKBERRY LIMITED reassignment BLACKBERRY LIMITED CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: RESEARCH IN MOTION LIMITED
Assigned to MALIKIE INNOVATIONS LIMITED reassignment MALIKIE INNOVATIONS LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLACKBERRY LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present disclosure relates to electronic devices, including but not limited to, portable electronic devices having touch-sensitive displays and their control.
  • Portable electronic devices include, for example, several types of mobile stations such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), and laptop computers with wireless 802.11 or Bluetooth capabilities.
  • PIM personal information manager
  • Portable electronic devices such as PDAs or smart telephones are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability.
  • a touch-sensitive display also known as a touchscreen display, is particularly useful on handheld devices, which are small and have limited space for user input and output.
  • the information displayed on the touch-sensitive displays may be modified depending on the functions and operations being performed. With continued demand for decreased size of portable electronic devices, touch-sensitive displays continue to decrease in size.
  • FIG. 1 is a block diagram of a portable electronic device in accordance with the disclosure.
  • FIG. 2 , FIG. 3 , FIG. 4 , FIG. 5 , FIG. 7 , FIG. 8 , FIG. 12 , FIG. 13 , and FIG. 14 illustrate examples of displaying touch-sensitive controls on an electronic device in accordance with the disclosure.
  • FIG. 10 illustrates an example of touch-sensitive controls on an electronic device in accordance with the disclosure.
  • FIG. 6 , FIG. 9 , and FIG. 11 are flowcharts illustrating methods of touch-sensitive control on an electronic device in accordance with the disclosure.
  • FIG. 15 is a flowchart illustrating a method of displaying an enlargement of information on an electronic device in accordance with the disclosure.
  • FIG. 16 and FIG. 19 illustrate examples of invoking an enlargement on an electronic device in accordance with the disclosure.
  • FIG. 17 illustrates an example of moving an indicator through information on an electronic device in accordance with the disclosure.
  • FIG. 18 illustrates an example of highlighting of information on an electronic device in accordance with the disclosure.
  • the apparatus may be an electronic device.
  • the electronic device displays information and at least two controls on the touch-sensitive display. Touch associated with the controls, results in moving an indicator through the information in at least a first direction and a second direction.
  • the controls do not move with the movement of the indicator.
  • the electronic device detects at least two touches on the touch-sensitive display that overlap at least partially in time.
  • an editing control is displayed on the touch-sensitive display.
  • a virtual keyboard is displayed to replace the display of the editing control.
  • the apparatus may be a portable electronic device that includes a touch-sensitive display.
  • the electronic device displays information and, for example, a virtual keyboard on the touch-sensitive display.
  • the electronic device displays an enlargement of at least part of the information to replace at least part of the information displayed, such as a virtual keyboard.
  • the electronic device moves an indicator in the enlargement in response to detecting a touch on the touch-sensitive display.
  • the disclosure generally relates to an electronic device, such as a portable electronic device or non-portable electronic device.
  • portable electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, tablet computers, mobile internet devices, and so forth.
  • the portable electronic device may be a portable electronic device without wireless communication capabilities, such as handheld electronic games, digital photograph albums, digital cameras, media players, e-book readers, and so forth.
  • Examples of non portable electronic devices include desktop computers, electronic white boards, smart boards utilized for collaboration, built-in monitors or displays in furniture or appliances, and so forth.
  • FIG. 1 A block diagram of an example of a portable electronic device 100 is shown in FIG. 1 .
  • the portable electronic device 100 includes multiple components, such as a processor 102 that controls the overall operation of the portable electronic device 100 . Communication functions, including data and voice communications, are performed through a communication subsystem 104 . Data received by the portable electronic device 100 is decompressed and decrypted by a decoder 106 .
  • the communication subsystem 104 receives messages from and sends messages to a wireless network 150 .
  • the wireless network 150 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications.
  • a power source 142 such as one or more rechargeable batteries or a port to an external power supply, powers the portable electronic device 100 .
  • the processor 102 interacts with other components, such as Random Access Memory (RAM) 108 , memory 110 , a display 112 with a touch-sensitive overlay 114 operably coupled to an electronic controller 116 that together comprise a touch-sensitive display 118 , one or more actuators 120 , one or more force sensors 122 , an auxiliary input/output (I/O) subsystem 124 , a data port 126 , a speaker 128 , a microphone 130 , short-range communications 132 , and other device subsystems 134 .
  • Input via a graphical user interface is provided via the touch-sensitive overlay 114 .
  • the processor 102 interacts with the touch-sensitive overlay 114 via the electronic controller 116 .
  • Information such as text, characters including spaces, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102 .
  • the processor 102 may interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces.
  • the portable electronic device 100 may utilize a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150 .
  • SIM/RUIM Removable User Identity Module
  • user identification information may be programmed into memory 110 .
  • the portable electronic device 100 includes an operating system 146 and software programs, applications, or components 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110 . Additional applications or programs may be loaded onto the portable electronic device 100 through the wireless network 150 , the auxiliary I/O subsystem 124 , the data port 126 , the short-range communications subsystem 132 , or any other suitable subsystem 134 .
  • a received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102 .
  • the processor 102 processes the received signal for output to the display 112 and/or to the auxiliary I/O subsystem 124 .
  • a subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104 .
  • the speaker 128 outputs audible information converted from electrical signals
  • the microphone 130 converts audible information into electrical signals for processing.
  • the touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth, as known in the art.
  • a capacitive touch-sensitive display includes a capacitive touch-sensitive overlay 114 .
  • the overlay 114 may be an assembly of multiple layers in a stack including, for example, a substrate, a ground shield layer, a barrier layer, one or more capacitive touch sensor layers separated by a substrate or other barrier, and a cover.
  • the capacitive touch sensor layers may comprise any suitable material, such as indium tin oxide (ITO).
  • One or more touches may be detected by the touch-sensitive display 118 .
  • the processor 102 may determine attributes of the touch, including a location of a touch.
  • Touch location data may include data for an area of contact or data for a single point of contact, such as a point at or near a center of the area of contact.
  • the location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118 .
  • the x location component may be determined by a signal generated from one touch sensor
  • the y location component may be determined by a signal generated from another touch sensor.
  • a signal is provided to the controller 116 in response to detection of a touch.
  • a touch may be detected from any suitable input member, such as a finger, thumb, appendage, or other objects, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118 . Multiple simultaneous touches may be detected.
  • the actuator(s) 120 may be depressed or activated by applying sufficient force to the touch-sensitive display 118 to overcome the actuation force of the actuator 120 .
  • the actuator(s) 120 may be actuated by pressing anywhere on the touch-sensitive display 118 .
  • the actuator(s) 120 may provide input to the processor 102 when actuated. Actuation of the actuator(s) 120 may result in provision of tactile feedback.
  • the touch-sensitive display 118 is depressible, pivotable, and/or movable. Such a force may actuate the actuator(s) 120 .
  • the touch-sensitive display 118 may, for example, float with respect to the housing of the portable electronic device, i.e., the touch-sensitive display 118 may not be fastened to the housing.
  • a mechanical dome switch actuator may be utilized.
  • tactile feedback is provided when the dome collapses due to imparted force and when the dome returns to the rest position after release of the switch.
  • the actuator 120 may comprise one or more piezoelectric (piezo) devices that provide tactile feedback for the touch-sensitive display 118 .
  • Optional force sensors 122 may be disposed in conjunction with the touch-sensitive display 118 to determine or react to forces applied to the touch-sensitive display 118 .
  • the force sensor 122 may be disposed in line with a piezo actuator 120 .
  • the force sensors 122 may be force-sensitive resistors, strain gauges, piezoelectric or piezoresistive devices, pressure sensors, quantum tunneling composites, force-sensitive switches, or other suitable devices.
  • force information related to a detected touch may be utilized to select information, such as information associated with a location of a touch.
  • a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option.
  • Selection options include, for example, displayed or virtual keys of a keyboard; selection boxes or windows, e.g., “cancel,” “delete,” or “unlock”; function buttons, such as play or stop on a music player; and so forth.
  • Different magnitudes of force may be associated with different functions or input. For example, a lesser force may result in panning, and a higher force may result in zooming.
  • the touch-sensitive display 118 includes a display area in which information may be displayed, and a non-display area extending around the periphery of the display area.
  • the display area generally corresponds to the area of the display 112 .
  • Information is not displayed in the non-display area by the display, which non-display area is utilized to accommodate, for example, electronic traces or electrical connections, adhesives or other sealants, and/or protective coatings around the edges of the display area.
  • the non-display area may be referred to as an inactive area and is not part of the physical housing or frame of the electronic device. Typically, no pixels of the display are in the non-display area, thus no image can be displayed by the display 112 in the non-display area.
  • a secondary display not part of the primary display 112 , may be disposed under the non-display area.
  • Touch sensors may be disposed in the non-display area, which touch sensors may be extended from the touch sensors in the display area or distinct or separate touch sensors from the touch sensors in the display area.
  • a touch, including a gesture may be associated with the display area, the non-display area, or both areas.
  • the touch sensors may extend across substantially the entire non-display area or may be disposed in only part of the non-display area.
  • a user may choose to manipulate the information. For example, a user may choose to edit information by copying, cutting, deleting, or pasting information, for which moving an indicator through the information, highlighting parts of the information, moving the information, and so forth is advantageous.
  • An indicator includes, for example, a cursor, a blinking character, a colored area, an insertion marker, highlighting, and so forth. Fine control or movement of an indicator through information is facilitated through input such as one or more detected touches associated with one or more controls displayed on a touch-sensitive display, although the touch-sensitive display may be unable to provide such fine control due to coarse touch sensitivity.
  • a user may have difficulty touching a position between a first displayed character and a second displayed character on a touch-sensitive display.
  • touch-sensitive controls are displayed to facilitate movement of an indicator through the information, which indicator may indicate a single position with the text or highlight multiple characters of text.
  • One or more controls may be provided to move the indicator in one or more directions through the information. Each control may move the indicator in all possible directions, in one direction, or in a subset of all the possible directions. Each control may provide the same functionality, e.g., moving the indicator in the same direction(s), or the controls may provide different functionality from one another, e.g., moving the indicator in different directions, such as up and down or left and right.
  • the one or more controls may be displayed at or near one or more sides of the electronic device to facilitate use of one or both hands to interact with the electronic device.
  • an editing control is displayed on the touch-sensitive display.
  • the at least two touches at least partially overlap in time.
  • the editing control may be displayed until release of another of the remaining touches is detected, until a menu option or selection option is selected, after a time period of no detected activity, and so forth.
  • a virtual keyboard may be displayed to replace the editing control when the release of the additional one of the two or more touches is detected.
  • the editing control is an individual control or group of controls that provide editing functions.
  • the editing control may include one or more controls for moving an indicator, one or more selection options to facilitate performing cut, copy, delete, and paste functions, one or more selection options to highlight information, and so forth. While typing on a virtual keyboard, multiple input members may be at or near the virtual keyboard displayed on the touch-sensitive display 118 . Because the at least two touches may be at locations associated with the virtual keyboard, the edit controls may be quickly accessed during typing on the virtual keyboard.
  • Touch-sensitive controls are displayed on the touch-sensitive display 118 as shown in the example of FIG. 2 .
  • a first control 202 and a second control 204 are displayed on the touch-sensitive display 118 .
  • the controls 202 , 204 may optionally have a default control direction, such as indicated by a displayed arrow as shown in FIG. 2 .
  • a touch such as a tap at a location associated with the controls 202 , 204 , results in moving the indicator in the default direction associated with the one of the controls 202 , 204 with which the touch is associated.
  • an indicator 208 moves to the left by one character through displayed information 210 , in the default direction for the first control 202 .
  • the indicator 208 may move by more than one character, at least one word, at least one sentence, at least one paragraph, at least one page, and so forth.
  • a touch such as a tap at location 206 is associated with the second control 204 and results in moving the indicator 208 to the right through the information 210 by one character, in the default direction for the second control 204 .
  • the touch at the location 206 is a tap that results in the indicator 208 moving from the position shown on the left device 100 to the position shown on the right device 100 .
  • the default direction associated with each of the controls 202 , 204 may be up, down, left, right, up and left, up and right, down and left, down and right, and so forth. Controls may have a default direction but need not have a default direction. Although the default direction associated with the controls 202 , 204 is invoked in the above example with a tap, any type of touch may result in a movement in the default direction, such as a double tap, flick, swipe, hovering or held touch, and so forth. Although two controls 202 , 204 are shown in FIG. 2 , any number of controls may be displayed.
  • the first control 202 is displayed adjacent to a virtual keyboard 212 and on the left side of the display 118 , which may facilitate easy use or operation by a finger of a left hand.
  • the second control 204 is displayed adjacent to the virtual keyboard 212 and on the right side of the display 118 , which may facilitate easy use by a finger of a right hand.
  • Other locations for the controls may also be successfully implemented, including locations in the non-display area of the touch-sensitive display 118 .
  • the controls 202 , 204 may be at or near a location where a touch was recently detected, at or near a position where information is not currently displayed, at or near an outer edge of the display 118 , away from an edge of the display 118 , and so forth.
  • the controls 202 , 204 may be any shape or design.
  • a touch associated with the controls 202 , 204 may be detected while the controls 202 , 204 are displayed, are displayed with any opaque or translucent or see-through level, are associated with a location in the non-display area, and so forth.
  • the controls 202 , 204 may be temporarily displayed to indicate the location of the controls 202 , 204 , and the controls 202 , 204 may cease to be displayed, e.g., to facilitate the display of other information on the display 118 .
  • indications of the location of the controls 202 , 204 may be utilized, such as when the controls 202 , 204 are located in the non-display area.
  • the controls 202 , 204 are stationary in that they do not move when the indicator 208 moves.
  • the controls 202 , 204 may be displayed in an area outside the area in which the information 210 is displayed, may be displayed in the area in which the information 210 is displayed, may be displayed adjacent to the area in which the information 210 is displayed, may be displayed to replace a part of the information 210 , and so forth.
  • the controls 202 , 204 may optionally be moved to different locations.
  • the controls 202 , 204 may move based on a location of a touch, may move based on a setting specifying a location for the controls 202 , 204 , may move based on movement of the indicator 208 , may move based on the position of information displayed on the display 118 , and so forth.
  • the electronic device 100 detects touches at two locations 302 , 304 , wherein these touches at least partially overlap in time.
  • the first touch location 302 is associated with the first control 202 and the second touch location 304 is associated with the second control 204 .
  • the touches may overlap in time for 200 milliseconds, 0.75 seconds, 1.25 seconds, or any other suitable period of time.
  • highlighting 310 is initiated, e.g., an end of the highlighting is established. Any other action, e.g., cut, copy, delete, paste, may optionally be performed in response to detecting that the touches at locations 302 , 304 overlap in time.
  • Detection of a third touch at a location 308 associated with the second control 204 , as shown in the right device 100 results in extending the highlighting by one character to the right as shown on the right device 100 in this example.
  • the end characters of the highlighting 310 may be moved, e.g., ends of the highlighting may be changed, in any direction by one or more touches associated with the first control 202 and/or the second control 204 .
  • a first indicator may be moved in response to detection of a touch associated with the first control 202 and a second indicator may be moved in response to detection of a touch associated with the second control 204 .
  • the highlighting 310 selects a part of the information 210 that is between the first indicator and the second indicator.
  • the controls 202 , 204 function as virtual joysticks in the examples shown in the figures.
  • functioning as a virtual joystick includes detecting movement of a touch associated with the virtual joystick and moving an indicator in response to the detected movement; the movement may be in any direction; maintaining a touch at a fixed location continues to move the indicator along the current direction of movement, and other physical joystick-like functionality.
  • the touch associated with the virtual joystick may move in multiple directions before the touch is released. In the example shown in FIG.
  • a touch that originates at a location 402 associated with the second control 204 moves upward in the direction associated with the arrow 404 , the indicator 208 is moved from the position after “for” shown on the left device 100 to the position after “has” shown on the right device 100 .
  • the indicator 208 is generally moved in a direction based on the direction of movement of the touch, e.g., the indicator 208 may be moved in the same direction as the movement of the touch, in the opposite direction as the direction of the touch, e.g., as in an airplane control, and so forth.
  • the indicator 208 may continue to move, for example, as long as the touch continues/continues to move, until the touch is released, or until the touch returns to the original location 402 .
  • the touch may move in any direction, including multiple directions, resulting in the indicator 208 being moved in the same direction(s) along with the movement of the touch.
  • the indicator 208 may alternatively move at a constant speed regardless of the distance that the touch moves.
  • the indicator 208 may move at a speed substantially the same as the speed of movement of the touch.
  • the indicator 208 may move a distance based on the distance of the movement of the touch. For example, when the touch moves a distance of the height of two lines of characters, the indicator 208 moves two lines of characters, the indicator 208 moves four lines of characters, or any other proportional movement.
  • the controls 202 , 204 may be displayed in accordance with the example of FIG. 5 .
  • a gesture in the area where a keyboard is displayed may be utilized to display the controls 202 , 204 , and a gesture in the opposite direction discontinues display of the controls 202 , 204 .
  • a touch that is a swipe is detected beginning at a location 502 and moves in the direction of the arrow 504 while the controls 202 , 204 are not displayed.
  • the controls 202 , 204 are displayed in response as shown on the right device 100 .
  • a touch that is a swipe is detected beginning at a location 506 and moves in the direction of the arrow 508 while the controls 202 , 204 are displayed in this example, and the controls 202 , 204 are no longer displayed on the display 118 in response to this detection.
  • the display of the controls 202 , 204 may be provided in response to detecting movement such as a swipe in any direction(s). Any type of touch may be detected to provide the display of the controls 202 , 204 . For example, two or more separate touches at any different locations may be detected.
  • the controls 202 , 204 may be a displayed when a touch associated with the information 210 is detected.
  • FIG. 6 A flowchart illustrating a method of touch-sensitive control is shown in FIG. 6 .
  • the method may be carried out by software executed, for example, by the processor 102 . Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description.
  • the method may contain additional or fewer processes than shown and/or described, and may be performed in a different order.
  • Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • Information is displayed 602 on the touch-sensitive display 118 .
  • An indicator such as described above is optionally displayed within the information.
  • the information 208 may be information input into the portable electronic device 100 or received in a communication by the portable electronic device 100 , e.g., an electronic mail message (e-mail), a short message service (SMS) message, a webpage, a document, a calendar event, a contact, and so forth.
  • e-mail electronic mail message
  • SMS short message service
  • One or more controls are displayed 604 on the touch-sensitive display 118 .
  • the controls may be, for example, the controls 202 , 204 shown in FIG. 2 , FIG. 3 , FIG. 4 , and FIG. 5 , or any other controls.
  • the indicator 208 is moved 608 through the information in accordance with the touch characteristics, e.g., type and direction.
  • the touch may move in any direction, including multiple directions, resulting in the indicator 208 being moved in the same direction(s) along with the movement of the touch.
  • a detected touch associated with the first control 202 results in moving the indicator 208 through the information 210 in a first direction and a second direction as shown in the example of FIG. 2 .
  • a detected touch associated with the second control 204 may result in moving the indicator 208 through the information 210 in the first direction and the second direction.
  • a first touch associated with the first control 202 and a second touch associated with the second control 204 may result in the indicator 208 moving in the same direction. For example, when a touch associated with the second control 204 , as shown in FIG. 2 , is detected 606 , the indicator is moved 608 in the right direction.
  • a gesture indicating movement to the right e.g., a swipe to the right
  • the indicator 208 is also moved 608 to the right.
  • Example gestures include a swipe, a tap, a double tap, a flick, and so forth. The gesture may indicate movement in any direction.
  • the detection 606 and movement 608 may be repeated any number of times. Although a first direction and a second direction are discussed above, any number of directions may be associated with the controls displayed at 604 .
  • the movement 608 may be up, down, left, right, or any combination of directions.
  • Touch-sensitive controls are displayed to facilitate the movement of an indicator through information displayed on a touch-sensitive display of an electronic device.
  • the touch-sensitive controls facilitate fine control of movement of the indicator, which is advantageous when an input device, such as a touch-sensitive display, has limited or coarse sensitivity, such as limited ability to locate a touch at a specific point on a display.
  • Multiple controls for moving the indicator in the same directions or different directions may be displayed to facilitate the detection of touches that do not overlap in time and touches that at least partially overlap in time. For example, a detected touch associated with a first control results in moving the indicator in four directions and a detected touch associated with a second control may result in moving the indicator in the same four directions.
  • a touch associated with the first control may result in movement of the indicator in one direction, e.g., the up direction
  • a touch associated with the second control may result in movement of the indicator in another direction, e.g., to the right.
  • the touch may be a tap, a flick in a direction, touch and movement associated with a virtual joystick, a gesture in a direction, multiple touches, and so forth.
  • the movements may be performed substantially simultaneously, e.g., movement up and to the right. Touches on both controls that overlap in time may also result in other events and/or actions such as initiation of highlighting, selection, and so forth.
  • FIG. 6 examples described in connection with FIG. 2 , FIG. 3 , FIG. 4 , FIG. 5 , and FIG. 6 include touches associated with a particular one of the controls 202 , 204 and a resulting action, the touch and the resulting action may alternatively be associated with the other of the controls 202 , 204 or any other control.
  • the method of FIG. 6 is described with reference to FIG. 2 , FIG. 3 , FIG. 4 , and FIG. 5 , the method is applicable to any electronic device.
  • An alternative editing control is displayed on the touch-sensitive display 118 of the electronic device 100 as shown in the example of FIG. 7 .
  • a touch is held at one location 702 , referred to as a hold touch, and a touch detected and released at a second location 704 , referred to as a release touch, are detected in an area associated with a virtual keyboard 212 .
  • the hold touch and the release touch overlap partially in time, and the release touch ends prior to the hold touch.
  • the hold touch and the release touch may be associated with any other area of the touch-sensitive display 118 .
  • the hold touch and the release touch may be associated with physical buttons or keys such as keys of a physical keyboard or keys along the housing of the device 100 .
  • the locations 702 , 704 of the touches may be at any locations on the touch-sensitive display 118 , e.g., both may be located in an area in which the virtual keyboard 212 is displayed, both may be located in an area in which information is displayed, one may be located in the area of the keyboard 212 and another in the area in which the information is displayed, and so forth.
  • editing control 706 including a highlight selection option 708 are displayed.
  • the editing control 706 is displayed at or near the end location 704 of the release touch, which is on a right side of the display 118 in the example shown on the right device 100 of FIG. 7 .
  • the editing control 706 may be displayed in any location such as at or near a left side of the display 118 when the release touch has an end location on the left side of the display 118 , may be displayed at or near a center of the touch-sensitive display 118 , may be displayed in an area in which the virtual keyboard 212 was previously displayed, may be displayed at any location associated with an end location for a touch, at any location associated with an origin location of a touch, at or near a location opposite an origin location or end location of a touch, and so forth.
  • the editing control 706 may be in an editing window, an editing area, and so forth.
  • the editing control 706 shown in the example of FIG. 7 includes selection options for the editing functions CUT, COPY, PASTE, and DELETE.
  • the editing control may include more or fewer selection options, controls, and so forth. Additional selection options include, for example, END, HOME, PAGE UP, PAGE DOWN, END, SEND, GO, and SPELL.
  • An indicator 714 is also displayed within the information. The indicator 714 may be displayed whenever the information is displayed, only when the editing control 706 is displayed, and so forth.
  • the editing control 706 also includes selection options or controls designated with arrows for up, down, left, and right directions relative to information displayed or the device 100 .
  • a detected touch, e.g., a tap, associated with the directional selection options results in moving an indicator one character through information in the direction associated with the associated directional selection option.
  • a gesture associated with a directional selection option results in moving the indicator multiple characters through the information in the direction associated with the directional selection option. For example, when a swipe associated with a right directional selection option is detected, the indicator is moved from a first word to the start of a second word to the right of the first word. The indicator may be moved through one or more characters of the first word and through a space to the start of the second word.
  • a swipe associated with an up directional selection option results in moving the indicator to a start of a paragraph within which the indicator is located.
  • Any other type of gesture may be detected. Any other action may be associated with a detected gesture.
  • a touch associated with a directional selection option results in moving the indicator a single character and a gesture associated with the directional selection option results in moving the indicator multiple characters.
  • a touch associated with a directional selection option results in moving the indicator multiple characters and a gesture associated with the directional selection option results in moving the indicator a single character.
  • the editing control 706 may be a toggle button, a switch, a drop-down menu, a virtual trackpad, a virtual joystick, a virtual directional pad (D-pad), any combination of the foregoing, and so forth.
  • a detected touch associated with the highlight selection option 708 is detected to initiate and end highlighting of information in this example.
  • a detected touch associated with an editing control for moving the indicator e.g., a directional selection option
  • an end point for highlighting is initiated at the position of the indicator 714 in the information and is moved as a result of a detected touch associated with an editing control.
  • a subsequent selection of the highlight selection option 708 may result in initiating a different (another) end point for the highlighting.
  • the highlighting may remain while the highlight selection option 708 is selected, e.g., while a touch is detected, until a second touch associated with the highlight selection portion 708 is detected, and so forth.
  • the highlighting may end when the highlighting selection option 708 is not selected, e.g., when a touch is released, when the highlight selection option 708 is selected a second time to toggle the highlight selection option 708 , and so forth.
  • the highlighting may remain when the highlight selection option 708 is not selected, e.g., the highlighting may remain until another editing control is selected, until the editing control is no longer displayed, and so forth.
  • the highlight selection option 708 may behave similarly to the SHIFT key on a keyboard. This process may be utilized to select or change an end point for the highlighting.
  • the highlight selection option 708 may be displayed at or near the location 704 of the release touch to facilitate easy selection of the highlight selection option by the input member of the release touch.
  • the highlight selection option 708 is optionally displayed at or near the location 702 of the hold touch to facilitate easy selection of the highlight selection option 708 by the input member of the hold touch.
  • the input member of the hold touch may move along the display 118 to the highlight selection option 708 to select the highlight selection option 708 .
  • the highlight selection option 708 may be displayed in any other location on the display 118 .
  • the highlight selection option 708 shown in FIG. 7 is displayed as a key or button, highlighting may be engaged or disengaged by a drop-down menu, a physical key, a radio button, a slider, an option button, a menu, a checkbox, and so forth.
  • the selection options 710 , 712 in the example of FIG. 7 may optionally toggle between display of the virtual keyboard 212 and display of the editing control 706 , may toggle display of other selection options or controls, and so forth.
  • the EDIT selection option 710 initiates display of the editing control 706 as shown on the right device 100 of FIG. 7 .
  • the KEYBOARD selection option 712 may initiate display of the virtual keyboard 212 as shown on the left device 100 .
  • the display controls 710 , 712 may be selected as an alternative to using the hold touch and the release touch to toggle display of the editing control.
  • the selection options 710 , 712 may be associated with the hold touch and the release touch. For example, after detection of the hold touch and the release touch results in the display of the editing control 706 , a touch associated with the display selection option 712 results in display of the virtual keyboard 212 to replace the display of the editing control 706 .
  • Release of the hold touch results in display of the virtual keyboard 212 instead of the editing control 706 .
  • the editing control 706 may be displayed while the hold touch is detected until the hold touch is released, and thus is no longer detected.
  • a touch associated with the KEYBOARD selection option 712 may result in display of the virtual keyboard instead of the editing control 706 .
  • the hold touch may optionally move to any location on the touch-sensitive display 118 . Release of the hold touch may alternatively result in display of any other selection options or controls.
  • FIG. 9 A flowchart illustrating a method of touch-sensitive control including displaying an editing control on the touch-sensitive display 118 is shown in FIG. 9 .
  • the method may be carried out by software executed, for example, by the processor 102 . Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description.
  • the method may contain additional or fewer processes than shown and/or described, and may be performed in a different order.
  • Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • the hold touch may be the hold touch at location 702 and the release touch may be the release touch at location 704 of the examples of FIG. 7 and FIG. 8 .
  • the hold touch and the release touch may be any other touch or gesture.
  • the electronic device 100 may determine that the hold touch and the release touch at least partially overlap in time, e.g., for 0.5 seconds, 1 second, or any other suitable time. Further, determining that the hold touch and the release touch at least partially overlap in time may include detecting that the hold touch and the release touch overlap in time for at least a time value.
  • the time value may be any threshold amount of time that may be established by a manufacturer of the electronic device 100 , by a programmer of the electronic device 100 , by a user of the electronic device 100 , and so forth. Detecting that the hold touch and the release touch overlap in time for at least a time value may prevent detection of two or more touches that are not intended to initiate display of the editing control, e.g., multiple touches that are intended to select keys of the virtual keyboard, inadvertent touches, and so forth.
  • the editing control displayed at 906 may be the editing control 706 and the highlight control 708 of the examples of FIG. 7 and FIG. 8 .
  • the editing control may include any number and type of controls, such as one or more selection options for editing information, one or more controls for inputting information, e.g., a SPACE key, a keyboard key, and so forth, one or more controls for moving an indicator through information, and so forth.
  • display of the editing control is discontinued 910 .
  • the display of the editing control 706 and the highlight control 708 is replaced by display of the virtual keyboard 212 .
  • the display of the editing control may be replaced by any display such as display of information, display of a background, display of one or more controls, and so forth.
  • any other display or action may be initiated by the combination of the hold touch and the release touch.
  • the first control 202 and the second control 204 as shown in FIG. 2 are displayed at 906 in response to detection at 902 , 904 of the combination of the hold touch and the release touch.
  • the method of FIG. 9 is described with reference to FIG. 7 and FIG. 8 , the method is applicable to any other electronic device.
  • the electronic device 100 is a tablet 1000 .
  • the tablet 1000 includes a touch-sensitive display 1002 that includes a non-display area 1004 and a display area 1006 .
  • the tablet 1000 includes a first control 1008 and the second control 1010 positioned in the non-display area 1004 . While the controls 1008 , 1010 are shown in FIG. 10 , the controls 1008 , 1010 are not displayed because they are associated with the non-display area 1004 . In this example, touch sensors are disposed in the non-display area 1004 .
  • the controls 1008 , 1010 may operate in the same manner as the controls 202 , 204 of FIG. 2 , FIG. 3 , FIG. 4 , and FIG. 5 . For example, a touch associated with the first control 1008 moves the indicator 1012 .
  • a marker may be displayed in the display area 1006 of the touch-sensitive display 1002 to indicate the positions of the controls 1008 , 1010 in the non-display area 1006 .
  • the touch-sensitive display 1002 may display the indicator at or near the border of the display area 1006 adjacent to the positions of the controls 1008 , 1010 .
  • the indicator may be a line, a symbol, an icon, a bar, an arrow, and so forth.
  • a light emitting diode or other small visual indicator may be disposed under the non-display area 1004 to indicate the control location.
  • the areas associated with the controls 1008 , 1010 may be anywhere in the non-display area 1004 , for example, next to the display area 1006 .
  • FIG. 10 also illustrates optional locations for the controls 1008 , 1010 in the non-display area 1004 .
  • the control 1014 is at or near the center of a side of the non-display area 1004 .
  • One or more controls may be positioned at or near the center of any of the other sides of the non-display area 1004 .
  • Control 1016 is positioned at a corner of the non-display area 1004 .
  • One or more controls may be positioned at any other corner of the non-display area 1004 .
  • Control 1018 extends along substantially an entire length of a side of the non-display area 1004 .
  • One or more controls may extend along other sides of the non-display area.
  • a control may comprise a substantial area of the non-display area 1004 to facilitate selection of the control, e.g., substantially an entire side of the non-display area 1004 , an area larger than an area encompassed by a touch, and so forth.
  • a control may be associated with an area at or near a location of a touch detected in the non-display area.
  • FIG. 11 A flowchart illustrating a method of touch-sensitive control is shown in FIG. 11 .
  • the method may be carried out by software executed, for example, by the processor 102 . Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description.
  • the method may contain additional or fewer processes than shown and/or described, and may be performed in a different order.
  • Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • Information is displayed 1102 in the display area 1006 of the touch-sensitive display 1002 .
  • An indicator such as described above is optionally displayed within the information.
  • the information may be information input into the tablet 1000 or received in a communication by the tablet 1000 , e.g., an electronic mail message (e-mail), a short message service (SMS) message, a webpage, a document, a calendar event, a contact, and so forth.
  • e-mail electronic mail message
  • SMS short message service
  • One or more controls are associated 1104 with areas of the non-display area 1004 of the touch-sensitive display 1002 .
  • the controls may be, for example, the controls 1008 , 1010 shown in FIG. 10 , which may be substantially similar to the controls 202 , 204 shown in FIG. 2 , FIG. 3 , FIG. 4 , and FIG. 5 , or any other controls.
  • the indicator 1012 is moved 1108 through the information in accordance with the touch characteristics, e.g., type and direction.
  • the touch may move in any direction, including multiple directions, resulting in the indicator 1012 being moved in the same direction(s) along with the movement of the touch.
  • a first touch associated with the first control 1008 and a second touch associated with the second control 1010 may result in the indicator 1012 moving in the same direction. For example, when a touch associated with the second control 1010 , is detected 1106 , the indicator is moved 1108 in the right direction. When a gesture indicating movement to the right, e.g., a swipe to the right, associated with the first control 1008 is detected 1106 , the indicator 1012 is also moved 1108 to the right.
  • the detection 1106 and movement 1108 may be repeated any number of times. Although a first direction and a second direction are discussed above, any number of directions may be associated with the controls 1008 , 1010 .
  • the movement 1108 may be up, down, left, right, or any combination of directions.
  • the editing control 706 includes a virtual trackpad 1202 .
  • the example editing control 706 also includes selection options to facilitate performing cut, copy, paste, and select functions. Additional selection options may be included in the editing control 706 .
  • the editing control 706 may include selection options for delete, page up, page down, and so forth.
  • the editing control 706 including the virtual trackpad 1202 are displayed in response to detecting the hold touch at hold touch location 702 and detecting release of the release touch at release touch location 704 . When release of the hold touch is detected, display of the editing control 706 including the virtual trackpad 1202 is discontinued. Alternatively, any other touch, gesture, instruction, and so forth may result in display of the editing control 706 including the virtual trackpad 1202 .
  • the editing control 706 may be displayed in response to detecting a touch associated with the selection option 710 and display of the editing control 706 may be discontinued in response to detecting a touch associated with selection option 712 .
  • the display of the editing control 706 may be discontinued in response to detecting a touch in an area not associated with the editing control 706 , e.g., a touch in an area not associated with the virtual trackpad 1202 and not associated with the selection options.
  • the display of the editing control 706 may be discontinued in response to detecting a double tap associated with the virtual trackpad 1202 .
  • the virtual trackpad 1202 is displayed as a border surrounding an area in which touches associated with the virtual trackpad 1202 are detected. Alternatively, any information that identifies the area of the touch-sensitive display 118 associated with the virtual trackpad 1202 may be displayed.
  • the virtual trackpad 1202 and any of the selection options or controls of the editing control 706 may be overlaid over the virtual keyboard 212 such that some or all of the virtual keyboard 212 remains visible. Display of the editing control 706 advantageously replaces the display of the virtual keyboard 212 to increase the amount of the touch-sensitive display 118 available for display of information, e.g., when the editing control 706 is displayed on a portable electronic device.
  • the example editing control 706 replaces the display of the virtual keyboard 212 and is displayed within the same dimensions as the virtual keyboard 212
  • the editing control 706 including the virtual trackpad 1202 may be displayed in any suitable size.
  • the indicator 714 When a touch associated with the virtual trackpad 1202 is detected as a swipe, the indicator 714 is moved in the direction of the swipe. The indicator 714 is moved a distance based on the distance of the swipe. Alternatively, the indicator 714 may be moved a distance that is not based on the distance of the swipe. For example, the indicator may move by one character or other unit for each detected swipe. Any other touch or gesture associated with the virtual trackpad 1202 may be detected and any other action may be performed in response to a touch or gesture associated with the virtual trackpad 1202 .
  • a touch at or near a side of the virtual trackpad 1202 may result in moving the indicator 714 in a direction of the associated side, e.g., a touch at or near the top of the virtual trackpad 1202 may result in moving the indicator 714 up, a touch at or near the left side of the virtual trackpad 1202 may result in moving the indicator 714 to the left.
  • a touch at or near the center or a corner of the virtual trackpad 1202 may engage and disengage selection.
  • a first touch associated with the virtual trackpad 1202 may be detected to move the indicator 714 to a first position in displayed information.
  • a touch associated with the selection option identified “select” may be detected to initiate highlighting.
  • a second touch associated with the virtual trackpad 1202 may be detected to move the second indicator to a second position in the displayed information. The displayed information between the indicator 714 and the second indicator is highlighted.
  • the virtual trackpad 1202 may be displayed in conjunction with display of an enlargement of information 1302 .
  • the enlargement of information 1302 includes an enlarged display of information as described in further detail below.
  • a second virtual trackpad may be displayed instead of the enlargement, or a second enlargement may be displayed instead of the virtual trackpad.
  • a touch associated with the first virtual trackpad or first enlargement results in moving a first end point of highlighting or a first indicator
  • a touch associated with the second virtual trackpad or second enlargement of information results in moving a second end point of highlighting or a second indicator.
  • one or more enlargements and/or virtual trackpads 1404 are displayed between a left side of a virtual keyboard 1402 and a right side of the virtual keyboard 1406 .
  • the enlargement includes an enlarged display of information as described in further detail below.
  • the virtual trackpad 1404 may be implemented as described in conjunction with the virtual trackpad 1202 .
  • the one or more enlargements and/or virtual trackpads 1404 may be an enlargement and a trackpad in any order, two virtual trackpads, or two enlargements. Each enlargement or virtual trackpad may control a different end point of highlighting or a different indicator.
  • highlighting may be controlled by receiving input to move and establish a first end point of the highlighting and receiving input to move and establish a second end point of the highlighting.
  • input may be received to simultaneously or substantially simultaneously move two end points of the highlighting, e.g., input associated with a first selection option or control may result in moving a first end point of the highlighting and input associated with a second selection option or control may result in moving a second end point of the highlighting.
  • a first end point of highlighting may be fixed, e.g., a first end point may be fixed at a location of an indicator when highlighting is initiated, and input to move and establish a second end point may be received.
  • input results in selecting an end point of highlighting for moving.
  • a density of touch sensors may be uniform or may vary throughout the touch-sensitive display 118 .
  • the density of the touch sensors may vary between display area(s) and non-display area(s).
  • the density of the touch sensors may be greater in areas where editing controls are provided, e.g., the virtual trackpad 1202 ; the controls 202 , 204 , 1008 , 1010 , 1014 , 1016 , and 1018 ; the editing control 706 ; and so forth.
  • the touch sensors may be disposed in only part(s) of the touch-sensitive display 118 .
  • the touch sensors may be disposed at or near a location where the display area meets the non-display area of the touch-sensitive display 118 .
  • a touch-sensitive editing control is displayed to facilitate the movement of an indicator through information displayed on a touch-sensitive display of an electronic device.
  • the touch-sensitive controls are displayed when two touches that at least partially overlap in time are detected and release of one of the touches is detected.
  • the display of the touch-sensitive controls is replaced when release of the other one of the touches is detected.
  • the combination of touches e.g., the two touches followed by release of a first touch and later release of a second touch, facilitates easier access to the editing control and easier return to a previous display, e.g., a virtual keyboard.
  • An electronic device comprises a touch-sensitive display and a processor coupled to the touch-sensitive display and configured to display information on the touch-sensitive display, display a first control, wherein a touch associated with the first control results in moving an indicator through the information in a first direction and in a second direction, wherein the first control does not move with movement of the indicator, display a second control, wherein a touch associated with the second control results in moving the indicator through the information in the first direction and the second direction, wherein the first control does not move with movement of the indicator, detect a first touch associated with the first control, in response to the detecting, move the indicator in the first direction, and in response to detecting a second touch associated with the second control, move the indicator in the first direction.
  • a method comprises displaying information on a touch-sensitive display of an electronic device, displaying a first control, wherein a touch associated with the first control results in moving an indicator through the information in a first direction and in a second direction, wherein the first control does not move with movement of the indicator, displaying a second control, wherein a touch associated with the second control results in moving the indicator through the information in the first direction and in the second direction, wherein the second control does not move with movement of the indicator, detecting a first touch associated with the first control, in response to the detecting, moving the indicator in the first direction, and in response to detecting a second touch associated with the second control, moving the indicator in the first direction.
  • the method may also comprise initiating highlighting of the information in response to detecting the first touch and the second touch.
  • An electronic device comprises a touch-sensitive display and a processor coupled to the touch-sensitive display and configured to detect a hold touch and a release touch on a touch-sensitive display of an electronic device wherein the hold touch and the release touch overlap at least partially in time, detect release of the release touch, and in response to detecting the release of the release touch, display an editing control while the hold touch is detected.
  • the editing control may include a highlight control for identifying one or more end points in displayed information.
  • a method comprises detecting a hold touch and a release touch on a touch-sensitive display of an electronic device, wherein the hold touch and the release touch overlap at least partially in time, detecting release of the release touch, and in response to detecting the release of the release touch, displaying an editing control while the hold touch is detected.
  • the method may comprise determining that the hold touch and the release touch overlap in time for at least a first time value.
  • the method may include moving an indicator from a first word to a second word in response to detecting a gesture associated with the editing control.
  • a user may manipulate the information, e.g., make changes, move, cut, copy, paste, delete, and perform other functions with the information. For example, a user may edit information by moving an indicator within the information.
  • An indicator includes a cursor, a marker, a blinking character, a pointer, highlighting, and so forth. Editing the information may be difficult when the information is displayed in a small size.
  • portable electronic devices typically include small displays. Coarse input resolution of an input device, such as coarse sensor resolution of a touch-sensitive display may cause difficulty in performing fine selection or movement of an indicator within information displayed in a small size.
  • a user may have difficulty positioning or moving a cursor because accurately touching a position between two characters is difficult.
  • at least part of the information is enlarged or magnified, also referred to as zooming, and displayed on the touch-sensitive display 118 .
  • the enlargement may replace at least part of other displayed information, such as a virtual keyboard, virtual keys, controls, or other information, that is displayed on the touch-sensitive display 118 .
  • FIG. 15 A flowchart illustrating a method of displaying an enlargement of information on the touch-sensitive display 118 is shown in FIG. 15 .
  • the method may be carried out by software executed, for example, by the processor 102 . Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description.
  • the method may contain additional or fewer processes than shown and/or described, and may be performed in a different order.
  • Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • Information is displayed 1502 on the touch-sensitive display 118 .
  • Information is displayed in one area 1602 and information in the form of a virtual keyboard is displayed in another area 1604 in the example of FIG. 16 .
  • the information from the area 1602 may continue into the area 1604 , such as shown in the example of FIG. 19 .
  • Information displayed in the areas 1602 , 1604 may include one or more controls, selection options, and so forth.
  • the controls include but are not limited to one or more selection options, switches, drop-down menus, dials, scrollbars, sliders, and so forth.
  • the displayed information may be information input to the portable electronic device 100 , information received in a communication by the portable electronic device 100 , e.g., in an electronic mail message, in a short message service (SMS) message, a webpage, a document, a calendar event, a contact, and so forth.
  • An indicator e.g., a cursor, a marker, a blinking character, a pointer, highlighting, and so forth, may be displayed in the information, such as the indicator 1606 shown in FIG. 16 .
  • the enlargement may be invoked, also referred to as initiated or activated, by detecting a touch associated with a selection option, such as the “EDIT” selection option 1608 in the example of FIG. 16 , detecting a gesture, detecting one or more touches or gestures associated with the information, detecting selection of a physical button or key, detecting a touch associated with a menu item, and so forth.
  • a selection option may be utilized to alternate between display of the enlargement and display of other information.
  • the selection option may be displayed in the same location on the display, e.g., the “EDIT” selection option 1608 and the “ABC” selection option 1616 .
  • An indicator e.g., the indicator 1614 in the example of FIG. 16
  • the indicator may be displayed in the enlargement.
  • the indicator may be an enlarged indicator displayed at a position corresponding with the location of the indicator in the information, e.g., between the same characters of the information, such as shown in FIG. 16 , the same highlighted character(s) of the information, and so forth.
  • the touch-sensitive display 118 may optionally display other selection options, such as cut, copy, paste, delete, directional options or controls 1618 , 1620 , 1622 , 1624 in the example of FIG. 16 , and so forth.
  • the information included in the enlargement corresponds with information at or near the indicator, such as shown in the example of FIG. 16 .
  • information at or near the indicator such as shown in the example of FIG. 16 .
  • information from a row above and a row below the indicator 1606 as well as information to the left and right of the indicator 1606 is shown. Because more information than, for example, a single word, is displayed, a user is able to see more information to facilitate easier and more flexible movement of the indicator through the information.
  • information included in the enlargement may correspond with any other information, such as the beginning of information, the end of information, information at or near a misspelled word, information that was previously included in the enlargement, and so forth.
  • Indicators may be displayed in the enlargement, the information, or both.
  • the indicator in the information and/or the indicator in the enlargement are moved 1510 in accordance with the touch, e.g., up, down, left, and/or right. If, after a period of time, a touch associated with an indicator is not detected at 1508 , the method proceeds to 1512 .
  • a touch associated with an indicator includes a touch on, at, or near either indicator, a touch associated with a control for an indicator, such as the directional options 1618 , 1620 , 1622 , 1624 in the example of FIG.
  • the touch associated with the indicator may be a touch at touch location 1702 associated with the area 1604 of the touch-sensitive display 118 associated with the display of the enlargement.
  • input may be provided from a control other than the touch-sensitive display 118 , such as a physical button, a key of a physical keyboard, a mouse, a navigation device, e.g., a joystick, an optical navigation device, track pad, and so forth.
  • the indicator in the enlargement and/or the indicator in the information may be moved to a different position, may highlight information, may perform another function, or may be modified in any way.
  • the enlargement may function similar to a virtual trackpad, e.g., touches in the area associated with the enlargement move the indicator.
  • the indicator may highlight the information.
  • One or both ends of the highlighting may be adjusted to select different end points, e.g., characters, of the highlighting within the information.
  • the end points may be moved one at a time, e.g., selection and optional movement of one end point followed by selection and optional movement of the other end point.
  • the end points may be selected/moved in any order, and selection/movement may be repeated for either or both end points.
  • both end points may be moved simultaneously, e.g., by separate touches, one associated with each end point.
  • the highlighting may be any type of marking of the information to cause the highlighted information to appear different than unhighlighted information, such as background color or style, underlining, outlining, bolding, italics, shading, text coloring, font, relative information size, and so forth.
  • Information displayed in the enlargement may change as the indicator moves through the information.
  • the information displayed in the enlargement may change responsive to the movement of the indicator to maintain the indicator at or near the center of the enlargement.
  • the indicator or the word in which the indicator is located may be centered in the area of the enlargement.
  • An indicator may be at or near the center of enlargement when the indicator is close to the center, is about the center, is away from the center by a character, a word, a line of text, and so forth.
  • An indicator may be offset from the center due to the size of information displayed in an enlargement, due to the length of a line of the information, due to the length of a word, and so forth.
  • the display of the enlargement ends.
  • the virtual keyboard may be displayed to replace the display of the enlargement
  • the display of the information may be expanded to replace the display of the enlargement
  • additional controls may be displayed to replace the display of the enlargement
  • the indication to end the enlargement may be detected at 1512 upon detecting selection of a selection option to end enlargement, such as the “ABC” selection option 1616 , after a period of time occurs without detecting a touch, upon detection of a gesture indicating end of the enlargement, completion of an editing function such as cut, copy, paste, delete, and so forth.
  • FIG. 15 is described with reference to FIG. 16 and FIG. 17 , the method is applicable to FIG. 18 and FIG. 19 and any other electronic device.
  • Information is displayed on the device 100 in an upper area 1602 and a virtual keyboard is displayed in a lower area 1604 of the left device 100 in the example of FIG. 16 .
  • An indicator 1606 is displayed in the information in this example.
  • the two areas may be separated from each other, adjacent to each other, distanced from each other, side by side vertically or horizontally, one area at least partially surrounding the other area, and so forth.
  • the information in the upper area 1602 may be displayed in the lower area 1604 and the information in the lower area 1604 may be displayed in the upper area 1602 .
  • a boundary line or other visual element may separate the areas.
  • the information in the upper area 1602 may continue into the lower area 1604 and no visual element may separate the areas, as shown in the example of FIG. 19 .
  • the areas 1602 , 1604 may at least partially overlap, where the enlargement or the information is partially translucent.
  • the sizes of the areas may vary. As shown in the example of FIG. 16 , the size of the upper area 1602 on the right device 100 is smaller than the upper area 1602 of the left device 100 , and the lower area 1604 of the right device 100 is larger than the lower area 1604 of the left device 100 .
  • the size of the upper area may increase and the size of the lower area may be decrease when the enlargement is displayed, or the sizes of the areas may remain the same.
  • the sizes of the areas may change while the enlargement is displayed, for example, to facilitate display of larger words in the enlargement, to facilitate faster movement of the indicator, and so forth.
  • the display of the virtual keyboard on the left device 100 in the example of FIG. 16 includes a selection option 1608 labeled “EDIT” that may be utilized to invoke the enlargement. Detection of a touch at a touch location associated with the selection option 1608 , such as the location 1610 in FIG. 16 , invokes the enlargement 1612 .
  • the selection option 1608 may include a label, such as a text label, a graphic label, a symbolic label, and so forth.
  • the enlargement 1612 includes display of some of the information from the upper area 1602 in a larger size than the information displayed in the upper area 1602 .
  • the enlarged information is centered at or near the cursor 1614 displayed in the enlargement. Any part of the information may be displayed in the enlargement.
  • the amount by which the display of the information is increased in size, e.g., the amount of “zoom,” the amount of magnification, and so forth, and the number of lines of information displayed may vary from the example of FIG. 16 .
  • the amount of enlargement may be uniform or variable, e.g., information near an indicator or in the same row as an indicator may be enlarged more than other information.
  • the amount of enlargement may be adjusted in response to a touch associated with a selection option, a control, a gesture, a menu selection, or any other input.
  • the indicator 1614 is displayed at a position in the enlargement 1612 corresponding to a position of the indicator 1606 in the information displayed in the upper area 1602 .
  • the indicator 1614 is displayed at or near the center of the enlargement 1612 in this example.
  • the enlargement 1612 and the indicator 1614 may be displayed such that the indicator 1614 is in another position relative to the enlargement 1612 , such as at or near the top left corner, or any other position.
  • a selection option may be displayed, such as the “ABC” selection option 1616 displayed in the lower area 1604 of the right device 100 in the example of FIG. 16 .
  • the electronic device 100 When the enlargement is displayed, the electronic device 100 optionally displays editing controls. For example, directional options 1618 , 1620 , 1622 , 1624 are shown displayed in the lower area 1604 in the example of FIG. 16 . When a touch associated with any of the direction options 1618 , 1620 , 1622 , 1624 is detected, the indicator 1614 and the indicator 1606 are moved in accordance with the direction associated with the control. In this example, the directional options 1618 , 1620 , 1622 , 1624 are associated with the directions up, left, right, and down with respect to the orientation of the text.
  • a touch at a touch location 1702 associated with the indicator 1614 is detected, and the indicator 1614 is moved in accordance with the touch from the position of the indicator 1614 shown on the left device 100 to the position of the touch location 1702 on the right device 100 .
  • the touch is detected at the touch location 1702 between “o” and “r” in the word “for” in this example.
  • the indicators 1606 , 1614 are moved to the corresponding position in the information displayed in both areas 1602 , 1604 .
  • the part of the information displayed in the enlargement changes, such that the indicator 1614 remains at or near the center of the enlargement.
  • the information displayed in the enlargement moves to the left as the indicator 1614 moves to the right to maintain the indicator 1614 at or near the center of the enlargement.
  • the information may not change based on the movement of the indicator, may be changed but delayed from movement of the indicator, may be changed in response to detecting a touch associated with a control, may be changed in response to detecting movement of a touch associated with the indicator, may be changed to facilitate viewing of other parts of the information, and so forth.
  • Moving includes changing the position of information, which movement may or may not be animated to appear as though the information is moving across the touch-sensitive display 118 .
  • An indicator illustrating highlighting of information is shown in the example of FIG. 18 .
  • Highlighting may be initiated by detection of a touch associated with a selection option, a gesture, selection from a menu, depression of a physical key, movement of a mouse, a touch on a physical navigation device, a combination of inputs, and so forth.
  • editing controls displayed in the lower area 1604 may include a selection option for initiating highlighting or a double tap may initiate highlighting.
  • an end point of the highlighting moves accordingly.
  • editing functions may be performed such as, copying highlighted information, cutting highlighted information, deleting highlighted information, pasting highlighted information, and so forth.
  • a second editing function, such as pasting information may optionally be performed, for example, by moving the indicator to the past position as described above.
  • controls 1708 and 1710 which may be similar to controls 1502 , 1504 described above, may also be provided.
  • the control 1708 and 1710 may be displayed or not displayed, e.g., provided in a non-display area.
  • the controls 1708 and 1710 may be provided in addition to or as an alternative to the selection options 1618 , 1620 , 1622 , and 1624 .
  • a touch moves from a first touch location 1802 along the display to a second location 1804 .
  • the first touch location 1802 is associated with the first “t” in “tonight” and the second touch location 1804 is associated with the second “t” in “tonight” in this example.
  • the highlighting is displayed with the information in the enlargement between the two positions in the text, thus highlighting 1808 of “tonight” is displayed.
  • an indication of the end points of the highlighting may be displayed and a visual indication may be displayed to indicate the end point that is currently being manipulated. Highlighting may be displayed in one or both of the areas 1602 , 1604 .
  • highlighted information 1810 corresponding to highlighted information 1808 from the enlargement is displayed. Highlighting may persist in the area 1602 after display of the enlargement is ended. Additionally or alternatively, the directional options 1618 , 1620 , 1622 , 1624 may persist after display of the enlargement is ended. Alternatively, highlighting may not be displayed in the upper area. Optionally, highlighting 1810 may be displayed when display of the enlargement ends.
  • information Prior to invocation of the enlargement, information may be displayed seamlessly or continuously in both areas 1602 , 1604 of the device 100 as shown in the example of FIG. 19 .
  • the information may be a single continuous set of information, may be multiple discrete sets of information, and so forth.
  • an email, a webpage, a document, and so forth may be displayed.
  • the enlargement 1612 When the enlargement 1612 is invoked, the enlargement replaces part of the display of the information in the second area 1604 .
  • characters are shown in FIG. 19 , the information may include images, graphics, symbols, other types of information, and so forth.
  • Display of the enlargement of information facilitates movement of an indicator through information displayed on a touch-sensitive display, making editing of the information easier. Because the information is displayed in a larger size, movement of an indicator through the information, such as moving a cursor or highlighting information, facilitates reviewing or editing of the information.
  • the enlargement is advantageously applied to portable electronic devices, which typically include relatively small touch-sensitive displays. Selection options may be provided to invoke display of the enlargement and to indicate end of display of the enlargement. Additional selection options for editing or manipulation the information may be provided.
  • An electronic device comprises a touch-sensitive display and a processor operably coupled with the touch-sensitive display and configured to display information in a first area of a touch-sensitive display of an electronic device, display an enlargement including at least part of the information in a second area outside the first area to replace at least part of a virtual keyboard, detect a touch associated with the second area, and move a first indicator in the first area and a second indicator in the second area along with the touch.
  • a method comprises displaying information in a first area of a touch-sensitive display of an electronic device, displaying an enlargement including at least part of the information in a second area outside the first area to replace at least part of a virtual keyboard, detecting a touch associated with the second area, and moving a first indicator in the first area and a second indicator in the second area along with the touch.
  • the method may also include displaying a control in the second area to control the first indicator and the second indicator.
  • the method may also include changing the at least part of the information displayed in the enlargement based on the movement of the second indicator.
  • an indicator may be at or near the center of an area when the indicator is close to the center, is about the center, is away from the center by a character, a line of text, or a word, and so forth.
  • touch locations are shown as circles with dashed lines the actual touch locations may be larger or smaller, e.g., a point.
  • selection options, controls, and other elements may be located at any locations such as at the top of a display, at the bottom of a display, along a side of the display, in any area of a non-display area, and so forth.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

An example method includes detecting a hold touch and a release touch on a touch-sensitive display of an electronic device, wherein the hold touch and the release touch overlap at least partially in time, detecting release of the release touch, and in response to detecting the release of the release touch, displaying an editing control while the hold touch is detected. The editing control may include a virtual trackpad.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This patent application claims priority to U.S. Provisional Patent Application No. 61/557,873, filed on Nov. 9, 2011, titled “TOUCH-SENSITIVE DISPLAY METHOD AND APPARATUS,” which is hereby incorporated herein by reference in its entirety. This patent application is related to U.S. application Ser. No. ______ (Atty Docket No. 40848-US-PAT), titled “TOUCH-SENSITIVE DISPLAY METHOD AND APPARATUS,” U.S. application Ser. No. ______ (Atty Docket No. 40860-US-PAT), titled “TOUCH-SENSITIVE DISPLAY METHOD AND APPARATUS,” and U.S. application Ser. No. ______ (Atty Docket No. 42417-US-PAT), titled “TOUCH-SENSITIVE DISPLAY METHOD AND APPARATUS.”
  • FIELD OF TECHNOLOGY
  • The present disclosure relates to electronic devices, including but not limited to, portable electronic devices having touch-sensitive displays and their control.
  • BACKGROUND
  • Electronic devices, including portable electronic devices, have gained widespread use and may provide a variety of functions including, for example, telephonic, electronic messaging and other personal information manager (PIM) application functions. Portable electronic devices include, for example, several types of mobile stations such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), and laptop computers with wireless 802.11 or Bluetooth capabilities.
  • Portable electronic devices such as PDAs or smart telephones are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability. A touch-sensitive display, also known as a touchscreen display, is particularly useful on handheld devices, which are small and have limited space for user input and output. The information displayed on the touch-sensitive displays may be modified depending on the functions and operations being performed. With continued demand for decreased size of portable electronic devices, touch-sensitive displays continue to decrease in size.
  • Improvements in devices with touch-sensitive displays are desirable.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a portable electronic device in accordance with the disclosure.
  • FIG. 2, FIG. 3, FIG. 4, FIG. 5, FIG. 7, FIG. 8, FIG. 12, FIG. 13, and FIG. 14 illustrate examples of displaying touch-sensitive controls on an electronic device in accordance with the disclosure.
  • FIG. 10 illustrates an example of touch-sensitive controls on an electronic device in accordance with the disclosure.
  • FIG. 6, FIG. 9, and FIG. 11 are flowcharts illustrating methods of touch-sensitive control on an electronic device in accordance with the disclosure.
  • FIG. 15 is a flowchart illustrating a method of displaying an enlargement of information on an electronic device in accordance with the disclosure.
  • FIG. 16 and FIG. 19 illustrate examples of invoking an enlargement on an electronic device in accordance with the disclosure.
  • FIG. 17 illustrates an example of moving an indicator through information on an electronic device in accordance with the disclosure.
  • FIG. 18 illustrates an example of highlighting of information on an electronic device in accordance with the disclosure.
  • DETAILED DESCRIPTION
  • The following describes an apparatus for and method of touch-sensitive control on a touch-sensitive display. The apparatus may be an electronic device. The electronic device displays information and at least two controls on the touch-sensitive display. Touch associated with the controls, results in moving an indicator through the information in at least a first direction and a second direction. The controls do not move with the movement of the indicator. In another example, the electronic device detects at least two touches on the touch-sensitive display that overlap at least partially in time. When the electronic device detects release of one of the touches, an editing control is displayed on the touch-sensitive display. When the electronic device detects release of the other of the touches, a virtual keyboard is displayed to replace the display of the editing control.
  • The following describes a method and apparatus to control an electronic device. The apparatus may be a portable electronic device that includes a touch-sensitive display. The electronic device displays information and, for example, a virtual keyboard on the touch-sensitive display. In response to an invocation, the electronic device displays an enlargement of at least part of the information to replace at least part of the information displayed, such as a virtual keyboard. The electronic device moves an indicator in the enlargement in response to detecting a touch on the touch-sensitive display.
  • For simplicity and clarity of illustration, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. Numerous details are set forth to provide an understanding of the examples described herein. The examples may be practiced without these details. In other instances, well-known methods, procedures, and components are not described in detail to avoid obscuring the examples described. The description is not to be considered as limited to the scope of the examples described herein.
  • The disclosure generally relates to an electronic device, such as a portable electronic device or non-portable electronic device. Examples of portable electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, tablet computers, mobile internet devices, and so forth. The portable electronic device may be a portable electronic device without wireless communication capabilities, such as handheld electronic games, digital photograph albums, digital cameras, media players, e-book readers, and so forth. Examples of non portable electronic devices include desktop computers, electronic white boards, smart boards utilized for collaboration, built-in monitors or displays in furniture or appliances, and so forth.
  • A block diagram of an example of a portable electronic device 100 is shown in FIG. 1. The portable electronic device 100 includes multiple components, such as a processor 102 that controls the overall operation of the portable electronic device 100. Communication functions, including data and voice communications, are performed through a communication subsystem 104. Data received by the portable electronic device 100 is decompressed and decrypted by a decoder 106. The communication subsystem 104 receives messages from and sends messages to a wireless network 150. The wireless network 150 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications. A power source 142, such as one or more rechargeable batteries or a port to an external power supply, powers the portable electronic device 100.
  • The processor 102 interacts with other components, such as Random Access Memory (RAM) 108, memory 110, a display 112 with a touch-sensitive overlay 114 operably coupled to an electronic controller 116 that together comprise a touch-sensitive display 118, one or more actuators 120, one or more force sensors 122, an auxiliary input/output (I/O) subsystem 124, a data port 126, a speaker 128, a microphone 130, short-range communications 132, and other device subsystems 134. Input via a graphical user interface is provided via the touch-sensitive overlay 114. The processor 102 interacts with the touch-sensitive overlay 114 via the electronic controller 116. Information, such as text, characters including spaces, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102. The processor 102 may interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces.
  • To identify a subscriber for network access, the portable electronic device 100 may utilize a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150. Alternatively, user identification information may be programmed into memory 110.
  • The portable electronic device 100 includes an operating system 146 and software programs, applications, or components 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110. Additional applications or programs may be loaded onto the portable electronic device 100 through the wireless network 150, the auxiliary I/O subsystem 124, the data port 126, the short-range communications subsystem 132, or any other suitable subsystem 134.
  • A received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102. The processor 102 processes the received signal for output to the display 112 and/or to the auxiliary I/O subsystem 124. A subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104. For voice communications, the overall operation of the portable electronic device 100 is similar. The speaker 128 outputs audible information converted from electrical signals, and the microphone 130 converts audible information into electrical signals for processing.
  • The touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth, as known in the art. A capacitive touch-sensitive display includes a capacitive touch-sensitive overlay 114. The overlay 114 may be an assembly of multiple layers in a stack including, for example, a substrate, a ground shield layer, a barrier layer, one or more capacitive touch sensor layers separated by a substrate or other barrier, and a cover. The capacitive touch sensor layers may comprise any suitable material, such as indium tin oxide (ITO).
  • One or more touches, also known as touch contacts or touch events, may be detected by the touch-sensitive display 118. The processor 102 may determine attributes of the touch, including a location of a touch. Touch location data may include data for an area of contact or data for a single point of contact, such as a point at or near a center of the area of contact. The location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118. For example, the x location component may be determined by a signal generated from one touch sensor, and the y location component may be determined by a signal generated from another touch sensor. A signal is provided to the controller 116 in response to detection of a touch. A touch may be detected from any suitable input member, such as a finger, thumb, appendage, or other objects, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118. Multiple simultaneous touches may be detected.
  • The actuator(s) 120 may be depressed or activated by applying sufficient force to the touch-sensitive display 118 to overcome the actuation force of the actuator 120. The actuator(s) 120 may be actuated by pressing anywhere on the touch-sensitive display 118. The actuator(s) 120 may provide input to the processor 102 when actuated. Actuation of the actuator(s) 120 may result in provision of tactile feedback. When force is applied, the touch-sensitive display 118 is depressible, pivotable, and/or movable. Such a force may actuate the actuator(s) 120. The touch-sensitive display 118 may, for example, float with respect to the housing of the portable electronic device, i.e., the touch-sensitive display 118 may not be fastened to the housing. A mechanical dome switch actuator may be utilized. In this example, tactile feedback is provided when the dome collapses due to imparted force and when the dome returns to the rest position after release of the switch. Alternatively, the actuator 120 may comprise one or more piezoelectric (piezo) devices that provide tactile feedback for the touch-sensitive display 118.
  • Optional force sensors 122 may be disposed in conjunction with the touch-sensitive display 118 to determine or react to forces applied to the touch-sensitive display 118. The force sensor 122 may be disposed in line with a piezo actuator 120. The force sensors 122 may be force-sensitive resistors, strain gauges, piezoelectric or piezoresistive devices, pressure sensors, quantum tunneling composites, force-sensitive switches, or other suitable devices. Force as utilized throughout the specification, including the claims, refers to force measurements, estimates, and/or calculations, such as pressure, deformation, stress, strain, force density, force-area relationships, thrust, torque, and other effects that include force or related quantities. Optionally, force information related to a detected touch may be utilized to select information, such as information associated with a location of a touch. For example, a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option. Selection options include, for example, displayed or virtual keys of a keyboard; selection boxes or windows, e.g., “cancel,” “delete,” or “unlock”; function buttons, such as play or stop on a music player; and so forth. Different magnitudes of force may be associated with different functions or input. For example, a lesser force may result in panning, and a higher force may result in zooming.
  • The touch-sensitive display 118 includes a display area in which information may be displayed, and a non-display area extending around the periphery of the display area. The display area generally corresponds to the area of the display 112. Information is not displayed in the non-display area by the display, which non-display area is utilized to accommodate, for example, electronic traces or electrical connections, adhesives or other sealants, and/or protective coatings around the edges of the display area. The non-display area may be referred to as an inactive area and is not part of the physical housing or frame of the electronic device. Typically, no pixels of the display are in the non-display area, thus no image can be displayed by the display 112 in the non-display area. Optionally, a secondary display, not part of the primary display 112, may be disposed under the non-display area. Touch sensors may be disposed in the non-display area, which touch sensors may be extended from the touch sensors in the display area or distinct or separate touch sensors from the touch sensors in the display area. A touch, including a gesture, may be associated with the display area, the non-display area, or both areas. The touch sensors may extend across substantially the entire non-display area or may be disposed in only part of the non-display area.
  • When viewing information on an electronic device, e.g., information input to or received by the electronic device, a user may choose to manipulate the information. For example, a user may choose to edit information by copying, cutting, deleting, or pasting information, for which moving an indicator through the information, highlighting parts of the information, moving the information, and so forth is advantageous. An indicator includes, for example, a cursor, a blinking character, a colored area, an insertion marker, highlighting, and so forth. Fine control or movement of an indicator through information is facilitated through input such as one or more detected touches associated with one or more controls displayed on a touch-sensitive display, although the touch-sensitive display may be unable to provide such fine control due to coarse touch sensitivity. For example, a user may have difficulty touching a position between a first displayed character and a second displayed character on a touch-sensitive display. To support the manipulation of information, touch-sensitive controls are displayed to facilitate movement of an indicator through the information, which indicator may indicate a single position with the text or highlight multiple characters of text. One or more controls may be provided to move the indicator in one or more directions through the information. Each control may move the indicator in all possible directions, in one direction, or in a subset of all the possible directions. Each control may provide the same functionality, e.g., moving the indicator in the same direction(s), or the controls may provide different functionality from one another, e.g., moving the indicator in different directions, such as up and down or left and right. The one or more controls may be displayed at or near one or more sides of the electronic device to facilitate use of one or both hands to interact with the electronic device.
  • When at least two touches are detected on the touch-sensitive display, and release of one of the touches is subsequently detected, an editing control is displayed on the touch-sensitive display. Advantageously, the at least two touches at least partially overlap in time. The editing control may be displayed until release of another of the remaining touches is detected, until a menu option or selection option is selected, after a time period of no detected activity, and so forth. Optionally, a virtual keyboard may be displayed to replace the editing control when the release of the additional one of the two or more touches is detected. The editing control is an individual control or group of controls that provide editing functions. The editing control may include one or more controls for moving an indicator, one or more selection options to facilitate performing cut, copy, delete, and paste functions, one or more selection options to highlight information, and so forth. While typing on a virtual keyboard, multiple input members may be at or near the virtual keyboard displayed on the touch-sensitive display 118. Because the at least two touches may be at locations associated with the virtual keyboard, the edit controls may be quickly accessed during typing on the virtual keyboard.
  • Touch-sensitive controls are displayed on the touch-sensitive display 118 as shown in the example of FIG. 2. In the left device 100 of FIG. 2, a first control 202 and a second control 204 are displayed on the touch-sensitive display 118. The controls 202, 204 may optionally have a default control direction, such as indicated by a displayed arrow as shown in FIG. 2. For example, a touch such as a tap at a location associated with the controls 202, 204, results in moving the indicator in the default direction associated with the one of the controls 202, 204 with which the touch is associated. For example, when a touch, such as a tap, associated with the first control 202 is detected, an indicator 208 moves to the left by one character through displayed information 210, in the default direction for the first control 202. Optionally, the indicator 208 may move by more than one character, at least one word, at least one sentence, at least one paragraph, at least one page, and so forth. A touch such as a tap at location 206 is associated with the second control 204 and results in moving the indicator 208 to the right through the information 210 by one character, in the default direction for the second control 204. In the example shown in FIG. 2, the touch at the location 206 is a tap that results in the indicator 208 moving from the position shown on the left device 100 to the position shown on the right device 100. The default direction associated with each of the controls 202, 204 may be up, down, left, right, up and left, up and right, down and left, down and right, and so forth. Controls may have a default direction but need not have a default direction. Although the default direction associated with the controls 202, 204 is invoked in the above example with a tap, any type of touch may result in a movement in the default direction, such as a double tap, flick, swipe, hovering or held touch, and so forth. Although two controls 202, 204 are shown in FIG. 2, any number of controls may be displayed.
  • The first control 202 is displayed adjacent to a virtual keyboard 212 and on the left side of the display 118, which may facilitate easy use or operation by a finger of a left hand. The second control 204 is displayed adjacent to the virtual keyboard 212 and on the right side of the display 118, which may facilitate easy use by a finger of a right hand. Other locations for the controls may also be successfully implemented, including locations in the non-display area of the touch-sensitive display 118. For example, the controls 202, 204 may be at or near a location where a touch was recently detected, at or near a position where information is not currently displayed, at or near an outer edge of the display 118, away from an edge of the display 118, and so forth.
  • Although an example shape of the controls 202, 204 is shown in FIG. 2, the controls 202, 204 may be any shape or design. A touch associated with the controls 202, 204 may be detected while the controls 202, 204 are displayed, are displayed with any opaque or translucent or see-through level, are associated with a location in the non-display area, and so forth. For example, the controls 202, 204 may be temporarily displayed to indicate the location of the controls 202, 204, and the controls 202, 204 may cease to be displayed, e.g., to facilitate the display of other information on the display 118. Alternatively, indications of the location of the controls 202, 204 may be utilized, such as when the controls 202, 204 are located in the non-display area.
  • The controls 202, 204 are stationary in that they do not move when the indicator 208 moves. The controls 202, 204 may be displayed in an area outside the area in which the information 210 is displayed, may be displayed in the area in which the information 210 is displayed, may be displayed adjacent to the area in which the information 210 is displayed, may be displayed to replace a part of the information 210, and so forth. The controls 202, 204 may optionally be moved to different locations. For example, the controls 202, 204 may move based on a location of a touch, may move based on a setting specifying a location for the controls 202, 204, may move based on movement of the indicator 208, may move based on the position of information displayed on the display 118, and so forth.
  • In the example of FIG. 3, the electronic device 100 detects touches at two locations 302, 304, wherein these touches at least partially overlap in time. The first touch location 302 is associated with the first control 202 and the second touch location 304 is associated with the second control 204. For example, the touches may overlap in time for 200 milliseconds, 0.75 seconds, 1.25 seconds, or any other suitable period of time. In the left device 100 of FIG. 3, highlighting 310 is initiated, e.g., an end of the highlighting is established. Any other action, e.g., cut, copy, delete, paste, may optionally be performed in response to detecting that the touches at locations 302, 304 overlap in time. Detection of a third touch at a location 308 associated with the second control 204, as shown in the right device 100, results in extending the highlighting by one character to the right as shown on the right device 100 in this example. The end characters of the highlighting 310 may be moved, e.g., ends of the highlighting may be changed, in any direction by one or more touches associated with the first control 202 and/or the second control 204.
  • In other examples, after initiating highlighting 310, a first indicator may be moved in response to detection of a touch associated with the first control 202 and a second indicator may be moved in response to detection of a touch associated with the second control 204. In such an example, the highlighting 310 selects a part of the information 210 that is between the first indicator and the second indicator.
  • The controls 202, 204 function as virtual joysticks in the examples shown in the figures. For example, functioning as a virtual joystick includes detecting movement of a touch associated with the virtual joystick and moving an indicator in response to the detected movement; the movement may be in any direction; maintaining a touch at a fixed location continues to move the indicator along the current direction of movement, and other physical joystick-like functionality. Optionally, the touch associated with the virtual joystick may move in multiple directions before the touch is released. In the example shown in FIG. 4, a touch that originates at a location 402 associated with the second control 204 moves upward in the direction associated with the arrow 404, the indicator 208 is moved from the position after “for” shown on the left device 100 to the position after “has” shown on the right device 100. The indicator 208 is generally moved in a direction based on the direction of movement of the touch, e.g., the indicator 208 may be moved in the same direction as the movement of the touch, in the opposite direction as the direction of the touch, e.g., as in an airplane control, and so forth.
  • The indicator 208 may continue to move, for example, as long as the touch continues/continues to move, until the touch is released, or until the touch returns to the original location 402. The touch may move in any direction, including multiple directions, resulting in the indicator 208 being moved in the same direction(s) along with the movement of the touch.
  • Optionally, the further the touch moves from the original location 402 of the touch, the faster, further, and so forth the indicator 208 is moved, e.g., the faster the movement of the indicator repeats. The indicator 208 may alternatively move at a constant speed regardless of the distance that the touch moves. The indicator 208 may move at a speed substantially the same as the speed of movement of the touch. Alternatively, the indicator 208 may move a distance based on the distance of the movement of the touch. For example, when the touch moves a distance of the height of two lines of characters, the indicator 208 moves two lines of characters, the indicator 208 moves four lines of characters, or any other proportional movement.
  • The controls 202, 204 may be displayed in accordance with the example of FIG. 5. Optionally, a gesture in the area where a keyboard is displayed may be utilized to display the controls 202, 204, and a gesture in the opposite direction discontinues display of the controls 202, 204. As shown on the left device 100 in this example, a touch that is a swipe is detected beginning at a location 502 and moves in the direction of the arrow 504 while the controls 202, 204 are not displayed. The controls 202, 204 are displayed in response as shown on the right device 100. A touch that is a swipe is detected beginning at a location 506 and moves in the direction of the arrow 508 while the controls 202, 204 are displayed in this example, and the controls 202, 204 are no longer displayed on the display 118 in response to this detection. Although the movement of the touches in these examples is up and down, the display of the controls 202, 204 may be provided in response to detecting movement such as a swipe in any direction(s). Any type of touch may be detected to provide the display of the controls 202, 204. For example, two or more separate touches at any different locations may be detected. Optionally, the controls 202, 204 may be a displayed when a touch associated with the information 210 is detected.
  • A flowchart illustrating a method of touch-sensitive control is shown in FIG. 6. The method may be carried out by software executed, for example, by the processor 102. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description. The method may contain additional or fewer processes than shown and/or described, and may be performed in a different order. Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • Information is displayed 602 on the touch-sensitive display 118. An indicator such as described above is optionally displayed within the information. The information 208 may be information input into the portable electronic device 100 or received in a communication by the portable electronic device 100, e.g., an electronic mail message (e-mail), a short message service (SMS) message, a webpage, a document, a calendar event, a contact, and so forth.
  • One or more controls are displayed 604 on the touch-sensitive display 118. The controls may be, for example, the controls 202, 204 shown in FIG. 2, FIG. 3, FIG. 4, and FIG. 5, or any other controls. When a touch associated with one of the controls 202, 204 is detected 606, the indicator 208 is moved 608 through the information in accordance with the touch characteristics, e.g., type and direction. The touch may move in any direction, including multiple directions, resulting in the indicator 208 being moved in the same direction(s) along with the movement of the touch. For example, a detected touch associated with the first control 202 results in moving the indicator 208 through the information 210 in a first direction and a second direction as shown in the example of FIG. 2. A detected touch associated with the second control 204 may result in moving the indicator 208 through the information 210 in the first direction and the second direction. A first touch associated with the first control 202 and a second touch associated with the second control 204 may result in the indicator 208 moving in the same direction. For example, when a touch associated with the second control 204, as shown in FIG. 2, is detected 606, the indicator is moved 608 in the right direction. When a gesture indicating movement to the right, e.g., a swipe to the right, associated with the first control 202 is detected 606, the indicator 208 is also moved 608 to the right. Example gestures include a swipe, a tap, a double tap, a flick, and so forth. The gesture may indicate movement in any direction.
  • The detection 606 and movement 608 may be repeated any number of times. Although a first direction and a second direction are discussed above, any number of directions may be associated with the controls displayed at 604. The movement 608 may be up, down, left, right, or any combination of directions.
  • Touch-sensitive controls are displayed to facilitate the movement of an indicator through information displayed on a touch-sensitive display of an electronic device. The touch-sensitive controls facilitate fine control of movement of the indicator, which is advantageous when an input device, such as a touch-sensitive display, has limited or coarse sensitivity, such as limited ability to locate a touch at a specific point on a display. Multiple controls for moving the indicator in the same directions or different directions may be displayed to facilitate the detection of touches that do not overlap in time and touches that at least partially overlap in time. For example, a detected touch associated with a first control results in moving the indicator in four directions and a detected touch associated with a second control may result in moving the indicator in the same four directions. A touch associated with the first control may result in movement of the indicator in one direction, e.g., the up direction, and a touch associated with the second control may result in movement of the indicator in another direction, e.g., to the right. The touch may be a tap, a flick in a direction, touch and movement associated with a virtual joystick, a gesture in a direction, multiple touches, and so forth. When the first touch and the second touch at least partially overlap in time, the movements may be performed substantially simultaneously, e.g., movement up and to the right. Touches on both controls that overlap in time may also result in other events and/or actions such as initiation of highlighting, selection, and so forth.
  • Although examples described in connection with FIG. 2, FIG. 3, FIG. 4, FIG. 5, and FIG. 6 include touches associated with a particular one of the controls 202, 204 and a resulting action, the touch and the resulting action may alternatively be associated with the other of the controls 202, 204 or any other control. Although the method of FIG. 6 is described with reference to FIG. 2, FIG. 3, FIG. 4, and FIG. 5, the method is applicable to any electronic device.
  • An alternative editing control is displayed on the touch-sensitive display 118 of the electronic device 100 as shown in the example of FIG. 7. As shown on the left device 100 of FIG. 7, a touch is held at one location 702, referred to as a hold touch, and a touch detected and released at a second location 704, referred to as a release touch, are detected in an area associated with a virtual keyboard 212. The hold touch and the release touch overlap partially in time, and the release touch ends prior to the hold touch. Alternatively, the hold touch and the release touch may be associated with any other area of the touch-sensitive display 118. Alternatively, the hold touch and the release touch may be associated with physical buttons or keys such as keys of a physical keyboard or keys along the housing of the device 100. Although the hold touch location 702 is to the left of the release touch location 704, the locations 702, 704 of the touches may be at any locations on the touch-sensitive display 118, e.g., both may be located in an area in which the virtual keyboard 212 is displayed, both may be located in an area in which information is displayed, one may be located in the area of the keyboard 212 and another in the area in which the information is displayed, and so forth.
  • As shown on the right device 100 in the example of FIG. 7, when the release touch is released, editing control 706 including a highlight selection option 708 are displayed. As shown in FIG. 7, the editing control 706 is displayed at or near the end location 704 of the release touch, which is on a right side of the display 118 in the example shown on the right device 100 of FIG. 7. In other examples, the editing control 706 may be displayed in any location such as at or near a left side of the display 118 when the release touch has an end location on the left side of the display 118, may be displayed at or near a center of the touch-sensitive display 118, may be displayed in an area in which the virtual keyboard 212 was previously displayed, may be displayed at any location associated with an end location for a touch, at any location associated with an origin location of a touch, at or near a location opposite an origin location or end location of a touch, and so forth. The editing control 706 may be in an editing window, an editing area, and so forth.
  • The editing control 706 shown in the example of FIG. 7 includes selection options for the editing functions CUT, COPY, PASTE, and DELETE. The editing control may include more or fewer selection options, controls, and so forth. Additional selection options include, for example, END, HOME, PAGE UP, PAGE DOWN, END, SEND, GO, and SPELL. An indicator 714 is also displayed within the information. The indicator 714 may be displayed whenever the information is displayed, only when the editing control 706 is displayed, and so forth.
  • The editing control 706 also includes selection options or controls designated with arrows for up, down, left, and right directions relative to information displayed or the device 100. A detected touch, e.g., a tap, associated with the directional selection options results in moving an indicator one character through information in the direction associated with the associated directional selection option. A gesture associated with a directional selection option results in moving the indicator multiple characters through the information in the direction associated with the directional selection option. For example, when a swipe associated with a right directional selection option is detected, the indicator is moved from a first word to the start of a second word to the right of the first word. The indicator may be moved through one or more characters of the first word and through a space to the start of the second word. Alternatively, a swipe associated with an up directional selection option results in moving the indicator to a start of a paragraph within which the indicator is located. Any other type of gesture may be detected. Any other action may be associated with a detected gesture. A touch associated with a directional selection option results in moving the indicator a single character and a gesture associated with the directional selection option results in moving the indicator multiple characters. Optionally, a touch associated with a directional selection option results in moving the indicator multiple characters and a gesture associated with the directional selection option results in moving the indicator a single character. Although movement of the indicator through the information is discussed, the electronic device 100 may determine a position to which the indicator is to be moved and may display the indicator at that location rather than moving the indicator through the information.
  • Any other controls may be included with the editing control 706 such as any other editing control, any other keyboard key, and so forth. The editing control 706 may be a toggle button, a switch, a drop-down menu, a virtual trackpad, a virtual joystick, a virtual directional pad (D-pad), any combination of the foregoing, and so forth.
  • A detected touch associated with the highlight selection option 708 is detected to initiate and end highlighting of information in this example. When the highlight selection option 708 is selected, e.g., a touch associated with the highlight selection option 708 is detected, a detected touch associated with an editing control for moving the indicator, e.g., a directional selection option, results in highlighting information. For example, when selection of the highlight selection option 708 is detected, an end point for highlighting is initiated at the position of the indicator 714 in the information and is moved as a result of a detected touch associated with an editing control. A subsequent selection of the highlight selection option 708 may result in initiating a different (another) end point for the highlighting. The highlighting may remain while the highlight selection option 708 is selected, e.g., while a touch is detected, until a second touch associated with the highlight selection portion 708 is detected, and so forth. The highlighting may end when the highlighting selection option 708 is not selected, e.g., when a touch is released, when the highlight selection option 708 is selected a second time to toggle the highlight selection option 708, and so forth. Alternatively, the highlighting may remain when the highlight selection option 708 is not selected, e.g., the highlighting may remain until another editing control is selected, until the editing control is no longer displayed, and so forth. The highlight selection option 708 may behave similarly to the SHIFT key on a keyboard. This process may be utilized to select or change an end point for the highlighting.
  • The highlight selection option 708 may be displayed at or near the location 704 of the release touch to facilitate easy selection of the highlight selection option by the input member of the release touch. The highlight selection option 708 is optionally displayed at or near the location 702 of the hold touch to facilitate easy selection of the highlight selection option 708 by the input member of the hold touch. For example, the input member of the hold touch may move along the display 118 to the highlight selection option 708 to select the highlight selection option 708. The highlight selection option 708 may be displayed in any other location on the display 118. Although the highlight selection option 708 shown in FIG. 7 is displayed as a key or button, highlighting may be engaged or disengaged by a drop-down menu, a physical key, a radio button, a slider, an option button, a menu, a checkbox, and so forth.
  • The selection options 710, 712 in the example of FIG. 7 may optionally toggle between display of the virtual keyboard 212 and display of the editing control 706, may toggle display of other selection options or controls, and so forth. For example, the EDIT selection option 710 initiates display of the editing control 706 as shown on the right device 100 of FIG. 7. The KEYBOARD selection option 712 may initiate display of the virtual keyboard 212 as shown on the left device 100. The display controls 710, 712 may be selected as an alternative to using the hold touch and the release touch to toggle display of the editing control. The selection options 710, 712 may be associated with the hold touch and the release touch. For example, after detection of the hold touch and the release touch results in the display of the editing control 706, a touch associated with the display selection option 712 results in display of the virtual keyboard 212 to replace the display of the editing control 706.
  • Release of the hold touch results in display of the virtual keyboard 212 instead of the editing control 706. The editing control 706 may be displayed while the hold touch is detected until the hold touch is released, and thus is no longer detected. Alternatively, a touch associated with the KEYBOARD selection option 712 may result in display of the virtual keyboard instead of the editing control 706. The hold touch may optionally move to any location on the touch-sensitive display 118. Release of the hold touch may alternatively result in display of any other selection options or controls.
  • A flowchart illustrating a method of touch-sensitive control including displaying an editing control on the touch-sensitive display 118 is shown in FIG. 9. The method may be carried out by software executed, for example, by the processor 102. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description. The method may contain additional or fewer processes than shown and/or described, and may be performed in a different order. Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • When a hold touch and a release touch are detected 902 and release of the release touch is detected 904, an editing control is displayed 906. The hold touch may be the hold touch at location 702 and the release touch may be the release touch at location 704 of the examples of FIG. 7 and FIG. 8. The hold touch and the release touch may be any other touch or gesture. The electronic device 100 may determine that the hold touch and the release touch at least partially overlap in time, e.g., for 0.5 seconds, 1 second, or any other suitable time. Further, determining that the hold touch and the release touch at least partially overlap in time may include detecting that the hold touch and the release touch overlap in time for at least a time value. The time value may be any threshold amount of time that may be established by a manufacturer of the electronic device 100, by a programmer of the electronic device 100, by a user of the electronic device 100, and so forth. Detecting that the hold touch and the release touch overlap in time for at least a time value may prevent detection of two or more touches that are not intended to initiate display of the editing control, e.g., multiple touches that are intended to select keys of the virtual keyboard, inadvertent touches, and so forth.
  • The editing control displayed at 906 may be the editing control 706 and the highlight control 708 of the examples of FIG. 7 and FIG. 8. Alternatively, the editing control may include any number and type of controls, such as one or more selection options for editing information, one or more controls for inputting information, e.g., a SPACE key, a keyboard key, and so forth, one or more controls for moving an indicator through information, and so forth.
  • When release of the hold touch is detected 908, display of the editing control is discontinued 910. For example, as shown in the example of FIG. 8, the display of the editing control 706 and the highlight control 708 is replaced by display of the virtual keyboard 212. The display of the editing control may be replaced by any display such as display of information, display of a background, display of one or more controls, and so forth.
  • Although display of an editing control is initiated by the combination of the hold touch and the release touch as shown in FIG. 9, any other display or action may be initiated by the combination of the hold touch and the release touch. For example, the first control 202 and the second control 204 as shown in FIG. 2 are displayed at 906 in response to detection at 902, 904 of the combination of the hold touch and the release touch. Although the method of FIG. 9 is described with reference to FIG. 7 and FIG. 8, the method is applicable to any other electronic device.
  • In the example of FIG. 10, the electronic device 100 is a tablet 1000. As shown in the example of FIG. 10, the tablet 1000 includes a touch-sensitive display 1002 that includes a non-display area 1004 and a display area 1006. The tablet 1000 includes a first control 1008 and the second control 1010 positioned in the non-display area 1004. While the controls 1008, 1010 are shown in FIG. 10, the controls 1008, 1010 are not displayed because they are associated with the non-display area 1004. In this example, touch sensors are disposed in the non-display area 1004. The controls 1008, 1010 may operate in the same manner as the controls 202, 204 of FIG. 2, FIG. 3, FIG. 4, and FIG. 5. For example, a touch associated with the first control 1008 moves the indicator 1012.
  • A marker may be displayed in the display area 1006 of the touch-sensitive display 1002 to indicate the positions of the controls 1008, 1010 in the non-display area 1006. The touch-sensitive display 1002 may display the indicator at or near the border of the display area 1006 adjacent to the positions of the controls 1008, 1010. The indicator may be a line, a symbol, an icon, a bar, an arrow, and so forth. A light emitting diode or other small visual indicator may be disposed under the non-display area 1004 to indicate the control location. The areas associated with the controls 1008, 1010 may be anywhere in the non-display area 1004, for example, next to the display area 1006.
  • FIG. 10 also illustrates optional locations for the controls 1008, 1010 in the non-display area 1004. The control 1014 is at or near the center of a side of the non-display area 1004. One or more controls may be positioned at or near the center of any of the other sides of the non-display area 1004. Control 1016 is positioned at a corner of the non-display area 1004. One or more controls may be positioned at any other corner of the non-display area 1004. Control 1018 extends along substantially an entire length of a side of the non-display area 1004. One or more controls may extend along other sides of the non-display area. Any combination of the example controls 1008, 1010, 1014, 1016, and 1018 may be provided. A control may comprise a substantial area of the non-display area 1004 to facilitate selection of the control, e.g., substantially an entire side of the non-display area 1004, an area larger than an area encompassed by a touch, and so forth. Optionally, a control may be associated with an area at or near a location of a touch detected in the non-display area.
  • A flowchart illustrating a method of touch-sensitive control is shown in FIG. 11. The method may be carried out by software executed, for example, by the processor 102. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description. The method may contain additional or fewer processes than shown and/or described, and may be performed in a different order. Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • Information is displayed 1102 in the display area 1006 of the touch-sensitive display 1002. An indicator such as described above is optionally displayed within the information. The information may be information input into the tablet 1000 or received in a communication by the tablet 1000, e.g., an electronic mail message (e-mail), a short message service (SMS) message, a webpage, a document, a calendar event, a contact, and so forth.
  • One or more controls are associated 1104 with areas of the non-display area 1004 of the touch-sensitive display 1002. The controls may be, for example, the controls 1008, 1010 shown in FIG. 10, which may be substantially similar to the controls 202, 204 shown in FIG. 2, FIG. 3, FIG. 4, and FIG. 5, or any other controls. When a touch associated with one of the controls 1008, 1010 is detected 1106, the indicator 1012 is moved 1108 through the information in accordance with the touch characteristics, e.g., type and direction. The touch may move in any direction, including multiple directions, resulting in the indicator 1012 being moved in the same direction(s) along with the movement of the touch. A first touch associated with the first control 1008 and a second touch associated with the second control 1010 may result in the indicator 1012 moving in the same direction. For example, when a touch associated with the second control 1010, is detected 1106, the indicator is moved 1108 in the right direction. When a gesture indicating movement to the right, e.g., a swipe to the right, associated with the first control 1008 is detected 1106, the indicator 1012 is also moved 1108 to the right.
  • The detection 1106 and movement 1108 may be repeated any number of times. Although a first direction and a second direction are discussed above, any number of directions may be associated with the controls 1008, 1010. The movement 1108 may be up, down, left, right, or any combination of directions.
  • In the example of FIG. 12, the editing control 706 includes a virtual trackpad 1202. The example editing control 706 also includes selection options to facilitate performing cut, copy, paste, and select functions. Additional selection options may be included in the editing control 706. For example, the editing control 706 may include selection options for delete, page up, page down, and so forth. The editing control 706 including the virtual trackpad 1202 are displayed in response to detecting the hold touch at hold touch location 702 and detecting release of the release touch at release touch location 704. When release of the hold touch is detected, display of the editing control 706 including the virtual trackpad 1202 is discontinued. Alternatively, any other touch, gesture, instruction, and so forth may result in display of the editing control 706 including the virtual trackpad 1202. The editing control 706 may be displayed in response to detecting a touch associated with the selection option 710 and display of the editing control 706 may be discontinued in response to detecting a touch associated with selection option 712. The display of the editing control 706 may be discontinued in response to detecting a touch in an area not associated with the editing control 706, e.g., a touch in an area not associated with the virtual trackpad 1202 and not associated with the selection options. The display of the editing control 706 may be discontinued in response to detecting a double tap associated with the virtual trackpad 1202.
  • The virtual trackpad 1202 is displayed as a border surrounding an area in which touches associated with the virtual trackpad 1202 are detected. Alternatively, any information that identifies the area of the touch-sensitive display 118 associated with the virtual trackpad 1202 may be displayed. The virtual trackpad 1202 and any of the selection options or controls of the editing control 706 may be overlaid over the virtual keyboard 212 such that some or all of the virtual keyboard 212 remains visible. Display of the editing control 706 advantageously replaces the display of the virtual keyboard 212 to increase the amount of the touch-sensitive display 118 available for display of information, e.g., when the editing control 706 is displayed on a portable electronic device. Although the example editing control 706 replaces the display of the virtual keyboard 212 and is displayed within the same dimensions as the virtual keyboard 212, the editing control 706 including the virtual trackpad 1202 may be displayed in any suitable size.
  • When a touch associated with the virtual trackpad 1202 is detected as a swipe, the indicator 714 is moved in the direction of the swipe. The indicator 714 is moved a distance based on the distance of the swipe. Alternatively, the indicator 714 may be moved a distance that is not based on the distance of the swipe. For example, the indicator may move by one character or other unit for each detected swipe. Any other touch or gesture associated with the virtual trackpad 1202 may be detected and any other action may be performed in response to a touch or gesture associated with the virtual trackpad 1202. For example, a touch at or near a side of the virtual trackpad 1202 may result in moving the indicator 714 in a direction of the associated side, e.g., a touch at or near the top of the virtual trackpad 1202 may result in moving the indicator 714 up, a touch at or near the left side of the virtual trackpad 1202 may result in moving the indicator 714 to the left. For example, a touch at or near the center or a corner of the virtual trackpad 1202 may engage and disengage selection.
  • A first touch associated with the virtual trackpad 1202 may be detected to move the indicator 714 to a first position in displayed information. A touch associated with the selection option identified “select” may be detected to initiate highlighting. A second touch associated with the virtual trackpad 1202 may be detected to move the second indicator to a second position in the displayed information. The displayed information between the indicator 714 and the second indicator is highlighted.
  • In the example of FIG. 13, the virtual trackpad 1202 may be displayed in conjunction with display of an enlargement of information 1302. The enlargement of information 1302 includes an enlarged display of information as described in further detail below. Alternatively, a second virtual trackpad may be displayed instead of the enlargement, or a second enlargement may be displayed instead of the virtual trackpad. For example, a touch associated with the first virtual trackpad or first enlargement results in moving a first end point of highlighting or a first indicator, and a touch associated with the second virtual trackpad or second enlargement of information results in moving a second end point of highlighting or a second indicator.
  • In the example of FIG. 14, one or more enlargements and/or virtual trackpads 1404 are displayed between a left side of a virtual keyboard 1402 and a right side of the virtual keyboard 1406. The enlargement includes an enlarged display of information as described in further detail below. The virtual trackpad 1404 may be implemented as described in conjunction with the virtual trackpad 1202. For example, the one or more enlargements and/or virtual trackpads 1404 may be an enlargement and a trackpad in any order, two virtual trackpads, or two enlargements. Each enlargement or virtual trackpad may control a different end point of highlighting or a different indicator.
  • For embodiments herein, highlighting may be controlled by receiving input to move and establish a first end point of the highlighting and receiving input to move and establish a second end point of the highlighting. Optionally, input may be received to simultaneously or substantially simultaneously move two end points of the highlighting, e.g., input associated with a first selection option or control may result in moving a first end point of the highlighting and input associated with a second selection option or control may result in moving a second end point of the highlighting. Alternatively, a first end point of highlighting may be fixed, e.g., a first end point may be fixed at a location of an indicator when highlighting is initiated, and input to move and establish a second end point may be received. Optionally, input results in selecting an end point of highlighting for moving.
  • For embodiments herein, a density of touch sensors may be uniform or may vary throughout the touch-sensitive display 118. For example, the density of the touch sensors may vary between display area(s) and non-display area(s). The density of the touch sensors may be greater in areas where editing controls are provided, e.g., the virtual trackpad 1202; the controls 202, 204, 1008, 1010, 1014, 1016, and 1018; the editing control 706; and so forth. The touch sensors may be disposed in only part(s) of the touch-sensitive display 118. For example, the touch sensors may be disposed at or near a location where the display area meets the non-display area of the touch-sensitive display 118.
  • A touch-sensitive editing control is displayed to facilitate the movement of an indicator through information displayed on a touch-sensitive display of an electronic device. The touch-sensitive controls are displayed when two touches that at least partially overlap in time are detected and release of one of the touches is detected. The display of the touch-sensitive controls is replaced when release of the other one of the touches is detected. The combination of touches, e.g., the two touches followed by release of a first touch and later release of a second touch, facilitates easier access to the editing control and easier return to a previous display, e.g., a virtual keyboard.
  • An electronic device comprises a touch-sensitive display and a processor coupled to the touch-sensitive display and configured to display information on the touch-sensitive display, display a first control, wherein a touch associated with the first control results in moving an indicator through the information in a first direction and in a second direction, wherein the first control does not move with movement of the indicator, display a second control, wherein a touch associated with the second control results in moving the indicator through the information in the first direction and the second direction, wherein the first control does not move with movement of the indicator, detect a first touch associated with the first control, in response to the detecting, move the indicator in the first direction, and in response to detecting a second touch associated with the second control, move the indicator in the first direction.
  • A method comprises displaying information on a touch-sensitive display of an electronic device, displaying a first control, wherein a touch associated with the first control results in moving an indicator through the information in a first direction and in a second direction, wherein the first control does not move with movement of the indicator, displaying a second control, wherein a touch associated with the second control results in moving the indicator through the information in the first direction and in the second direction, wherein the second control does not move with movement of the indicator, detecting a first touch associated with the first control, in response to the detecting, moving the indicator in the first direction, and in response to detecting a second touch associated with the second control, moving the indicator in the first direction. The method may also comprise initiating highlighting of the information in response to detecting the first touch and the second touch.
  • An electronic device comprises a touch-sensitive display and a processor coupled to the touch-sensitive display and configured to detect a hold touch and a release touch on a touch-sensitive display of an electronic device wherein the hold touch and the release touch overlap at least partially in time, detect release of the release touch, and in response to detecting the release of the release touch, display an editing control while the hold touch is detected. The editing control may include a highlight control for identifying one or more end points in displayed information.
  • A method comprises detecting a hold touch and a release touch on a touch-sensitive display of an electronic device, wherein the hold touch and the release touch overlap at least partially in time, detecting release of the release touch, and in response to detecting the release of the release touch, displaying an editing control while the hold touch is detected. The method may comprise determining that the hold touch and the release touch overlap in time for at least a first time value. The method may include moving an indicator from a first word to a second word in response to detecting a gesture associated with the editing control.
  • When viewing information on an electronic device, e.g., information input to or received by the electronic device, a user may manipulate the information, e.g., make changes, move, cut, copy, paste, delete, and perform other functions with the information. For example, a user may edit information by moving an indicator within the information. An indicator includes a cursor, a marker, a blinking character, a pointer, highlighting, and so forth. Editing the information may be difficult when the information is displayed in a small size. For example, portable electronic devices typically include small displays. Coarse input resolution of an input device, such as coarse sensor resolution of a touch-sensitive display may cause difficulty in performing fine selection or movement of an indicator within information displayed in a small size. For example, a user may have difficulty positioning or moving a cursor because accurately touching a position between two characters is difficult. To aid in the manipulation of information, at least part of the information is enlarged or magnified, also referred to as zooming, and displayed on the touch-sensitive display 118. The enlargement may replace at least part of other displayed information, such as a virtual keyboard, virtual keys, controls, or other information, that is displayed on the touch-sensitive display 118.
  • A flowchart illustrating a method of displaying an enlargement of information on the touch-sensitive display 118 is shown in FIG. 15. The method may be carried out by software executed, for example, by the processor 102. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description. The method may contain additional or fewer processes than shown and/or described, and may be performed in a different order. Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer-readable medium.
  • Information is displayed 1502 on the touch-sensitive display 118. Information is displayed in one area 1602 and information in the form of a virtual keyboard is displayed in another area 1604 in the example of FIG. 16. Alternatively, the information from the area 1602 may continue into the area 1604, such as shown in the example of FIG. 19. Information displayed in the areas 1602, 1604 may include one or more controls, selection options, and so forth. The controls include but are not limited to one or more selection options, switches, drop-down menus, dials, scrollbars, sliders, and so forth. The displayed information may be information input to the portable electronic device 100, information received in a communication by the portable electronic device 100, e.g., in an electronic mail message, in a short message service (SMS) message, a webpage, a document, a calendar event, a contact, and so forth. An indicator, e.g., a cursor, a marker, a blinking character, a pointer, highlighting, and so forth, may be displayed in the information, such as the indicator 1606 shown in FIG. 16.
  • When a touch that invokes an enlargement is detected 1504, at least part of the information is displayed 1506 in an enlarged form. The enlargement may be invoked, also referred to as initiated or activated, by detecting a touch associated with a selection option, such as the “EDIT” selection option 1608 in the example of FIG. 16, detecting a gesture, detecting one or more touches or gestures associated with the information, detecting selection of a physical button or key, detecting a touch associated with a menu item, and so forth. A selection option may be utilized to alternate between display of the enlargement and display of other information. The selection option may be displayed in the same location on the display, e.g., the “EDIT” selection option 1608 and the “ABC” selection option 1616.
  • An indicator, e.g., the indicator 1614 in the example of FIG. 16, may be displayed in the enlargement. For example, the indicator may be an enlarged indicator displayed at a position corresponding with the location of the indicator in the information, e.g., between the same characters of the information, such as shown in FIG. 16, the same highlighted character(s) of the information, and so forth. The touch-sensitive display 118 may optionally display other selection options, such as cut, copy, paste, delete, directional options or controls 1618, 1620, 1622, 1624 in the example of FIG. 16, and so forth.
  • The information included in the enlargement corresponds with information at or near the indicator, such as shown in the example of FIG. 16. In this example, information from a row above and a row below the indicator 1606 as well as information to the left and right of the indicator 1606 is shown. Because more information than, for example, a single word, is displayed, a user is able to see more information to facilitate easier and more flexible movement of the indicator through the information. Alternatively, information included in the enlargement may correspond with any other information, such as the beginning of information, the end of information, information at or near a misspelled word, information that was previously included in the enlargement, and so forth.
  • Indicators may be displayed in the enlargement, the information, or both. When a touch associated with any indicator is detected at 1508, the indicator in the information and/or the indicator in the enlargement are moved 1510 in accordance with the touch, e.g., up, down, left, and/or right. If, after a period of time, a touch associated with an indicator is not detected at 1508, the method proceeds to 1512. A touch associated with an indicator includes a touch on, at, or near either indicator, a touch associated with a control for an indicator, such as the directional options 1618, 1620, 1622, 1624 in the example of FIG. 16 or an editing control such as a joystick, any touch in the enlargement or the trackpad, and so forth. In the example of FIG. 17, the touch associated with the indicator may be a touch at touch location 1702 associated with the area 1604 of the touch-sensitive display 118 associated with the display of the enlargement. Alternatively, input may be provided from a control other than the touch-sensitive display 118, such as a physical button, a key of a physical keyboard, a mouse, a navigation device, e.g., a joystick, an optical navigation device, track pad, and so forth. The indicator in the enlargement and/or the indicator in the information may be moved to a different position, may highlight information, may perform another function, or may be modified in any way. The enlargement may function similar to a virtual trackpad, e.g., touches in the area associated with the enlargement move the indicator.
  • The indicator may highlight the information. One or both ends of the highlighting may be adjusted to select different end points, e.g., characters, of the highlighting within the information. The end points may be moved one at a time, e.g., selection and optional movement of one end point followed by selection and optional movement of the other end point. The end points may be selected/moved in any order, and selection/movement may be repeated for either or both end points. Optionally, both end points may be moved simultaneously, e.g., by separate touches, one associated with each end point. The highlighting may be any type of marking of the information to cause the highlighted information to appear different than unhighlighted information, such as background color or style, underlining, outlining, bolding, italics, shading, text coloring, font, relative information size, and so forth.
  • Information displayed in the enlargement may change as the indicator moves through the information. For example, the information displayed in the enlargement may change responsive to the movement of the indicator to maintain the indicator at or near the center of the enlargement. For example, the indicator or the word in which the indicator is located may be centered in the area of the enlargement. An indicator may be at or near the center of enlargement when the indicator is close to the center, is about the center, is away from the center by a character, a word, a line of text, and so forth. An indicator may be offset from the center due to the size of information displayed in an enlargement, due to the length of a line of the information, due to the length of a word, and so forth.
  • When an indication to end the enlargement is detected 1512, the display of the enlargement ends. For example, the virtual keyboard may be displayed to replace the display of the enlargement, the display of the information may be expanded to replace the display of the enlargement, additional controls may be displayed to replace the display of the enlargement, and so forth. The indication to end the enlargement may be detected at 1512 upon detecting selection of a selection option to end enlargement, such as the “ABC” selection option 1616, after a period of time occurs without detecting a touch, upon detection of a gesture indicating end of the enlargement, completion of an editing function such as cut, copy, paste, delete, and so forth.
  • Although the method of FIG. 15 is described with reference to FIG. 16 and FIG. 17, the method is applicable to FIG. 18 and FIG. 19 and any other electronic device.
  • Information is displayed on the device 100 in an upper area 1602 and a virtual keyboard is displayed in a lower area 1604 of the left device 100 in the example of FIG. 16. An indicator 1606 is displayed in the information in this example. The two areas may be separated from each other, adjacent to each other, distanced from each other, side by side vertically or horizontally, one area at least partially surrounding the other area, and so forth. The information in the upper area 1602 may be displayed in the lower area 1604 and the information in the lower area 1604 may be displayed in the upper area 1602. A boundary line or other visual element may separate the areas. The information in the upper area 1602 may continue into the lower area 1604 and no visual element may separate the areas, as shown in the example of FIG. 19. Alternatively, the areas 1602, 1604 may at least partially overlap, where the enlargement or the information is partially translucent.
  • The sizes of the areas may vary. As shown in the example of FIG. 16, the size of the upper area 1602 on the right device 100 is smaller than the upper area 1602 of the left device 100, and the lower area 1604 of the right device 100 is larger than the lower area 1604 of the left device 100. Alternatively, the size of the upper area may increase and the size of the lower area may be decrease when the enlargement is displayed, or the sizes of the areas may remain the same. Optionally, the sizes of the areas may change while the enlargement is displayed, for example, to facilitate display of larger words in the enlargement, to facilitate faster movement of the indicator, and so forth.
  • The display of the virtual keyboard on the left device 100 in the example of FIG. 16 includes a selection option 1608 labeled “EDIT” that may be utilized to invoke the enlargement. Detection of a touch at a touch location associated with the selection option 1608, such as the location 1610 in FIG. 16, invokes the enlargement 1612. The selection option 1608 may include a label, such as a text label, a graphic label, a symbolic label, and so forth.
  • The enlargement 1612 includes display of some of the information from the upper area 1602 in a larger size than the information displayed in the upper area 1602. As shown on the right device 100 of FIG. 16, the enlarged information is centered at or near the cursor 1614 displayed in the enlargement. Any part of the information may be displayed in the enlargement. The amount by which the display of the information is increased in size, e.g., the amount of “zoom,” the amount of magnification, and so forth, and the number of lines of information displayed may vary from the example of FIG. 16. The amount of enlargement may be uniform or variable, e.g., information near an indicator or in the same row as an indicator may be enlarged more than other information. The amount of enlargement may be adjusted in response to a touch associated with a selection option, a control, a gesture, a menu selection, or any other input.
  • The indicator 1614 is displayed at a position in the enlargement 1612 corresponding to a position of the indicator 1606 in the information displayed in the upper area 1602. The indicator 1614 is displayed at or near the center of the enlargement 1612 in this example. Alternatively, the enlargement 1612 and the indicator 1614 may be displayed such that the indicator 1614 is in another position relative to the enlargement 1612, such as at or near the top left corner, or any other position.
  • To facilitate end of display of the enlargement, a selection option may be displayed, such as the “ABC” selection option 1616 displayed in the lower area 1604 of the right device 100 in the example of FIG. 16.
  • When the enlargement is displayed, the electronic device 100 optionally displays editing controls. For example, directional options 1618, 1620, 1622, 1624 are shown displayed in the lower area 1604 in the example of FIG. 16. When a touch associated with any of the direction options 1618, 1620, 1622, 1624 is detected, the indicator 1614 and the indicator 1606 are moved in accordance with the direction associated with the control. In this example, the directional options 1618, 1620, 1622, 1624 are associated with the directions up, left, right, and down with respect to the orientation of the text.
  • As shown in the example of FIG. 17, a touch at a touch location 1702 associated with the indicator 1614 is detected, and the indicator 1614 is moved in accordance with the touch from the position of the indicator 1614 shown on the left device 100 to the position of the touch location 1702 on the right device 100. The touch is detected at the touch location 1702 between “o” and “r” in the word “for” in this example. The indicators 1606, 1614 are moved to the corresponding position in the information displayed in both areas 1602, 1604. As shown in FIG. 17, the part of the information displayed in the enlargement changes, such that the indicator 1614 remains at or near the center of the enlargement. In this example, the information displayed in the enlargement moves to the left as the indicator 1614 moves to the right to maintain the indicator 1614 at or near the center of the enlargement. Alternatively, the information may not change based on the movement of the indicator, may be changed but delayed from movement of the indicator, may be changed in response to detecting a touch associated with a control, may be changed in response to detecting movement of a touch associated with the indicator, may be changed to facilitate viewing of other parts of the information, and so forth. Moving includes changing the position of information, which movement may or may not be animated to appear as though the information is moving across the touch-sensitive display 118.
  • An indicator illustrating highlighting of information is shown in the example of FIG. 18. Highlighting may be initiated by detection of a touch associated with a selection option, a gesture, selection from a menu, depression of a physical key, movement of a mouse, a touch on a physical navigation device, a combination of inputs, and so forth. For example, editing controls displayed in the lower area 1604 may include a selection option for initiating highlighting or a double tap may initiate highlighting. In the example of FIG. 18, when a touch associated with any of the directional options 1618, 1620, 1622, 1624 is detected after highlighting is initiated, an end point of the highlighting moves accordingly. After highlighting is initiated, editing functions may be performed such as, copying highlighted information, cutting highlighted information, deleting highlighted information, pasting highlighted information, and so forth. A second editing function, such as pasting information, may optionally be performed, for example, by moving the indicator to the past position as described above.
  • As shown in the example of FIG. 17, controls 1708 and 1710, which may be similar to controls 1502, 1504 described above, may also be provided. The control 1708 and 1710 may be displayed or not displayed, e.g., provided in a non-display area. The controls 1708 and 1710 may be provided in addition to or as an alternative to the selection options 1618, 1620, 1622, and 1624.
  • In the example of FIG. 18, after a double tap in the lower area 1604 initiates highlighting, a touch moves from a first touch location 1802 along the display to a second location 1804. The first touch location 1802 is associated with the first “t” in “tonight” and the second touch location 1804 is associated with the second “t” in “tonight” in this example. The highlighting is displayed with the information in the enlargement between the two positions in the text, thus highlighting 1808 of “tonight” is displayed. Optionally, an indication of the end points of the highlighting may be displayed and a visual indication may be displayed to indicate the end point that is currently being manipulated. Highlighting may be displayed in one or both of the areas 1602, 1604. In this example, highlighted information 1810 corresponding to highlighted information 1808 from the enlargement is displayed. Highlighting may persist in the area 1602 after display of the enlargement is ended. Additionally or alternatively, the directional options 1618, 1620, 1622, 1624 may persist after display of the enlargement is ended. Alternatively, highlighting may not be displayed in the upper area. Optionally, highlighting 1810 may be displayed when display of the enlargement ends.
  • Prior to invocation of the enlargement, information may be displayed seamlessly or continuously in both areas 1602, 1604 of the device 100 as shown in the example of FIG. 19. The information may be a single continuous set of information, may be multiple discrete sets of information, and so forth. For example, an email, a webpage, a document, and so forth may be displayed. When the enlargement 1612 is invoked, the enlargement replaces part of the display of the information in the second area 1604. Although characters are shown in FIG. 19, the information may include images, graphics, symbols, other types of information, and so forth.
  • Display of the enlargement of information facilitates movement of an indicator through information displayed on a touch-sensitive display, making editing of the information easier. Because the information is displayed in a larger size, movement of an indicator through the information, such as moving a cursor or highlighting information, facilitates reviewing or editing of the information. The enlargement is advantageously applied to portable electronic devices, which typically include relatively small touch-sensitive displays. Selection options may be provided to invoke display of the enlargement and to indicate end of display of the enlargement. Additional selection options for editing or manipulation the information may be provided.
  • An electronic device comprises a touch-sensitive display and a processor operably coupled with the touch-sensitive display and configured to display information in a first area of a touch-sensitive display of an electronic device, display an enlargement including at least part of the information in a second area outside the first area to replace at least part of a virtual keyboard, detect a touch associated with the second area, and move a first indicator in the first area and a second indicator in the second area along with the touch. A method comprises displaying information in a first area of a touch-sensitive display of an electronic device, displaying an enlargement including at least part of the information in a second area outside the first area to replace at least part of a virtual keyboard, detecting a touch associated with the second area, and moving a first indicator in the first area and a second indicator in the second area along with the touch. The method may also include displaying a control in the second area to control the first indicator and the second indicator. The method may also include changing the at least part of the information displayed in the enlargement based on the movement of the second indicator.
  • The words above, below, upper, lower, up, down, left, and right provide a perspective for the drawings and are not otherwise limiting. In the present disclosure, an indicator may be at or near the center of an area when the indicator is close to the center, is about the center, is away from the center by a character, a line of text, or a word, and so forth. Although touch locations are shown as circles with dashed lines the actual touch locations may be larger or smaller, e.g., a point. Although example locations of the selection options and controls 202, 204, 212, 706, 708, 710, 712, 1008, 1010, 1012, 1014, 1016, 1202, 1302, 1402, 1404, 1406, 1604, 1608, 1616, 1618, 1620, 1622, 1624, 1708, 1710, and other elements are shown in FIG. 2, FIG. 3, FIG. 4, FIG. 5, FIG. 7, FIG. 8, FIG. 10, FIG. 12, FIG. 13, FIG. 14, FIG. 16, FIG. 17, FIG. 18, and FIG. 19, selection options, controls, and other elements may be located at any locations such as at the top of a display, at the bottom of a display, along a side of the display, in any area of a non-display area, and so forth.
  • Elements of the examples described herein are interchangeable. Any of the elements of the various examples are combinable to the extent that the elements are not mutually exclusive or do not conflict.
  • The present disclosure may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the disclosure is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

What is claimed is:
1. A method comprising:
detecting a hold touch and a release touch on a touch-sensitive display of an electronic device, wherein the hold touch and the release touch overlap at least partially in time;
detecting release of the release touch;
in response to detecting the release of the release touch, displaying an editing control while the hold touch is detected.
2. The method according to claim 1, comprising detecting release of the hold touch.
3. The method according to claim 2, comprising displaying a virtual keyboard in response to detecting the release of the hold touch.
4. The method according to claim 1, wherein the hold touch and the release touch are associated with a virtual keyboard.
5. The method according to claim 1, wherein displaying the editing control replaces the display of a virtual keyboard.
6. The method according to claim 1, wherein the detecting comprises determining that the hold touch and the release touch overlap in time for at least a first time value.
7. The method according to claim 1, wherein the release touch has an end location, further comprising displaying the editing control at or near the end location.
8. The method according to claim 1, wherein the editing control includes a highlight selection option for initiating highlighting of displayed information.
9. The method according to claim 8, comprising displaying the highlight selection option at or near a location of the hold touch.
10. The method according to claim 1, comprising moving an indicator through displayed information in response to detecting a touch associated with the editing control.
11. The method according to claim 1, comprising detecting a gesture associated with the editing control.
12. The method according to claim 1, comprising moving an indicator from a first word to a second word in response to detecting a gesture associated with the editing control.
13. The method according to claim 1, wherein the editing control includes a first control and a second control, wherein a touch associated with the first control results in moving the indicator through the information in a first direction and a touch associated with the second control results in moving the indicator in a second direction other than the first direction.
14. The method according to claim 1, wherein the editing control includes a virtual trackpad.
15. The method according to claim 14, comprising moving an indicator through displayed information in response to detecting a touch associated with the virtual trackpad.
16. The method according to claim 15, wherein the touch is a swipe.
17. A non-transitory computer-readable medium having computer-readable code executable by at least one processor of an electronic device to perform the method of claim 1.
18. An electronic device comprising:
a touch-sensitive display;
a processor coupled to the touch-sensitive display and configured to:
detect a hold touch and a release touch on a touch-sensitive display of an electronic device wherein the hold touch and the release touch overlap at least partially in time;
detect release of the release touch;
in response to detecting the release of the release touch, display an editing control while the hold touch is detected.
19. The electronic device according to claim 18, wherein the processor is configured to display information.
20. The electronic device according to claim 18, wherein the editing control includes a virtual trackpad.
US13/339,151 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus Abandoned US20130113719A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/339,151 US20130113719A1 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161557873P 2011-11-09 2011-11-09
US13/339,151 US20130113719A1 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus

Publications (1)

Publication Number Publication Date
US20130113719A1 true US20130113719A1 (en) 2013-05-09

Family

ID=48223360

Family Applications (5)

Application Number Title Priority Date Filing Date
US13/339,151 Abandoned US20130113719A1 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US13/339,146 Active 2032-11-15 US9588680B2 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US13/339,138 Abandoned US20130113717A1 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US13/339,155 Active 2033-10-11 US9141280B2 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US14/849,050 Active US9383921B2 (en) 2011-11-09 2015-09-09 Touch-sensitive display method and apparatus

Family Applications After (4)

Application Number Title Priority Date Filing Date
US13/339,146 Active 2032-11-15 US9588680B2 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US13/339,138 Abandoned US20130113717A1 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US13/339,155 Active 2033-10-11 US9141280B2 (en) 2011-11-09 2011-12-28 Touch-sensitive display method and apparatus
US14/849,050 Active US9383921B2 (en) 2011-11-09 2015-09-09 Touch-sensitive display method and apparatus

Country Status (4)

Country Link
US (5) US20130113719A1 (en)
EP (3) EP2776907A4 (en)
CA (3) CA2856209C (en)
WO (3) WO2013067616A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130249809A1 (en) * 2012-03-22 2013-09-26 Honeywell International Inc. Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system
US8766936B2 (en) 2011-03-25 2014-07-01 Honeywell International Inc. Touch screen and method for providing stable touches
US9128580B2 (en) 2012-12-07 2015-09-08 Honeywell International Inc. System and method for interacting with a touch screen interface utilizing an intelligent stencil mask
US9141280B2 (en) 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
US9423871B2 (en) 2012-08-07 2016-08-23 Honeywell International Inc. System and method for reducing the effects of inadvertent touch on a touch screen controller
WO2022093346A1 (en) * 2020-10-26 2022-05-05 Microsoft Technology Licensing, Llc Touch screen display with virtual trackpad

Families Citing this family (121)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9223426B2 (en) 2010-10-01 2015-12-29 Z124 Repositioning windows in the pop-up window
US8756522B2 (en) * 2010-03-19 2014-06-17 Blackberry Limited Portable electronic device and method of controlling same
US9542091B2 (en) 2010-06-04 2017-01-10 Apple Inc. Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator
US8635560B2 (en) * 2011-01-21 2014-01-21 Blackberry Limited System and method for reducing power consumption in an electronic device having a touch-sensitive display
KR101842457B1 (en) * 2011-03-09 2018-03-27 엘지전자 주식회사 Mobile twrminal and text cusor operating method thereof
US9032338B2 (en) 2011-05-30 2015-05-12 Apple Inc. Devices, methods, and graphical user interfaces for navigating and editing text
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US20130076654A1 (en) * 2011-09-27 2013-03-28 Imerj LLC Handset states and state diagrams: open, closed transitional and easel
KR101924835B1 (en) * 2011-10-10 2018-12-05 삼성전자주식회사 Method and apparatus for function of touch device
US9785281B2 (en) 2011-11-09 2017-10-10 Microsoft Technology Licensing, Llc. Acoustic touch sensitive testing
WO2013099362A1 (en) * 2011-12-28 2013-07-04 Ikeda Hiroyuki Portable terminal
US20130239041A1 (en) * 2012-03-06 2013-09-12 Sony Corporation Gesture control techniques for use with displayed virtual keyboards
TWI485577B (en) * 2012-05-03 2015-05-21 Compal Electronics Inc Electronic apparatus and operating method thereof
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
CN104471521B (en) 2012-05-09 2018-10-23 苹果公司 For providing the equipment, method and graphic user interface of feedback for the state of activation for changing user interface object
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
KR101823288B1 (en) 2012-05-09 2018-01-29 애플 인크. Device, method, and graphical user interface for transitioning between display states in response to gesture
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
EP2847657B1 (en) 2012-05-09 2016-08-10 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
CN106201316B (en) 2012-05-09 2020-09-29 苹果公司 Apparatus, method and graphical user interface for selecting user interface objects
EP2847658B1 (en) 2012-05-09 2017-06-21 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
EP3264252B1 (en) 2012-05-09 2019-11-27 Apple Inc. Device, method, and graphical user interface for performing an operation in accordance with a selected mode of operation
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US20140007008A1 (en) * 2012-06-11 2014-01-02 Jim S. Baca Techniques for select-hold-release electronic device navigation menu system
JP6071107B2 (en) * 2012-06-14 2017-02-01 裕行 池田 Mobile device
CN103513878A (en) * 2012-06-29 2014-01-15 国际商业机器公司 Touch input method and device
US8842088B2 (en) * 2012-07-27 2014-09-23 Apple Inc. Touch gesture with visible point of interaction on a touch screen
KR20140035183A (en) * 2012-09-13 2014-03-21 엘지전자 주식회사 Mobile terminal and control method thereof
US20140109016A1 (en) * 2012-10-16 2014-04-17 Yu Ouyang Gesture-based cursor control
KR101329584B1 (en) * 2012-10-22 2013-11-14 신근호 Multi-touch method of providing text block editing, and computer-readable recording medium for the same
US20140129933A1 (en) * 2012-11-08 2014-05-08 Syntellia, Inc. User interface for input functions
WO2014105277A2 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
KR101905174B1 (en) 2012-12-29 2018-10-08 애플 인크. Device, method, and graphical user interface for navigating user interface hierachies
CN107831991B (en) 2012-12-29 2020-11-27 苹果公司 Device, method and graphical user interface for determining whether to scroll or select content
EP2912542B1 (en) 2012-12-29 2022-07-13 Apple Inc. Device and method for forgoing generation of tactile output for a multi-contact gesture
EP3435220B1 (en) 2012-12-29 2020-09-16 Apple Inc. Device, method and graphical user interface for transitioning between touch input to display output relationships
WO2014105279A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for switching between user interfaces
GB2509541A (en) * 2013-01-08 2014-07-09 Ibm Display tool with a magnifier with a crosshair tool.
KR20140097820A (en) * 2013-01-30 2014-08-07 삼성전자주식회사 Method and apparatus for adjusting attribute of specific object in web page in electronic device
JP5998964B2 (en) * 2013-01-31 2016-09-28 カシオ計算機株式会社 Dictionary information display device, dictionary information display method, dictionary information display program, dictionary information display system, server device thereof, and terminal device
KR102091235B1 (en) * 2013-04-10 2020-03-18 삼성전자주식회사 Apparatus and method for editing a message in a portable terminal
US9146672B2 (en) * 2013-04-10 2015-09-29 Barnes & Noble College Booksellers, Llc Multidirectional swipe key for virtual keyboard
US20140306897A1 (en) * 2013-04-10 2014-10-16 Barnesandnoble.Com Llc Virtual keyboard swipe gestures for cursor movement
US9767477B2 (en) * 2013-04-16 2017-09-19 Apple Inc. Accidental selection of invitational content
JP6851197B2 (en) 2013-05-30 2021-03-31 ティーケー ホールディングス インク.Tk Holdings Inc. Multidimensional trackpad
US10282067B2 (en) * 2013-06-04 2019-05-07 Sony Corporation Method and apparatus of controlling an interface based on touch operations
JP2015022567A (en) * 2013-07-19 2015-02-02 富士ゼロックス株式会社 Information processing apparatus and information processing program
CN104345944B (en) * 2013-08-05 2019-01-18 中兴通讯股份有限公司 Device, method and the mobile terminal of adaptive adjustment touch input panel layout
KR20150019165A (en) * 2013-08-12 2015-02-25 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN110058697B (en) * 2013-10-08 2023-02-28 Tk控股公司 Force-based touch interface with integrated multi-sensory feedback
US10990267B2 (en) 2013-11-08 2021-04-27 Microsoft Technology Licensing, Llc Two step content selection
US9841881B2 (en) * 2013-11-08 2017-12-12 Microsoft Technology Licensing, Llc Two step content selection with auto content categorization
EP2887199B1 (en) * 2013-12-20 2020-06-03 Dassault Systèmes A device with a touch-sensitive display comprising a mecanism to copy and manipulate modeled objects
US20150212671A1 (en) * 2014-01-30 2015-07-30 Honeywell International Inc. System and method to view, verify, and send datalink downlink messaging
US9348495B2 (en) 2014-03-07 2016-05-24 Sony Corporation Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone
KR102217560B1 (en) * 2014-03-20 2021-02-19 엘지전자 주식회사 Mobile terminal and control method therof
WO2015141102A1 (en) * 2014-03-20 2015-09-24 日本電気株式会社 Information-processing device, information processing method, and information-processing program
EP3125088B1 (en) * 2014-03-25 2018-08-22 Fujitsu Limited Terminal device, display control method, and program
JP6249851B2 (en) * 2014-03-26 2017-12-20 Kddi株式会社 INPUT CONTROL DEVICE, INPUT CONTROL METHOD, AND PROGRAM
KR102206385B1 (en) 2014-04-11 2021-01-22 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR102177607B1 (en) * 2014-05-16 2020-11-11 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9898162B2 (en) 2014-05-30 2018-02-20 Apple Inc. Swiping functions for messaging applications
US9971500B2 (en) 2014-06-01 2018-05-15 Apple Inc. Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application
KR20160037508A (en) * 2014-09-29 2016-04-06 삼성전자주식회사 Display apparatus and displaying method of thereof
US10466826B2 (en) 2014-10-08 2019-11-05 Joyson Safety Systems Acquisition Llc Systems and methods for illuminating a track pad system
US9864515B1 (en) * 2014-10-24 2018-01-09 Google Llc Virtual joystick on a touch-sensitive screen
US10949075B2 (en) 2014-11-06 2021-03-16 Microsoft Technology Licensing, Llc Application command control for small screen display
US20160132992A1 (en) * 2014-11-06 2016-05-12 Microsoft Technology Licensing, Llc User interface scaling for devices based on display size
CN104375655A (en) * 2014-11-24 2015-02-25 合肥鑫晟光电科技有限公司 Keyboard
US10459608B2 (en) * 2014-12-01 2019-10-29 Ebay Inc. Mobile optimized shopping comparison
KR102367184B1 (en) * 2014-12-11 2022-02-25 삼성메디슨 주식회사 Method and apparatus for inputting information by using a screen keyboard
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9785305B2 (en) * 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
CN104778006B (en) * 2015-03-31 2019-05-10 深圳市万普拉斯科技有限公司 Information edit method and system
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10120555B2 (en) * 2015-05-15 2018-11-06 International Business Machines Corporation Cursor positioning on display screen
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US11209976B2 (en) 2016-04-29 2021-12-28 Myscript System and method for editing input management
WO2018159579A1 (en) * 2017-02-28 2018-09-07 株式会社teamS Display/operation terminal and display/operation separation system using same
JP6822232B2 (en) * 2017-03-14 2021-01-27 オムロン株式会社 Character input device, character input method, and character input program
US10771558B2 (en) 2017-04-10 2020-09-08 Honeywell International Inc. System and method for modifying multiple request datalink messages in avionics system
JP6496345B2 (en) * 2017-04-13 2019-04-03 ファナック株式会社 Numerical controller
US10481791B2 (en) * 2017-06-07 2019-11-19 Microsoft Technology Licensing, Llc Magnified input panels
US11488053B2 (en) 2017-10-06 2022-11-01 Adobe Inc. Automatically controlling modifications to typeface designs with machine-learning models
DE102018005621A1 (en) * 2017-10-06 2019-04-11 Adobe Inc. Selectively enable trackpad functionality in graphical interfaces
US10983679B2 (en) 2017-10-06 2021-04-20 Adobe Inc. Selectively enabling trackpad functionality in graphical interfaces
US11061556B2 (en) * 2018-01-12 2021-07-13 Microsoft Technology Licensing, Llc Computer device having variable display output based on user input with variable time and/or pressure patterns
CN109189303B (en) * 2018-08-30 2021-02-09 维沃移动通信有限公司 Text editing method and mobile terminal
US11314409B2 (en) 2018-12-03 2022-04-26 Microsoft Technology Licensing, Llc Modeless augmentations to a virtual trackpad on a multiple screen computing device
US11137905B2 (en) * 2018-12-03 2021-10-05 Microsoft Technology Licensing, Llc Modeless augmentations to a virtual trackpad on a multiple screen computing device
US11294463B2 (en) 2018-12-03 2022-04-05 Microsoft Technology Licensing, Llc Augmenting the functionality of user input devices using a digital glove
US11199901B2 (en) 2018-12-03 2021-12-14 Microsoft Technology Licensing, Llc Augmenting the functionality of non-digital objects using a digital glove
CN110124314A (en) * 2019-05-15 2019-08-16 网易(杭州)网络有限公司 Clue search method and device, electronic equipment, storage medium in game
CN110141854B (en) * 2019-05-17 2023-03-28 网易(杭州)网络有限公司 Information processing method and device in game, storage medium and electronic equipment
CN115176216A (en) 2019-12-30 2022-10-11 乔伊森安全系统收购有限责任公司 System and method for intelligent waveform interrupts
US11429957B1 (en) 2020-10-26 2022-08-30 Wells Fargo Bank, N.A. Smart table assisted financial health
US11397956B1 (en) 2020-10-26 2022-07-26 Wells Fargo Bank, N.A. Two way screen mirroring using a smart table
US11727483B1 (en) 2020-10-26 2023-08-15 Wells Fargo Bank, N.A. Smart table assisted financial health
US11741517B1 (en) 2020-10-26 2023-08-29 Wells Fargo Bank, N.A. Smart table system for document management
US11740853B1 (en) 2020-10-26 2023-08-29 Wells Fargo Bank, N.A. Smart table system utilizing extended reality
US11457730B1 (en) 2020-10-26 2022-10-04 Wells Fargo Bank, N.A. Tactile input device for a touch screen
US11572733B1 (en) 2020-10-26 2023-02-07 Wells Fargo Bank, N.A. Smart table with built-in lockers
US20220147223A1 (en) * 2020-11-07 2022-05-12 Saad Al Mohizea System and method for correcting typing errors
US20220382374A1 (en) * 2021-05-26 2022-12-01 Da-Yuan Huang Methods, devices, and computer-readable storage media for performing a function based on user input

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070245269A1 (en) * 2006-04-18 2007-10-18 Lg Electronics Inc. Functional icon display system and method
US20090228792A1 (en) * 2008-03-04 2009-09-10 Van Os Marcel Methods and Graphical User Interfaces for Editing on a Portable Multifunction Device
US20090322687A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Virtual touchpad
US20100171711A1 (en) * 2008-11-28 2010-07-08 Research In Motion Limited Portable electronic device with touch-sensitive display and method of controlling same
US20100171713A1 (en) * 2008-10-07 2010-07-08 Research In Motion Limited Portable electronic device and method of controlling same
US20100235783A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20120011438A1 (en) * 2010-07-12 2012-01-12 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120206363A1 (en) * 2011-02-10 2012-08-16 Research In Motion Limited Portable electronic device and method of controlling same

Family Cites Families (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347295A (en) 1990-10-31 1994-09-13 Go Corporation Control of a computer through a position-sensed stylus
US5442742A (en) 1990-12-21 1995-08-15 Apple Computer, Inc. Method and apparatus for the manipulation of text on a computer display screen
US5231698A (en) 1991-03-20 1993-07-27 Forcier Mitchell D Script/binary-encoded-character processing method and system
US5523775A (en) 1992-05-26 1996-06-04 Apple Computer, Inc. Method for selecting objects on a computer display
DE19519417A1 (en) 1995-05-26 1996-11-28 Kodak Ag Lenticular optical device, esp. e.g. large projection screen
US5682439A (en) * 1995-08-07 1997-10-28 Apple Computer, Inc. Boxed input correction system and method for pen based computer systems
US5739946A (en) 1995-09-21 1998-04-14 Kabushiki Kaisha Toshiba Display device
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US6240430B1 (en) 1996-12-13 2001-05-29 International Business Machines Corporation Method of multiple text selection and manipulation
US6073036A (en) 1997-04-28 2000-06-06 Nokia Mobile Phones Limited Mobile station with touch input having automatic symbol magnification function
JP3492493B2 (en) 1997-06-13 2004-02-03 日本電気株式会社 Touch panel and method of detecting pressed position on touch panel
JP3543641B2 (en) 1997-12-15 2004-07-14 富士ゼロックス株式会社 Volume modulation type color forming material, volume modulation type color forming composition, optical element using the same, and light modulation method
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US6211856B1 (en) 1998-04-17 2001-04-03 Sung M. Choi Graphical user interface touch screen with an auto zoom feature
US6169538B1 (en) 1998-08-13 2001-01-02 Motorola, Inc. Method and apparatus for implementing a graphical user interface keyboard and a text buffer on electronic devices
US6370282B1 (en) * 1999-03-03 2002-04-09 Flashpoint Technology, Inc. Method and system for advanced text editing in a portable digital electronic device using a button interface
US7197718B1 (en) 1999-10-18 2007-03-27 Sharp Laboratories Of America, Inc. Interactive virtual area browser for selecting and rescaling graphical representations of displayed data
US6847347B1 (en) 2000-08-17 2005-01-25 Xerox Corporation Electromagnetophoretic display system and method
US7190348B2 (en) 2000-12-26 2007-03-13 International Business Machines Corporation Method for touchscreen data input
US7730401B2 (en) 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
US6970159B2 (en) 2001-06-25 2005-11-29 Gray Robin S Mouse printing device with integrated touch pad buttons
US6956979B2 (en) * 2001-10-04 2005-10-18 International Business Machines Corporation Magnification of information with user controlled look ahead and look behind contextual information
JP2003296015A (en) 2002-01-30 2003-10-17 Casio Comput Co Ltd Electronic equipment
US7062723B2 (en) 2002-05-20 2006-06-13 Gateway Inc. Systems, methods and apparatus for magnifying portions of a display
US7656393B2 (en) * 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US7272489B2 (en) 2002-07-18 2007-09-18 Alpine Electronics, Inc. Navigation method and system for extracting, sorting and displaying POI information
JP4215549B2 (en) 2003-04-02 2009-01-28 富士通株式会社 Information processing device that operates in touch panel mode and pointing device mode
US6986614B2 (en) 2003-07-31 2006-01-17 Microsoft Corporation Dual navigation control computer keyboard
KR100537280B1 (en) 2003-10-29 2005-12-16 삼성전자주식회사 Apparatus and method for inputting character using touch screen in portable terminal
FI20045149A (en) 2004-04-23 2005-10-24 Nokia Corp User interface
US7394453B2 (en) * 2004-04-23 2008-07-01 Cirque Corporation Method for scrolling and edge motion on a touchpad
US7212332B2 (en) 2004-06-01 2007-05-01 Intel Corporation Micro-electromechanical system (MEMS) polyelectrolyte gel network pump
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
WO2006020305A2 (en) 2004-07-30 2006-02-23 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060059437A1 (en) 2004-09-14 2006-03-16 Conklin Kenneth E Iii Interactive pointing guide
US20060077179A1 (en) 2004-10-08 2006-04-13 Inventec Corporation Keyboard having automatic adjusting key intervals and a method thereof
US20060189278A1 (en) 2005-02-24 2006-08-24 Research In Motion Limited System and method for making an electronic handheld device more accessible to a disabled person
US20060218492A1 (en) 2005-03-22 2006-09-28 Andrade Jose O Copy and paste with citation attributes
US20060267803A1 (en) 2005-05-26 2006-11-30 Tele Atlas North America, Inc. Non-perspective variable-scale map displays
US20070075922A1 (en) 2005-09-28 2007-04-05 Jessop Richard V Electronic display systems
CN100583014C (en) * 2005-09-30 2010-01-20 鸿富锦精密工业(深圳)有限公司 Page information processor and process
US20070100800A1 (en) 2005-10-31 2007-05-03 Rose Daniel E Methods for visually enhancing the navigation of collections of information
DE102005056459A1 (en) 2005-11-26 2007-01-25 Daimlerchrysler Ag Vehicle operating system, has evaluation and control unit provided for determining starting point of operator control unit and adjusting surface of input field based on determined starting point, such that surface is increased
GB2434286B (en) 2006-01-12 2008-05-28 Motorola Inc User interface for a touch-screen based computing device and method therefor
TWI328185B (en) 2006-04-19 2010-08-01 Lg Electronics Inc Touch screen device for potable terminal and method of displaying and selecting menus thereon
US20090278806A1 (en) 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
KR100813062B1 (en) 2006-05-03 2008-03-14 엘지전자 주식회사 Portable Terminal And Method Of Displaying Text Using Same
US20100179958A1 (en) 2006-07-19 2010-07-15 Michael James Carr Apparatus, methods, and products for surfing the internet
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US9304675B2 (en) 2006-09-06 2016-04-05 Apple Inc. Portable electronic device for instant messaging
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US8130203B2 (en) 2007-01-03 2012-03-06 Apple Inc. Multi-touch input discrimination
US20080167081A1 (en) 2007-01-10 2008-07-10 Eng U P Peter Keyless touch-screen cellular telephone
TWI380201B (en) 2007-05-15 2012-12-21 Htc Corp Method for browsing a user interface for an electronic device and the software thereof
TWI367436B (en) 2007-05-15 2012-07-01 Htc Corp Method for operating user interfaces of handheld device
JP4501018B2 (en) * 2007-06-13 2010-07-14 株式会社ヤッパ Portable terminal device and input device
US8289193B2 (en) * 2007-08-31 2012-10-16 Research In Motion Limited Mobile wireless communications device providing enhanced predictive word entry and related methods
US9274698B2 (en) 2007-10-26 2016-03-01 Blackberry Limited Electronic device and method of controlling same
US20090125848A1 (en) 2007-11-14 2009-05-14 Susann Marie Keohane Touch surface-sensitive edit system
JP4372188B2 (en) 2007-12-21 2009-11-25 株式会社東芝 Information processing apparatus and display control method
US8610671B2 (en) 2007-12-27 2013-12-17 Apple Inc. Insertion marker placement on touch sensitive display
TWI361376B (en) 2007-12-31 2012-04-01 Htc Corp Method for operating handheld electronic device and touch interface apparatus and storage media using the same
US20090207140A1 (en) * 2008-02-19 2009-08-20 Sony Ericsson Mobile Communications Ab Identifying and responding to multiple time-overlapping touches on a touch panel
US20090227369A1 (en) 2008-03-10 2009-09-10 Merit Entertainment Amusement Device Having a Configurable Display for Presenting Games Having Different Aspect Ratios
JP2009271777A (en) 2008-05-08 2009-11-19 Sharp Corp Information processor, text display program, and text display method
KR20100000514A (en) 2008-06-25 2010-01-06 엘지전자 주식회사 Image display device with touch screen and method of controlling the same
CA2680666A1 (en) * 2008-10-08 2010-04-08 Research In Motion Limited An electronic device having a state aware touchscreen
US9104311B2 (en) 2008-10-09 2015-08-11 Lenovo (Singapore) Pte. Ltd. Slate computer with tactile home keys
EP2579143B1 (en) 2008-12-01 2019-09-18 BlackBerry Limited Portable electronic device and method of controlling same
US8451236B2 (en) 2008-12-22 2013-05-28 Hewlett-Packard Development Company L.P. Touch-sensitive display screen with absolute and relative input modes
US20100166404A1 (en) * 2008-12-31 2010-07-01 Lombardi Michael J Device and Method Using a Touch-Detecting Surface
US9141768B2 (en) * 2009-06-10 2015-09-22 Lg Electronics Inc. Terminal and control method thereof
KR101608770B1 (en) 2009-08-03 2016-04-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9152317B2 (en) 2009-08-14 2015-10-06 Microsoft Technology Licensing, Llc Manipulation of graphical elements via gestures
US10101898B2 (en) 2009-10-23 2018-10-16 Autodesk, Inc. Multi-touch graphical user interface for interacting with menus on a handheld device
US20110148774A1 (en) 2009-12-23 2011-06-23 Nokia Corporation Handling Tactile Inputs
US20110169750A1 (en) 2010-01-14 2011-07-14 Continental Automotive Systems, Inc. Multi-touchpad multi-touch user interface
US20110175826A1 (en) 2010-01-15 2011-07-21 Bradford Allen Moore Automatically Displaying and Hiding an On-screen Keyboard
US8432362B2 (en) * 2010-03-07 2013-04-30 Ice Computer, Inc. Keyboards and methods thereof
US8924882B2 (en) * 2010-03-24 2014-12-30 Htc Corporation Method for controlling a software direction pad of an electronic device, electronic device and computer-readable medium thereof
US9405404B2 (en) * 2010-03-26 2016-08-02 Autodesk, Inc. Multi-touch marking menus and directional chording gestures
US9542091B2 (en) 2010-06-04 2017-01-10 Apple Inc. Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator
US8773370B2 (en) * 2010-07-13 2014-07-08 Apple Inc. Table editing systems with gesture-based insertion and deletion of columns and rows
EP2407892B1 (en) 2010-07-14 2020-02-19 BlackBerry Limited Portable electronic device and method of controlling same
US20120013541A1 (en) * 2010-07-14 2012-01-19 Research In Motion Limited Portable electronic device and method of controlling same
US8531417B2 (en) * 2010-09-02 2013-09-10 Blackberry Limited Location of a touch-sensitive control method and apparatus
EP2431849B1 (en) 2010-09-02 2013-11-20 BlackBerry Limited Location of a touch-sensitive control method and apparatus
US8453186B2 (en) * 2010-09-15 2013-05-28 At&T Intellectual Property I, L.P. Method and system for remote control
US20120154295A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Cooperative use of plural input mechanisms to convey gestures
US9411509B2 (en) * 2010-12-29 2016-08-09 Microsoft Technology Licensing, Llc Virtual controller for touch display
US8656315B2 (en) 2011-05-27 2014-02-18 Google Inc. Moving a graphical selector
US9323415B2 (en) * 2011-06-29 2016-04-26 Nokia Technologies Oy Apparatus and associated methods related to touch sensitive displays
US20130113719A1 (en) 2011-11-09 2013-05-09 Jason Tyler Griffin Touch-sensitive display method and apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070245269A1 (en) * 2006-04-18 2007-10-18 Lg Electronics Inc. Functional icon display system and method
US20090228792A1 (en) * 2008-03-04 2009-09-10 Van Os Marcel Methods and Graphical User Interfaces for Editing on a Portable Multifunction Device
US20090322687A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Virtual touchpad
US20100171713A1 (en) * 2008-10-07 2010-07-08 Research In Motion Limited Portable electronic device and method of controlling same
US20100171711A1 (en) * 2008-11-28 2010-07-08 Research In Motion Limited Portable electronic device with touch-sensitive display and method of controlling same
US20100235783A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20120011438A1 (en) * 2010-07-12 2012-01-12 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120206363A1 (en) * 2011-02-10 2012-08-16 Research In Motion Limited Portable electronic device and method of controlling same

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8766936B2 (en) 2011-03-25 2014-07-01 Honeywell International Inc. Touch screen and method for providing stable touches
US9141280B2 (en) 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
US9383921B2 (en) 2011-11-09 2016-07-05 Blackberry Limited Touch-sensitive display method and apparatus
US9588680B2 (en) 2011-11-09 2017-03-07 Blackberry Limited Touch-sensitive display method and apparatus
US20130249809A1 (en) * 2012-03-22 2013-09-26 Honeywell International Inc. Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system
US9733707B2 (en) * 2012-03-22 2017-08-15 Honeywell International Inc. Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system
US9423871B2 (en) 2012-08-07 2016-08-23 Honeywell International Inc. System and method for reducing the effects of inadvertent touch on a touch screen controller
US9128580B2 (en) 2012-12-07 2015-09-08 Honeywell International Inc. System and method for interacting with a touch screen interface utilizing an intelligent stencil mask
WO2022093346A1 (en) * 2020-10-26 2022-05-05 Microsoft Technology Licensing, Llc Touch screen display with virtual trackpad
US11543961B2 (en) 2020-10-26 2023-01-03 Microsoft Technology Licensing, Llc Touch screen display with virtual trackpad

Also Published As

Publication number Publication date
EP2776908A1 (en) 2014-09-17
US9141280B2 (en) 2015-09-22
WO2013067617A9 (en) 2013-09-06
US20130113718A1 (en) 2013-05-09
CA2855153C (en) 2019-04-30
EP2776908A4 (en) 2015-07-15
US9383921B2 (en) 2016-07-05
WO2013067618A1 (en) 2013-05-16
CA2855153A1 (en) 2013-05-16
US20130113720A1 (en) 2013-05-09
WO2013067618A9 (en) 2014-01-23
EP2776906A4 (en) 2015-07-22
EP2776907A1 (en) 2014-09-17
US20130113717A1 (en) 2013-05-09
EP2776907A4 (en) 2015-07-15
EP2776906A1 (en) 2014-09-17
US20150378601A1 (en) 2015-12-31
WO2013067617A1 (en) 2013-05-16
US9588680B2 (en) 2017-03-07
WO2013067616A9 (en) 2013-07-25
CA2855154A1 (en) 2013-05-16
CA2856209C (en) 2020-04-07
CA2856209A1 (en) 2013-05-16
WO2013067616A1 (en) 2013-05-16

Similar Documents

Publication Publication Date Title
US9383921B2 (en) Touch-sensitive display method and apparatus
US10331313B2 (en) Method and apparatus for text selection
US10025487B2 (en) Method and apparatus for text selection
EP2660696B1 (en) Method and apparatus for text selection
US20130285930A1 (en) Method and apparatus for text selection
EP2660697B1 (en) Method and apparatus for text selection
CA2821814C (en) Method and apparatus for text selection
EP2660727B1 (en) Method and apparatus for text selection
US9098127B2 (en) Electronic device including touch-sensitive display and method of controlling same
US20150012874A1 (en) Electronic device and method for character deletion
EP2722746A1 (en) Electronic device including touch-sensitive display and method of controlling same
US20130293483A1 (en) Selectable object display method and apparatus
CA2821772A1 (en) Method and apparatus for text selection
EP2660698A9 (en) Selectable object display method and apparatus
WO2014059510A1 (en) Electronic device including touch-sensitive display and method of controlling same
CA2821784A1 (en) Method and apparatus for text selection

Legal Events

Date Code Title Description
AS Assignment

Owner name: RESEARCH IN MOTION LIMITED, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GRIFFIN, JASON TYLER;PASQUERO, JEROME;EARNSHAW, ANDREW MARK;AND OTHERS;SIGNING DATES FROM 20120808 TO 20120904;REEL/FRAME:029187/0050

AS Assignment

Owner name: BLACKBERRY LIMITED, ONTARIO

Free format text: CHANGE OF NAME;ASSIGNOR:RESEARCH IN MOTION LIMITED;REEL/FRAME:034012/0111

Effective date: 20130709

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064104/0103

Effective date: 20230511