US20100156833A1 - Electronic device having touch screen and method for changing data displayed on the touch screen - Google Patents
Electronic device having touch screen and method for changing data displayed on the touch screen Download PDFInfo
- Publication number
- US20100156833A1 US20100156833A1 US12/643,538 US64353809A US2010156833A1 US 20100156833 A1 US20100156833 A1 US 20100156833A1 US 64353809 A US64353809 A US 64353809A US 2010156833 A1 US2010156833 A1 US 2010156833A1
- Authority
- US
- United States
- Prior art keywords
- time data
- touch
- gesture
- display
- display block
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/08—Touch switches specially adapted for time-pieces
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G5/00—Setting, i.e. correcting or changing, the time-indication
- G04G5/04—Setting, i.e. correcting or changing, the time-indication by setting each of the displayed values, e.g. date, hour, independently
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G9/00—Visual time or date indication means
- G04G9/02—Visual time or date indication means by selecting desired characters out of a number of characters or by selecting indicating elements the position of which represent the time, e.g. by using multiplexing techniques
- G04G9/025—Visual time or date indication means by selecting desired characters out of a number of characters or by selecting indicating elements the position of which represent the time, e.g. by using multiplexing techniques provided with date indication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
Definitions
- Exemplary embodiments of the present invention relate to a device having a touch screen and a method for changing data displayed on the touch screen of the device.
- exemplary embodiments of the present invention relate to a method for replacing time-related data displayed on the touch screen with new data according to a touch gesture or a drag gesture.
- Electronic devices can provide a user with the convenience of mobility and a rich set of services and features.
- Examples of electronic devices include a personal computer, a notebook, a mobile phone, a personal digital assistant (PDA), a smart phone, a portable multimedia player (PMP), a music player (e.g. an MP3 player), a digital multimedia broadcasting (DMB) receiver, and a car navigation system.
- PDA personal digital assistant
- PMP portable multimedia player
- music player e.g. an MP3 player
- DMB digital multimedia broadcasting
- a touch screen which can be used as a display unit and an input unit.
- Electronic devices having a touch screen may not require an additional display and other types of input units. Accordingly, a touch screen may be used in small-scale portable devices.
- a reduction in size of an electronic device may, however, restrict the capability of simultaneously displaying data on the touch screen of the device. Also, some functions executed in the device may often require many regions for displaying related data on the screen, so the graphical configuration of elements displayed on the screen may become complicated. These problems may be more serious in cases where time-related functions such as scheduling a task or outputting an alarm need to be executed.
- time-related data such as time, date, and time period
- time-related data such as time, date, and time period
- Exemplary embodiments of the present invention provide a device having a touch screen and a method for changing data displayed on the touch screen of the device.
- Exemplary embodiments of the present invention disclose a method for changing data displayed on a device.
- the method includes displaying a time data in a display block. And the display block is arranged on a touch screen.
- the method also includes detecting a touch gesture or a drag gesture on the display block.
- the method includes replacing the time data with a second time data according to the touch gesture or the drag gesture.
- Exemplary embodiments of the present invention disclose a device including a display unit including a display block to display a time data.
- the device also includes a touch sensor configured to detect a touch gesture or a drag gesture. And the touch sensor is arranged in the display block.
- the device further includes a control unit configured to control the display unit to display the time data, and configured to replace the time data in the display block with a second time data in response to detection of the touch gesture or the drag gesture.
- Exemplary embodiments of the present invention disclose a method for receiving time data currently being indicated from a display block in time data display mode and the display block may be arranged to a screen.
- the method also includes determining a value of a touch gesture or a drag gesture applied to the display block. And the value corresponds to a speed, a distance, a direction and a duration of the respective gesture and a coordinated gesture based on the touch gesture and the drag gesture.
- the method also includes replacing the time data with new time data according to the value determined by the touch gesture, the drag gesture, and the coordinated gesture.
- FIG. 1 is an exemplary view illustrating an electronic device displaying time data according to exemplary embodiments of the present invention.
- FIG. 2A and FIG. 2B are exemplary views each of which illustrate a change in time data by a touch gesture according to exemplary embodiments of the present invention.
- FIG. 3 is an exemplary view illustrating a change in time data by a touch gesture according to exemplary embodiments of the present invention.
- FIG. 4A and FIG. 4B are exemplary views each of which illustrate a change in time data by a drag gesture according to exemplary embodiments of the present invention.
- FIG. 5A and FIG. 5B are exemplary views each illustrating a change in time data by a touch gesture according to exemplary embodiments of the present invention.
- FIG. 6A is a block diagram illustrating a configuration of an electronic device according to exemplary embodiments of the present invention.
- FIG. 6B is a schematic view which illustrates touch zones in a display block of a device according to exemplary embodiments of the present invention.
- FIG. 7 is a flow diagram illustrating a method for changing time data displayed on a touch screen of an electronic device according to exemplary embodiments of the present invention.
- FIG. 8 is a flow diagram illustrating a method for changing time data according to exemplary embodiments of the present invention.
- a ‘time data’ may refer to data related to a date (e.g., day, month, and year), a unit of time (e.g., hour, minute, and second), an interval of time (e.g., a period, ante meridiem (A.M.), or post meridiem (P.M.)), and a day of the week.
- a date e.g., day, month, and year
- a unit of time e.g., hour, minute, and second
- an interval of time e.g., a period, ante meridiem (A.M.), or post meridiem (P.M.)
- A.M. ante meridiem
- P.M. post meridiem
- a ‘display block’ may refer to a virtual block of a touch screen to display the time data.
- a single display block may contain at least two touch zones, each of which can individually detect a contact and release an input tool, such as, for example, a user's finger or a stylus pen.
- a ‘time data display mode’ may refer to a mode in which the time data may be displayed.
- the time data display mode may include, for example, a schedule mode to manage a user's schedule, an alarm mode to establish an alarm time, and a current time display mode to exhibit and to set a current time.
- An ‘electronic device’ may refer to an apparatus having a touch screen and displaying, on the touch screen, a variety of data including the time data.
- the electronic device can be, for example, a personal computer, a notebook, a mobile phone (e.g., cellular handset), a cordless phone, a mobile transmitter, a stationary wireless transmitter, a personal digital assistant (PDA), a smart phone, a portable multimedia player (PMP), a music player (e.g., an MP3 player), a digital multimedia broadcasting (DMB) receiver, a car navigation system, a pager, and any other type of portable or handheld terminal.
- PDA personal digital assistant
- PMP portable multimedia player
- music player e.g., an MP3 player
- DMB digital multimedia broadcasting
- FIG. 1 is an exemplary view illustrating an electronic device displaying time data according to exemplary embodiments of the present invention.
- the device may include a display unit 120 formed in a touch screen to output time data on the display unit 120 .
- the display unit 120 may display time data such as, for example, a day, a month and a year.
- the display unit 120 may further display a number of display blocks (e.g., three display blocks 125 a , 125 b and 125 c ) on which different time data (e.g., a day, a month, and a year) can be arranged respectively.
- Each of the display blocks 125 a , 125 b , and 125 c may have two touch zones 130 a and 130 b , which can be arranged, for example, at an upper part and a lower part of each of the display block 125 a , 125 b , and 125 c .
- FIG. 1 shows a day, a month, and a year as time data
- exemplary embodiments of the present invention are not limited to time data related to a date.
- any other time data described above may be alternatively or additively displayed.
- the number of display blocks may be limited, and the touch zones may be divided widthwise or diagonally.
- the display unit 120 may change time data displayed in the selected display block under the control of a control unit 140 .
- the display unit 120 may replace the current time data displayed in the selected display block with a following (e.g., next) time data.
- the display unit 120 may replace the currently displayed time data with a foregoing (e.g., previous) time data.
- the current time data displayed in the selected display block may be replaced with a following or foregoing time data, depending on a direction of a drag gesture regardless of the location of the touch zone at which the contact and release are detected.
- an extent of a change in time data may be determined depending on duration of a touch gesture or a speed of a drag gesture applied to the first or second touch zones 130 a and 130 b.
- FIG. 2A , FIG. 2B , FIG. 3 , FIG. 4A , FIG. 4B , FIG. 5A and FIG. 5B examples of a screen view in which time data is changed by a touch gesture or a drag gesture will be described with reference to FIG. 2A , FIG. 2B , FIG. 3 , FIG. 4A , FIG. 4B , FIG. 5A and FIG. 5B .
- FIG. 2A and FIG. 2B are exemplary views each of which illustrate a change in time data by a touch gesture according to exemplary embodiments of the present invention.
- the display unit 120 may output a first time data, for example, ‘28,’ which may correspond to a day of the month as shown in the first display block 125 a .
- the display unit 120 may output a second time data, for example, ‘Aug,’ which may correspond to a month as shown in the second display block 125 b , and may output a third time data, for example, ‘2008,’ which may correspond to a year as shown in the third display block 125 c.
- the display unit 120 may replace the first time data ‘28’ of the first display block 125 a with a following first time data, for example, ‘29,’ under the control of the control unit 140 .
- the first touch zone 130 a may be touched at a touch point 105 by the input tool to replace the first time data ‘28’ with the following first time data ‘29.’ Accordingly, a change of time data may require contact and release in the same touch zone.
- a time data may be changed even when contact and release are detected from different touch zones.
- a drag gesture may be applied from one touch zone to another. If a contact is detected in the first touch zone 130 a and a release is detected in the second touch zone 130 b , the display unit 120 may replace a currently displayed time data (e.g., first time data) with a following time data (e.g., following first time data).
- time data may be changed based on the starting location of a drag gesture irrespective of the finishing location of the drag gesture. For example, if a drag gesture starts from the first touch zone 130 a , the display unit 120 may replace the currently displayed time data with the following time data.
- a time data may be changed based on a direction of a drag gesture irrespective of the location of the touch zone at which the contact and release are detected.
- the display unit 120 may replace the first time data ‘28’ of the first display block 125 a with a previous first time data, for example, ‘27,’ under the control of the control unit 140 .
- a contact is detected in the second touch zone 130 b and a release is detected in the first touch zone 130 a , (e.g., a drag gesture is applied from the second touch zone 130 b to the first touch zone 130 a )
- the display unit 120 may replace the first time data with the previous time data.
- a time data may be changed depending on a starting location or a direction of a drag gesture.
- FIG. 3 is an exemplary view which illustrates a change in time data by a touch gesture according to exemplary embodiments of the present invention.
- the display unit 120 may output a first hour time data, for example, ‘6,’, which may correspond to an hour of a day, in the first display block 125 a .
- the display unit 120 may also output a first minute time data, for example, ‘15,’ which may correspond to a minute of an hour, in the second display block 125 b , and a first interval time data, for example, ‘AM,’ which may correspond to a time interval (e.g., A.M./P.M.) in the third display block 125 c.
- a first hour time data for example, ‘6,’, which may correspond to an hour of a day
- the display unit 120 may also output a first minute time data, for example, ‘15,’ which may correspond to a minute of an hour, in the second display block 125 b
- a first interval time data for example, ‘AM,’ which may correspond to a time interval (e.g., A.M./P.M.) in the third display block 125 c.
- the display unit 120 may replace the first interval time data ‘AM’ with a following first interval time data ‘PM’ under the control of the control unit 140 .
- individual time data can be displayed in each display block 125 a , 125 b , and 125 c , and one of the display blocks 125 a , 125 b , and 125 c can be selected by a contact and release of the input tool.
- a time data being displayed in the selected display block may be replaced with the following or previous time data, as noted above, according to the location of the touch zone at which the contact and release are detected. Therefore, the change in time data may be determined according to a touch location.
- a time data may be replaced with the following or previous time data according to the direction of a drag gesture, which may be implemented by moving the input tool from one touch zone to another.
- a drag gesture may be applied to a single touch zone and a drag direction may be determined by using start and finish coordinates of the drag gesture.
- the change in time data may also be determined, at least partially, on duration of a touch gesture applied to a touch zone or a speed of the drag gesture.
- FIG. 4A and FIG. 4B are exemplary views each of which illustrate a change in time data by a drag gesture according to exemplary embodiments of the present invention.
- the display unit 120 may output the first time data, for example, ‘28,’ which may correspond to a day in a month, as shown in the first display block 125 a .
- the display unit 120 may output the second time data ‘Aug,’ which may correspond to a month, as shown in the second display block 125 b , and may output the third time data ‘2008,’ which may correspond to a year, as shown in the third display block 125 c .
- the extent of change in the time data may be determined according to a speed of a drag gesture.
- the speed of the drag gesture may be obtained, for example, by dividing a distance between the touch and the release of the drag gesture by a time between the touch and the release of the drag gesture.
- a drag gesture may be completed when an input tool touches 105 the first touch zone 130 a in the first display block, and travels toward the second touch zone 130 b as indicated by arrow 135 a .
- the input tool may be released at second touch zone 130 b .
- the display unit 120 may replace the time data (e.g., current time data) being displayed in the first display block 125 a at the instant when the input tool touches the first touch zone 130 a with following time data, depending on the downward magnitude of the drag gesture. For example, if a drag gesture having a downward direction starts and ends within the first touch zone 130 a , the display unit 120 may replace the current time data with the following time data.
- control unit 140 may calculate a speed of the drag gesture. If, for example, a calculated speed of the drag gesture corresponds to a change value ‘2,’ the display unit 120 may replace the current time data ‘28’ with the following time data ‘30.’
- the control unit 140 may calculate a velocity of the drag gesture using the touch and release coordinates of the drag gesture.
- the velocity of the drag gesture can be a vector quantity having a magnitude corresponding to a drag speed and a direction corresponding to a drag direction.
- a drag gesture may be completed when the input tool touches the second touch zone 130 b in the first display block 125 a , moves towards the first touch zone 130 a as indicated by arrow 135 b , and is released from the first touch zone 130 a .
- the display unit 120 may replace a current time data in the first display block 125 a with the previous time data, according to the upward direction of the drag gesture.
- control unit 140 may calculate a speed of the drag gesture. For example, if the speed of the drag gesture corresponds to a change value ‘8,’ the display unit 120 may replace the current time data, for example, ‘28,’ with the previous time data, for example, ‘20.’.
- the time data can be determined according to a speed or velocity of a drag gesture.
- exemplary embodiments of the invention are not limited thereto.
- the change in time data may also be determined according to a travel distance of a drag gesture.
- the change in time data may be determined according to the duration of the touch gesture.
- a user may manipulate a relation between the touch location and a directionality of a change in time data, or a relation between the drag direction and the directionality of a change in time data.
- the touch zones 130 a , 130 b may be divided widthwise or diagonally in the display blocks 125 a , 125 b , and 125 c.
- FIG. 5A and FIG. 5B are exemplary views each illustrating a change in time data by a touch gesture according to exemplary embodiments of the present invention.
- the display unit 120 may output first time data, for example, ‘24,’ which may correspond to a day of a month, in the first display block 125 a .
- the display unit 120 may also output second time data, for example, ‘December,’ which may correspond to a month, in the second display block 125 b , and the third time data, for example, ‘2008,’ which may correspond to a year, in the third display block 125 c .
- the first touch zone 130 a and the second touch zone 130 b may be arranged at a left portion and a right portion of the display block (e.g., display block 125 b ).
- the display unit 120 may replace the second time data ‘December’ in the second display block 125 b with a previous time data ‘November’ under the control of the control unit 140 .
- the display unit 120 may replace the second time data ‘December’ in the second display block 125 b with a following time data ‘January’ under the control of the control unit.
- time data is changed by a touch gesture or a drag gesture.
- an electronic device for executing a change in time data by a touch or drag gesture will be described with reference to FIG. 6A and FIG. 6B .
- FIG. 6A is a block diagram illustrating a configuration of an electronic device according to exemplary embodiments of the present invention.
- FIG. 6B is a schematic view illustrating a touch zone in a display block according to exemplary embodiments of the present invention.
- An electronic device may include a bus (not shown) or other communication mechanisms for communicating data, and a control unit 140 including a processor (not shown) coupled to the bus for processing information.
- the electronic device may also include memory unit 150 , which may be a random access memory (RAM) or a dynamic storage device coupled to the bus for storing information and instructions to be executed by the processor.
- the memory unit 150 may also be used for storing temporary variables or intermediate information during execution of instructions by the processor.
- the memory unit 150 may be a read only memory (ROM) or other static storage device coupled to the bus for storing static information and instructions for the processor.
- the memory unit 150 may include a series of applications to operate the electronic device. Examples of suitable applications include a touch application, a pressure application, an image application, and a direction application.
- the display unit 120 may be coupled to the touch is screen 120 .
- the display unit 120 include, for example, a liquid crystal display, a flexible display, or active matrix display, for displaying information to the user.
- the touch screen 120 may be an input device, such as a keyboard, including alphanumeric and other keys.
- the input device may be coupled to the bus and may communicate information and command selections to the processor.
- the input device may include various types of sensors (e.g., touch sensor 130 ) and may include a plurality of touch zones 130 a and 130 b in the display block 125 for detecting user input.
- the input device may further include a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor and for controlling cursor movement on the display unit 120 .
- execution of the instructions contained in memory unit 150 may cause the processor to perform processes according to the instructions.
- the control unit 140 may include one or more processors in a multi-processing arrangement to execute the instructions contained in memory unit 150 .
- Hard-wired circuitry may be used in place of, or in combination with, software instructions to implement one or more of the exemplary embodiments of the present invention.
- reconfigurable hardware such as Field Programmable Gate Arrays (FPGAs)
- FPGAs Field Programmable Gate Arrays
- functionality and connection topology of the FPGA logic gates may be customized at run-time, typically by programming memory look up tables.
- exemplary embodiments of the present invention are not limited to any specific combination of hardware circuitry and/or software.
- the electronic device may also include at least one communication interface unit (not shown).
- the communication interface unit may provide a two-way data communication coupling to a network link (not shown).
- the communication interface unit may send and receive electrical, electromagnetic, or optical signals that can carry digital data streams representing various types of information.
- the communication interface unit may include peripheral interface devices, such as a Universal Serial Bus (USB) interface, or a PCMCIA (Personal Computer Memory Card International Association) interface.
- USB Universal Serial Bus
- PCMCIA Personal Computer Memory Card International Association
- the processor may execute transmitted code and/or may store the transmitted code in the memory unit 150 , or in other non-volatile storage.
- the electronic device may obtain application code in the form of a carrier wave.
- a “computer-readable medium” may refer to any medium that provides instructions to the processor for execution. Such a medium may be implemented in various forms, including but not limited to, non-volatile media, volatile media, and transmission media.
- Non-volatile media may include, for example, optical or magnetic disks, such as the storage device.
- Volatile media may include dynamic memory, such as main memory.
- Transmission media may include coaxial cables, copper wire and fiber optics, including the wires that comprise the bus. Transmission media can also take the form of acoustic, optical, or electromagnetic waves, such as those generated during radio frequency (RF) and infrared (IR) data communications.
- RF radio frequency
- IR infrared
- Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a compact disk read-only memory (CD-ROM), compact disc rewritable (CDRW), digital video disc (DVD), any other suitable optical medium, punch cards, optical mark sheets, any other suitable physical medium with patterns of holes or other optically recognizable indicia, a random-access memory (RAM), a programmable read-only memory (PROM), and erasable programmable read-only memory (EPROM), a FLASH-EPROM, any other memory chip or cartridge, and a carrier wave.
- a floppy disk a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium
- CD-ROM compact disk read-only memory
- CDRW compact disc rewritable
- DVD digital video disc
- any other suitable optical medium punch cards, optical mark sheets, any other suitable physical medium with patterns of holes or other optically recognizable indicia
- RAM random-access memory
- the instructions for carrying out at least part of the present invention may be implemented on a magnetic disk of a remote computer.
- the remote mobile terminal may load the instructions into the memory unit 150 and may send the instructions, for example, over a telephone line using a modem.
- a modem of a local system may receive the data on a telephone line and/or may be used an infrared transmitter to convert the data to an infrared signal and may transmit the infrared signal to the electronic device.
- the electronic device may receive information and instructions provided by the infrared signal.
- the bus may provide the information and instructions to the memory unit 150 , from which a processor may retrieve and execute the instructions.
- the instructions received by memory unit 150 may optionally be stored either before or after execution by the control unit 140 .
- the device may include a touch screen 110 , a control unit 140 , and a memory unit 150 .
- the touch screen 110 may include a display unit 120 and a touch sensor 130 .
- the display unit 120 may be coupled to the control unit 140 and may be configured to display graphical data, on a screen.
- the data may be related to a state or operation of the device and/or may be produced by execution of functions in the device.
- the display unit 120 may display time data using at least one display block 125 (e.g., 125 a , 125 b , 125 c ) under the control of the control unit 140 . Additionally, the display unit 120 may change time data of one of the display blocks 125 under the control of the control unit 140 in response to a touch gesture or a drag gesture being detected by the touch sensor 130 .
- the display block 125 may include the first touch zone 130 a and the second touch zone 130 b .
- the first and second touch zones 130 a and 130 b may be arranged at an upper part and a lower part of the display block 125 , as shown in FIG. 6B , exemplary embodiment of the present invention are not limited thereto.
- the first touch zone 130 a and second touch zones 130 b may be divided widthwise or diagonally in the display block 125 , as described above.
- the display unit 120 may use a dynamic graphical effect to provide a user with visual feedback for indicating a change in time data. For example, the display unit 120 may render a rightward turning motion and/or leftward turning motion on the display block 125 as if pages of a book are turned. In some cases, the display unit 120 may render a rightward rolling motion or a leftward rolling motion to the display block 125 as if a small cube is rotated. In general, a change in time data may be represented in any suitable manner.
- Display block 125 can be divided into a first touch zone 130 a and a second touch zone 130 b , which may be arranged at a left portion and a right portion of the display block 125 , respectively. If a touch gesture is detected on the first touch zone 130 a , the display unit 120 may replace the time data being displayed in the display block 125 at the instant the touch gesture is detected with a previous time data by the control unit 140 . For example, time data displayed on the left portion corresponding to the first touch zone 130 a may appear to turn and be displayed on the right portion corresponding to the second touch zone 130 b . The display block 125 may then display the previous time data on the left portion.
- the touch sensor 130 may be provided near the display unit 120 and may detect a touch gesture or a drag gesture by the input tool.
- the touch sensor 130 may detect the contact or release of the input tool on a surface of the touch screen 110 .
- the touch sensor 130 may determine coordinates of the contact and the release and may transmit the coordinates to the control unit 140 .
- the control unit 140 may determine whether a user's input gesture is a touch event or a drag gesture.
- the control unit 140 may further determine which display block is selected and which touch zone is touched.
- the control unit 140 may also determine a direction and a distance of the drag gesture, and may calculate the speed (e.g., velocity) of the drag gesture from the contact and release coordinates.
- the control unit 140 may execute functions and operation of elements in the device. For example, when a time data display mode is selected by a user, the display unit 140 may control the display unit 120 to display a time data. A time data may be displayed in each display block 125 . In addition, the control unit 140 may divide each display block 125 into a plurality of touch zones, for example, the first touch zone 130 a and the second touch zone 130 b.
- the control unit 140 may select a display block 125 (e.g., 125 a , 125 b , and 125 c ) at which the contact is detected. The control unit 140 may then determine which of the first touch zone 130 a and second touch zone 130 b in the selected display block 125 can be contacted. Furthermore, the control unit 140 may ascertain the time data displayed in the selected display block 125 .
- a display block 125 e.g., 125 a , 125 b , and 125 c
- the control unit 140 may determine a directionality of a change in the time data.
- the control unit 140 may determine whether to replace time data being displayed with the following time data or with the previous time data.
- the change directionality of time data may be determined according to a location of a touch zone on which a touch gesture is detected. For example, the contact and release on the first touch zone 130 a can be regarded as a forward change, and the contact and release on the second touch zone 130 b can be regarded as a backward change. Accordingly, if the first touch zone 130 a is touched, the control unit 140 may replace the time data being displayed (e.g., ‘28’) with the following time data (e.g., ‘29’).
- the change directionality of time data may also be determined according to a direction of a drag gesture.
- a drag gesture applied from a first touch zone 130 a to a second touch zone 130 b can be regarded as a forward change.
- a drag gesture from the second touch zone 130 b to the first touch zone 130 a may be regarded as a backward change.
- a drag gesture may be detected within a single touch zone and the drag direction may be determined by using both start and finish coordinates of the drag gesture.
- the control unit 140 may further determine a change in the time data being displayed.
- the change may depend on a speed of a drag gesture.
- the drag speed may be obtained by dividing a traveled distance by a time of travel.
- the control unit 140 may determine the duration of a drag gesture from a difference between a start time of the drag gesture and a finish time of the drag gesture.
- the control unit 140 may also determine a drag distance from a difference between the start point and the finish point of the drag gesture.
- the control unit 140 may calculate a drag speed and determine a change in time data value by referring to a mapping data mapping the drag speed to the change in time data value.
- a drag distance may be 4 mm and a drag duration may be 400 ms.
- the control unit 140 may calculate the drag speed to be, for example, 0.01 mm/ms by dividing the drag distance (e.g., 4 mm) by the drag duration (e.g., 400 ms). If the drag speed (e.g., 0.01 mm/ms) corresponds to a change value of, for example ‘5,’ the control unit 140 may replace the time data being displayed (e.g., ‘10’) with a fifth following time data (e.g., ‘15’) in response to the forward change.
- the change in time data may also be determined according to a duration of a touch gesture. For example, if a touch gesture is maintained for more than a predefined time, the control unit 140 may change the time data being displayed according to a period. For example, if the predefined time is, for example, three seconds and the period is, for example, 100 ms, the control unit 140 may change the time data being displayed once in 100 ms after the touch gesture is maintained for three seconds. The time data may continuously change until the touch gesture is finished, (e.g., until the contact is released).
- the memory unit 150 may store various types of application programs and data required for execution of functions in the device. For example, the memory unit 150 may store a time data to be displayed using a time data display mode, such as, for example, a schedule mode or a current time display mode. Additionally, the memory unit 150 may store mapping data to map a drag speed to a change value for a time data, as shown in TABLE 1.
- a time data display mode such as, for example, a schedule mode or a current time display mode.
- mapping data to map a drag speed to a change value for a time data, as shown in TABLE 1.
- control unit 140 may use a mapping data to determine a change in time data based on a drag speed and a drag gesture.
- Mapping data as shown in TABLE 1, is exemplary only and should not be considered as limiting exemplary embodiments of the present invention. Mapping data may be set or adjusted by the user of the device.
- FIG. 7 is a flow diagram illustrating a method for changing time data displayed on a touch screen of an electronic device according to exemplary embodiments of the present invention.
- the control unit 140 may execute the time data display mode (step 710 ).
- the control unit 140 may then control the display unit 120 to display a time data (step 720 ).
- the time data may be a time data corresponding to a time when the user selects the time data display mode and may be live time data. If the time data is related to a date, the time data can be displayed in three display blocks (e.g., 125 a , 125 b , and 125 c ) of the display unit 120 under the control of the control unit 140 . For example, when the date is Dec.
- the control unit 140 may output a time data ‘24’ corresponding to the day in the first display block 125 a , a time data ‘Dec’ corresponding to the month in the second display block 125 b , and a time data ‘2008’ corresponding to the year in the third display block 125 c , respectively.
- control unit 140 may determine whether an input tool (e.g., a user's finger or a stylus pen) is detected on a surface of the touch screen 110 (step 730 ). If no touch of the input tool is detected, the process may return to step 720 and the control unit 140 may continue to display the time data.
- an input tool e.g., a user's finger or a stylus pen
- the control unit 140 may determine a display block 125 at which the touch is detected, and may determine the time data displayed in the display block 125 (step 740 ). In step 740 , the control unit 140 may also determine which touch zone in the display block 125 is touched.
- the control unit 140 may determine a directionality and a change in the time data.
- the control unit 140 may replace the time data displayed in the display block 125 with new time data which may be determined according to the determined directionality and a change (step 750 ).
- the directionality of the time data may be determined depending on a location (i.e., touch point) of the touch or the drag direction.
- the change in time data may be determined according to the duration of the touch or the drag speed.
- FIG. 8 is a flow diagram illustrating a process of changing time data according to exemplary embodiments of the present invention.
- the control unit 140 may determine whether the touch point can move (step 810 ), (i.e., whether the user's input is a touch gesture or a drag gesture). If the control unit 140 determines that the touch point can not move (e.g., if the user's input is a touch gesture), the control unit 140 may further determine whether a touch gesture is maintained for more than a predefined time (step 820 ).
- the control unit 140 may continuously change the time data being displayed according to a period (step 830 ).
- the change directionality of time data can be based on a location of the touch zone at which the touch gesture is detected, and may be determined in step 830 as described above. If the touch gesture is continuously detected on the first display block 125 a displaying a time data, for example, ‘24,’ and if the change directionality is determined as a forward change, the control unit 140 may change time data to following data (e.g., ‘25,’ ‘26,’ and/or ‘27’. If the change directionality is determined as backward change, the control unit 140 may change time data to is the previous data (e.g., ‘23,’ ‘22,’ and ‘21.’)
- control unit 140 may determine whether a touch gesture is released from the touch screen (step 835 ). If the release is detected, the control unit 140 may stop changing time data (step 840 ). If no release is detected, the controller 140 may return to step 830 . After step 840 , the control unit 140 may maintain the time data finally being displayed when changing time data is stopped (step 880 ).
- the control unit 140 may change the time data being displayed once (step 850 ).
- the change directionality of time data may be based on a location of the touch zone at which the touch gesture is detected, and may be determined as described above.
- the control unit 140 may change the time data once to the following data, for example, ‘25.’ If the change directionality is determined as a backward change, the control unit 140 may change the time data once to the previous data, for example, ‘23.’ The control unit 140 may then display the changed time data (step 880 ).
- the control unit 140 may determine whether the drag gesture is released (step 860 ). If the drag gesture is released, the control unit 140 may determine a drag direction and a drag speed as described above (step 865 ). For example, the control unit 140 may obtain the drag direction and the drag distance by using contact and release coordinates corresponding to a start point and a finish point of the drag gesture. In addition, the control unit 140 may obtain a drag time by using a start time and a finish time of the drag gesture. The control unit 140 may then calculate a drag speed by dividing the drag distance by the drag time.
- control unit 140 may change the time data by using both the drag direction and the drag speed (step 870 ).
- the control unit 140 may then display the changed time data (step 880 ).
- a touch point of a drag gesture may be detected to move from a first touch zone 130 a to a second touch zone 130 b in a first display block 125 a displaying a time data ‘24,’ and the drag speed may be 2.5 mm/ms.
- the control unit 140 may determine a change value by referring to a mapping data to map the drag speed range to a change value. If the drag speed 2.5 mm/ms corresponds to a change value ‘10,’ the control unit 140 may replace the time data ‘24’ being displayed in the first display block 125 a with the tenth following data ‘3.’ If the date being displayed is, for example, Dec. 24, 2008, the second display block 125 b displaying ‘Dec’ and the third display block 125 c displaying ‘2008’ may automatically be replaced with ‘Jan’ and ‘2009,’ respectively.
- control unit 140 may change the current time data ‘24’ to the tenth foregoing data ‘14.’
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
In a time data display mode, a time data is displayed in at least one display block partially assigned to a touch screen of an electronic device. After the touch gesture or the drag gesture is detected in one of the display block, the current time data in the display block can be replaced with a new time data according to the touch gesture or the drag gesture. The new time data may be a following time data or a previous time data which can be determined according to a location of the touch gesture, or a direction and a distance of the drag gesture. In addition, the extent of a change in the time data may be determined according to duration of the touch gesture or a speed of the drag gesture.
Description
- This application claims priority from and the benefit of Korean Application No. 10-2008-0131299, filed on Dec. 22, 2008, which is hereby incorporated by reference for all purposes as if fully set forth herein.
- 1. Field of the Invention
- Exemplary embodiments of the present invention relate to a device having a touch screen and a method for changing data displayed on the touch screen of the device. In particular, exemplary embodiments of the present invention relate to a method for replacing time-related data displayed on the touch screen with new data according to a touch gesture or a drag gesture.
- 2. Description of the Background
- Electronic devices can provide a user with the convenience of mobility and a rich set of services and features. Examples of electronic devices include a personal computer, a notebook, a mobile phone, a personal digital assistant (PDA), a smart phone, a portable multimedia player (PMP), a music player (e.g. an MP3 player), a digital multimedia broadcasting (DMB) receiver, and a car navigation system.
- Additionally, many electronic devices today can include a touch screen which can be used as a display unit and an input unit. Electronic devices having a touch screen may not require an additional display and other types of input units. Accordingly, a touch screen may be used in small-scale portable devices.
- A reduction in size of an electronic device may, however, restrict the capability of simultaneously displaying data on the touch screen of the device. Also, some functions executed in the device may often require many regions for displaying related data on the screen, so the graphical configuration of elements displayed on the screen may become complicated. These problems may be more serious in cases where time-related functions such as scheduling a task or outputting an alarm need to be executed.
- Additionally, when time-related data such as time, date, and time period are displayed, a user who may want to select or change specific data may often suffer the inconvenience of having to touch the touch screen several times.
- Exemplary embodiments of the present invention provide a device having a touch screen and a method for changing data displayed on the touch screen of the device.
- Additional features of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
- Exemplary embodiments of the present invention disclose a method for changing data displayed on a device. The method includes displaying a time data in a display block. And the display block is arranged on a touch screen. The method also includes detecting a touch gesture or a drag gesture on the display block. The method includes replacing the time data with a second time data according to the touch gesture or the drag gesture.
- Exemplary embodiments of the present invention disclose a device including a display unit including a display block to display a time data. The device also includes a touch sensor configured to detect a touch gesture or a drag gesture. And the touch sensor is arranged in the display block. The device further includes a control unit configured to control the display unit to display the time data, and configured to replace the time data in the display block with a second time data in response to detection of the touch gesture or the drag gesture.
- Exemplary embodiments of the present invention disclose a method for receiving time data currently being indicated from a display block in time data display mode and the display block may be arranged to a screen. The method also includes determining a value of a touch gesture or a drag gesture applied to the display block. And the value corresponds to a speed, a distance, a direction and a duration of the respective gesture and a coordinated gesture based on the touch gesture and the drag gesture. The method also includes replacing the time data with new time data according to the value determined by the touch gesture, the drag gesture, and the coordinated gesture.
- It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
- The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention, and together with the description serve to explain the principles of the invention.
-
FIG. 1 is an exemplary view illustrating an electronic device displaying time data according to exemplary embodiments of the present invention. -
FIG. 2A andFIG. 2B are exemplary views each of which illustrate a change in time data by a touch gesture according to exemplary embodiments of the present invention. -
FIG. 3 is an exemplary view illustrating a change in time data by a touch gesture according to exemplary embodiments of the present invention. -
FIG. 4A andFIG. 4B are exemplary views each of which illustrate a change in time data by a drag gesture according to exemplary embodiments of the present invention. -
FIG. 5A andFIG. 5B are exemplary views each illustrating a change in time data by a touch gesture according to exemplary embodiments of the present invention. -
FIG. 6A is a block diagram illustrating a configuration of an electronic device according to exemplary embodiments of the present invention. -
FIG. 6B is a schematic view which illustrates touch zones in a display block of a device according to exemplary embodiments of the present invention. -
FIG. 7 is a flow diagram illustrating a method for changing time data displayed on a touch screen of an electronic device according to exemplary embodiments of the present invention. -
FIG. 8 is a flow diagram illustrating a method for changing time data according to exemplary embodiments of the present invention. - The invention is described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Rather, these exemplary embodiments are provided so that this disclosure is thorough, and will fully convey the scope of the invention to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity. Like reference numerals in the drawings denote like elements.
- Prior to explaining exemplary embodiments of the present invention, relevant terminology will be defined for the description below.
- Among terms set forth herein, a ‘time data’ may refer to data related to a date (e.g., day, month, and year), a unit of time (e.g., hour, minute, and second), an interval of time (e.g., a period, ante meridiem (A.M.), or post meridiem (P.M.)), and a day of the week.
- A ‘display block’ may refer to a virtual block of a touch screen to display the time data. In general, a single display block may contain at least two touch zones, each of which can individually detect a contact and release an input tool, such as, for example, a user's finger or a stylus pen.
- A ‘time data display mode’ may refer to a mode in which the time data may be displayed. The time data display mode may include, for example, a schedule mode to manage a user's schedule, an alarm mode to establish an alarm time, and a current time display mode to exhibit and to set a current time.
- An ‘electronic device’ (or a ‘device’) may refer to an apparatus having a touch screen and displaying, on the touch screen, a variety of data including the time data. The electronic device can be, for example, a personal computer, a notebook, a mobile phone (e.g., cellular handset), a cordless phone, a mobile transmitter, a stationary wireless transmitter, a personal digital assistant (PDA), a smart phone, a portable multimedia player (PMP), a music player (e.g., an MP3 player), a digital multimedia broadcasting (DMB) receiver, a car navigation system, a pager, and any other type of portable or handheld terminal.
- Hereinafter, exemplary embodiments of the present invention are described in detail with reference to the accompanying drawings.
-
FIG. 1 is an exemplary view illustrating an electronic device displaying time data according to exemplary embodiments of the present invention. - Referring to
FIG. 1 , the device may include adisplay unit 120 formed in a touch screen to output time data on thedisplay unit 120. For example, thedisplay unit 120 may display time data such as, for example, a day, a month and a year. Thedisplay unit 120 may further display a number of display blocks (e.g., threedisplay blocks touch zones FIG. 1 shows a day, a month, and a year as time data, exemplary embodiments of the present invention are not limited to time data related to a date. For example, any other time data described above may be alternatively or additively displayed. Furthermore, the number of display blocks may be limited, and the touch zones may be divided widthwise or diagonally. - When contact and release by a user's touch or drag gesture to select and change time data are detected at one of the
touch zones display block display unit 120 may change time data displayed in the selected display block under the control of acontrol unit 140. For example, if the first touch zone 103 a detects contact and release of an input tool, such as the user's finger or a stylus pen, thedisplay unit 120 may replace the current time data displayed in the selected display block with a following (e.g., next) time data. If the second touch zone 103 b detects contact and release of the input tool, thedisplay unit 120 may replace the currently displayed time data with a foregoing (e.g., previous) time data. - In some cases, the current time data displayed in the selected display block may be replaced with a following or foregoing time data, depending on a direction of a drag gesture regardless of the location of the touch zone at which the contact and release are detected.
- Furthermore, an extent of a change in time data may be determined depending on duration of a touch gesture or a speed of a drag gesture applied to the first or
second touch zones - Hereinafter, examples of a screen view in which time data is changed by a touch gesture or a drag gesture will be described with reference to
FIG. 2A ,FIG. 2B ,FIG. 3 ,FIG. 4A ,FIG. 4B ,FIG. 5A andFIG. 5B . -
FIG. 2A andFIG. 2B are exemplary views each of which illustrate a change in time data by a touch gesture according to exemplary embodiments of the present invention. - Referring to
FIG. 2A , thedisplay unit 120 may output a first time data, for example, ‘28,’ which may correspond to a day of the month as shown in the first display block 125 a. Thedisplay unit 120 may output a second time data, for example, ‘Aug,’ which may correspond to a month as shown in thesecond display block 125 b, and may output a third time data, for example, ‘2008,’ which may correspond to a year as shown in thethird display block 125 c. - When an input tool (e.g., the user's finger or a stylus pen) touches the
first touch zone 130 a of the first display block 125 a and is released from thefirst touch zone 130 a, thedisplay unit 120 may replace the first time data ‘28’ of the first display block 125 a with a following first time data, for example, ‘29,’ under the control of thecontrol unit 140. Thefirst touch zone 130 a may be touched at atouch point 105 by the input tool to replace the first time data ‘28’ with the following first time data ‘29.’ Accordingly, a change of time data may require contact and release in the same touch zone. - In some cases, a time data may be changed even when contact and release are detected from different touch zones. For example, a drag gesture may be applied from one touch zone to another. If a contact is detected in the
first touch zone 130 a and a release is detected in thesecond touch zone 130 b, thedisplay unit 120 may replace a currently displayed time data (e.g., first time data) with a following time data (e.g., following first time data). In some cases, time data may be changed based on the starting location of a drag gesture irrespective of the finishing location of the drag gesture. For example, if a drag gesture starts from thefirst touch zone 130 a, thedisplay unit 120 may replace the currently displayed time data with the following time data. In some cases, a time data may be changed based on a direction of a drag gesture irrespective of the location of the touch zone at which the contact and release are detected. - Referring to
FIG. 2B , when the input tool touches thesecond touch zone 130 b in the first display block 125 a and is subsequently released from thesecond touch zone 130 b, thedisplay unit 120 may replace the first time data ‘28’ of the first display block 125 a with a previous first time data, for example, ‘27,’ under the control of thecontrol unit 140. In some cases, if a contact is detected in thesecond touch zone 130 b and a release is detected in thefirst touch zone 130 a, (e.g., a drag gesture is applied from thesecond touch zone 130 b to thefirst touch zone 130 a), thedisplay unit 120 may replace the first time data with the previous time data. In some cases, a time data may be changed depending on a starting location or a direction of a drag gesture. -
FIG. 3 is an exemplary view which illustrates a change in time data by a touch gesture according to exemplary embodiments of the present invention. - Referring to
FIG. 3 , thedisplay unit 120 may output a first hour time data, for example, ‘6,’, which may correspond to an hour of a day, in the first display block 125 a. Thedisplay unit 120 may also output a first minute time data, for example, ‘15,’ which may correspond to a minute of an hour, in thesecond display block 125 b, and a first interval time data, for example, ‘AM,’ which may correspond to a time interval (e.g., A.M./P.M.) in thethird display block 125 c. - When the input tool touches a
first touch zone 130 a in thethird display block 125 c and is subsequently released from thefirst touch zone 130 a, thedisplay unit 120 may replace the first interval time data ‘AM’ with a following first interval time data ‘PM’ under the control of thecontrol unit 140. - As discussed hereinabove, individual time data can be displayed in each display block 125 a, 125 b, and 125 c, and one of the display blocks 125 a, 125 b, and 125 c can be selected by a contact and release of the input tool. A time data being displayed in the selected display block may be replaced with the following or previous time data, as noted above, according to the location of the touch zone at which the contact and release are detected. Therefore, the change in time data may be determined according to a touch location.
- In some cases, a time data may be replaced with the following or previous time data according to the direction of a drag gesture, which may be implemented by moving the input tool from one touch zone to another.
- In some cases, a drag gesture may be applied to a single touch zone and a drag direction may be determined by using start and finish coordinates of the drag gesture.
- The change in time data may also be determined, at least partially, on duration of a touch gesture applied to a touch zone or a speed of the drag gesture.
-
FIG. 4A andFIG. 4B are exemplary views each of which illustrate a change in time data by a drag gesture according to exemplary embodiments of the present invention. - Referring to
FIG. 4A , thedisplay unit 120 may output the first time data, for example, ‘28,’ which may correspond to a day in a month, as shown in the first display block 125 a. Thedisplay unit 120 may output the second time data ‘Aug,’ which may correspond to a month, as shown in thesecond display block 125 b, and may output the third time data ‘2008,’ which may correspond to a year, as shown in thethird display block 125 c. The extent of change in the time data may be determined according to a speed of a drag gesture. The speed of the drag gesture may be obtained, for example, by dividing a distance between the touch and the release of the drag gesture by a time between the touch and the release of the drag gesture. - For example, referring to
FIG. 4 a, a drag gesture may be completed when an input tool touches 105 thefirst touch zone 130 a in the first display block, and travels toward thesecond touch zone 130 b as indicated byarrow 135 a. The input tool may be released atsecond touch zone 130 b. Accordingly, thedisplay unit 120 may replace the time data (e.g., current time data) being displayed in the first display block 125 a at the instant when the input tool touches thefirst touch zone 130 a with following time data, depending on the downward magnitude of the drag gesture. For example, if a drag gesture having a downward direction starts and ends within thefirst touch zone 130 a, thedisplay unit 120 may replace the current time data with the following time data. - Additionally, the
control unit 140 may calculate a speed of the drag gesture. If, for example, a calculated speed of the drag gesture corresponds to a change value ‘2,’ thedisplay unit 120 may replace the current time data ‘28’ with the following time data ‘30.’ - If a drag gesture is detected within a single touch zone, the
control unit 140 may calculate a velocity of the drag gesture using the touch and release coordinates of the drag gesture. The velocity of the drag gesture can be a vector quantity having a magnitude corresponding to a drag speed and a direction corresponding to a drag direction. - Referring to
FIG. 4B , a drag gesture may be completed when the input tool touches thesecond touch zone 130 b in the first display block 125 a, moves towards thefirst touch zone 130 a as indicated byarrow 135 b, and is released from thefirst touch zone 130 a. Thedisplay unit 120 may replace a current time data in the first display block 125 a with the previous time data, according to the upward direction of the drag gesture. - In addition, the
control unit 140 may calculate a speed of the drag gesture. For example, if the speed of the drag gesture corresponds to a change value ‘8,’ thedisplay unit 120 may replace the current time data, for example, ‘28,’ with the previous time data, for example, ‘20.’. - In the description hereinabove, the time data can be determined according to a speed or velocity of a drag gesture. However, exemplary embodiments of the invention are not limited thereto. The change in time data may also be determined according to a travel distance of a drag gesture. In addition, when a touch gesture is applied instead of a drag gesture, the change in time data may be determined according to the duration of the touch gesture.
- In some cases, a user may manipulate a relation between the touch location and a directionality of a change in time data, or a relation between the drag direction and the directionality of a change in time data.
- In addition, in some cases, as shall be described hereinafter, the
touch zones -
FIG. 5A andFIG. 5B are exemplary views each illustrating a change in time data by a touch gesture according to exemplary embodiments of the present invention. - Referring to
FIG. 5A , thedisplay unit 120 may output first time data, for example, ‘24,’ which may correspond to a day of a month, in the first display block 125 a. Thedisplay unit 120 may also output second time data, for example, ‘December,’ which may correspond to a month, in thesecond display block 125 b, and the third time data, for example, ‘2008,’ which may correspond to a year, in thethird display block 125 c. Thefirst touch zone 130 a and thesecond touch zone 130 b may be arranged at a left portion and a right portion of the display block (e.g.,display block 125 b). - When the input tool, such as a user's finger or a stylus pen, touches the
first touch zone 130 a in thesecond display block 125 b and is released from thefirst touch zone 130 a, thedisplay unit 120 may replace the second time data ‘December’ in thesecond display block 125 b with a previous time data ‘November’ under the control of thecontrol unit 140. - Referring to
FIG. 5B , when the input tool touches thesecond touch zone 130 b in thesecond display block 125 b and is released from thesecond touch zone 130 b, thedisplay unit 120 may replace the second time data ‘December’ in thesecond display block 125 b with a following time data ‘January’ under the control of the control unit. - Described hereinabove are examples in which time data is changed by a touch gesture or a drag gesture. Hereinafter, an electronic device for executing a change in time data by a touch or drag gesture will be described with reference to
FIG. 6A andFIG. 6B . -
FIG. 6A is a block diagram illustrating a configuration of an electronic device according to exemplary embodiments of the present invention.FIG. 6B is a schematic view illustrating a touch zone in a display block according to exemplary embodiments of the present invention. - An electronic device, as shown in
FIG. 6A , may include a bus (not shown) or other communication mechanisms for communicating data, and acontrol unit 140 including a processor (not shown) coupled to the bus for processing information. The electronic device may also includememory unit 150, which may be a random access memory (RAM) or a dynamic storage device coupled to the bus for storing information and instructions to be executed by the processor. Thememory unit 150 may also be used for storing temporary variables or intermediate information during execution of instructions by the processor. Thememory unit 150 may be a read only memory (ROM) or other static storage device coupled to the bus for storing static information and instructions for the processor. Thememory unit 150 may include a series of applications to operate the electronic device. Examples of suitable applications include a touch application, a pressure application, an image application, and a direction application. - The
display unit 120, includingdisplay block 125, may be coupled to the touch isscreen 120. Examples of thedisplay unit 120 include, for example, a liquid crystal display, a flexible display, or active matrix display, for displaying information to the user. In some cases, thetouch screen 120 may be an input device, such as a keyboard, including alphanumeric and other keys. The input device may be coupled to the bus and may communicate information and command selections to the processor. The input device may include various types of sensors (e.g., touch sensor 130) and may include a plurality oftouch zones display block 125 for detecting user input. The input device may further include a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor and for controlling cursor movement on thedisplay unit 120. - According to various exemplary embodiments of the invention, execution of the instructions contained in
memory unit 150 may cause the processor to perform processes according to the instructions. Thecontrol unit 140 may include one or more processors in a multi-processing arrangement to execute the instructions contained inmemory unit 150. Hard-wired circuitry may be used in place of, or in combination with, software instructions to implement one or more of the exemplary embodiments of the present invention. For example, reconfigurable hardware, such as Field Programmable Gate Arrays (FPGAs), can be used, and functionality and connection topology of the FPGA logic gates may be customized at run-time, typically by programming memory look up tables. Thus, exemplary embodiments of the present invention are not limited to any specific combination of hardware circuitry and/or software. - The electronic device may also include at least one communication interface unit (not shown). The communication interface unit may provide a two-way data communication coupling to a network link (not shown). The communication interface unit may send and receive electrical, electromagnetic, or optical signals that can carry digital data streams representing various types of information. Further, the communication interface unit may include peripheral interface devices, such as a Universal Serial Bus (USB) interface, or a PCMCIA (Personal Computer Memory Card International Association) interface.
- The processor may execute transmitted code and/or may store the transmitted code in the
memory unit 150, or in other non-volatile storage. In some cases, the electronic device may obtain application code in the form of a carrier wave. - A “computer-readable medium” may refer to any medium that provides instructions to the processor for execution. Such a medium may be implemented in various forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media may include, for example, optical or magnetic disks, such as the storage device. Volatile media may include dynamic memory, such as main memory. Transmission media may include coaxial cables, copper wire and fiber optics, including the wires that comprise the bus. Transmission media can also take the form of acoustic, optical, or electromagnetic waves, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a compact disk read-only memory (CD-ROM), compact disc rewritable (CDRW), digital video disc (DVD), any other suitable optical medium, punch cards, optical mark sheets, any other suitable physical medium with patterns of holes or other optically recognizable indicia, a random-access memory (RAM), a programmable read-only memory (PROM), and erasable programmable read-only memory (EPROM), a FLASH-EPROM, any other memory chip or cartridge, and a carrier wave.
- Various forms of computer-readable media may be involved in providing instructions to a processor for execution. For example, the instructions for carrying out at least part of the present invention may be implemented on a magnetic disk of a remote computer. The remote mobile terminal may load the instructions into the
memory unit 150 and may send the instructions, for example, over a telephone line using a modem. A modem of a local system may receive the data on a telephone line and/or may be used an infrared transmitter to convert the data to an infrared signal and may transmit the infrared signal to the electronic device. The electronic device may receive information and instructions provided by the infrared signal. The bus may provide the information and instructions to thememory unit 150, from which a processor may retrieve and execute the instructions. The instructions received bymemory unit 150 may optionally be stored either before or after execution by thecontrol unit 140. - Referring to
FIG. 6A , the device may include atouch screen 110, acontrol unit 140, and amemory unit 150. - The
touch screen 110 may include adisplay unit 120 and atouch sensor 130. Thedisplay unit 120 may be coupled to thecontrol unit 140 and may be configured to display graphical data, on a screen. The data may be related to a state or operation of the device and/or may be produced by execution of functions in the device. Thedisplay unit 120 may display time data using at least one display block 125 (e.g., 125 a, 125 b, 125 c) under the control of thecontrol unit 140. Additionally, thedisplay unit 120 may change time data of one of the display blocks 125 under the control of thecontrol unit 140 in response to a touch gesture or a drag gesture being detected by thetouch sensor 130. - Referring to
FIG. 6B , thedisplay block 125 may include thefirst touch zone 130 a and thesecond touch zone 130 b. Although the first andsecond touch zones display block 125, as shown inFIG. 6B , exemplary embodiment of the present invention are not limited thereto. For example, thefirst touch zone 130 a andsecond touch zones 130 b may be divided widthwise or diagonally in thedisplay block 125, as described above. - When time data is changed, the
display unit 120 may use a dynamic graphical effect to provide a user with visual feedback for indicating a change in time data. For example, thedisplay unit 120 may render a rightward turning motion and/or leftward turning motion on thedisplay block 125 as if pages of a book are turned. In some cases, thedisplay unit 120 may render a rightward rolling motion or a leftward rolling motion to thedisplay block 125 as if a small cube is rotated. In general, a change in time data may be represented in any suitable manner. - A page-turning effect can be described as follows.
Display block 125 can be divided into afirst touch zone 130 a and asecond touch zone 130 b, which may be arranged at a left portion and a right portion of thedisplay block 125, respectively. If a touch gesture is detected on thefirst touch zone 130 a, thedisplay unit 120 may replace the time data being displayed in thedisplay block 125 at the instant the touch gesture is detected with a previous time data by thecontrol unit 140. For example, time data displayed on the left portion corresponding to thefirst touch zone 130 a may appear to turn and be displayed on the right portion corresponding to thesecond touch zone 130 b. Thedisplay block 125 may then display the previous time data on the left portion. - The
touch sensor 130 may be provided near thedisplay unit 120 and may detect a touch gesture or a drag gesture by the input tool. Thetouch sensor 130 may detect the contact or release of the input tool on a surface of thetouch screen 110. Thetouch sensor 130 may determine coordinates of the contact and the release and may transmit the coordinates to thecontrol unit 140. Based on the contact and release coordinates, thecontrol unit 140 may determine whether a user's input gesture is a touch event or a drag gesture. In addition, thecontrol unit 140 may further determine which display block is selected and which touch zone is touched. Thecontrol unit 140 may also determine a direction and a distance of the drag gesture, and may calculate the speed (e.g., velocity) of the drag gesture from the contact and release coordinates. - The
control unit 140 may execute functions and operation of elements in the device. For example, when a time data display mode is selected by a user, thedisplay unit 140 may control thedisplay unit 120 to display a time data. A time data may be displayed in eachdisplay block 125. In addition, thecontrol unit 140 may divide each display block 125 into a plurality of touch zones, for example, thefirst touch zone 130 a and thesecond touch zone 130 b. - When a contact of the input tool is detected by the
touch sensor 130, thecontrol unit 140 may select a display block 125 (e.g., 125 a, 125 b, and 125 c) at which the contact is detected. Thecontrol unit 140 may then determine which of thefirst touch zone 130 a andsecond touch zone 130 b in the selecteddisplay block 125 can be contacted. Furthermore, thecontrol unit 140 may ascertain the time data displayed in the selecteddisplay block 125. - In order to change the displayed time data, the
control unit 140 may determine a directionality of a change in the time data. Thecontrol unit 140 may determine whether to replace time data being displayed with the following time data or with the previous time data. The change directionality of time data may be determined according to a location of a touch zone on which a touch gesture is detected. For example, the contact and release on thefirst touch zone 130 a can be regarded as a forward change, and the contact and release on thesecond touch zone 130 b can be regarded as a backward change. Accordingly, if thefirst touch zone 130 a is touched, thecontrol unit 140 may replace the time data being displayed (e.g., ‘28’) with the following time data (e.g., ‘29’). - The change directionality of time data may also be determined according to a direction of a drag gesture. For example, a drag gesture applied from a
first touch zone 130 a to asecond touch zone 130 b can be regarded as a forward change. A drag gesture from thesecond touch zone 130 b to thefirst touch zone 130 a may be regarded as a backward change. In some cases, a drag gesture may be detected within a single touch zone and the drag direction may be determined by using both start and finish coordinates of the drag gesture. - The
control unit 140 may further determine a change in the time data being displayed. For example, the change may depend on a speed of a drag gesture. The drag speed may be obtained by dividing a traveled distance by a time of travel. For example, thecontrol unit 140 may determine the duration of a drag gesture from a difference between a start time of the drag gesture and a finish time of the drag gesture. Thecontrol unit 140 may also determine a drag distance from a difference between the start point and the finish point of the drag gesture. Thecontrol unit 140 may calculate a drag speed and determine a change in time data value by referring to a mapping data mapping the drag speed to the change in time data value. - For example, a drag distance may be 4 mm and a drag duration may be 400 ms. The
control unit 140 may calculate the drag speed to be, for example, 0.01 mm/ms by dividing the drag distance (e.g., 4 mm) by the drag duration (e.g., 400 ms). If the drag speed (e.g., 0.01 mm/ms) corresponds to a change value of, for example ‘5,’ thecontrol unit 140 may replace the time data being displayed (e.g., ‘10’) with a fifth following time data (e.g., ‘15’) in response to the forward change. - The change in time data may also be determined according to a duration of a touch gesture. For example, if a touch gesture is maintained for more than a predefined time, the
control unit 140 may change the time data being displayed according to a period. For example, if the predefined time is, for example, three seconds and the period is, for example, 100 ms, thecontrol unit 140 may change the time data being displayed once in 100 ms after the touch gesture is maintained for three seconds. The time data may continuously change until the touch gesture is finished, (e.g., until the contact is released). - The
memory unit 150 may store various types of application programs and data required for execution of functions in the device. For example, thememory unit 150 may store a time data to be displayed using a time data display mode, such as, for example, a schedule mode or a current time display mode. Additionally, thememory unit 150 may store mapping data to map a drag speed to a change value for a time data, as shown in TABLE 1. -
TABLE 1 Drag Speed Change Value 0.5~1 mm/ms 2 1~2 mm/ms 5 2~3 mm/ms 10 3~4 mm/ ms 20 4~5 mm/ ms 30 - As described above, the
control unit 140 may use a mapping data to determine a change in time data based on a drag speed and a drag gesture. Mapping data, as shown in TABLE 1, is exemplary only and should not be considered as limiting exemplary embodiments of the present invention. Mapping data may be set or adjusted by the user of the device. - Hereinafter, a method for changing time data in an electronic device according to a touch or drag gesture is described with reference to
FIG. 7 andFIG. 8 . -
FIG. 7 is a flow diagram illustrating a method for changing time data displayed on a touch screen of an electronic device according to exemplary embodiments of the present invention. - Referring to
FIG. 7 , when a user selects a time data display mode, such as, for example, a schedule mode or a current time display mode, by using a menu or a function key, thecontrol unit 140 may execute the time data display mode (step 710). - The
control unit 140 may then control thedisplay unit 120 to display a time data (step 720). The time data may be a time data corresponding to a time when the user selects the time data display mode and may be live time data. If the time data is related to a date, the time data can be displayed in three display blocks (e.g., 125 a, 125 b, and 125 c) of thedisplay unit 120 under the control of thecontrol unit 140. For example, when the date is Dec. 24, 2008, thecontrol unit 140 may output a time data ‘24’ corresponding to the day in the first display block 125 a, a time data ‘Dec’ corresponding to the month in thesecond display block 125 b, and a time data ‘2008’ corresponding to the year in thethird display block 125 c, respectively. - Next, the
control unit 140 may determine whether an input tool (e.g., a user's finger or a stylus pen) is detected on a surface of the touch screen 110 (step 730). If no touch of the input tool is detected, the process may return to step 720 and thecontrol unit 140 may continue to display the time data. - If a touch of the input tool is detected on the surface of the
touch screen 110, thecontrol unit 140 may determine adisplay block 125 at which the touch is detected, and may determine the time data displayed in the display block 125 (step 740). Instep 740, thecontrol unit 140 may also determine which touch zone in thedisplay block 125 is touched. - Next, the
control unit 140 may determine a directionality and a change in the time data. Thecontrol unit 140 may replace the time data displayed in thedisplay block 125 with new time data which may be determined according to the determined directionality and a change (step 750). The directionality of the time data may be determined depending on a location (i.e., touch point) of the touch or the drag direction. In addition, the change in time data may be determined according to the duration of the touch or the drag speed. Thestep 750 is further described in detail with reference toFIG. 8 .FIG. 8 is a flow diagram illustrating a process of changing time data according to exemplary embodiments of the present invention. - Referring to
FIG. 8 , thecontrol unit 140 may determine whether the touch point can move (step 810), (i.e., whether the user's input is a touch gesture or a drag gesture). If thecontrol unit 140 determines that the touch point can not move (e.g., if the user's input is a touch gesture), thecontrol unit 140 may further determine whether a touch gesture is maintained for more than a predefined time (step 820). - If a touch gesture is maintained for more than the predefined time, the
control unit 140 may continuously change the time data being displayed according to a period (step 830). The change directionality of time data can be based on a location of the touch zone at which the touch gesture is detected, and may be determined instep 830 as described above. If the touch gesture is continuously detected on the first display block 125 a displaying a time data, for example, ‘24,’ and if the change directionality is determined as a forward change, thecontrol unit 140 may change time data to following data (e.g., ‘25,’ ‘26,’ and/or ‘27’. If the change directionality is determined as backward change, thecontrol unit 140 may change time data to is the previous data (e.g., ‘23,’ ‘22,’ and ‘21.’) - Next, the
control unit 140 may determine whether a touch gesture is released from the touch screen (step 835). If the release is detected, thecontrol unit 140 may stop changing time data (step 840). If no release is detected, thecontroller 140 may return to step 830. Afterstep 840, thecontrol unit 140 may maintain the time data finally being displayed when changing time data is stopped (step 880). - If, in
step 820, the touch gesture is not maintained more than the predefined time, (e.g., if the touch gesture is released before the predefined time expires), thecontrol unit 140 may change the time data being displayed once (step 850). The change directionality of time data may be based on a location of the touch zone at which the touch gesture is detected, and may be determined as described above. If the touch gesture is detected on the first display block 125 a displaying time data, for example, ‘24,’ and if the change directionality is determined as a forward change, thecontrol unit 140 may change the time data once to the following data, for example, ‘25.’ If the change directionality is determined as a backward change, thecontrol unit 140 may change the time data once to the previous data, for example, ‘23.’ Thecontrol unit 140 may then display the changed time data (step 880). - If, in
step 810, the touch point move (e.g., if the user input is a drag gesture), thecontrol unit 140 may determine whether the drag gesture is released (step 860). If the drag gesture is released, thecontrol unit 140 may determine a drag direction and a drag speed as described above (step 865). For example, thecontrol unit 140 may obtain the drag direction and the drag distance by using contact and release coordinates corresponding to a start point and a finish point of the drag gesture. In addition, thecontrol unit 140 may obtain a drag time by using a start time and a finish time of the drag gesture. Thecontrol unit 140 may then calculate a drag speed by dividing the drag distance by the drag time. - Next, the
control unit 140 may change the time data by using both the drag direction and the drag speed (step 870). Thecontrol unit 140 may then display the changed time data (step 880). - For example, a touch point of a drag gesture may be detected to move from a
first touch zone 130 a to asecond touch zone 130 b in a first display block 125 a displaying a time data ‘24,’ and the drag speed may be 2.5 mm/ms., Thecontrol unit 140 may determine a change value by referring to a mapping data to map the drag speed range to a change value. If the drag speed 2.5 mm/ms corresponds to a change value ‘10,’ thecontrol unit 140 may replace the time data ‘24’ being displayed in the first display block 125 a with the tenth following data ‘3.’ If the date being displayed is, for example, Dec. 24, 2008, thesecond display block 125 b displaying ‘Dec’ and thethird display block 125 c displaying ‘2008’ may automatically be replaced with ‘Jan’ and ‘2009,’ respectively. - If a touch point of a drag gesture is detected to move from the
second touch zone 130 b to thefirst touch zone 130 a in the first display block 125 a, thecontrol unit 140 may change the current time data ‘24’ to the tenth foregoing data ‘14.’ - It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Claims (20)
1. A method, comprising:
displaying a time data in a display block, the display block being arranged on a touch screen;
detecting a touch gesture or a drag gesture on the display block; and
replacing the time data with a second time data according to the touch gesture or the drag gesture.
2. The method of claim 1 , wherein the time data is replaced with the second time data comprising a following time data, the following time data being displayed in response to detection of the touch gesture in a first touch zone in the display block, and wherein the time data is replaced with a the second time data comprising a previous time data, the previous time data being displayed in response to detection of the touch gesture in a second touch zone in the display block.
3. The method of claim 1 , wherein the time data is replaced with the second time data comprising a following time data, the following time data being displayed in response to detection of the drag gesture from a first touch zone to a second touch zone in the display block, and wherein the time data is replaced with the second time data comprising a previous time data, the previous time data being displayed in response to detection of the drag gesture from the second touch zone to the first touch zone in the display block.
4. The method of claim 1 , wherein the time data is continuously replaced if the touch gesture is maintained for more than a time threshold.
5. The method of claim 1 , wherein the second time data is determined according to a speed of the drag gesture.
6. The method of claim 1 , wherein the display block comprises a first touch zone and a second touch zone, the first touch zone and the second touch zone being disposed at an upper portion and a lower portion of the display block, or at a left portion and a right portion of the display block.
7. The method of claim 1 , wherein the time data and the second time data comprise at least one of:
a date comprising a day, a month, and a year;
a time comprising an hour, a minute, and a second; or
an interval of time comprising ante meridiem (A.M.), and post meridiem (P.M); and
a day of a week.
8. The method of claim 1 , wherein the time data is displayed in a time data display mode comprising a schedule mode to manage a schedule, an alarm mode to establish an alarm time, and a time display mode to display and to set a time.
9. A device, comprising:
a display unit comprising a display block to display a time data;
a touch sensor to detect a touch gesture or a drag gesture, the touch sensor being arranged in the display block; and
a control unit configured to control the display unit to display the time data, and configured to replace the time data in the display block with a second time data in response to detection of the touch gesture or the drag gesture.
10. The device of claim 9 , wherein the control unit is configured to replace the time data with a following time data in response to detection of the touch gesture in a first touch zone in the display block, and wherein the control unit is configured to replace the time data with a previous time data in response to detection of the touch gesture in a second touch zone in the display block.
11. The device of claim 9 , wherein the control unit is configured to replace the time data with a following time data in response to detection of the drag gesture from a first touch zone to a second touch zone in the display block, and wherein the control unit is configured to replace the time data with a previous time data in response to detection of the drag gesture from the second touch zone to the first touch zone in the display block.
12. The device of claim 9 , wherein the control unit is configured to continuously replace the time data if the touch gesture is maintained for more than a time threshold.
13. The device of claim 9 , wherein the control unit determines the second time data according to a speed of the drag gesture.
14. The device of claim 9 , wherein the display block has a first touch zone and a second touch zone, the first touch zone and the second touch zone are disposed at an upper portion and a lower portion of the display block, or at a left portion and a right portion of the display block.
15. The device of claim 9 , wherein the time data and the second time data comprise at least one of:
a date comprising a day, a month and a year;
a time comprising an hour, a minute and a second; or
an interval of time comprising ante meridiem (A.M.) and post meridiem (P.M.); and
a day of a week.
16. The device of claim 9 , wherein the time data is displayed in a time data display mode comprising a schedule mode to manage a schedule, an alarm mode to establish an alarm time, and a time display mode to display and to set a time.
17. A method, comprising:
receiving time data currently being indicated from a display block in time data display mode, the display block being arranged to a screen;
determining a value of a touch gesture or a drag gesture applied to the display block, wherein the value corresponds to a speed, a distance, a direction and a duration of the respective gesture and a coordinated gesture based on the touch gesture and the drag gesture; and
replacing the time data with new time data according to the value determined by the touch gesture, the drag gesture, and the coordinated gesture.
18. The method of claim 17 , wherein the amount of changing time data based on the replacing the time data with new time data is determined according to a speed of the drag gesture or a duration of the touch gesture.
19. The method of claim 17 , wherein the replacing the time data with new time data is determined according to the touch gesture maintained during a time threshold.
20. The method of claim 17 , wherein the time data and the new time data comprise at least one of:
a date comprising a day, a month, and a year;
a time comprising an hour, a minute, and a second; or
an interval of time comprising ante meridiem (A.M.), and post meridiem (P.M); and
a day of a week.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/353,253 US10031665B2 (en) | 2008-12-22 | 2016-11-16 | Electronic device having touch screen and method for changing data displayed on the touch screen |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020080131299A KR101545880B1 (en) | 2008-12-22 | 2008-12-22 | Terminal having touch screen and method for displaying data thereof |
KR10-2008-0131299 | 2008-12-22 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/353,253 Continuation US10031665B2 (en) | 2008-12-22 | 2016-11-16 | Electronic device having touch screen and method for changing data displayed on the touch screen |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100156833A1 true US20100156833A1 (en) | 2010-06-24 |
Family
ID=41667445
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/643,538 Abandoned US20100156833A1 (en) | 2008-12-22 | 2009-12-21 | Electronic device having touch screen and method for changing data displayed on the touch screen |
US15/353,253 Active US10031665B2 (en) | 2008-12-22 | 2016-11-16 | Electronic device having touch screen and method for changing data displayed on the touch screen |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/353,253 Active US10031665B2 (en) | 2008-12-22 | 2016-11-16 | Electronic device having touch screen and method for changing data displayed on the touch screen |
Country Status (3)
Country | Link |
---|---|
US (2) | US20100156833A1 (en) |
EP (1) | EP2199897A3 (en) |
KR (1) | KR101545880B1 (en) |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101907947A (en) * | 2010-09-01 | 2010-12-08 | 无敌科技(西安)有限公司 | Touch-control identification system and method thereof |
US20120030634A1 (en) * | 2010-07-30 | 2012-02-02 | Reiko Miyazaki | Information processing device, information processing method, and information processing program |
US20120098836A1 (en) * | 2010-10-25 | 2012-04-26 | Samsung Electroncs Co., Ltd. | Method and apparatus for turning pages in e-book reader |
WO2013191408A1 (en) * | 2012-06-22 | 2013-12-27 | Samsung Electronics Co., Ltd. | Method for improving touch recognition and electronic device thereof |
US20140298258A1 (en) * | 2013-03-28 | 2014-10-02 | Microsoft Corporation | Switch List Interactions |
US20140304664A1 (en) * | 2013-04-03 | 2014-10-09 | Lg Electronics Inc. | Portable device and method for controlling the same |
US20150002436A1 (en) * | 2012-03-15 | 2015-01-01 | Sony Corporation | Information processing apparatus, method, and non-transitory computer-readable medium |
CN105827799A (en) * | 2015-08-28 | 2016-08-03 | 维沃移动通信有限公司 | Alarm reminding method of terminal equipment and terminal equipment |
US20160283048A1 (en) * | 2014-08-08 | 2016-09-29 | Rakuten, Inc. | Data input system, data input method, data input program, and data input device |
US9459781B2 (en) | 2014-08-02 | 2016-10-04 | Apple Inc. | Context-specific user interfaces for displaying animated sequences |
US9841884B2 (en) | 2014-02-12 | 2017-12-12 | Visteon Global Technologies, Inc. | Providing a single-action multi-mode interface |
US9916075B2 (en) | 2015-06-05 | 2018-03-13 | Apple Inc. | Formatting content for a reduced-size user interface |
USD825584S1 (en) | 2017-03-29 | 2018-08-14 | Becton, Dickinson And Company | Display screen or portion thereof with transitional graphical user interface |
US10055121B2 (en) | 2015-03-07 | 2018-08-21 | Apple Inc. | Activity based thresholds and feedbacks |
US10254948B2 (en) | 2014-09-02 | 2019-04-09 | Apple Inc. | Reduced-size user interfaces for dynamically updated application overviews |
US20190121537A1 (en) * | 2016-05-12 | 2019-04-25 | Beijing Kingsoft Internet Security Software Co., Ltd. | Information displaying method and device, and electronic device |
US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
US10304347B2 (en) | 2015-08-20 | 2019-05-28 | Apple Inc. | Exercised-based watch face and complications |
US10452253B2 (en) | 2014-08-15 | 2019-10-22 | Apple Inc. | Weather user interface |
US10613745B2 (en) | 2014-09-02 | 2020-04-07 | Apple Inc. | User interface for receiving user input |
US10620590B1 (en) | 2019-05-06 | 2020-04-14 | Apple Inc. | Clock faces for an electronic device |
US10771606B2 (en) | 2014-09-02 | 2020-09-08 | Apple Inc. | Phone user interface |
US10802703B2 (en) | 2015-03-08 | 2020-10-13 | Apple Inc. | Sharing user-configurable graphical constructs |
US10838586B2 (en) | 2017-05-12 | 2020-11-17 | Apple Inc. | Context-specific user interfaces |
US10852905B1 (en) | 2019-09-09 | 2020-12-01 | Apple Inc. | Techniques for managing display usage |
US10872318B2 (en) | 2014-06-27 | 2020-12-22 | Apple Inc. | Reduced size user interface |
US11061372B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | User interfaces related to time |
US11301130B2 (en) | 2019-05-06 | 2022-04-12 | Apple Inc. | Restricted operation of an electronic device |
US11327650B2 (en) | 2018-05-07 | 2022-05-10 | Apple Inc. | User interfaces having a collection of complications |
US11372659B2 (en) | 2020-05-11 | 2022-06-28 | Apple Inc. | User interfaces for managing user interface sharing |
US20220357838A1 (en) * | 2010-12-22 | 2022-11-10 | Google Llc | Video player with assisted seek |
US11526256B2 (en) | 2020-05-11 | 2022-12-13 | Apple Inc. | User interfaces for managing user interface sharing |
US11604571B2 (en) | 2014-07-21 | 2023-03-14 | Apple Inc. | Remote user interface |
US11694590B2 (en) | 2020-12-21 | 2023-07-04 | Apple Inc. | Dynamic user interface with time indicator |
US11720239B2 (en) | 2021-01-07 | 2023-08-08 | Apple Inc. | Techniques for user interfaces related to an event |
US11921992B2 (en) | 2021-05-14 | 2024-03-05 | Apple Inc. | User interfaces related to time |
US11960701B2 (en) | 2019-05-06 | 2024-04-16 | Apple Inc. | Using an illustration to show the passing of time |
US12045014B2 (en) | 2022-01-24 | 2024-07-23 | Apple Inc. | User interfaces for indicating time |
US12175065B2 (en) | 2016-06-10 | 2024-12-24 | Apple Inc. | Context-specific user interfaces for relocating one or more complications in a watch or clock interface |
US12182373B2 (en) | 2021-04-27 | 2024-12-31 | Apple Inc. | Techniques for managing display usage |
US12229396B2 (en) | 2024-03-01 | 2025-02-18 | Apple Inc. | Weather user interface |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101475021B1 (en) * | 2012-08-21 | 2014-12-22 | 김원섭 | Apparatus having touch screen and method for controlling touch screen |
US20140075379A1 (en) * | 2012-09-11 | 2014-03-13 | Erich Schlaepfer | Direct character display control |
WO2014204069A1 (en) * | 2013-06-21 | 2014-12-24 | 주식회사 데이투라이프 | Apparatus and method for controlling user schedule display |
KR20150078315A (en) * | 2013-12-30 | 2015-07-08 | 삼성전자주식회사 | Method For Displaying User Interface And Electronic Device Using The Same |
CN109213413A (en) | 2017-07-07 | 2019-01-15 | 阿里巴巴集团控股有限公司 | A kind of recommended method, device, equipment and storage medium |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5563996A (en) * | 1992-04-13 | 1996-10-08 | Apple Computer, Inc. | Computer note pad including gesture based note division tools and method |
US6489951B1 (en) * | 1995-06-07 | 2002-12-03 | Microsoft Corporation | Method and system for providing touch-sensitive screens for the visually impaired |
US20040196267A1 (en) * | 2003-04-02 | 2004-10-07 | Fujitsu Limited | Information processing apparatus operating in touch panel mode and pointing device mode |
US20060093177A1 (en) * | 2004-10-18 | 2006-05-04 | Smk Corporation | Microphone attachment device |
US20080094370A1 (en) * | 2006-09-06 | 2008-04-24 | Bas Ording | Portable Electronic Device Performing Similar Operations for Different Gestures |
US20080165150A1 (en) * | 2007-01-04 | 2008-07-10 | Samsung Electronics Co., Ltd. | Data scrolling apparatus and method for mobile terminal |
US20080165151A1 (en) * | 2007-01-07 | 2008-07-10 | Lemay Stephen O | System and Method for Viewing and Managing Calendar Entries |
US20080165149A1 (en) * | 2007-01-07 | 2008-07-10 | Andrew Emilio Platzer | System, Method, and Graphical User Interface for Inputting Date and Time Information on a Portable Multifunction Device |
US20080174562A1 (en) * | 2007-01-20 | 2008-07-24 | Lg Electronics Inc. | Mobile electronic apparatus with touch input device and display method using the same |
US20090174680A1 (en) * | 2008-01-06 | 2009-07-09 | Freddy Allen Anzures | Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars |
US20100060586A1 (en) * | 2008-09-05 | 2010-03-11 | Pisula Charles J | Portable touch screen device, method, and graphical user interface for providing workout support |
US20100123734A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Corporation | Image processing apparatus, image processing method, and image display program |
US20100162105A1 (en) * | 2008-12-19 | 2010-06-24 | Palm, Inc. | Access and management of cross-platform calendars |
US20100164895A1 (en) * | 2008-12-31 | 2010-07-01 | Samsung Electronics Co., Ltd. | Apparatus and method for performing scroll function in portable terminal |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7312785B2 (en) | 2001-10-22 | 2007-12-25 | Apple Inc. | Method and apparatus for accelerated scrolling |
TWI238348B (en) * | 2002-05-13 | 2005-08-21 | Kyocera Corp | Portable information terminal, display control device, display control method, and recording media |
KR100877829B1 (en) * | 2006-03-21 | 2009-01-12 | 엘지전자 주식회사 | A terminal having a scrolling function and a scrolling method thereof |
KR100826194B1 (en) * | 2006-07-27 | 2008-04-30 | 엘지전자 주식회사 | Touch panel remote controller and how to perform functions on this touch panel remote controller |
KR101239797B1 (en) | 2007-02-07 | 2013-03-06 | 엘지전자 주식회사 | Electronic Device With Touch Screen And Method Of Providing Analog Clock Using Same |
-
2008
- 2008-12-22 KR KR1020080131299A patent/KR101545880B1/en active IP Right Grant
-
2009
- 2009-12-21 US US12/643,538 patent/US20100156833A1/en not_active Abandoned
- 2009-12-21 EP EP20090180182 patent/EP2199897A3/en not_active Ceased
-
2016
- 2016-11-16 US US15/353,253 patent/US10031665B2/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5563996A (en) * | 1992-04-13 | 1996-10-08 | Apple Computer, Inc. | Computer note pad including gesture based note division tools and method |
US6489951B1 (en) * | 1995-06-07 | 2002-12-03 | Microsoft Corporation | Method and system for providing touch-sensitive screens for the visually impaired |
US20040196267A1 (en) * | 2003-04-02 | 2004-10-07 | Fujitsu Limited | Information processing apparatus operating in touch panel mode and pointing device mode |
US20060093177A1 (en) * | 2004-10-18 | 2006-05-04 | Smk Corporation | Microphone attachment device |
US20080094370A1 (en) * | 2006-09-06 | 2008-04-24 | Bas Ording | Portable Electronic Device Performing Similar Operations for Different Gestures |
US20080165150A1 (en) * | 2007-01-04 | 2008-07-10 | Samsung Electronics Co., Ltd. | Data scrolling apparatus and method for mobile terminal |
US20080165151A1 (en) * | 2007-01-07 | 2008-07-10 | Lemay Stephen O | System and Method for Viewing and Managing Calendar Entries |
US20080165149A1 (en) * | 2007-01-07 | 2008-07-10 | Andrew Emilio Platzer | System, Method, and Graphical User Interface for Inputting Date and Time Information on a Portable Multifunction Device |
US20080174562A1 (en) * | 2007-01-20 | 2008-07-24 | Lg Electronics Inc. | Mobile electronic apparatus with touch input device and display method using the same |
US20090174680A1 (en) * | 2008-01-06 | 2009-07-09 | Freddy Allen Anzures | Portable Multifunction Device, Method, and Graphical User Interface for Viewing and Managing Electronic Calendars |
US8327272B2 (en) * | 2008-01-06 | 2012-12-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US20100060586A1 (en) * | 2008-09-05 | 2010-03-11 | Pisula Charles J | Portable touch screen device, method, and graphical user interface for providing workout support |
US20100123734A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Corporation | Image processing apparatus, image processing method, and image display program |
US20100162105A1 (en) * | 2008-12-19 | 2010-06-24 | Palm, Inc. | Access and management of cross-platform calendars |
US20100164895A1 (en) * | 2008-12-31 | 2010-07-01 | Samsung Electronics Co., Ltd. | Apparatus and method for performing scroll function in portable terminal |
Cited By (86)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10782869B2 (en) | 2010-07-30 | 2020-09-22 | Line Corporation | Information processing device, information processing method, and information processing program for selectively changing a value or a change speed of the value by a user operation |
US20120030634A1 (en) * | 2010-07-30 | 2012-02-02 | Reiko Miyazaki | Information processing device, information processing method, and information processing program |
US9747016B2 (en) * | 2010-07-30 | 2017-08-29 | Line Corporation | Information processing device, information processing method, and information processing program for selectively changing a value or a change speed of the value by a user operation |
US11740779B2 (en) | 2010-07-30 | 2023-08-29 | Line Corporation | Information processing device, information processing method, and information processing program for selectively performing display control operations |
CN101907947A (en) * | 2010-09-01 | 2010-12-08 | 无敌科技(西安)有限公司 | Touch-control identification system and method thereof |
US20120098836A1 (en) * | 2010-10-25 | 2012-04-26 | Samsung Electroncs Co., Ltd. | Method and apparatus for turning pages in e-book reader |
US12216893B2 (en) * | 2010-12-22 | 2025-02-04 | Google Llc | Video player with assisted seek |
US20220357838A1 (en) * | 2010-12-22 | 2022-11-10 | Google Llc | Video player with assisted seek |
US20150002436A1 (en) * | 2012-03-15 | 2015-01-01 | Sony Corporation | Information processing apparatus, method, and non-transitory computer-readable medium |
US9588607B2 (en) | 2012-06-22 | 2017-03-07 | Samsung Electronics Co., Ltd. | Method for improving touch recognition and electronic device thereof |
WO2013191408A1 (en) * | 2012-06-22 | 2013-12-27 | Samsung Electronics Co., Ltd. | Method for improving touch recognition and electronic device thereof |
US20140298258A1 (en) * | 2013-03-28 | 2014-10-02 | Microsoft Corporation | Switch List Interactions |
US20140304664A1 (en) * | 2013-04-03 | 2014-10-09 | Lg Electronics Inc. | Portable device and method for controlling the same |
US9841884B2 (en) | 2014-02-12 | 2017-12-12 | Visteon Global Technologies, Inc. | Providing a single-action multi-mode interface |
US11720861B2 (en) | 2014-06-27 | 2023-08-08 | Apple Inc. | Reduced size user interface |
US10872318B2 (en) | 2014-06-27 | 2020-12-22 | Apple Inc. | Reduced size user interface |
US11250385B2 (en) | 2014-06-27 | 2022-02-15 | Apple Inc. | Reduced size user interface |
US11604571B2 (en) | 2014-07-21 | 2023-03-14 | Apple Inc. | Remote user interface |
US12093515B2 (en) | 2014-07-21 | 2024-09-17 | Apple Inc. | Remote user interface |
US9459781B2 (en) | 2014-08-02 | 2016-10-04 | Apple Inc. | Context-specific user interfaces for displaying animated sequences |
US10606458B2 (en) | 2014-08-02 | 2020-03-31 | Apple Inc. | Clock face generation based on contact on an affordance in a clock face selection mode |
US11740776B2 (en) | 2014-08-02 | 2023-08-29 | Apple Inc. | Context-specific user interfaces |
US9547425B2 (en) * | 2014-08-02 | 2017-01-17 | Apple Inc. | Context-specific user interfaces |
US9582165B2 (en) * | 2014-08-02 | 2017-02-28 | Apple Inc. | Context-specific user interfaces |
US9804759B2 (en) | 2014-08-02 | 2017-10-31 | Apple Inc. | Context-specific user interfaces |
US10990270B2 (en) | 2014-08-02 | 2021-04-27 | Apple Inc. | Context-specific user interfaces |
US10496259B2 (en) | 2014-08-02 | 2019-12-03 | Apple Inc. | Context-specific user interfaces |
US20160283048A1 (en) * | 2014-08-08 | 2016-09-29 | Rakuten, Inc. | Data input system, data input method, data input program, and data input device |
US10042515B2 (en) * | 2014-08-08 | 2018-08-07 | Rakuten, Inc. | Using genture direction to input data into multiple spin dial list boxes |
US10452253B2 (en) | 2014-08-15 | 2019-10-22 | Apple Inc. | Weather user interface |
US11042281B2 (en) | 2014-08-15 | 2021-06-22 | Apple Inc. | Weather user interface |
US11922004B2 (en) | 2014-08-15 | 2024-03-05 | Apple Inc. | Weather user interface |
US11550465B2 (en) | 2014-08-15 | 2023-01-10 | Apple Inc. | Weather user interface |
US10254948B2 (en) | 2014-09-02 | 2019-04-09 | Apple Inc. | Reduced-size user interfaces for dynamically updated application overviews |
US10613745B2 (en) | 2014-09-02 | 2020-04-07 | Apple Inc. | User interface for receiving user input |
US10613743B2 (en) | 2014-09-02 | 2020-04-07 | Apple Inc. | User interface for receiving user input |
US11700326B2 (en) | 2014-09-02 | 2023-07-11 | Apple Inc. | Phone user interface |
US10771606B2 (en) | 2014-09-02 | 2020-09-08 | Apple Inc. | Phone user interface |
US10409483B2 (en) | 2015-03-07 | 2019-09-10 | Apple Inc. | Activity based thresholds for providing haptic feedback |
US10055121B2 (en) | 2015-03-07 | 2018-08-21 | Apple Inc. | Activity based thresholds and feedbacks |
US10802703B2 (en) | 2015-03-08 | 2020-10-13 | Apple Inc. | Sharing user-configurable graphical constructs |
US12019862B2 (en) | 2015-03-08 | 2024-06-25 | Apple Inc. | Sharing user-configurable graphical constructs |
US10572132B2 (en) | 2015-06-05 | 2020-02-25 | Apple Inc. | Formatting content for a reduced-size user interface |
US9916075B2 (en) | 2015-06-05 | 2018-03-13 | Apple Inc. | Formatting content for a reduced-size user interface |
US11908343B2 (en) | 2015-08-20 | 2024-02-20 | Apple Inc. | Exercised-based watch face and complications |
US10304347B2 (en) | 2015-08-20 | 2019-05-28 | Apple Inc. | Exercised-based watch face and complications |
US11580867B2 (en) | 2015-08-20 | 2023-02-14 | Apple Inc. | Exercised-based watch face and complications |
CN105827799A (en) * | 2015-08-28 | 2016-08-03 | 维沃移动通信有限公司 | Alarm reminding method of terminal equipment and terminal equipment |
US20190121537A1 (en) * | 2016-05-12 | 2019-04-25 | Beijing Kingsoft Internet Security Software Co., Ltd. | Information displaying method and device, and electronic device |
US12175065B2 (en) | 2016-06-10 | 2024-12-24 | Apple Inc. | Context-specific user interfaces for relocating one or more complications in a watch or clock interface |
US11660503B2 (en) | 2016-06-11 | 2023-05-30 | Apple Inc. | Activity and workout updates |
US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
US11918857B2 (en) | 2016-06-11 | 2024-03-05 | Apple Inc. | Activity and workout updates |
US11161010B2 (en) | 2016-06-11 | 2021-11-02 | Apple Inc. | Activity and workout updates |
US11148007B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Activity and workout updates |
USD825584S1 (en) | 2017-03-29 | 2018-08-14 | Becton, Dickinson And Company | Display screen or portion thereof with transitional graphical user interface |
US11327634B2 (en) | 2017-05-12 | 2022-05-10 | Apple Inc. | Context-specific user interfaces |
US10838586B2 (en) | 2017-05-12 | 2020-11-17 | Apple Inc. | Context-specific user interfaces |
US11775141B2 (en) | 2017-05-12 | 2023-10-03 | Apple Inc. | Context-specific user interfaces |
US11977411B2 (en) | 2018-05-07 | 2024-05-07 | Apple Inc. | Methods and systems for adding respective complications on a user interface |
US11327650B2 (en) | 2018-05-07 | 2022-05-10 | Apple Inc. | User interfaces having a collection of complications |
US10620590B1 (en) | 2019-05-06 | 2020-04-14 | Apple Inc. | Clock faces for an electronic device |
US11340757B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Clock faces for an electronic device |
US11131967B2 (en) | 2019-05-06 | 2021-09-28 | Apple Inc. | Clock faces for an electronic device |
US11960701B2 (en) | 2019-05-06 | 2024-04-16 | Apple Inc. | Using an illustration to show the passing of time |
US11301130B2 (en) | 2019-05-06 | 2022-04-12 | Apple Inc. | Restricted operation of an electronic device |
US11340778B2 (en) | 2019-05-06 | 2022-05-24 | Apple Inc. | Restricted operation of an electronic device |
US10788797B1 (en) | 2019-05-06 | 2020-09-29 | Apple Inc. | Clock faces for an electronic device |
US10852905B1 (en) | 2019-09-09 | 2020-12-01 | Apple Inc. | Techniques for managing display usage |
US10936345B1 (en) | 2019-09-09 | 2021-03-02 | Apple Inc. | Techniques for managing display usage |
US10908559B1 (en) | 2019-09-09 | 2021-02-02 | Apple Inc. | Techniques for managing display usage |
US10878782B1 (en) | 2019-09-09 | 2020-12-29 | Apple Inc. | Techniques for managing display usage |
US12008230B2 (en) | 2020-05-11 | 2024-06-11 | Apple Inc. | User interfaces related to time with an editable background |
US11442414B2 (en) | 2020-05-11 | 2022-09-13 | Apple Inc. | User interfaces related to time |
US11526256B2 (en) | 2020-05-11 | 2022-12-13 | Apple Inc. | User interfaces for managing user interface sharing |
US11372659B2 (en) | 2020-05-11 | 2022-06-28 | Apple Inc. | User interfaces for managing user interface sharing |
US11842032B2 (en) | 2020-05-11 | 2023-12-12 | Apple Inc. | User interfaces for managing user interface sharing |
US11061372B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | User interfaces related to time |
US11822778B2 (en) | 2020-05-11 | 2023-11-21 | Apple Inc. | User interfaces related to time |
US12099713B2 (en) | 2020-05-11 | 2024-09-24 | Apple Inc. | User interfaces related to time |
US11694590B2 (en) | 2020-12-21 | 2023-07-04 | Apple Inc. | Dynamic user interface with time indicator |
US11720239B2 (en) | 2021-01-07 | 2023-08-08 | Apple Inc. | Techniques for user interfaces related to an event |
US12182373B2 (en) | 2021-04-27 | 2024-12-31 | Apple Inc. | Techniques for managing display usage |
US11921992B2 (en) | 2021-05-14 | 2024-03-05 | Apple Inc. | User interfaces related to time |
US12045014B2 (en) | 2022-01-24 | 2024-07-23 | Apple Inc. | User interfaces for indicating time |
US12229396B2 (en) | 2024-03-01 | 2025-02-18 | Apple Inc. | Weather user interface |
Also Published As
Publication number | Publication date |
---|---|
EP2199897A2 (en) | 2010-06-23 |
US20170068441A1 (en) | 2017-03-09 |
US10031665B2 (en) | 2018-07-24 |
KR101545880B1 (en) | 2015-08-21 |
KR20100072789A (en) | 2010-07-01 |
EP2199897A3 (en) | 2011-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10031665B2 (en) | Electronic device having touch screen and method for changing data displayed on the touch screen | |
US11150775B2 (en) | Electronic device and method for controlling screen display using temperature and humidity | |
US20200333951A1 (en) | Mobile terminal having dual touch screen and method of controlling content therein | |
KR101973631B1 (en) | Electronic Device And Method Of Controlling The Same | |
US20200301567A1 (en) | User interfaces for viewing and accessing content on an electronic device | |
US20170205894A1 (en) | Method and device for switching tasks | |
US20110216095A1 (en) | Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces | |
CN102667697A (en) | User interface control with edge sensor for finger touch and motion sensing | |
CN107066167A (en) | A kind of regional selection method, device and graphic user interface | |
CN104364751A (en) | Electronic device and controlling method and program therefor | |
CN103092502A (en) | Method and apparatus for providing user interface in portable device | |
US20140035853A1 (en) | Method and apparatus for providing user interaction based on multi touch finger gesture | |
US20140022182A1 (en) | Techniques for programmable button on bezel of mobile terminal | |
CN105630307A (en) | Apparatus and method for displaying a plurality of applications on mobile terminal | |
CN103150093B (en) | The method of a kind of operation indicating identifier movement, device and terminal | |
CN105094527A (en) | Icon exchanging method and device | |
CN110275653A (en) | Page display method, device, terminal and storage medium | |
CN106201317A (en) | Icon word Zoom method, device and terminal unit | |
CN107390931A (en) | Response control mehtod, device, storage medium and the mobile terminal of touch operation | |
EP2846239B1 (en) | Apparatus and method for executing function in electronic device | |
CN106933481A (en) | A kind of screen scroll method and device | |
CN103150111A (en) | Symbol inputting method, device and terminal | |
CN110502169B (en) | Display control method and terminal | |
KR20150026615A (en) | Method for providing schedule management and mobile device thereof | |
CN105892918A (en) | Mobile terminal with touch screen and control method of mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD.,KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, EUN SUN;PARK, KYUNG DAE;KIM, BYUNG JOO;AND OTHERS;REEL/FRAME:024042/0069 Effective date: 20091218 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |