[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR102028656B1 - Terminal and method for operating thereof - Google Patents

Terminal and method for operating thereof Download PDF

Info

Publication number
KR102028656B1
KR102028656B1 KR1020120111797A KR20120111797A KR102028656B1 KR 102028656 B1 KR102028656 B1 KR 102028656B1 KR 1020120111797 A KR1020120111797 A KR 1020120111797A KR 20120111797 A KR20120111797 A KR 20120111797A KR 102028656 B1 KR102028656 B1 KR 102028656B1
Authority
KR
South Korea
Prior art keywords
area
icon
boundary area
user interface
displayed
Prior art date
Application number
KR1020120111797A
Other languages
Korean (ko)
Other versions
KR20140045718A (en
Inventor
이현정
김종환
정재우
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120111797A priority Critical patent/KR102028656B1/en
Publication of KR20140045718A publication Critical patent/KR20140045718A/en
Application granted granted Critical
Publication of KR102028656B1 publication Critical patent/KR102028656B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a terminal and a method for operating the terminal, and more particularly, to a terminal for providing a boundary area including a menu and a method of operating the terminal. According to an embodiment of the present disclosure, a terminal may include a user input unit configured to receive a boundary area start command or an icon selection command for selecting an icon included in the boundary area; A display unit displaying the boundary area and the active area; And when the user input unit receives the border area start command, displays the border area on the display unit. When the user input unit receives the icon selection command, the user input unit performs an operation corresponding to the selected icon. It includes a control unit for displaying a result of performing the operation.

Description

Terminal and method of operating the terminal {TERMINAL AND METHOD FOR OPERATING THEREOF}

The present invention relates to a terminal and a method for operating the terminal, and more particularly, to a terminal for providing a boundary area including a menu and a method of operating the terminal.

Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

As the terminal functions are diversified, for example, such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.

In order to support and increase the function of such a terminal, it may be considered to improve the structural part and / or the software part of the terminal.

As mobile terminals having relatively large display units such as touch-based mobile phones, e-books, smart pads, and tablet PCs and high-performance CPUs appear, it becomes difficult for a user to touch the entire display unit with one hand.

It is an object of the present invention to provide a terminal and a method of operating the same in which an area for receiving a user touch input is diversified.

According to an embodiment of the present disclosure, a terminal may include a user input unit configured to receive a boundary area start command or an icon selection command for selecting an icon included in the boundary area; A display unit displaying the boundary area and the active area; And when the user input unit receives the border area start command, displays the border area on the display unit. When the user input unit receives the icon selection command, the user input unit performs an operation corresponding to the selected icon. It includes a control unit for displaying a result of performing the operation.

According to an embodiment of the present disclosure, a method of operating a terminal may include: receiving a boundary area start command, displaying a boundary area including at least one icon based on the boundary area start command, and an icon included in the boundary area Receiving an icon selection command for selecting one of the icons, performing an operation corresponding to the icon selected according to the icon selection command, and displaying a result of performing the operation on an active area.

According to the terminal and its operation method according to the present invention, a user touch input can be received in various areas.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a view for explaining a boundary area according to an exemplary embodiment of the present invention.
3 is a diagram for describing a method of starting a boundary region according to an exemplary embodiment of the present invention.
4 is a flowchart illustrating a method of operating a terminal according to an exemplary embodiment of the present invention.
5 is a diagram illustrating a selection menu for a user input provided in a boundary area according to an embodiment of the present invention.
FIG. 6 is a diagram illustrating a keyboard menu for user input provided in a boundary area according to an embodiment of the present invention.
7 is a diagram illustrating an existing task list provided in a boundary area according to an embodiment of the present invention.
FIG. 8 is a diagram for describing a method of differently displaying a portion activated in a non-boundary area when selecting a boundary area according to a user input according to an exemplary embodiment.

Hereinafter, a mobile terminal according to the present invention will be described in more detail with reference to the accompanying drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.

Next, a structure of a mobile terminal according to an embodiment of the present invention will be described with reference to FIG. 1.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface. The unit 170, the controller 180, and the power supply unit 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the use environment.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, and a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the implementation form of the mobile terminal 100. For example, a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen. The proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. The proximity sensor 141 has a longer life and higher utilization than a contact sensor.

Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to the vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.). The memory 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM (Random Access Memory, RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. The mobile terminal 100 may operate in connection with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100. The identification module includes a user identification module (UIM), a subscriber identity module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. A device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.

The interface unit may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various command signals input from the cradle by a user may be transferred. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

The controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. These may be implemented by the controller 180.

In a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed. The software code may be implemented by a software application written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

Next, a boundary area of a mobile terminal according to an embodiment of the present invention will be described with reference to FIG. 2.

2 is a view for explaining a boundary area according to an exemplary embodiment of the present invention.

Referring to FIG. 2, the display unit 151 of the mobile terminal 100 displays an active area 250 composed of at least one page. For example, when the controller 180 executes one application, the display unit 151 may display the corresponding application on one page.

According to an embodiment of the present disclosure, the user interface may be displayed on the edge area of the entire area of the display unit 151. Hereinafter, "edge area of the entire area of the display unit 151" and "boundary area 210" are used in the same meaning.

The boundary area 210 may mean an area including an edge of the display unit 151 in a physical sense or an area including an edge of the active area 250 of the display unit 151 described above, but is not limited thereto. Do not.

When the portrait mode is executed, the mobile terminal 100 may display the corresponding page in the vertical direction on the display unit 151. In this case, the boundary area 210 may be the mobile terminal 100 or the display unit 151. It may be formed on the side of the. The boundary area 210 can include at least one user interface 212, 214.

In this case, the user interfaces 212 and 214 included in the boundary area 210 may function the same as the user interfaces 252 and 254 included in the active area 250. For example, the display unit 151 may receive an OK button 254 in the active area 250 to receive a user input for installing a specific application or a user input for canceling the installation of a specific application. And a Cancle button 252 may be displayed. In this case, the display unit 151 may display the OK button 214 having the same function as the OK button 254 displayed on the active area 250 and the Cancle button 252 displayed on the active area 250 in the boundary area 210. Cancle button 212 that performs the same function as can be displayed. The user selects an OK button 254 displayed on the active area 250 of the display unit or an OK button 214 displayed on the boundary area 210 to input a specific application (A) installation command to the mobile terminal 100. Can be. Alternatively, the user selects the Cancle button 252 displayed on the active area 250 of the display unit or the Cancle button 212 displayed on the boundary area 210 to instruct the mobile terminal 100 to cancel a specific application (Application A) installation. You can also type

When the landscape mode is executed, the mobile terminal 100 may change the orientation of the corresponding page to be displayed horizontally on the display unit 151. In this case, the boundary area 240 may be formed on the side surface of the mobile terminal 100 or the display unit 151, and may include at least one user interface 216, 218, and 220.

In this case, the user interfaces 216, 218, and 220 included in the boundary area 240 may function in the same manner as the user interfaces 256, 258, and 260 included in the active area 250. For example, the display unit 151 may display a list including a plurality of music information when an application (B) providing a music service is executed. The list may include a first sound source indicator 256, a second sound source indicator 258, and a third sound source indicator 260. In this case, the display unit 151 performs the same function as the first icon 216 and the second sound source indicator 258 on the boundary area 240 according to a user input, having the same function as the first sound source indicator 256. The third icon 220 having the same function as the second icon 218 and the third sound source indicator 260 may be displayed. The user may select a first sound source indicator 256 displayed on the active area 250 of the display unit or a first icon 216 displayed on the boundary area 240 to input a first sound source playback command. Alternatively, the user may select the second sound source indicator 258 displayed on the active area 250 of the display unit or the second icon 218 displayed on the boundary area 240 to input a second sound source playback command, or A third sound source reproduction command may be input by selecting the third sound source indicator 260 displayed in the region 250 or the third icon 220 displayed in the boundary region 240.

According to the exemplary embodiment of the present invention, for example, when the user grabs the mobile terminal 100 with one hand and wants to input a touch input to a user interface displayed in an area where the finger of the hand does not touch, the display unit 151 By displaying the boundary areas 210 and 240 including the user interface having the same function, a touch may be input to a desired user interface using a finger of a hand holding the mobile terminal 100.

Meanwhile, the boundary areas 210 and 240 including the user interface for performing the same function as the user interface displayed on the active area 250 may be displayed on the display unit 151 only at a certain point of time such as when a user input is required. It may be. Hereinafter, referring to FIG. 3, a method of starting the boundary area on the display unit 151 to display the boundary areas 210 and 240 at a certain point in time will be described.

3 is a diagram for describing a method of starting a boundary region according to an exemplary embodiment of the present invention.

Referring to FIG. 3, for example, in the case of a user touch input, the user touches the edges 201 and 203 of both sides of the touch pad in the portrait mode of the mobile terminal 100, and then moves by a predetermined distance without interrupting the touch input. In this way, the boundary area start command can be input. Hereinafter, "a user touch moving a certain distance without interrupting touch input" and "drag" are used in the same sense.

For example, in the case of a touch input using a user's finger, the user touches an area including the left edge 201 of the touch pad with a finger, and then drags it to the right side of the display unit 151, thereby leaving the left side of the touch pad. The mobile terminal 100 may input a command to start the boundary area 210 in some areas. Although not shown in the drawing, the user touches an area including the right edge 203 of the touch pad with a finger and drags it to the left side of the display unit 151, whereby the boundary area 210 is located in a part of the right side of the touch pad. The command to start the mobile terminal 100 may be input.

In this case, the display unit 151 that receives the boundary region start command from the user may start the boundary regions 210 and 240 in some regions of the touch pad and display the active region 250 in the remaining regions. The display unit 151 may display the active area 250 in the remaining areas except the boundary areas 210 and 240, such as a page turning screen for displaying a plurality of pages on the screen, respectively. Only some of them may be displayed. For example, when the boundary area 210 is initiated on the left side of the touch pad, the active area 250 may be displayed as shown on the right side of the touch pad and pushed to the right by the width of the boundary area 210. It is not limited.

According to an embodiment of the present invention, unlike a general touch and drag input, also called a swipe input, a user input for initiating a boundary area may touch an area including a predefined edge on the mobile terminal 100. It can mean the input dragging. Therefore, when a touch and drag input is received, the display unit 151 including a plurality of pages determines whether or not a predefined edge is included in the area where the first touch input is received, and then includes the edge. The border region may be started, or the page turning operation may be performed when the edge is not included.

Hereinafter, a method of operating a terminal using a border area according to an embodiment of the present invention will be described with reference to FIG. 4.

4 is a flowchart illustrating a method of operating a terminal according to an exemplary embodiment of the present invention.

The mobile terminal 100 receives a user input for starting the boundary area (S101). The user input for starting the boundary area includes, but is not limited to, a boundary area start command such as an edge touch and drag input as described above.

Subsequently, the mobile terminal 100 starts the boundary area on the display unit 151 (S103). The boundary area may include an edge of the display unit 151 and is not limited to the edge of the side surface.

The mobile terminal 100 receives a user input for selecting an icon included in the boundary area (S105). The icon included in the boundary area may mean a user interface. The icon included in the boundary area may be displayed to perform the same function as the icon for receiving a user input in the active area. The icon included in the boundary area may be displayed in the same form as an icon for receiving a user input in the active area. Alternatively, the icon included in the boundary area may be displayed in a different form from the icon for receiving a user input in the active area, and in this case, the icon may be displayed to intuitively indicate that the two icons mean a user interface performing the same function. Can be.

When the icon displayed in the border area is selected, the mobile terminal 100 performs an operation corresponding to the selected icon (S107). The mobile terminal 100 may stop displaying the boundary area while performing the operation of step S107, but is not limited thereto.

Hereinafter, a user interface displayed on a boundary area according to an embodiment of the present invention will be described with reference to FIGS. 5 to 7. However, the user interface described with reference to FIGS. 5 to 7 is not limited thereto.

5 is a diagram illustrating a selection menu for a user input provided in a boundary area according to an embodiment of the present invention.

Referring to FIG. 5, the boundary area 210 may include a plurality of icons 215, 217, and 219. As described above, each of the plurality of icons 215, 217, and 219 included in the boundary area 210 may mean a plurality of icons 209, 211, and 213 included in the active area 250. Both the plurality of icons 215, 217, 219 included in the boundary area 210 and the plurality of icons 209, 211, 213 included in the active area 250 may receive a user input. Alternatively, when the boundary area 210 is started, only a plurality of icons 215, 217, and 219 included in the boundary area 210 may receive a user input, but is not limited thereto.

Reference numerals 231, 233, and 235 denote touch and drag inputs for starting the boundary area 210 and user input for selecting one icon from a plurality of icons 215, 217, and 219 displayed on the boundary area 210. Can be. That is, when the user touches the left edge of the touch pad and drags it to the right, the user moves the drag upwards, where the first button 215 is displayed, and the mobile terminal 100 transmits the user input of the active area 250. As in the case of receiving a user input of selecting the first button 209, an operation corresponding to the first button 209 may be performed. Similarly, when the user touches and drags the left edge of the touch pad to the right and moves the drag in the horizontal direction where the button 217 is displayed, the mobile terminal 100 transmits the user input to the active area 250. As in the case of receiving a user input of selecting the second button 211, the operation corresponding to the second button 211 may be performed. In addition, when the user touches the left edge of the touch pad and drags it to the right and moves the drag downward, which is the position where the button 219 is displayed, the mobile terminal 100 transmits the user input of the active area 250. As in the case of receiving a user input of selecting the third button 213, an operation corresponding to the third button 213 may be performed. In this case, if the user stops the touch input after the user touch and drag input corresponding to the boundary area 210 start command, the start of the boundary area 210 may be terminated even before the user selects one of the plurality of buttons. Can be.

In this case, the first button 215, the second button 217, and the third button 219 may be displayed in the predetermined area in the boundary area 210 as shown in FIG. The area where each button is displayed may move. When the area where each button is displayed moves, the area representing the corresponding button may be changed to the upper side or the lower side of the boundary area 210 according to the position of the user touch input including the boundary area 210 start command. It may mean.

Although not shown in the drawing, even after the user touches and drags an input corresponding to the boundary region 210 start command and stops the touch input before the user selects one of the plurality of buttons, the boundary region 210 may be moved. Initiation may be maintained. In this case, the mobile terminal 100 may perform an operation corresponding to the selected button by receiving a touch input for selecting one of the plurality of buttons displayed on the boundary area 210.

FIG. 6 is a diagram illustrating a keyboard menu for user input provided in a boundary area according to an embodiment of the present invention.

Referring to FIG. 6, the boundary area 210 may display a plurality of icons including a character icon 251 and a direction key 255 and 257 for each character included in the keyboard. In this case, the character icon 251 may mean the same character as the character indicated by the indicator 253 among the plurality of characters included in the keyboard displayed on the active area 250. For example, as illustrated in FIG. 6, when the indicator 253 points to the lowercase letter d of the keyboard, the letter icon 251 of the boundary area 210 may also refer to the lowercase letter d of the alphabet. As described above, when the boundary area 210 is started, the mobile terminal 100 may receive a user input from one of the character icon 251 of the boundary area 210 and the keyboard of the active area 250. The user input may be received from only the character icon 251 of the area 210, but is not limited thereto.

The direction keys 255 and 257 may receive a user input for changing the position of the indicator 253 displayed on the active area 250. For example, as shown in FIG. 6, the first direction key 255 may receive a user input for moving the position of the current indicator 253 to the right, and the second direction key 257 may be the current indicator 253. Receive a user input for moving the position of the to the left. Although not shown in the drawing, the first direction key 255 may receive a user input for moving the position of the current indicator 253 upwards, and the second direction key 257 may move the position of the current indicator 253 downward. It may also receive a user input for moving to. As such, the direction indicated by the direction keys 255 and 257 is not limited to up, down, left, and right, and may include at least one direction key.

Reference numerals 259, 271, and 273 denote touch and drag inputs for starting the boundary area 210 and user input for selecting one icon from a plurality of icons 251, 255, and 257 displayed on the boundary area 210. Can be. According to an embodiment of the present disclosure, when the user touches the left edge of the touch pad and drags it to the right and moves the drag upward, the position at which the first direction key 255 is displayed, the mobile terminal 100 displays the indicator 253. Move the position of to the right to point to the lowercase letter f displayed to the right of the lowercase letter d. In addition, when the user touches the left edge of the touch pad and drags it to the right and moves the drag downward, which is the position where the second direction key 257 is displayed, the mobile terminal 100 moves the position of the indicator 253 to the left. It can refer to the lowercase letter s displayed to the left of the lowercase letter d. In addition, when the user touches the left edge of the touch pad and drags it to the right and moves the drag in the horizontal direction where the letter icon 251 is displayed, the mobile terminal 100 displays the letter lowercase letter d indicated by the indicator 253. The input window 239 may be input.

Although not shown in the drawings, according to another embodiment of the present invention, when the user touches the left edge of the touch pad and drags it to the right, the user moves the drag upwards, where the first direction key 255 is displayed. 100 may move the position of the indicator 253 upward to point to the lowercase letter e or the lowercase letter r displayed above the lowercase letter d. In addition, when the user touches the left edge of the touch pad and drags it to the right and moves the drag downward, which is the position where the second direction key 257 is displayed, the mobile terminal 100 moves the position of the indicator 253 downward. It can also refer to the lowercase letter x displayed below the lowercase letter d.

In the above case, if the user stops the touch input after the user touch and drag input corresponding to the boundary area 210 start command, the boundary area 210 starts even before the user selects one of the plurality of buttons. May be terminated.

Although not shown in the drawing, even after the user touches and drags an input corresponding to the boundary region 210 start command and stops the touch input before the user selects one of the plurality of buttons, the boundary region 210 may be moved. Initiation may be maintained. In this case, the mobile terminal 100 may perform an operation corresponding to the selected button by receiving a touch input for selecting one of the plurality of buttons displayed on the boundary area 210. In another embodiment of the present invention described above, the mobile terminal 100 inputs a command to start the boundary region 210 and stops the touch input, and then the user input for moving the direction of the indicator 253 up and down. Can be received using a plurality of direction keys (255, 257). In addition, when a user inputs a boundary area 210 start command and stops touch input, when the user touches and drags the text icon 251, the mobile terminal may change the text indicated by the text icon 251. For example, when the user touches the letter icon 251 of the boundary area 210 and drags it to the left, the indicator 253 of the active area 250 may point to a lowercase letter s of the alphabet, and if dragging to the right, the active area ( The indicator 253 of 250 may indicate the lowercase letter f of the alphabet, but is not limited thereto.

7 is a diagram illustrating an existing task list provided in a boundary area according to an embodiment of the present invention.

Referring to FIG. 7, the boundary area 240 may include an existing task list including a plurality of task icons 275, 277, and 279. Each of the task icons 275, 277, and 279 may refer to an application in which a user recently performed an action on the mobile terminal 100. For example, the first task icon 275 may display a text message transmission history. The text message transmission history may include information about an identifier of a user who has sent and received a text message, contents of a text message which has been sent and received, and a time when the text message has been sent or received. The second task icon 277 may display a call history. The call history may include information about an identifier, a call time, etc. of a user who has made a call using the mobile terminal 100, but is not limited thereto. The third task icon 279 may include a web site connection history. The web site access history may include information on an identifier, a web site access page, a web site access time, and the like of the web site accessed from the mobile terminal 100, but is not limited thereto. The description of the plurality of task icons 275, 277, and 279 described above is merely an example, and the number of task icons or information included in the task icons is not limited thereto.

In this way, by displaying the existing task list in the boundary area, the user can start the boundary area while checking or selecting the existing task list while executing a separate application in the active area 250, thereby making the user's application easier to use. Can provide.

Hereinafter, referring to FIG. 8, when a boundary area is disclosed according to an embodiment of the present disclosure, a method for identifying a user interface of an active area corresponding to a user interface indicated or displayed on the boundary area may be used. Explain.

FIG. 8 is a diagram for describing a method of differently displaying a portion activated in a non-boundary area when selecting a boundary area according to a user input according to an exemplary embodiment. For reference, the non-boundary region may be used in the same sense as the active region.

According to the exemplary embodiment of the present invention illustrated in FIG. 8, when the mobile terminal 100 receives a user input of initiating the boundary areas 210 and 240 according to a user experience, the user interface 293 and 297 displayed on the active area 250 is displayed. It may operate in the same manner as is selected. In this case, even when the user interface is not displayed in the boundary regions 210 and 240, the mobile terminal 100 may perform the corresponding operation, but is not limited thereto.

Referring to FIG. 8, when the mobile terminal 100 receives a user input 291 dragging to the right while touching the left boundary area 210, the cancel icon 293 is displayed in the user interface of the active area 250. Can be displayed differently from other user interfaces. This may mean that the user terminal knows that the mobile terminal 100 recognizes the start command of the left boundary area 210 as the same command as the cancel icon 293 selection command according to the user experience.

Similarly, when the mobile terminal 100 receives the user input 295 dragging to the left while touching the right boundary area 240, the mobile terminal 100 displays the OK icon 297 among the user interface of the active area 250. It can be displayed differently from the interface. This may mean that the user terminal knows that the mobile terminal 100 recognizes the right boundary area 240 start command as the same command as the OK icon 293 selection command according to the user experience.

In this case, in order to display a specific user interface differently from the rest of the user interface, the animation effect may be displayed only on a specific button, such as displaying a color of a specific button differently from that of another button, but is not limited thereto.

According to an embodiment of the present invention, the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded. Examples of processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.

The above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. It may be.

Claims (5)

A display unit which displays a boundary area and an active area;
A user input unit for receiving a boundary area start command and a selection command for selecting an icon included in the boundary area or a user interface included in the active area; And
When the user input unit receives the border region start command, the border region is displayed on the display unit. When the user input unit receives the selection command, the user input unit performs an operation corresponding to the selected icon or the selected user interface. A control unit which displays a result of performing the operation on an active area,
The boundary area is,
An edge of the display unit,
The icon included in the border area,
Perform the same function as the user interface included in the active area,
The control unit,
Display the user interface in the active area,
When receiving the boundary area start command, an icon performing the same function as the user interface displayed in the active area is displayed in the border area together with the user interface displayed in the active area.
terminal.
Claim 2 has been abandoned upon payment of a set-up fee. The method of claim 1,
The boundary region start command includes a user input of touching and dragging an edge of the terminal.
terminal.
Claim 3 has been abandoned upon payment of a set-up fee. The method of claim 1,
The icon included in the border area includes at least one of a character icon and a direction icon,
The letter icon includes a letter, number, symbol included in the keyboard,
The direction icon performs a function of moving the position of the indicator displayed on the keyboard.
terminal.
Claim 4 has been abandoned upon payment of a setup registration fee. The method of claim 3,
If the user input unit receives a selection command for selecting an icon included in the boundary area, the controller displays the user interface of the active area corresponding to the selected icon to be differentiated,
The display in which the user interface is differentiated includes animation effects
terminal.
Displaying an active area comprising a user interface;
Receiving a boundary area start command;
Displaying a border area including an icon based on the border area start command;
Receiving a selection command for selecting an icon included in the boundary area or the user interface included in the active area;
Performing an operation corresponding to the selected icon or the selected user interface according to the selection command; And
Displaying a result of performing the operation on the active area;
The boundary area is,
Including an edge of the display unit,
The icon included in the border area,
Perform the same function as the user interface included in the active area,
The displaying of the boundary area may include:
And upon receiving the boundary area start command, displaying, on the border area, an icon performing the same function as the user interface displayed on the active area together with the user interface displayed on the active area.
Terminal operation method.
KR1020120111797A 2012-10-09 2012-10-09 Terminal and method for operating thereof KR102028656B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120111797A KR102028656B1 (en) 2012-10-09 2012-10-09 Terminal and method for operating thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120111797A KR102028656B1 (en) 2012-10-09 2012-10-09 Terminal and method for operating thereof

Publications (2)

Publication Number Publication Date
KR20140045718A KR20140045718A (en) 2014-04-17
KR102028656B1 true KR102028656B1 (en) 2019-10-04

Family

ID=50653020

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120111797A KR102028656B1 (en) 2012-10-09 2012-10-09 Terminal and method for operating thereof

Country Status (1)

Country Link
KR (1) KR102028656B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007179502A (en) 2005-12-28 2007-07-12 Sharp Corp Information processor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101416992B1 (en) * 2007-05-03 2014-07-08 엘지전자 주식회사 Mobile Terminal With Touch Input Device And Method Of Displaying Item Using Same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007179502A (en) 2005-12-28 2007-07-12 Sharp Corp Information processor

Also Published As

Publication number Publication date
KR20140045718A (en) 2014-04-17

Similar Documents

Publication Publication Date Title
KR101078929B1 (en) Terminal and internet-using method thereof
US9207854B2 (en) Mobile terminal and user interface of mobile terminal
EP2302497A2 (en) Mobile terminal and display controlling method thereof
KR20110054452A (en) Method for outputting tts voice data in mobile terminal and mobile terminal thereof
KR20100030968A (en) Terminal and method for displaying menu thereof
KR20100125635A (en) The method for executing menu in mobile terminal and mobile terminal using the same
KR101564108B1 (en) Method for cotrolling item in mobile terminal having dual display units and mobile terminal using the same
KR20110045664A (en) Method for displaying a menu in mobile terminal and mobile terminal thereof
KR20130082190A (en) Terminal and method for diaplaying icons
KR20110045659A (en) Method for controlling icon display in mobile terminal and mobile terminal thereof
KR20110013606A (en) Method for executing menu in mobile terminal and mobile terminal thereof
KR20100062252A (en) Mobile terminal and user interface of mobile terminal
KR20100104562A (en) Mobile terminal and method for controlling wallpaper display thereof
KR101859099B1 (en) Mobile device and control method for the same
KR20100038858A (en) Mobile terminal enable and method of controlling icon using same
KR20130059681A (en) Camera and method for controlling thereof
KR20100054184A (en) Mobile terminal and display method thereof
KR20130076028A (en) Method for controlling mobile terminal
KR20100039977A (en) Portable terminal and method of changing teleccommunication channel
KR20110016340A (en) Method for transmitting data in mobile terminal and mobile terminal thereof
KR102028656B1 (en) Terminal and method for operating thereof
KR20100117417A (en) Method for executing application in mobile terminal and mobile terminal using the same
KR102059007B1 (en) Terminal and control method thereof
KR20140067291A (en) Terminal and method for displaying icon
KR101598227B1 (en) Method for dispalying menu in mobile terminal and mobile terminal using the same

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right