[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2010116028A2 - Method for controlling an apparatus - Google Patents

Method for controlling an apparatus Download PDF

Info

Publication number
WO2010116028A2
WO2010116028A2 PCT/FI2010/050252 FI2010050252W WO2010116028A2 WO 2010116028 A2 WO2010116028 A2 WO 2010116028A2 FI 2010050252 W FI2010050252 W FI 2010050252W WO 2010116028 A2 WO2010116028 A2 WO 2010116028A2
Authority
WO
WIPO (PCT)
Prior art keywords
menu
item
relating
sector
information relating
Prior art date
Application number
PCT/FI2010/050252
Other languages
French (fr)
Other versions
WO2010116028A3 (en
Inventor
Raine Kajastila
Tapio Lokki
Original Assignee
Aalto-Korkeakoulusäätiö
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aalto-Korkeakoulusäätiö filed Critical Aalto-Korkeakoulusäätiö
Publication of WO2010116028A2 publication Critical patent/WO2010116028A2/en
Publication of WO2010116028A3 publication Critical patent/WO2010116028A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the invention relates to a method for controlling an apparatus by a surface of the apparatus.
  • the invention also relates to an apparatus, which is controlled by a surface of the apparatus.
  • the invention relates to a computer program product for controlling the apparatus by the surface of the apparatus.
  • the invention relates to a carrier medium comprising the computer program product for controlling the apparatus by the surface of the apparatus.
  • a user controls his/her mobile device, which utilizes a three dimensional audio system in order to produce a response to the user, through a keypad.
  • the three dimensional audio system audibly indicates keypad strokes as they are displayed within the display of the device by varying pitch, tone, and/or volume. For example, a sound indicating one key has a first pitch, another pressed key has a second pitch differing from the first pitch, a third pressed key has a third pitch differing from the first and second pitches, and so on. Consequently, when a certain pitch is associated with a certain key, the user can audibly detect a pressed key.
  • the user of the device controls the device through a rotating ball displayed in a touch panel. The ball is rotated by touching the touch panel and the rotation of the ball, a direction and rotational speed, is represented by means of sounds.
  • One object of the invention is to provide a method for controlling an apparatus without a visual feedback.
  • the object of the invention is fulfilled by providing a method, wherein information relating to an interacted location on a surface of an apparatus is obtained, an item corresponding to the information relating to the interacted location on the surface of the apparatus is obtained, and an audio feedback relating to the item corre- sponding to the information relating to the interacted location on the surface of the apparatus is provided.
  • the object of the invention is also fulfilled by providing an apparatus, which is configured to obtain information relating to an interacted location on a surface of the apparatus, obtain an item corresponding to the information relating to the inter- acted location on the surface of the apparatus, and provide an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • the object of the invention is also fulfilled by providing a computer program product, which, when the computer program product is run in a computer, obtains in- formation relating to an interacted location on a surface of an apparatus, obtains an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provides an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • a carrier medium which comprises a computer program product, which, when the computer program product is run in a computer, obtains information relating to an interacted location on a surface of an apparatus, obtains an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provides an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • a mobile device is controlled by a user who moves his/her finger(s) on the touch surface of the mobile device differ- ent directions for browsing a circular ego-centric auditory menu comprising one or more auditory menu item.
  • Targeted auditory menu items are indicated with speech or other sounds and reproduced from corresponding directions with a three dimensional audio.
  • the synthesised targeted auditory menu items are transmitted from the mobile device, which possibly does not comprise a visual display at all, to user's headphones so that a three dimensional auditory space is established around the user.
  • An embodiment of the present invention relates to a method according to independent claim 1.
  • an embodiment of the present invention relates to an apparatus according to independent claim 13.
  • an embodiment of the present invention relates to a computer program product according to independent claim 14.
  • an embodiment of the present invention relates to a carrier medium accord- ing to independent claim 15.
  • a method comprises obtaining information relating to an interacted location on a surface of an apparatus, obtaining an item corresponding to the information relating to the interacted location on the sur- face of the apparatus, and providing an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • acted location refers to a certain location on the surface of the apparatus, which the user of the apparatus interacts somehow, e.g. by touching by means of a finger, hand, stylus, ordinary pen, or stick.
  • surface of an apparatus refers to a surface (component), which is capable of detecting user's interaction directed at the surface of the apparatus, e.g. resistively, by pressure, capacitively, or optically.
  • Such surface can be e.g. a visual display, auditory display, or tactile display.
  • apparatus refers to e.g. a mobile station, laptop, computer, digital video disc (DVD) device, set-top box, video recorder, sound reproduction equipment, household apparatus such as microwave oven, car stereo, or navigator.
  • item refers to e.g. an alphabetic character, number, function icon, which enables to e.g. dial, send a short message service (SMS) message, browse a phonebook or playlist, control playback, and perform a radio station surfing, menu, or link.
  • SMS short message service
  • the item is illustrated by means of the surface of the apparatus e.g. visually through the visual display, audibly through the auditory display, or by vibrations through the tactile display.
  • the method further comprises providing a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus.
  • the form of the menu which the apparatus illustrates visually and/or audibly to a user, can also be e.g. ellipse and square.
  • the method which is disclosed in any of the previous embodiments, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus.
  • the centre of the menu heedless of whether the menu is circular, ellipse, or square, can be placed freely in view of the fact that the menu completely fits on the surface of the apparatus.
  • the method further comprises receiving the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors.
  • the touch surface can be e.g. a touch screen or mere interaction surface, which is capable of receive user's contact by e.g. a finger, hand, or other suitable instrument.
  • the method comprises determining a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus.
  • the interaction location information defines the menu sector indicating the targeted item.
  • the method further comprises expanding the sector determining the item and neighbouring sectors of the menu, and tapering other sectors of the menu on the surface of the apparatus. After the user has "activated” the target item (the interacted sector) by touching, the interacted sector and possible its neighbouring sectors expand and the other sectors shrink in the menu.
  • the method which is disclosed in any of the previous embodiments, further comprises receiving selection information relating to the item determined by the expanded sector. The user selects the "activated" item again by touching by means of e.g. a finger, hand, stylus, ordinary pen, stick, or by raising a finger from the touch surface.
  • the method which is disclosed in any of the previous embodiments, further comprises performing a function relating to the selected item determined by the expanded sector. So, when the user has selected the item, the function, which the selected item indicates, is executed in the apparatus.
  • the method further comprises providing an audio feedback, tactile feedback, and/or visual feedback relating to the function, which relates to the selected item determined by the expanded sector.
  • the audio feedback to the user is provided e.g. by the loudspeaker of the apparatus, headphones or loudspeaker system connected to the apparatus.
  • the visual feedback for one, is provided e.g. through the visual display of the apparatus or a display connected to the apparatus.
  • the tactile feedback indicating the made selection is established e.g. through the apparatus or another device capable of provide a tactile interaction, which is connected to the apparatus.
  • the connection between the apparatus and another device establishing the feedback can be e.g. a cable connection, wireless connection, or the combination of the cable and wireless connections.
  • the method which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item cor- responding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus.
  • the three dimensional sound feedback through the headphones, or loudspeaker system is established so that the targeted or selected items are reproduced from the correct directions of the targeted or selected items in a three dimensional audio space. It is, of course, possible to use mono or stereo sound instead of the three dimensional sound in order to provide the feedback to the user.
  • the method which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech.
  • the activation or selection of the item has been established, information describing the activated item and/or selected item or mere selection information are synthesised to a speech or other sounds such as an earcorn, spearcorn, auditory icon, and mixing speech, and reproduced it to the user.
  • the method which is disclosed in any of the previous embodiments, wherein items locate around the circular menu in an alphabetical order or numerical order.
  • Item locations in the circular menu can be defined by the alphabetical or numerical order, where menu items are located around the circular menu depending on their first (second, third, etc.) letter.
  • the menu items are always found from prior known locations.
  • the items are not merely and necessarily in the alphabetical order or numerical order, wherein the first letter or any other letter or a number defines the location of the item on the surface, but those can be placed to certain locations on the surface depending on the first or any other letter or number.
  • an apparatus is configured to obtain information relating to an interacted location on a surface of the apparatus, obtain an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provide an audio feedback relating to the item corre- sponding to the information relating to the interacted location on the surface of the apparatus.
  • the apparatus which is disclosed in the previous embodiment, is configured to provide a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to receive the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors.
  • the apparatus which is disclosed in any of the previous embodiments, is configured to determine a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to expand the sector determining the item and neighbouring sectors of the menu, and taper other sectors of the menu on the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to receive selection information relating to the item determined by the expanded sector.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to perform a function relating to the selected item determined by the expanded sector.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to provide an audio feedback, tactile feedback, and/or visual feedback relating to the function, which re- lates to the selected item determined by the expanded sector.
  • the apparatus which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the func- tion, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech.
  • the apparatus which is disclosed in any of the previous embodiments, wherein items locate around the circular menu in an alphabetical order or numerical order.
  • the method according to embodiments of the invention which utilises a touch sur- face input with auditory menus, allows a reasonably fast and accurate way to navigate circular auditory menus with a large number of menu items.
  • a user When a user is browsing the menu items, they are read-out loud and the sound is reproduced from the correct direction.
  • the fast browsing is enabled with a reactive interruptible audio design and the accuracy in selection is enhanced with the dynamic move- ment of the menu items and the expansion of the selection area.
  • the method according to embodiments of the invention can be used to handle all the basic controls of the modern mobile phone or a music player, which contains a touch surface. It is also possible to construct a small multi-functional device consisting only of a touch surface without a screen. Such robust devices without visual display can be inexpensive and have low energy consumption, but still offer the same functionalities as similar devices with a visual screen.
  • Figure 1 illustrates an exemplary view of an arrangement comprising a mobile device and an auditory user interface according to an advantageous embodiment of the invention
  • Figure 2 illustrates an exemplary view of an arrangement comprising a mobile device and a remote controlled device according to an advantageous embodiment of the invention
  • FIG. 3 illustrates an exemplary flowchart of a method according to an advantageous embodiment of the invention
  • Figure 4 illustrates exemplary views of auditory menus according to an advanta- geous embodiment of the invention
  • Figure 5 illustrate further exemplary views of auditory menus according to an advantageous embodiment of the invention
  • Figure 6 illustrates an exemplary view of a method according to an advantageous embodiment of the invention
  • Figure 7 illustrates an exemplary view of an apparatus according to an advantageous embodiment of the invention.
  • Figure 8 illustrates an exemplary view of a remote device to be controlled by a method according to an advantageous embodiment of the invention.
  • Figure 1 illustrates an arrangement 100, wherein a mobile device 1 10, e.g. a mobile station or PDA, comprising a user interface 120 such as a touch surface or touch screen implemented into the mobile device 1 10 for receiving control commands from a user (not shown in figure).
  • the processor of the mobile device runs a program application, e.g. an auditory user interface application, in the mobile device 110 in order to provide the auditory user interface, which comprises one or more auditory items 130a-130e to the user.
  • the auditory user interface enables that the mobile device 1 10 can comprise only a touch surface, whereupon it does not have a display, or it possibly has a touch screen including the display, but not any keyboard.
  • the mobile device 1 10 provides an auditory space around the user by means of a mobile device loudspeaker (not shown), headphones 140 (or one earphone), which has a wired or wireless connection 150 to the mobile device 1 10, or a loudspeaker, wherein at least one loudspeaker system (not shown) is established around the user.
  • the loudspeaker system is also connected to the mobile device 110 through a wireless or wired connection.
  • an auditory object 130a illustrating an icon which provides an access to e.g. a phonebook
  • the mobile device 1 10 comprises the display, the user has a better knowledge relating to the location of the icon.
  • the user interface application provides an auditory feedback through the mobile device loudspeaker, headphones 140, one earphone, or loudspeaker system, when the user targets to the desired auditory icon 130a.
  • the feedback is possible to provide by using a three dimensional sound, whereupon the feedback comes from the direction wherein the targeted (activated) icon locates.
  • mono or stereo sound instead of the three dimensional sound for providing the feedback to the user.
  • the feedback notifies the user that he/she can access to the phonebook, which the targeted icon 130a indicates, by selecting the icon 130a.
  • the user raises his/her finger on the touch surface 120 where the finger rests on the touch surface 120 after the icon activation, presses a button 160 or other suitable key from the mobile device, or provides the selection other way and by other means in order to access to the phonebook.
  • the application provides another audio feedback, which describes that the selection is made, by means of e.g. a fast replay.
  • the user can browse the phonebook for selecting the name of the person to whom he/she wants to call, and make a call by pointing some auditory objects 130a-13Oe, which enables calling, in the auditory menu.
  • the user can also provide e.g. a SMS message sending, playlist browsing, playback controlling, and radio station surfing.
  • FIG 2 is represented an arrangement 200, wherein a mobile device 210, which comprises a touch surface or touch screen 220 in order to receive control commands from a user 230, communicates with a device 240 such as a desktop computer having a display 250, keyboard 260, and loudspeakers 270a, 270b by using a wireless Bluetooth or infrared connection 280.
  • the computer 240 establishes an auditory user interface, which has auditory items such as auditory menus and auditory icons, around the user 230 by the loudspeakers 270a, 270b.
  • the computer 240 can run a graphical user interface comprising e.g. graphical menus and graphical icons displayed on the display 250.
  • the user 230 wants to control a jukebox application run by the computer 240 in order to select next song to be played from a song list, which includes hundreds of songs, he/she points the touch surface 220 in order to direct a song icon indicating a desired song from the auditory user interface of the jukebox application so that, if the desired song icon locates on the top left corner of the display 250 and/or the front left corner of the auditory user interface, the user 230 touches e.g. by his/her finger(s) the top left corner of the touch surface 220. By pointing the song icon the user 230 activates the song icon and the activation is indicated to the user 230 audibly by reproducing a sound through the loudspeakers 270a, 270b.
  • the user 230 raises his/her finger on the touch surface 220 where the finger rests after the activation or presses a button 290 belonging to the mobile device 210 in order to play (select) the desired song. Then, the loudspeakers 270a, 270b reproduce a sound indicating the selection and the desired song, if the song is next on a play list.
  • the mobile device 210 comprising the touch surface 220 for enabling a song selection can also be utilised as the remote controller of the television (not shown), which visually illustrates a circular menu on the display of the television.
  • the mobile device 210 can be used for e.g. channel selecting or volume level adjusting.
  • a computer program e.g. the above-mentioned jukebox application or a game application, which the computer 240 runs, is displayed on the display 220 in the mobile device 210 and sounds relating to the application are reproduced through the loudspeaker of the mobile device (not shown) or headphones (not shown) connected to the mobile device 210.
  • the connection between the mobile device 210 and the headphones is either through a wire or wireless connection.
  • the jukebox or game application is displayed on the computer display 250 and sounds are reproduced through the mobile device loudspeaker or headphones connected to the mobile device 210.
  • Figure 3 discloses, by means of an example only, a flow chart describing a method 300 according to one embodiment of the invention.
  • a mobile device and/or an application such as a user interface executing a method, is turned on and necessary stages such as a connection set up definition for e.g. external headphones and different parameters' initialisation relating to an auditory user interface, are provided.
  • a circular menu is established to a user in the mobile device by means of the touch surface of the mobile device in step 320.
  • the circular menu comprising items can be displayed to the user visually through a mobile device display, e.g. a touch screen, and/or audibly through a circular auditory menu comprising auditory items provided by the mobile device and the headphones.
  • a mobile device user touches a certain location on the touch surface (touch screen) of the mobile device by e.g. his/her finger in order to activate a menu item from the circular menu
  • the certain interacted location information which comprises e.g. the x and y coordinates of the finger touching or information relating to the interacted sector of the circular menu, is obtained resistively, capacitively, or by any other means in step 330.
  • step 340 the menu item is obtained on the grounds of the obtained interacted location information so that the menu item is defined directly by means of the interacted sector information or the coordinates information specifying the sector, which then defines the menu item.
  • the menu item activation is indicated to the user by an audio feedback relating to the activated menu item through the headphones.
  • the feedback can be established by using a three dimensional sound so that the feedback relating to the activated menu item is reproduced from the correct direction of the activated item.
  • Mono or stereo sound can also be used instead of the three dimensional sound in order to provide the feedback to the user.
  • the information describing the activated item is synthesised to a speech or other sounds and reproduced to the user.
  • recorded samples can be used instead of the synthesised speech.
  • the sector determining the activated item and its neighbouring sectors expand and the other sectors of the circular menu taper on the surface of the apparatus and in the auditory space around the user for helping the user to target his/her selection.
  • step 360 it is determined whether the mobile device receives through the touch surface selection information relating to the activated item.
  • the selection information is indicated e.g. by raising the finger from the touch surface of the mobile device.
  • the method returns back to step 320.
  • the mobile device Since the selection is established, the mobile device performs a function, which the activated menu item indicates, in step 370.
  • a function can be e.g. make a call, place the call on hold, or disconnect the call.
  • the performed function or function to be performed is indicated to the user by an audio, visual, and/or tactile feedback, which relates to the selected menu item. This step is usually provided together with step 370.
  • control method is ended in step 390.
  • the method is possible to execute so that the mobile device obtains the interacted location information and transfers the interacted location information to e.g. the stereo set enabling to produce an audi- tory user interface around the user, which obtains the item on the grounds of the obtained interacted location information and provides the audio feedback relating to the activated item to the user of the mobile device.
  • a remote device such as a stereo set or computer having loudspeakers
  • the mobile device obtains the item according to step 340 of the previous embodiment and transfers the item information to the stereo set so that it can reproduce the audio feedback indicating the activated item.
  • the mobile device determines whether the selection of the activated item is executed and, when it determines that the user selects the activated item, it transfer the selection information to the stereo set performing the function indicated by the activated and selected item.
  • the ste- reo set also provides the audio feedback relating to the selected item to the user.
  • the above described control methods utilise an auditory user interface.
  • the auditory menu containing e.g. a dial menu, which enables dialling, listening to, and removing selected numbers
  • a SMS menu which contains all letters and a "space" for writing and an icon for sending a written message
  • phonebook which comprises an alphabetically organised phonebook.
  • the phonebook can include multiple layers, e.g. alphabets and for each of them a submenu with names.
  • a menu browsing happens by e.g. circular finger sweeping on a touch surface, as it is mentioned earlier, amongst the menu items, which are spread evenly on the surrounding circular space. Depending on the number of the items the space between them is dynamically changed.
  • Figure 4 discloses a circular menu 410 and square menu 420 according to an embodiment of the invention, wherein menu centres and sector points locate in the centre of the surface.
  • the circular menu 410 and square menu 420 comprising similarly alphabetic character icons and a "space" icon.
  • Lower menus 410, 420 represent how touch surfaces are divided into sectors having an equal point angle and how each sector comprises one icon. This figure shows one example how to the positions of the alphabetic characters can be implemented around the user in the static auditory menu.
  • the user can access any item directly by placing the finger on the touch surface and he/she can continue browsing with a circular finger sweep. Selection is made by removing the finger from the surface.
  • the centre of the touch surface is a safe area from where the finger can be lifted without making a selection.
  • other special actions can be assigned to the centre, corners, or any other specific location of the touch surface.
  • the menus 410, 420 can utilise the static placement of the menu items, which suits well for three dimensional menus where the items are always at known loca- tions. Thus, each item can be accessed directly by touching a known location and the auditory feedback of the targeted items reveals if the right item was activated, and if not, the static order of the items reveals where the desired item will be found since item locations are easy to remember with the longer use of the menu.
  • the menu items are separate items that monitor if an "interaction" is in their de- fined sector. Targeted items send info to other items in order to manage the positions of the other items, fade their sound, and adjust their sector sizes.
  • Figure 5 illustrates a circular menu 510 and square menu 520, wherein sectors are dynamically adjusted for enhancing user's pointing accuracy.
  • the dynamic zoom of the target sector reduces undesired jumping between items and facilitates the item selection with bigger number of items.
  • the user touches a touch surface by his/her finger in order to activate an icon, which indicates a letter R and when the apparatus (application) determines an interaction it expands the activated target sector R to both directions for enabling stabile browsing and selecting.
  • neighbouring sectors S, T, U, Q, P, and O also expand and other sectors, for one, regroup so that the sectors taper.
  • lower menus 510, 520 represent how the part of the sectors expand and the other sectors regroup.
  • the touch surface can comprise also a touch area in the middle of the touch surface and/or touchable areas around the sectors.
  • Figure 6 discloses a further enhancement for auditory menu layouts in order to achieve faster and better usability.
  • a start place is defined to be always a first name in alphabetical order.
  • the advanced spreading method can be adapted also to menus with multiple lay- ers and small menus with only few menu items.
  • An example relating to a browsing a small contact list is depicted in figure 6.
  • a first menu level containing the alphabets can be enhanced with the advanced spreading to gain easier access to the menu items.
  • names can be positioned so that they are always found according to a second letter. For example, Aaron is always posi- tioned in front and Amber behind in an auditory menu. When the user touches the screen, a closest menu item becomes active and the rest is spread evenly around them. If the menu happens to have only four items, the series of four can be repeated when the user continues browsing the menu as it is presented in the centre menu 620. The user can always return to the absolute positioning of the menu items by moving the finger to the centre of the surface (screen) as a menu 630 on the right shows.
  • Figure 7 discloses one example of a mobile device 700 according to an embodiment of the invention.
  • the mobile device 700 comprises processor 710 for performing instructions and handling data, a memory unit 720 for storing data such as instructions and application data, a user interface 730, which can comprise at least a touch surface or touch screen.
  • the user interface 730 can also comprise be e.g. a single button, a keyboard, or other selection means.
  • the mobile device 700 comprises data transfer means 740 for transmitting and receiving data and a loudspeaker 750.
  • the mobile device can also comprise a display 760 for providing a visual or tactile feedback, but not necessary.
  • the memory 720 stores at least an auditory user interface application 722, application 724 for determining interacted location data, and synthesizer application 726.
  • the implemented touch surface 730 obtains the interacted location data, which the processor 710 manipulates according to the instructions of the corresponding application 724, and the synthesizer 726 converts obtained data from text format to speech, which is established through the loudspeaker 750 e.g. by using the three dimensional sound.
  • Figure 8 discloses a device 800, which is controlled through an air interface by a mobile device, which is capable of receiving control inputs by a touch surface or a touch screen.
  • a mobile device which is capable of receiving control inputs by a touch surface or a touch screen.
  • Such device 800 can be e.g. a mobile station, computer, laptop, DVD recorder, personal computer, stereo set etc.
  • the device 800 comprises a processor 810 for performing instructions and handling data, a mem- ory 820 for storing data such as instructions and application data, a user interface 830 comprising e.g. touchpad, keyboard, or one or more buttons, at least data receiving means 840 for receiving data, a loudspeaker 850 and a display 860, but not necessary.
  • the device 800 can comprise data transmitting means 842 for sending data to an external loudspeaker (not shown).
  • the memory of the device 800 includes e.g. at least an auditory user interface application 822, application 824 for manipulating interacted location data or item data, and synthesizer application 826.
  • the processor 810 controls the received interacted location data or item data according to the instructions of the corresponding application 724, and the synthesizer 826 converts obtained data from text for- mat to speech, which is provided through the loudspeaker 850 e.g. by using the three dimensional sound.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Investigating Or Analyzing Materials By The Use Of Fluid Adsorption Or Reactions (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a method for controlling an apparatus by a surface of the apparatus, which method comprises obtaining information relating to an interacted location on a surface of an apparatus, obtaining an item corresponding to the information relating to the interacted location on the surface of the apparatus, and providing an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus. The invention also relates to the apparatus controlled by the surface of the apparatus, a computer program product for controlling the apparatus by a surface of the apparatus, and a carrier medium, which comprises the computer program product for controlling the apparatus by a surface of the apparatus.

Description

METHOD FOR CONTROLLING AN APPARATUS
TECHNICAL FIELD OF THE INVENTION
The invention relates to a method for controlling an apparatus by a surface of the apparatus. The invention also relates to an apparatus, which is controlled by a surface of the apparatus. Also, the invention relates to a computer program product for controlling the apparatus by the surface of the apparatus. Furthermore, the invention relates to a carrier medium comprising the computer program product for controlling the apparatus by the surface of the apparatus. BACKGROUND OF THE INVENTION
The design and use of the audio-only and eyes-free interfaces have been emerging in recent years and with a proper design an auditory interface can be even more effective than its visual counterparts. Although a three dimensional auditory user interfaces are not widely used yet, they can bring a better usability in situa- tions where an eyes-free operation is necessary. Such cases include the competition of the visual attention, the absence or limitations of the visual display, a user disability, or the reduction of the battery life.
Due to the above-mentioned a need for an easy way to control devices, especially mobile devices such as e.g. mobile stations, laptops, and personal digital assis- tants (PDAs), which utilize auditory user interfaces, has become real.
In one solution a user controls his/her mobile device, which utilizes a three dimensional audio system in order to produce a response to the user, through a keypad. When the user presses keypad keys, the three dimensional audio system audibly indicates keypad strokes as they are displayed within the display of the device by varying pitch, tone, and/or volume. For example, a sound indicating one key has a first pitch, another pressed key has a second pitch differing from the first pitch, a third pressed key has a third pitch differing from the first and second pitches, and so on. Consequently, when a certain pitch is associated with a certain key, the user can audibly detect a pressed key. In another solution the user of the device controls the device through a rotating ball displayed in a touch panel. The ball is rotated by touching the touch panel and the rotation of the ball, a direction and rotational speed, is represented by means of sounds.
SUMMARY
One object of the invention is to provide a method for controlling an apparatus without a visual feedback.
The object of the invention is fulfilled by providing a method, wherein information relating to an interacted location on a surface of an apparatus is obtained, an item corresponding to the information relating to the interacted location on the surface of the apparatus is obtained, and an audio feedback relating to the item corre- sponding to the information relating to the interacted location on the surface of the apparatus is provided.
The object of the invention is also fulfilled by providing an apparatus, which is configured to obtain information relating to an interacted location on a surface of the apparatus, obtain an item corresponding to the information relating to the inter- acted location on the surface of the apparatus, and provide an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
The object of the invention is also fulfilled by providing a computer program product, which, when the computer program product is run in a computer, obtains in- formation relating to an interacted location on a surface of an apparatus, obtains an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provides an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus. The object of the invention is also fulfilled by providing a carrier medium, which comprises a computer program product, which, when the computer program product is run in a computer, obtains information relating to an interacted location on a surface of an apparatus, obtains an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provides an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
According to an embodiment of the invention a mobile device is controlled by a user who moves his/her finger(s) on the touch surface of the mobile device differ- ent directions for browsing a circular ego-centric auditory menu comprising one or more auditory menu item. Targeted auditory menu items are indicated with speech or other sounds and reproduced from corresponding directions with a three dimensional audio. The synthesised targeted auditory menu items are transmitted from the mobile device, which possibly does not comprise a visual display at all, to user's headphones so that a three dimensional auditory space is established around the user.
An embodiment of the present invention relates to a method according to independent claim 1.
In addition, an embodiment of the present invention relates to an apparatus according to independent claim 13.
Furthermore, an embodiment of the present invention relates to a computer program product according to independent claim 14.
Also, an embodiment of the present invention relates to a carrier medium accord- ing to independent claim 15.
Further embodiments are defined in dependent claims.
According to an embodiment of the invention a method comprises obtaining information relating to an interacted location on a surface of an apparatus, obtaining an item corresponding to the information relating to the interacted location on the sur- face of the apparatus, and providing an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
The term "interacted location" refers to a certain location on the surface of the apparatus, which the user of the apparatus interacts somehow, e.g. by touching by means of a finger, hand, stylus, ordinary pen, or stick.
The term "surface of an apparatus" refers to a surface (component), which is capable of detecting user's interaction directed at the surface of the apparatus, e.g. resistively, by pressure, capacitively, or optically. Such surface can be e.g. a visual display, auditory display, or tactile display.
The term "apparatus" refers to e.g. a mobile station, laptop, computer, digital video disc (DVD) device, set-top box, video recorder, sound reproduction equipment, household apparatus such as microwave oven, car stereo, or navigator. The term "item" refers to e.g. an alphabetic character, number, function icon, which enables to e.g. dial, send a short message service (SMS) message, browse a phonebook or playlist, control playback, and perform a radio station surfing, menu, or link. The item is illustrated by means of the surface of the apparatus e.g. visually through the visual display, audibly through the auditory display, or by vibrations through the tactile display.
According to an embodiment of the invention the method, which is disclosed in the previous embodiment, further comprises providing a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus. The form of the menu, which the apparatus illustrates visually and/or audibly to a user, can also be e.g. ellipse and square.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus. The centre of the menu, heedless of whether the menu is circular, ellipse, or square, can be placed freely in view of the fact that the menu completely fits on the surface of the apparatus.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises receiving the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors. The touch surface can be e.g. a touch screen or mere interaction surface, which is capable of receive user's contact by e.g. a finger, hand, or other suitable instrument.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises determining a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus. The interaction location information defines the menu sector indicating the targeted item.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises expanding the sector determining the item and neighbouring sectors of the menu, and tapering other sectors of the menu on the surface of the apparatus. After the user has "activated" the target item (the interacted sector) by touching, the interacted sector and possible its neighbouring sectors expand and the other sectors shrink in the menu. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises receiving selection information relating to the item determined by the expanded sector. The user selects the "activated" item again by touching by means of e.g. a finger, hand, stylus, ordinary pen, stick, or by raising a finger from the touch surface.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises performing a function relating to the selected item determined by the expanded sector. So, when the user has selected the item, the function, which the selected item indicates, is executed in the apparatus.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises providing an audio feedback, tactile feedback, and/or visual feedback relating to the function, which relates to the selected item determined by the expanded sector. The audio feedback to the user is provided e.g. by the loudspeaker of the apparatus, headphones or loudspeaker system connected to the apparatus. The visual feedback, for one, is provided e.g. through the visual display of the apparatus or a display connected to the apparatus. The tactile feedback indicating the made selection is established e.g. through the apparatus or another device capable of provide a tactile interaction, which is connected to the apparatus. The connection between the apparatus and another device establishing the feedback can be e.g. a cable connection, wireless connection, or the combination of the cable and wireless connections.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item cor- responding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus. The three dimensional sound feedback through the headphones, or loudspeaker system is established so that the targeted or selected items are reproduced from the correct directions of the targeted or selected items in a three dimensional audio space. It is, of course, possible to use mono or stereo sound instead of the three dimensional sound in order to provide the feedback to the user. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech. So, when the activation or selection of the item has been established, information describing the activated item and/or selected item or mere selection information are synthesised to a speech or other sounds such as an earcorn, spearcorn, auditory icon, and mixing speech, and reproduced it to the user. On the other hand, it is possible to play recorded samples instead of the synthesisation.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, wherein items locate around the circular menu in an alphabetical order or numerical order. Item locations in the circular menu can be defined by the alphabetical or numerical order, where menu items are located around the circular menu depending on their first (second, third, etc.) letter. Thus, the menu items are always found from prior known locations. Also, the items are not merely and necessarily in the alphabetical order or numerical order, wherein the first letter or any other letter or a number defines the location of the item on the surface, but those can be placed to certain locations on the surface depending on the first or any other letter or number.
According to an embodiment of the invention an apparatus is configured to obtain information relating to an interacted location on a surface of the apparatus, obtain an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provide an audio feedback relating to the item corre- sponding to the information relating to the interacted location on the surface of the apparatus.
According to an embodiment of the invention the apparatus, which is disclosed in the previous embodiment, is configured to provide a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, is further configured to receive the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, is configured to determine a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, is further configured to expand the sector determining the item and neighbouring sectors of the menu, and taper other sectors of the menu on the surface of the apparatus.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, is further configured to receive selection information relating to the item determined by the expanded sector.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, is further configured to perform a function relating to the selected item determined by the expanded sector.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, is further configured to provide an audio feedback, tactile feedback, and/or visual feedback relating to the function, which re- lates to the selected item determined by the expanded sector.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the func- tion, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus.
According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech. According to an embodiment of the invention the apparatus, which is disclosed in any of the previous embodiments, wherein items locate around the circular menu in an alphabetical order or numerical order.
The method according to embodiments of the invention, which utilises a touch sur- face input with auditory menus, allows a reasonably fast and accurate way to navigate circular auditory menus with a large number of menu items. When a user is browsing the menu items, they are read-out loud and the sound is reproduced from the correct direction. The fast browsing is enabled with a reactive interruptible audio design and the accuracy in selection is enhanced with the dynamic move- ment of the menu items and the expansion of the selection area.
The method according to embodiments of the invention can be used to handle all the basic controls of the modern mobile phone or a music player, which contains a touch surface. It is also possible to construct a small multi-functional device consisting only of a touch surface without a screen. Such robust devices without visual display can be inexpensive and have low energy consumption, but still offer the same functionalities as similar devices with a visual screen.
BRIEF DESCRIPTION OF THE DRAWINGS
Next, the aspects of the invention will be described in greater detail with reference to exemplary embodiments in accordance with the accompanying drawings, of which
Figure 1 illustrates an exemplary view of an arrangement comprising a mobile device and an auditory user interface according to an advantageous embodiment of the invention,
Figure 2 illustrates an exemplary view of an arrangement comprising a mobile device and a remote controlled device according to an advantageous embodiment of the invention,
Figure 3 illustrates an exemplary flowchart of a method according to an advantageous embodiment of the invention,
Figure 4 illustrates exemplary views of auditory menus according to an advanta- geous embodiment of the invention,
Figure 5 illustrate further exemplary views of auditory menus according to an advantageous embodiment of the invention, Figure 6 illustrates an exemplary view of a method according to an advantageous embodiment of the invention,
Figure 7 illustrates an exemplary view of an apparatus according to an advantageous embodiment of the invention, and
Figure 8 illustrates an exemplary view of a remote device to be controlled by a method according to an advantageous embodiment of the invention.
DETAILED DESCRIPTION
Figure 1 illustrates an arrangement 100, wherein a mobile device 1 10, e.g. a mobile station or PDA, comprising a user interface 120 such as a touch surface or touch screen implemented into the mobile device 1 10 for receiving control commands from a user (not shown in figure). The processor of the mobile device runs a program application, e.g. an auditory user interface application, in the mobile device 110 in order to provide the auditory user interface, which comprises one or more auditory items 130a-130e to the user. The auditory user interface enables that the mobile device 1 10 can comprise only a touch surface, whereupon it does not have a display, or it possibly has a touch screen including the display, but not any keyboard.
The mobile device 1 10 provides an auditory space around the user by means of a mobile device loudspeaker (not shown), headphones 140 (or one earphone), which has a wired or wireless connection 150 to the mobile device 1 10, or a loudspeaker, wherein at least one loudspeaker system (not shown) is established around the user. The loudspeaker system is also connected to the mobile device 110 through a wireless or wired connection.
If the user, who has the mobile device 1 10 available, decides to select an auditory object 130a illustrating an icon, which provides an access to e.g. a phonebook, he/she touches the touch surface 120 on the mobile device 1 10 e.g. by his/her finger, in this case to the left side of the touch surface 120, wherein the auditory icon 130a locates. Since the user is unable to see the icon 130a, the touching is based on user's estimation. Sounds aid to find the icon and help to remember the loca- tion. On the other hand, if the mobile device 1 10 comprises the display, the user has a better knowledge relating to the location of the icon.
The user interface application provides an auditory feedback through the mobile device loudspeaker, headphones 140, one earphone, or loudspeaker system, when the user targets to the desired auditory icon 130a. Moreover, the feedback is possible to provide by using a three dimensional sound, whereupon the feedback comes from the direction wherein the targeted (activated) icon locates. Naturally, it is possible to use mono or stereo sound instead of the three dimensional sound for providing the feedback to the user. The feedback notifies the user that he/she can access to the phonebook, which the targeted icon 130a indicates, by selecting the icon 130a.
Then, the user raises his/her finger on the touch surface 120 where the finger rests on the touch surface 120 after the icon activation, presses a button 160 or other suitable key from the mobile device, or provides the selection other way and by other means in order to access to the phonebook. The application provides another audio feedback, which describes that the selection is made, by means of e.g. a fast replay. After that the user can browse the phonebook for selecting the name of the person to whom he/she wants to call, and make a call by pointing some auditory objects 130a-13Oe, which enables calling, in the auditory menu.
Similarly the user can also provide e.g. a SMS message sending, playlist browsing, playback controlling, and radio station surfing.
In figure 2 is represented an arrangement 200, wherein a mobile device 210, which comprises a touch surface or touch screen 220 in order to receive control commands from a user 230, communicates with a device 240 such as a desktop computer having a display 250, keyboard 260, and loudspeakers 270a, 270b by using a wireless Bluetooth or infrared connection 280. The computer 240 establishes an auditory user interface, which has auditory items such as auditory menus and auditory icons, around the user 230 by the loudspeakers 270a, 270b. Also, the computer 240 can run a graphical user interface comprising e.g. graphical menus and graphical icons displayed on the display 250.
If the user 230, who has the mobile device 210 capable of communicating with the computer 240 wirelessly, wants to control a jukebox application run by the computer 240 in order to select next song to be played from a song list, which includes hundreds of songs, he/she points the touch surface 220 in order to direct a song icon indicating a desired song from the auditory user interface of the jukebox application so that, if the desired song icon locates on the top left corner of the display 250 and/or the front left corner of the auditory user interface, the user 230 touches e.g. by his/her finger(s) the top left corner of the touch surface 220. By pointing the song icon the user 230 activates the song icon and the activation is indicated to the user 230 audibly by reproducing a sound through the loudspeakers 270a, 270b.
When the song icon has been activated, the user 230 raises his/her finger on the touch surface 220 where the finger rests after the activation or presses a button 290 belonging to the mobile device 210 in order to play (select) the desired song. Then, the loudspeakers 270a, 270b reproduce a sound indicating the selection and the desired song, if the song is next on a play list.
The mobile device 210 comprising the touch surface 220 for enabling a song selection can also be utilised as the remote controller of the television (not shown), which visually illustrates a circular menu on the display of the television. The mobile device 210 can be used for e.g. channel selecting or volume level adjusting.
Naturally, it is possible that a computer program, e.g. the above-mentioned jukebox application or a game application, which the computer 240 runs, is displayed on the display 220 in the mobile device 210 and sounds relating to the application are reproduced through the loudspeaker of the mobile device (not shown) or headphones (not shown) connected to the mobile device 210. The connection between the mobile device 210 and the headphones is either through a wire or wireless connection. Moreover, it is possible that the jukebox or game application is displayed on the computer display 250 and sounds are reproduced through the mobile device loudspeaker or headphones connected to the mobile device 210.
Figure 3 discloses, by means of an example only, a flow chart describing a method 300 according to one embodiment of the invention.
During the method start-up in step 310, a mobile device and/or an application, such as a user interface executing a method, is turned on and necessary stages such as a connection set up definition for e.g. external headphones and different parameters' initialisation relating to an auditory user interface, are provided.
Since the start-up stage has been completed, a circular menu is established to a user in the mobile device by means of the touch surface of the mobile device in step 320. The circular menu comprising items can be displayed to the user visually through a mobile device display, e.g. a touch screen, and/or audibly through a circular auditory menu comprising auditory items provided by the mobile device and the headphones. When a mobile device user touches a certain location on the touch surface (touch screen) of the mobile device by e.g. his/her finger in order to activate a menu item from the circular menu, the certain interacted location information, which comprises e.g. the x and y coordinates of the finger touching or information relating to the interacted sector of the circular menu, is obtained resistively, capacitively, or by any other means in step 330.
In step 340 the menu item is obtained on the grounds of the obtained interacted location information so that the menu item is defined directly by means of the interacted sector information or the coordinates information specifying the sector, which then defines the menu item.
Next, during step 350 the menu item activation is indicated to the user by an audio feedback relating to the activated menu item through the headphones. The feedback can be established by using a three dimensional sound so that the feedback relating to the activated menu item is reproduced from the correct direction of the activated item. Mono or stereo sound can also be used instead of the three dimensional sound in order to provide the feedback to the user.
When the menu item activation establishes, the information describing the activated item is synthesised to a speech or other sounds and reproduced to the user. Certainly, recorded samples can be used instead of the synthesised speech. In addition, during the step 350 the sector determining the activated item and its neighbouring sectors expand and the other sectors of the circular menu taper on the surface of the apparatus and in the auditory space around the user for helping the user to target his/her selection.
In step 360, it is determined whether the mobile device receives through the touch surface selection information relating to the activated item. The selection information is indicated e.g. by raising the finger from the touch surface of the mobile device.
If the user does not select the activated item, the method returns back to step 320.
Since the selection is established, the mobile device performs a function, which the activated menu item indicates, in step 370. Such function can be e.g. make a call, place the call on hold, or disconnect the call. In step 380 the performed function or function to be performed is indicated to the user by an audio, visual, and/or tactile feedback, which relates to the selected menu item. This step is usually provided together with step 370.
Finally, when the mobile device controlling is successfully completed, the control method is ended in step 390.
Optionally, when the mobile device is used to control a remote device such as a stereo set or computer having loudspeakers, the method is possible to execute so that the mobile device obtains the interacted location information and transfers the interacted location information to e.g. the stereo set enabling to produce an audi- tory user interface around the user, which obtains the item on the grounds of the obtained interacted location information and provides the audio feedback relating to the activated item to the user of the mobile device.
It is also possible that the mobile device obtains the item according to step 340 of the previous embodiment and transfers the item information to the stereo set so that it can reproduce the audio feedback indicating the activated item.
Despite who obtains the item information, the mobile device determines whether the selection of the activated item is executed and, when it determines that the user selects the activated item, it transfer the selection information to the stereo set performing the function indicated by the activated and selected item. The ste- reo set also provides the audio feedback relating to the selected item to the user.
The above described control methods utilise an auditory user interface. There is in the user interface the auditory menu containing e.g. a dial menu, which enables dialling, listening to, and removing selected numbers, a SMS menu, which contains all letters and a "space" for writing and an icon for sending a written message and phonebook, which comprises an alphabetically organised phonebook. The phonebook can include multiple layers, e.g. alphabets and for each of them a submenu with names.
A menu browsing happens by e.g. circular finger sweeping on a touch surface, as it is mentioned earlier, amongst the menu items, which are spread evenly on the surrounding circular space. Depending on the number of the items the space between them is dynamically changed.
Figure 4 discloses a circular menu 410 and square menu 420 according to an embodiment of the invention, wherein menu centres and sector points locate in the centre of the surface. The circular menu 410 and square menu 420 comprising similarly alphabetic character icons and a "space" icon. Lower menus 410, 420 represent how touch surfaces are divided into sectors having an equal point angle and how each sector comprises one icon. This figure shows one example how to the positions of the alphabetic characters can be implemented around the user in the static auditory menu.
The user can access any item directly by placing the finger on the touch surface and he/she can continue browsing with a circular finger sweep. Selection is made by removing the finger from the surface. The centre of the touch surface is a safe area from where the finger can be lifted without making a selection. Furthermore, other special actions can be assigned to the centre, corners, or any other specific location of the touch surface.
The menus 410, 420 can utilise the static placement of the menu items, which suits well for three dimensional menus where the items are always at known loca- tions. Thus, each item can be accessed directly by touching a known location and the auditory feedback of the targeted items reveals if the right item was activated, and if not, the static order of the items reveals where the desired item will be found since item locations are easy to remember with the longer use of the menu.
The menu items are separate items that monitor if an "interaction" is in their de- fined sector. Targeted items send info to other items in order to manage the positions of the other items, fade their sound, and adjust their sector sizes.
Figure 5 illustrates a circular menu 510 and square menu 520, wherein sectors are dynamically adjusted for enhancing user's pointing accuracy. The dynamic zoom of the target sector reduces undesired jumping between items and facilitates the item selection with bigger number of items.
So, the user touches a touch surface by his/her finger in order to activate an icon, which indicates a letter R and when the apparatus (application) determines an interaction it expands the activated target sector R to both directions for enabling stabile browsing and selecting. At the same time neighbouring sectors S, T, U, Q, P, and O also expand and other sectors, for one, regroup so that the sectors taper. Also in this figure lower menus 510, 520 represent how the part of the sectors expand and the other sectors regroup.
Even if figures 5 and 6 depict that the centre of the menu and the points of the menu sectors locate in the touch surface centre, the centre of the menu, heedless of whether the menu is circular, ellipse, or square, can be placed freely in view of the fact that the menu completely fits on the touch surface. Thus, the user can put his/her finger to any location on the touch surface, which can be large as a wall or small as a mobile phone screen, and that touch location is regarded as the centre of the menu. The user can then touch with his/her finger or sweep the finger to any direction from the newly defined centre and access menu items.
However, defining the centre of the menu to the centre of the screen helps the user to access predefined item locations.
While mentioned above merely the sectors, the touch surface can comprise also a touch area in the middle of the touch surface and/or touchable areas around the sectors.
Figure 6 discloses a further enhancement for auditory menu layouts in order to achieve faster and better usability. In this embodiment a start place is defined to be always a first name in alphabetical order.
Therefore, when a user touches the touch surface (touch screen), e.g. on the letter D, he/she would always know that names starting with D will be heard when browsing clockwise and with counter clockwise browsing a last name starting with C is found.
The advanced spreading method can be adapted also to menus with multiple lay- ers and small menus with only few menu items. An example relating to a browsing a small contact list is depicted in figure 6. A first menu level containing the alphabets can be enhanced with the advanced spreading to gain easier access to the menu items. In a second menu level 610, names can be positioned so that they are always found according to a second letter. For example, Aaron is always posi- tioned in front and Amber behind in an auditory menu. When the user touches the screen, a closest menu item becomes active and the rest is spread evenly around them. If the menu happens to have only four items, the series of four can be repeated when the user continues browsing the menu as it is presented in the centre menu 620. The user can always return to the absolute positioning of the menu items by moving the finger to the centre of the surface (screen) as a menu 630 on the right shows.
When accessing hundreds of menu items (e.g. when browsing music or phonebook contacts), names starting with the same letter are always found from the direction determined by the letter. Method works also with few menu items, where menu items are again located depending on their first letter. So, when user starts browsing the menu, the menu item closest to the finger becomes active.
Figure 7 discloses one example of a mobile device 700 according to an embodiment of the invention.
The mobile device 700 comprises processor 710 for performing instructions and handling data, a memory unit 720 for storing data such as instructions and application data, a user interface 730, which can comprise at least a touch surface or touch screen. The user interface 730 can also comprise be e.g. a single button, a keyboard, or other selection means. In addition, the mobile device 700 comprises data transfer means 740 for transmitting and receiving data and a loudspeaker 750. The mobile device can also comprise a display 760 for providing a visual or tactile feedback, but not necessary.
The memory 720 stores at least an auditory user interface application 722, application 724 for determining interacted location data, and synthesizer application 726. The implemented touch surface 730 obtains the interacted location data, which the processor 710 manipulates according to the instructions of the corresponding application 724, and the synthesizer 726 converts obtained data from text format to speech, which is established through the loudspeaker 750 e.g. by using the three dimensional sound.
Figure 8, for one, discloses a device 800, which is controlled through an air interface by a mobile device, which is capable of receiving control inputs by a touch surface or a touch screen. Such device 800 can be e.g. a mobile station, computer, laptop, DVD recorder, personal computer, stereo set etc. The device 800 comprises a processor 810 for performing instructions and handling data, a mem- ory 820 for storing data such as instructions and application data, a user interface 830 comprising e.g. touchpad, keyboard, or one or more buttons, at least data receiving means 840 for receiving data, a loudspeaker 850 and a display 860, but not necessary. Furthermore, the device 800 can comprise data transmitting means 842 for sending data to an external loudspeaker (not shown).
The memory of the device 800 includes e.g. at least an auditory user interface application 822, application 824 for manipulating interacted location data or item data, and synthesizer application 826. The processor 810 controls the received interacted location data or item data according to the instructions of the corresponding application 724, and the synthesizer 826 converts obtained data from text for- mat to speech, which is provided through the loudspeaker 850 e.g. by using the three dimensional sound.
The invention has been now explained above with reference to the aforesaid embodiments and the several advantages of the invention have been demonstrated. It is clear that the invention is not only restricted to these embodiments, but comprises all possible embodiments within the spirit and scope of the invention thought and the following patent claims.

Claims

1. A method, which comprises obtaining (330) information relating to an interacted location on a surface of an apparatus, obtaining (340) an item corresponding to the information relating to the interacted location on the surface of the apparatus, and providing (350) an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
2. The method according to claim 1 , wherein the method further comprises pro- viding (320) a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus.
3. The method according to claim 1 or 2, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus.
4. The method according to any of the previous claims, wherein the method fur- ther comprises receiving (330) the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors.
5. The method according to any of the previous claims, wherein the method comprises determining (340) a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus.
6. The method according to any of the previous claims, wherein the method further comprises expanding the sector determining the item and neighbouring sectors of the menu, and tapering other sectors of the menu on the surface of the ap- paratus.
7. The method according to any of the previous claims, wherein the method further comprises receiving (360) selection information relating to the item determined by the expanded sector.
8. The method according to any of the previous claims, wherein the method further comprises performing (370) a function relating to the selected item determined by the expanded sector.
9. The method according to any of the previous claims, wherein the method fur- ther comprises providing (380) an audio feedback, tactile feedback, and/or visual feedback relating to the function, which relates to the selected item determined by the expanded sector.
10. The method according to any of the previous claims, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus.
11. The method according to any of the previous claims, wherein the audio feed- back relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech.
12. The method according to any of the previous claims, wherein items locate around the circular menu in an alphabetical order or numerical order.
13. An apparatus configured to perform the method according to any of claims 1 - 11.
14. A computer program product configured to perform the method according to any of claims 1 -1 1 , when the computer program product is run in a computer.
15. A carrier medium comprising a computer program product according to claim 13.
PCT/FI2010/050252 2009-04-06 2010-03-30 Method for controlling an apparatus WO2010116028A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20095376 2009-04-06
FI20095376A FI20095376A (en) 2009-04-06 2009-04-06 A method for controlling the device

Publications (2)

Publication Number Publication Date
WO2010116028A2 true WO2010116028A2 (en) 2010-10-14
WO2010116028A3 WO2010116028A3 (en) 2010-12-16

Family

ID=40590258

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2010/050252 WO2010116028A2 (en) 2009-04-06 2010-03-30 Method for controlling an apparatus

Country Status (2)

Country Link
FI (1) FI20095376A (en)
WO (1) WO2010116028A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013068793A1 (en) * 2011-11-11 2013-05-16 Nokia Corporation A method, apparatus, computer program and user interface
WO2013093566A1 (en) * 2011-12-22 2013-06-27 Nokia Corporation An audio-visual interface for apparatus
EP2613522A1 (en) * 2012-01-06 2013-07-10 Samsung Electronics Co., Ltd. Method and apparatus for on-screen channel selection
WO2014100839A1 (en) * 2012-12-19 2014-06-26 Willem Morkel Van Der Westhuizen User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
EP2761419A1 (en) * 2011-09-30 2014-08-06 Van Der Westhuizen, Willem Morkel Method for human-computer interaction on a graphical user interface (gui)
CN106814966A (en) * 2017-01-24 2017-06-09 腾讯科技(深圳)有限公司 A kind of method and device of control object
CN108139811A (en) * 2015-10-15 2018-06-08 三星电子株式会社 Record performs the method for screen and the electronic equipment of processing this method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6983251B1 (en) * 1999-02-15 2006-01-03 Sharp Kabushiki Kaisha Information selection apparatus selecting desired information from plurality of audio information by mainly using audio
FI118100B (en) * 2005-02-07 2007-06-29 Ilpo Kojo Selector
US8098856B2 (en) * 2006-06-22 2012-01-17 Sony Ericsson Mobile Communications Ab Wireless communications devices with three dimensional audio systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2761419A1 (en) * 2011-09-30 2014-08-06 Van Der Westhuizen, Willem Morkel Method for human-computer interaction on a graphical user interface (gui)
WO2013068793A1 (en) * 2011-11-11 2013-05-16 Nokia Corporation A method, apparatus, computer program and user interface
WO2013093566A1 (en) * 2011-12-22 2013-06-27 Nokia Corporation An audio-visual interface for apparatus
US9632744B2 (en) 2011-12-22 2017-04-25 Nokia Technologies Oy Audio-visual interface for apparatus
EP2613522A1 (en) * 2012-01-06 2013-07-10 Samsung Electronics Co., Ltd. Method and apparatus for on-screen channel selection
WO2014100839A1 (en) * 2012-12-19 2014-06-26 Willem Morkel Van Der Westhuizen User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
CN105190508A (en) * 2012-12-19 2015-12-23 瑞艾利缇盖特(Pty)有限公司 User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
US10732813B2 (en) 2012-12-19 2020-08-04 Flow Labs, Inc. User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
CN108139811A (en) * 2015-10-15 2018-06-08 三星电子株式会社 Record performs the method for screen and the electronic equipment of processing this method
CN106814966A (en) * 2017-01-24 2017-06-09 腾讯科技(深圳)有限公司 A kind of method and device of control object

Also Published As

Publication number Publication date
FI20095376A (en) 2010-10-07
FI20095376A0 (en) 2009-04-06
WO2010116028A3 (en) 2010-12-16

Similar Documents

Publication Publication Date Title
KR101545875B1 (en) Apparatus and method for adjusting of multimedia item
CN101309311B (en) Bidirectional slide-type mobile communciation terminal and method of providing graphic user interface thereof
JP5705131B2 (en) Electronic device operation control method and apparatus using different touch areas
EP1752865B1 (en) Mobile terminal having jog dial and controlling method thereof
KR100993064B1 (en) Method for Music Selection Playback in Touch Screen Adopted Music Playback Apparatus
US20100306703A1 (en) Method, device, module, apparatus, and computer program for an input interface
KR20090043753A (en) Method and apparatus for controlling multitasking of terminal device with touch screen
WO2010116028A2 (en) Method for controlling an apparatus
EP2538696A1 (en) Method and apparatus for multimedia content playback
KR20100081577A (en) Apparatus and method for controlling navigation of object in a portable terminal
JP2007183914A (en) Content navigation method and content navigation apparatus therefor
KR20090085470A (en) A method for providing ui to detecting the plural of touch types at items or a background
CN110908582A (en) Control method, touch control pen and electronic assembly
US20100188351A1 (en) Apparatus and method for playing of multimedia item
EP2071443A2 (en) Method for controlling value of parameter
US20080318618A1 (en) Mobile communication device and method of controlling the same
US8185163B2 (en) Mobile communication device and method of controlling the same
JP2009276833A (en) Display and display method
JP6289324B2 (en) Mobile terminal and display control method
CN103699303A (en) Information processing method and electronic equipment
CN106843903B (en) User behavior mode application method and device of intelligent mobile terminal
WO2010046541A1 (en) Method and device for controlling an application
KR100655928B1 (en) Navigation graphic user interface system and the graphic user interface embodying method thereof
JP2010533916A (en) Display control method, terminal using the method, and recording medium recording the method
CN105930069A (en) Input method switching method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10715911

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10715911

Country of ref document: EP

Kind code of ref document: A2