[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20140026101A1 - Accessible Menu Navigation Techniques For Electronic Devices - Google Patents

Accessible Menu Navigation Techniques For Electronic Devices Download PDF

Info

Publication number
US20140026101A1
US20140026101A1 US13/946,530 US201313946530A US2014026101A1 US 20140026101 A1 US20140026101 A1 US 20140026101A1 US 201313946530 A US201313946530 A US 201313946530A US 2014026101 A1 US2014026101 A1 US 2014026101A1
Authority
US
United States
Prior art keywords
menu
user
navigation
gesture
response
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/946,530
Inventor
Matthew Pallakoff
Harold E. Cohn
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nook Digital LLC
Original Assignee
Nook Digital LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nook Digital LLC filed Critical Nook Digital LLC
Priority to US13/946,530 priority Critical patent/US20140026101A1/en
Assigned to BARNESANDNOBLE.COM LLC reassignment BARNESANDNOBLE.COM LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COHN, HAROLD E., PALLAKOFF, Matthew
Publication of US20140026101A1 publication Critical patent/US20140026101A1/en
Assigned to NOOK DIGITAL LLC reassignment NOOK DIGITAL LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: BARNESANDNOBLE.COM LLC
Assigned to NOOK DIGITAL, LLC reassignment NOOK DIGITAL, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: NOOK DIGITAL LLC
Assigned to NOOK DIGITAL LLC reassignment NOOK DIGITAL LLC CORRECTIVE ASSIGNMENT TO REMOVE APPLICATION NUMBERS 13924129 AND 13924362 PREVIOUSLY RECORDED ON REEL 035187 FRAME 0469. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME. Assignors: BARNESANDNOBLE.COM LLC
Assigned to NOOK DIGITAL, LLC reassignment NOOK DIGITAL, LLC CORRECTIVE ASSIGNMENT TO REMOVE APPLICATION NUMBERS 13924129 AND 13924362 PREVIOUSLY RECORDED ON REEL 035187 FRAME 0476. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME. Assignors: NOOK DIGITAL LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • This disclosure relates to electronic display devices, and more particularly, to user interface (UI) techniques for interacting with computing devices.
  • UI user interface
  • Electronic display devices such as tablets, eReaders, mobile phones, smart phones, personal digital assistants (PDAs), and other such touch screen electronic display devices are commonly used for displaying consumable content.
  • the content may be, for example, an eBook, an online article or blog, images, a movie or video, a map, just to name a few types.
  • Such display devices are also useful for displaying a user interface that allows a user to interact with an application running on the device.
  • the textual content and/or screen controls may be spoken aloud to the user.
  • the user interface may include, for example, one or more touch screen controls and/or one or more displayed labels that correspond to nearby hardware buttons.
  • the touch screen display may be backlit or not, and may be implemented for instance with an LED screen or an electrophoretic display.
  • Such devices may also include other touch sensitive surfaces, such as a track pad (e.g., capacitive or resistive touch sensor) or touch sensitive housing (e.g., acoustic sensor).
  • FIGS. 1 a - b illustrate an example electronic touch screen device having an accessible menu navigation mode configured in accordance with an embodiment of the present invention.
  • FIGS. 1 c - d illustrate example configuration screen shots of the user interface of the electronic touch screen device shown in FIGS. 1 a - b , configured in accordance with an embodiment of the present invention.
  • FIG. 2 a illustrates a block diagram of an electronic touch screen device configured in accordance with an embodiment of the present invention.
  • FIG. 2 b illustrates a block diagram of a communication system including the electronic touch screen device of FIG. 2 a , configured in accordance with an embodiment of the present invention.
  • FIG. 3 is a visual representation of an accessible navigation menu, configured in accordance with an embodiment of the present invention.
  • FIGS. 4 a - b show tables of two accessible menu navigation gesture sequences, in accordance with an embodiment of the present invention.
  • FIGS. 5 a - b collectively illustrate an example accessible menu navigation mode of an electronic touch screen device, in accordance with an embodiment of the present invention.
  • FIGS. 6 a - e collectively illustrate example menu navigation mode functions, in accordance with an embodiment of the present invention.
  • FIG. 7 illustrates a method for providing an accessible menu navigation mode in an electronic touch screen device, in accordance with an embodiment of the present invention.
  • the device may be a touch screen mobile device, or any other device with a touch sensitive surface that can detect user gestures.
  • the user can engage a manual reading mode by performing a manual reading mode activation gesture, wherein the manual mode allows the user to navigate through content, share content, adjust the reading rate, font, volume, or other device settings.
  • the user may navigate through a menu structure using menu navigation gestures and the menu and sub-menu options may be read aloud to the user as they are navigated through.
  • the main menu options may be navigated, for example, using vertical up or down swipe gestures, while sub-menu options within a given menu option may be navigated using horizontal swipe gestures.
  • a selection gesture may allow the user to enable or adjust various menu and sub-menu options, and various earcons or sound effects may guide the navigation process and/or confirm a menu or sub-menu selection in some embodiments.
  • the user may configure the navigation gestures and option selection gestures.
  • the menu options may be structured to allow a user to access content navigation options with upward swipe gestures and access device settings options using downward swipe gestures.
  • electronic display devices such as tablets, eReaders, and smart phones are commonly used for displaying user interfaces and consumable content.
  • the user of such devices can typically consume the displayed content with relative ease.
  • users who are unable to view or read text on the screen may wish to navigate through the device's menu screens and/or select one or more application options.
  • some electronic devices may aurally present textual content to a user, offer printed instructions on a screen protector (e.g., using a braille-embossed screen cover), or offer a hunt-and-peck approach for navigating through menu options
  • an accessible menu navigation user interface as described herein may provide a more intuitive or otherwise positive user experience.
  • accessible menu navigation techniques are disclosed for use in electronic touch screen devices.
  • the techniques facilitate an accessible user interface that may be supported by multiple reading and navigation modes.
  • an accessible mode for an electronic device may read aloud the text of an eBook, an email message, or some other textual content that may be displayed on the device.
  • the accessible mode has an automatic and a manual mode, wherein the manual mode allows the user to actively navigate and/or adjust device settings.
  • the automatic mode in some embodiments, does not allow manual navigation and settings adjustment, and a specific gesture or control feature activation may switch the device from automatic to manual mode.
  • the automatic reading mode facilitates an electronic device reading automatically and continuously from a predetermined point with a selected voice font, volume, and rate.
  • the automatic mode may also, for example, play an earcon or audio cue upon passing sentences, paragraphs, pages, chapters, or other boundaries.
  • manual mode however, the user may change different reading rates, fonts, volumes, etc.
  • the entire display screen may be treated as a single button to facilitate transitioning from automatic into manual mode, or the mode transition may be performed using a physical switch or control feature. For example, a single tap on the display screen may transition the device from automatic mode to manual mode.
  • a menu navigation gesture may include, for example, a swipe gesture up or down on the device screen.
  • a swipe gesture may include a sweeping or dragging gesture across at least a portion of the touch sensitive surface whether directly contacting that surface or hovering over that surface (e.g., within a few centimeters or otherwise close enough to be detected by the touch sensitive surface).
  • the swipe gesture may be performed at a constant speed in one single direction, or may also be an accelerated flick gesture.
  • the gestures can be performed, for example, with the tip of a finger or a stylus, or any other suitable implement capable of providing a detectable swipe gesture.
  • any swipe gesture that is, for example, within a range of 45 degrees of the horizontal or vertical may be treated as a horizontal or vertical gesture.
  • the accessible menu navigation mode may present an audio cue to the user (e.g., reading aloud “entering menu” or playing a distinctive sound effect), or the menu navigation mode may simply begin reading aloud the various menu options.
  • the user input gestures for the accessible menu navigation mode may be categorized into two types: navigation gestures and selection gestures. Navigation gestures include those that allow the user to navigate through the various menus and sub-menus, while selection gestures allow the user to select a menu or sub-menu option, adjust a settings option, or otherwise complete an action within the accessible menu navigation mode.
  • Upward and/or downward swipe gestures may be configured to navigate through various menu levels, and the user may scroll through various sub-menu options within a menu level using sideways swipe gestures (either left or right), in some embodiments. Selecting a given option within a menu or sub-menu may be performed with a selection gesture that could include a swipe gesture, releasing a contact that was being held down, a double-tap gesture, or some other uniquely identifiable selection gesture. As will be apparent, different gesture configurations performed with one or more contact points may be used for any of the navigation or selection gestures described herein.
  • the accessible menu navigation mode may read aloud each menu option and an earcon or sound effect may be played upon accepting or cancelling a menu option, entering a manual or automatic mode, passing a boundary such as a sentence, page, paragraph, or chapter, or upon performing a certain function.
  • An earcon may be, for example, a brief and distinctive sound or chime used to represent a specific action or event, convey other information, or prompt a user action.
  • the accessible menu navigation mode allows for sharing or bookmarking of selected content.
  • a user may set up sub-menus to define different communication means, such as Facebook®, email, etc., or to bookmark or highlight selected content.
  • the user may also define a group of friends and create a customized subgroup, such as, a reading club, church group, etc., and then forward their selection or bookmark to that particular group using one or more communication means.
  • a new bookmark is added to that page. If there is already a bookmark on the current page, however, selecting the bookmark option may delete the bookmark.
  • the accessible menu navigation UI can be similarly invoked within multiple diverse applications, for example, an eReader, Internet browser, picture viewer, file browser, or any other content containing multiple levels of data, menu options, or settings.
  • the accessible menu navigation UI may be invoked without conflicting with other global gestures that might also be used by the device's operating system.
  • the techniques described herein may be combined with drag-and-drop UI techniques, or other UI techniques to aid in navigating and organizing content. Numerous uniquely identifiable engagement schemes that exploit a touch sensitive surface can be used as will be appreciated in light of this disclosure.
  • any touch sensitive device e.g., track pad, touch screen, or other touch sensitive surface, whether capacitive, resistive, acoustic or other touch detecting technology, regardless of whether a user is physically contacting the device or using some sort of implement, such as a stylus
  • touch sensitive device e.g., track pad, touch screen, or other touch sensitive surface, whether capacitive, resistive, acoustic or other touch detecting technology, regardless of whether a user is physically contacting the device or using some sort of implement, such as a stylus
  • user input is sometimes referred to as contact or user contact; however, direct and/or proximate contact (e.g., hovering within a few centimeters of the touch sensitive surface) can be used.
  • proximate contact e.g., hovering within a few centimeters of the touch sensitive surface
  • a user can operate the accessible menu navigation user interface without physically touching the touch sensitive device.
  • FIGS. 1 a - b illustrate an example electronic touch sensitive device having an accessible menu navigation user interface configured in accordance with an embodiment of the present invention.
  • the touch sensitive surface is a touch screen display.
  • the device could be, for example, a tablet such as the NOOK® tablet or eReader by Barnes & Noble.
  • the device may be any electronic device having a touch sensitive user interface for detecting direct touch or otherwise sufficiently proximate contact, and capability for displaying content to a user, such as a mobile phone or mobile computing device such as a laptop, a desktop computing system, a television, a smart display screen, or any other device having a touch sensitive display or a non-sensitive display screen that can be used in conjunction with a touch sensitive surface.
  • a mobile phone or mobile computing device such as a laptop, a desktop computing system, a television, a smart display screen, or any other device having a touch sensitive display or a non-sensitive display screen that can be used in conjunction with a touch sensitive surface.
  • the claimed invention is not intended to be limited to any specific kind or type of electronic device or form factor.
  • the device comprises a housing that includes a number of hardware features such as a power button, control features, and a press-button (sometimes called a home button herein).
  • a user interface is also provided, which in this example embodiment includes a quick navigation menu having six main categories to choose from (Home, Library, Shop, Search, Light, and Settings) and a status bar that includes a number of icons (a night-light icon, a wireless network icon, and a book icon), a battery indicator, and a clock.
  • an accessible UI may aurally present to the user the various menu categories from which the user may select the desired menu with a touch screen gesture or by activating a control feature.
  • Some embodiments may have fewer or additional such UI features, or different UI features altogether, depending on the target application of the device. Any such general UI controls and features can be implemented using any suitable conventional or custom technology, as will be appreciated.
  • the hardware control features provided on the device housing in this example embodiment are configured as elongated press-bars and can be used, for example, to page forward (using the top press-bar) or to page backward (using the bottom press-bar), such as might be useful in an eReader application.
  • the power button can be used to turn the device on and off, and may be used in conjunction with a touch-based UI control feature that allows the user to confirm a given power transition action request (e.g., such as a slide bar or tap point graphic to turn power off).
  • a touch-based UI control feature that allows the user to confirm a given power transition action request (e.g., such as a slide bar or tap point graphic to turn power off).
  • the home button is a physical press-button that can be used as follows: when the device is awake and in use, pressing the button will present to the user (either aurally or visually) the quick navigation menu, which is a toolbar that provides quick access to various features of the device.
  • the home button may also be configured to cease an active function that is currently executing on the device (such as an accessible menu navigation mode), or close a configuration sub-menu that is currently open.
  • the button may further control other functionality if, for example, the user presses and holds the home button. For instance, an example such push-and-hold function could engage a power conservation routine where the device is put to sleep or an otherwise lower power consumption mode.
  • the home button may be associated with and control different and unrelated actions: 1) present the quick navigation menu; 2) exit a configuration sub-menu; and 3) put the device to sleep.
  • the status bar may also include a book icon (upper left corner). In some cases, selecting the book icon may provide bibliographic information on the content or provide the main menu or table of contents for the book, movie, playlist, or other content.
  • an accessible menu navigation configuration sub-menu such as the one shown in FIG. 1 d
  • An accessible UI may, for example, aurally present to the user the various sub-menus and configuration options displayed in FIGS. 1 c - d , and the user may select the desired sub-menu or option with a touch screen selection gesture, by activating a control feature, by speaking a voice command, or through some other input means.
  • Other accessible selection options will be apparent in light of this disclosure. From this general sub-menu, the user can select any one of a number of options, including one designated Screen/UI in this specific example case.
  • Selecting this sub-menu item may cause the configuration sub-menu of FIG. 1 d to be presented, in accordance with an embodiment.
  • selecting the Screen/UI option may present the user with a number of additional sub-options, one of which may include a so-called “accessible menu navigation” option, which may then be selected by the user so as to cause the accessible menu navigation configuration sub-menu of FIG. 1 d to be presented.
  • the accessible menu navigation UI is hard-coded such that no configuration sub-menus are needed or otherwise provided (e.g., performing the accessible menu navigation gestures as described herein, with no user configuration needed). The degree of hard-coding versus user-configurability can vary from one embodiment to the next, and the claimed invention is not intended to be limited to any particular configuration scheme of any kind, as will be appreciated.
  • the various UI control features and sub-menus displayed to the user are implemented as touch screen controls in this example embodiment.
  • Such UI screen controls can be programmed or otherwise configured using any number of conventional or custom technologies.
  • the touch screen display translates a touch (direct or hovering, by a user's hand, a stylus, or any other suitable implement) in a given location into an electrical signal which is then received and processed by the device's underlying operating system (OS) and circuitry (processor, display controller, etc.).
  • OS operating system
  • circuitry processor, display controller, etc.
  • the touch screen display may be configured to detect input based on a finger or stylus hovering over the touch sensitive surface (e.g., within 3 centimeters of the touch screen or otherwise sufficiently proximate to be detected by the touch sensing circuitry). Additional example details of the underlying OS and circuitry in accordance with some embodiments will be discussed in turn with reference to FIG. 2 a.
  • the touch sensitive surface can be any surface that is configured with touch detecting technologies, whether capacitive, resistive, acoustic, active-stylus, and/or other input detecting technology, including direct contact and/or proximate contact.
  • the screen display can be layered above input sensors, such as a capacitive sensor grid for passive touch-based input, such as with a finger or passive stylus contact in the case of a so-called in-plane switching (IPS) panel, or an electro-magnetic resonance (EMR) sensor grid for sensing a resonant circuit of a stylus.
  • IPS in-plane switching
  • EMR electro-magnetic resonance
  • the touch sensitive display can be configured with a purely capacitive sensor, while in other embodiments the touch screen display may be configured to provide a hybrid mode that allows for both capacitive input and EMR input, for example.
  • the touch sensitive surface is configured with only an active stylus sensor.
  • Numerous touch screen display configurations can be implemented using any number of known or proprietary screen based input detecting technologies.
  • a touch sensitive controller may be configured to selectively scan the touch sensitive surface and/or selectively report user inputs detected directly on or otherwise sufficiently proximate to (e.g., within a few centimeters, or otherwise sufficiently close so as to allow detection) the detection surface (or touch sensitive display, in this example case).
  • the user can then select the Screen/UI option.
  • the accessible menu navigation configuration sub-menu shown in FIG. 1 d can be provided to the user, in accordance with one such example embodiment.
  • the user can configure a number of features with respect to the accessible menu navigation mode, in this example case.
  • the configuration sub-menu includes a UI check box that when checked or otherwise selected by the user, effectively enables the accessible menu navigation mode (shown in the enabled state); unchecking the box disables the function.
  • the accessible menu navigation mode may have the accessible menu navigation mode always enabled or enabled by a physical switch or button located on the device, for example.
  • the accessible menu navigation mode may associate a sound effect or earcon with certain menu actions and/or boundaries.
  • the user may enable or disable the earcon function, and in this particular embodiment, the user has enabled earcons.
  • the accessible menu navigation mode may also allow content sharing through one or more communication means such as Facebook®, email, etc.
  • the user may wish to disable content sharing in order to ensure that no external access to the user's content is possible, and the accessible menu navigation configuration sub-menu may allow the user to enable or disable this feature. In this particular embodiment, the user does not have content sharing enabled.
  • the menu and/or sub-menu option selection gesture could include a vertical or horizontal swipe gesture, releasing a contact point that was being held down, a double-tap gesture, or some other gesture.
  • the user may configure the selection gesture by picking from a drop-down menu, and the user has configured the selection gesture to be a double-tap gesture.
  • a back button arrow UI control feature may be provisioned on the screen for any of the menus provided, so that the user can go back to the previous menu, if so desired.
  • a universal back screen gesture may be performed in order to return to the previous menu.
  • configuration settings provided by the user can be saved automatically (e.g., user input is saved as selections are made or otherwise provided).
  • a save button or other such UI feature can be provisioned, or a save gesture performed, which the user can engage as desired.
  • FIG. 1 d is presented merely as an example of how an accessible menu navigation mode may be configured by the user, and numerous other configurable or hard-codable aspects will be apparent in light of this disclosure.
  • Other embodiments may confirm an action or menu selection using earcons, sound effects, or animations, and such sound effects and/or animations may be used to provide clarity to the function being performed or otherwise enhance the user experience.
  • sound effects and/or animations may be used to provide clarity to the function being performed or otherwise enhance the user experience.
  • animations and sound effects may be user-configurable, while in other embodiments they are hard-coded.
  • FIG. 2 a illustrates a block diagram of an electronic touch screen device configured in accordance with an embodiment of the present invention.
  • this example device includes a processor, memory (e.g., RAM and/or ROM for processor workspace and storage), additional storage/memory (e.g., for content), a communications module, a touch screen, and an audio module.
  • memory e.g., RAM and/or ROM for processor workspace and storage
  • additional storage/memory e.g., for content
  • a communications module e.g., for content
  • a communications bus and interconnect is also provided to allow inter-device communication.
  • Other typical componentry and functionality not reflected in the block diagram will be apparent (e.g., battery, co-processor, etc).
  • the touch screen and underlying circuitry is capable of translating a user's contact (direct or proximate) with the touch screen into an electronic signal that can be manipulated or otherwise used to trigger a specific user interface action, such as those provided herein.
  • the principles provided herein equally apply to any such touch sensitive devices. For ease of description, examples are provided with touch screen technology.
  • the memory includes a number of modules stored therein that can be accessed and executed by the processor (and/or a co-processor).
  • the modules include an operating system (OS), a user interface (UI), and a power conservation routine (Power).
  • OS operating system
  • UI user interface
  • Power power conservation routine
  • the modules can be implemented, for example, in any suitable programming language (e.g., C, C++, objective C, JavaScript, custom or proprietary instruction sets, etc), and encoded on a machine readable medium, that when executed by the processor (and/or co-processors), carries out the functionality of the device including a UI having an accessible menu navigation mode as variously described herein.
  • the computer readable medium may be, for example, a hard drive, compact disk, memory stick, server, or any suitable non-transitory computer/computing device memory that includes executable instructions, or a plurality or combination of such memories.
  • Other embodiments can be implemented, for instance, with gate-level logic or an application-specific integrated circuit (ASIC) or chip set or other such purpose-built logic, or a microcontroller having input/output capability (e.g., inputs for receiving user inputs and outputs for directing other components) and a number of embedded routines for carrying out the device functionality.
  • the functional modules can be implemented in hardware, software, firmware, or a combination thereof.
  • the processor can be any suitable processor (e.g., Texas Instruments OMAP4, dual-core ARM Cortex-A9, 1.5 GHz), and may include one or more co-processors or controllers to assist in device control.
  • the processor receives input from the user, including input from or otherwise derived from the power button and the home button.
  • the processor can also have a direct connection to a battery so that it can perform base level tasks even during sleep or low power modes.
  • the memory e.g., for processor workspace and executable file storage
  • the storage (e.g., for storing consumable content and user files) can also be implemented with any suitable memory and size (e.g., 2 GBytes of flash memory).
  • the display can be implemented, for example, with a 7 to 9 inch 1920 ⁇ 1280 IPS LCD touchscreen touch screen, or any other suitable display and touchscreen interface technology.
  • the communications module can be, for instance, any suitable 802.11b/g/n WLAN chip or chip set, which allows for connection to a local network, and so that content can be exchanged between the device and a remote system (e.g., content provider or repository depending on the application of the device).
  • the device housing that contains all the various componentry measures about 7′′ to 9′′ high by about 5′′ to 6′′ wide by about 0.5′′ thick, and weighs about 7 to 8 ounces. Any number of suitable form factors can be used, depending on the target application (e.g., laptop, desktop, mobile phone, etc).
  • the device may be smaller, for example, for smartphone and tablet applications and larger for smart computer monitor and laptop and desktop computer applications.
  • the operating system (OS) module can be implemented with any suitable OS, but in some example embodiments is implemented with Google Android OS or Linux OS or Microsoft OS or Apple OS. As will be appreciated in light of this disclosure, the techniques provided herein can be implemented on any such platforms.
  • the power management (Power) module can be configured as typically done, such as to automatically transition the device to a low power consumption or sleep mode after a period of non-use. A wake-up from that sleep mode can be achieved, for example, by a physical button press and/or a touch screen swipe or other action.
  • the user interface (UI) module can be, for example, based on touchscreen technology and the various example screen shots and use-case scenarios shown in FIGS.
  • the audio module can be configured to speak or otherwise aurally present, for example, menu options, a selected eBook, or any other textual content, and/or to provide verbal and/or other sound-based cues and earcons to guide the accessible menu navigation process, as will be appreciated in light of this disclosure.
  • Numerous commercially available text-to-speech modules can be used, such as Verbose text-to-speech software by NCH Software.
  • a touch screen display is provided, other embodiments may include a non-touch screen and a touch sensitive surface such as a track pad, or a touch sensitive housing configured with one or more acoustic sensors, etc.
  • FIG. 2 b illustrates a block diagram of a communication system configured in accordance with an embodiment of the present invention.
  • the system generally includes an electronic touch sensitive device (such as the one in FIG. 2 a ) that is capable of communicating with a server via a network/cloud.
  • the electronic touch sensitive device may be, for example, an eBook reader, a mobile cell phone, a laptop, a tablet, desktop, or any other touch sensitive computing device.
  • the network/cloud may be a public and/or private network, such as a private local area network operatively coupled to a wide area network such as the Internet.
  • the server may be programmed or otherwise configured to receive content requests from a user via the touch sensitive device and to respond to those requests by performing a desired function or providing the user with requested or otherwise recommended content.
  • the server is configured to remotely provision an accessible menu navigation mode as provided herein to the touch screen device (e.g., via JavaScript or other browser based technology).
  • portions of the accessible menu navigation methodology can be executed on the server and other portions of the methodology can be executed on the device. Numerous server-side/client-side execution schemes can be implemented to facilitate an accessible menu navigation mode in accordance with an embodiment, as will be apparent in light of this disclosure.
  • FIG. 3 is a visual representation of an accessible navigation menu, configured in accordance with an embodiment of the present invention.
  • a typical sentence in an eBook is represented as “The quick brown fox jumps over the lazy dog.”
  • the sentence may be read aloud to the user, and the current word being read may be “jumps,” so that word is displayed in all-caps in this particular example.
  • the navigation menu described herein may be utilized for a single word, a sentence, a selection of text, or any other content presented on the device.
  • the navigation menu may include, for example, a number of navigation and/or settings options and the user may engage these options by performing a swipe or flick gesture up or down vertically on the touch screen device.
  • the settings options are read aloud if the user scrolls down, while the navigation options are read aloud if the user scrolls up. In other embodiments all menu options may be read aloud when scrolling either up or down, and different gestures and/or menu structures will be apparent in light of this disclosure.
  • the navigation options include options to find a word, define a word, spell a word, select chapters, add/delete bookmarks, add/delete notes, and share content; while the settings options include adjusting the reading rate, adjusting volume, and changing voice font. Some options, like changing the voice font, for example, may have multiple sub-options that can be scrolled through.
  • the user input gestures for the accessible menu navigation mode may be categorized into two types: menu navigation gestures and selection gestures.
  • menu navigation gestures may include those that allow the user to navigate through the various menus and sub-menus, while selection gestures allow the user to select a menu or sub-menu option, adjust a settings option, or otherwise complete an action within the accessible menu navigation mode.
  • FIGS. 4 a - b illustrate tables of two accessible menu navigation gesture sequences, in accordance with an embodiment of the present invention.
  • the accessible menu navigation mode is configured according to the menu structure shown in FIG. 3 .
  • the first gesture includes a downward swipe to access the settings options
  • the “adjust rate” option is the first one that is read aloud.
  • the menu option spoken is a setting with a current value
  • a horizontal slide immediately after the setting name is spoken results in changing that setting's value. Consequently, if the user performs a swipe gesture in the left or right direction, for example, the reading rate could be decreased or increased respectively.
  • the user's second gesture is a horizontal swipe toward the right, which adjusts the value of the current setting by increasing the rate once.
  • the user may continue to hold the contact point on the touch screen, in some embodiments, so that the reading rate continues to increase.
  • Such a swipe-and-hold gesture may continuously increase the rate until the contact point is lifted.
  • the user may perform multiple swipe gestures to the right in order to increase the rate a number of times.
  • the currently running application e.g., an audiobook reading
  • the user swipes to the right to increase the reading rate an audio sample of the increased reading rate may be played, in some embodiments.
  • the user's fourth gesture includes lifting the contact point to select that rate.
  • the user may perform an upward swipe gesture, or some other identifiable selection gesture, in order to select the desired reading rate.
  • the first gesture includes an upward swipe to access the navigation options, and the “find word” option is the first one that is read aloud.
  • the user performs gesture 2 , a sideways swipe to the right, in order to select the “find word” option, and the accessible menu navigation mode begins slowly reading the words of the current sentence or selection of text.
  • the “find word” function may include performing a word search, as shown in the example of FIG. 6 a and described in more detail below.
  • an earcon may be played after each word and a short pause may give the user time to select the word.
  • gesture 3 an upward swipe gesture, which in this particular menu structure accesses the “define word” menu option.
  • the words of a sentence may be presented in a list that can be read through using single flick gestures (e.g., horizontal flick gestures), and a double-tap or upward swipe gesture may take the user back to the page associated with the selected word.
  • gesture 4 a selection gesture, gesture 4 , which is a sideways swipe to the right.
  • the accessible menu navigation mode reads the word's definition, and in some cases the definition may be textually displayed on the device in a pop-up window.
  • a select gesture (e.g., a double-tap gesture) may cause the definition to be re-read from the beginning.
  • a dismiss gesture or action (e.g., pressing the home button) at any point during the menu navigation or option selection may close the current window, return to the previous menu structure level, or return to reading content.
  • the content within the definition window may be navigable by section.
  • the other menu options displayed in FIG. 3 may be adjusted or selected using the same or similar procedures as the ones described in reference to FIGS. 4 a - b.
  • FIGS. 5 a - b collectively illustrate an example accessible menu navigation mode of an electronic touch screen device, in accordance with an embodiment of the present invention.
  • This example gesture sequence shows how the accessible menu navigation mode might be implemented on the surface of a touch screen device operating in the manual mode.
  • the device housing surrounds the touch screen of the device, and the user can interact with the touch screen with fingers or other suitable implement.
  • the example in FIG. 5 a shows the user performing two downward menu navigation gestures in order to access the “adjust volume” menu option.
  • the “adjust volume” menu option When the “adjust volume” menu option is being read aloud, it may be accompanied by an earcon or other sound effect that presents the current volume level to the user.
  • a volume icon may also be displayed on the screen to indicate the current volume level.
  • the swipe gesture might be a single swipe motion, or a swipe-and-hold motion, in some embodiments.
  • a single swipe gesture might increase the volume by one volume unit, while a swipe-and-hold gesture may continuously increase the volume until the contact point is lifted.
  • adjusting the volume may be accompanied by an earcon that presents the volume level to the user, or a volume icon that indicates the current volume level.
  • FIGS. 6 a - e collectively illustrate example menu navigation mode functions, in accordance with an embodiment of the present invention.
  • FIG. 6 a shows an example where the “find word” function includes performing a word search through a document for the currently selected word or a word in an application clipboard.
  • a word is selected using a single finger double tap, thus accessing a menu of actions that may be performed on that word, including “find word,” “define word,” “spell word,” etc.
  • results from the lookup may be presented in a list with the first item in the list selected.
  • the list may be visually displayed on the device, in some embodiments, and the results may be read aloud along with audio cues, such as “Result 1 of N” where N is the number of lookup results (capped at some rational maximum).
  • the list may be longer than the display screen, in some embodiments, and navigating “off” the list may merely refresh the visual display with the next (or previous) chunk of search results.
  • the user may navigate the list using touch screen gestures, or the list may be read aloud with pauses between each result allowing the user time to select the desired word. In some cases the entire sentence or phrase containing the word may be read aloud in order to provide the context of the search result. Selecting a search result may return the user to the reading mode at the chosen location.
  • FIG. 6 b shows an example menu navigation function where the “go to page” menu option is selected (using a single finger double tap gesture), in accordance with one embodiment of the present invention.
  • the user is presented with a phone-pad-style keypad, or other page entry window, that allows the input of page numbers. Each number selected may be added to the display window, and the selection process may be aurally presented and guided using audio cues or earcons, in some embodiments.
  • the user has a delete button to correct mistakes and a “go” button to jump to the selected page. If the entry results in a non-existent page, and appropriate error message or earcon may be aurally presented and the keypad may remain visible with the text field cleared for further input.
  • FIG. 6 c shows an example menu navigation function where the “spell word” menu option has been selected, in accordance with one embodiment of the present invention.
  • the selected word is stated, spelled, and then re-stated, per normal spelling bee rules.
  • other spelling presentations may be implemented.
  • the device remains in the spelling menu layer such that a subsequent selection gesture (e.g., a double tap gesture) will spell the word again.
  • the “spell word” option is used to spell a pre-selected word, however, if no word is currently selected, a word selection may be performed after the “spell word” function begins.
  • FIGS. 6 d - e show an example menu navigation function for adding and deleting notes, in accordance with one embodiment of the present invention.
  • the device displays a text entry window that allows the user to input the content of a note.
  • the user can save the note using a selection gesture, or exit the add note function using a dismiss gesture.
  • the user may be aurally prompted to confirm creating an note, e.g., “Double tap to confirm, single tap to return to text entry.”
  • the text entry window may disappear and the device may return to reading content.
  • 6 e shows an example where the user selects the “Notes” or “Go to Notes” menu option and is presented with a list of all the notes relating to the content currently being presented on the device.
  • the notes may be presented in a list and may be read aloud to the user with audio cues, such as “Note 1 of N” where N is the number of notes.
  • the user can move from note to note, for example, using horizontal flick gestures.
  • the list may also be visually displayed on the device, and the list may be longer than the display screen, in some embodiments. In such cases, navigating “off” the list may merely refresh the visual display with the next (or previous) chunk of notes.
  • a single tap on a note may jump the user to the note in a note editor window similar to the one shown in FIG. 6 d , and a delete gesture (e.g., a double tap with two contact points) may begin a delete dialog.
  • the delete dialog may prompt the user to confirm the deletion, and after deletion the note list may be displayed again until a global dismiss gesture is performed.
  • a global dismiss gesture e.g., a double tap gesture with two fingers
  • a global dismiss gesture e.g., a double tap gesture with two fingers
  • the various gestures described are provided as examples only, and many other selection and/or dismiss gestures will be apparent in light of this disclosure.
  • FIG. 7 illustrates a method for providing an accessible menu navigation user interface in an electronic touch screen device, in accordance with an embodiment of the present invention.
  • This example methodology may be implemented, for instance, by the UI module of the example touch screen device shown in FIG. 2 a , or the example touch screen device shown in FIG. 2 b (e.g., with the UI provisioned to the client by the server).
  • the accessible menu navigation mode can be implemented in software, hardware, firmware, or any combination thereof, as will be appreciated in light of this disclosure.
  • the method generally includes sensing a user's input by a touch screen display.
  • the UI code (and/or hardware) can assume a swipe gesture has been engaged and track the path of the contact point with respect to any fixed point within the touch screen until the user stops engaging the touch screen surface.
  • the release point can also be captured by the UI as it may be used to commit the action started when the user pressed on the touch sensitive screen.
  • a tap or press or press-and-hold command may be assumed depending on the amount of time the user was continually pressing on the touch sensitive screen.
  • the method includes detecting 701 a menu navigation gesture on the touch sensitive interface.
  • the gesture contact may be performed in any suitable manner using a stylus, the user's finger, or any other suitable implement, and it may be performed on a touch screen surface, a track pad, acoustic sensor, or other touch sensitive surface.
  • the user contact monitoring is essentially continuous.
  • the method may continue with reading 702 the menu option aloud to the user.
  • an upward swipe may prompts a group of navigation menu options to be read aloud, while a downward swipe may prompt a group of settings options to be read aloud.
  • Other embodiments may read aloud all the menu options, either organized or listed at random, upon performing either an upward or downward swipe gesture. Some embodiments may scroll through the entire menu list after a single menu navigation gesture, with each menu option followed by an earcon or a slight pause, giving the user a chance to select that option. Other embodiments, like the one shown in this example method, may require a separate swipe gesture in order to scroll through each menu option.
  • the method may continue with determining 703 whether the menu option has one or more sub-menus.
  • a navigation menu may include a “chapters” menu option and this option may include multiple sub-menu options for each chapter in a given book. If sub-menu options are available, the method may continue with determining 704 whether a sub-menu navigation gesture is detected. In one embodiment, if the menu navigation gesture is a vertical swipe gesture, the sub-menu navigation gesture may be a horizontal swipe gesture. If no sub-menu navigation gesture is detected, the method may return to monitoring 701 for another menu navigation gesture. If a sub-menu navigation gesture is detected, the method may read aloud 705 the sub-menu options.
  • the method may continue with determining 706 whether a sub-menu selection gesture is detected, and if none is detected the method may continue with determining 604 whether another sub-menu navigation gesture is detected. If a sub-menu selection gesture is detected, the method may continue with selecting 707 the sub-menu option.
  • a user has just opened a book with an eReader application and performs multiple menu navigation gestures (e.g., upward swipe gestures) in order to access the “chapter” menu.
  • the user then performs three sub-menu navigation gestures (e.g., sideways swipe gestures to the right) in order to access the sub-menu option “chapter 3,” and then performs a sub-menu selection gesture (e.g., a double-tap gesture) in order to access the content of chapter 3.
  • sub-menu navigation gestures e.g., sideways swipe gestures to the right
  • a sub-menu selection gesture e.g., a double-tap gesture
  • the method may continue with determining 708 whether the menu option selection gesture is detected.
  • One such menu option may be the “spell word” menu option, because the option may either be selected or not, and no sub-menu options are available.
  • Another such menu option may include an “adjust volume” or “adjust rate” menu option, wherein the menu option selection gesture includes a value adjustment gesture, such as a horizontal swipe gesture used to adjust the value for that setting or menu option. If a menu selection gesture is detected, the method may continue with performing 709 the menu option function. If no menu selection gesture is detected, the method may continue with abandoning 710 the menu navigation mode.
  • Abandoning the menu navigation mode may occur if no contact is detected after a certain period of time. Furthermore, at any point during the accessible menu navigation mode the mode may be abandoned if the home button is pressed or if some other hard-coded or configurable abandon action is performed.
  • the various menu navigation gestures and selection gestures may be hard-coded or configured by the user, and different gesture configurations performed with one or more contact points may be used for any of the navigation or selection gestures.
  • One example embodiment of the present invention provides a device including a touch sensitive surface for allowing user input.
  • the device also includes a user interface including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture.
  • the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options.
  • the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
  • the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures.
  • the selection gesture includes at least one of: a vertical swipe gesture, a horizontal swipe gesture, a double-tap gesture, and/or lifting a contact point from the touch sensitive surface.
  • the selection gesture is user configurable.
  • the accessible menu navigation mode is configured to navigate through one or more content navigation options in response to a first menu navigation gesture, and to navigate through one or more settings options in response to a second menu navigation gesture, the first and second menu navigation gestures having opposite orientations.
  • the menu and sub-menu options are user configurable.
  • one or more menu and/or sub-menu options allow the user to share selected content over the Internet.
  • one or more menu and/or sub-menu options allow the user to create a customized group of people and share selected content with that group.
  • the accessible menu navigation mode is configured to abandon if no user contact is detected at the touch sensitive surface after a specific period of time.
  • a mobile computing system including a processor and a touch sensitive surface for allowing user input, and a user interface executable on the processor and including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture.
  • the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options.
  • the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
  • the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures.
  • the computer program product may include one or more computer readable mediums such as, for example, a hard drive, compact disk, memory stick, server, cache memory, register memory, random access memory, read only memory, flash memory, or any suitable non-transitory memory that is encoded with instructions that can be executed by one or more processors, or a plurality or combination of such memories.
  • the process is configured to receive at a touch sensitive surface of the electronic device a menu navigation gesture; aurally present a menu option in response to the menu navigation gesture; receive at the touch sensitive surface a selection gesture; and adjust a navigation option and/or settings option in response to the selection gesture.
  • the process is further configured to receive at the touch sensitive surface a sub-menu navigation gesture; and aurally present a sub-menu option in response to the sub-menu navigation gesture.
  • the process is further configured to receive at the touch sensitive surface a menu option value adjustment gesture; and adjust a menu option value in response to the value adjustment gesture.
  • the process is further configured to activate a manual reading mode in response to a manual reading mode activation gesture detected at the touch sensitive surface, wherein the manual reading mode allows the user to adjust and select navigation and settings options.
  • the process is further configured to: aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Techniques are disclosed for providing an accessible menu navigation mode in electronic computing devices. The user can engage a manual reading mode, using a manual reading mode activation gesture, wherein the user may navigate through content, share content, or change reading rate, font, volume, or other device settings. The user may navigate through a menu structure using menu navigation gestures and the menu and sub-menu options may be read aloud to the user as they are navigated through. A selection gesture may allow the user to enable or adjust various menu and sub-menu options, and an earcon or sound effect may guide the navigation process and/or confirm a menu selection. The user may configure the navigation gestures and option selection gestures. The menu options may be structured to allow a user to access content navigation options with upward swipe gestures and access device settings using downward swipe gestures.

Description

    RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application Nos. 61/674,098 and 61/674,102 both filed on Jul. 20, 2012 each of which are herein incorporated by reference in their entirety.
  • FIELD OF THE DISCLOSURE
  • This disclosure relates to electronic display devices, and more particularly, to user interface (UI) techniques for interacting with computing devices.
  • BACKGROUND
  • Electronic display devices such as tablets, eReaders, mobile phones, smart phones, personal digital assistants (PDAs), and other such touch screen electronic display devices are commonly used for displaying consumable content. The content may be, for example, an eBook, an online article or blog, images, a movie or video, a map, just to name a few types. Such display devices are also useful for displaying a user interface that allows a user to interact with an application running on the device. The textual content and/or screen controls may be spoken aloud to the user. The user interface may include, for example, one or more touch screen controls and/or one or more displayed labels that correspond to nearby hardware buttons. The touch screen display may be backlit or not, and may be implemented for instance with an LED screen or an electrophoretic display. Such devices may also include other touch sensitive surfaces, such as a track pad (e.g., capacitive or resistive touch sensor) or touch sensitive housing (e.g., acoustic sensor).
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1 a-b illustrate an example electronic touch screen device having an accessible menu navigation mode configured in accordance with an embodiment of the present invention.
  • FIGS. 1 c-d illustrate example configuration screen shots of the user interface of the electronic touch screen device shown in FIGS. 1 a-b, configured in accordance with an embodiment of the present invention.
  • FIG. 2 a illustrates a block diagram of an electronic touch screen device configured in accordance with an embodiment of the present invention.
  • FIG. 2 b illustrates a block diagram of a communication system including the electronic touch screen device of FIG. 2 a, configured in accordance with an embodiment of the present invention.
  • FIG. 3 is a visual representation of an accessible navigation menu, configured in accordance with an embodiment of the present invention.
  • FIGS. 4 a-b show tables of two accessible menu navigation gesture sequences, in accordance with an embodiment of the present invention.
  • FIGS. 5 a-b collectively illustrate an example accessible menu navigation mode of an electronic touch screen device, in accordance with an embodiment of the present invention.
  • FIGS. 6 a-e collectively illustrate example menu navigation mode functions, in accordance with an embodiment of the present invention.
  • FIG. 7 illustrates a method for providing an accessible menu navigation mode in an electronic touch screen device, in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Techniques are disclosed for providing an accessible menu navigation mode in electronic computing devices. The device may be a touch screen mobile device, or any other device with a touch sensitive surface that can detect user gestures. The user can engage a manual reading mode by performing a manual reading mode activation gesture, wherein the manual mode allows the user to navigate through content, share content, adjust the reading rate, font, volume, or other device settings. The user may navigate through a menu structure using menu navigation gestures and the menu and sub-menu options may be read aloud to the user as they are navigated through. The main menu options may be navigated, for example, using vertical up or down swipe gestures, while sub-menu options within a given menu option may be navigated using horizontal swipe gestures. A selection gesture may allow the user to enable or adjust various menu and sub-menu options, and various earcons or sound effects may guide the navigation process and/or confirm a menu or sub-menu selection in some embodiments. The user may configure the navigation gestures and option selection gestures. The menu options may be structured to allow a user to access content navigation options with upward swipe gestures and access device settings options using downward swipe gestures.
  • General Overview
  • As previously explained, electronic display devices such as tablets, eReaders, and smart phones are commonly used for displaying user interfaces and consumable content. The user of such devices can typically consume the displayed content with relative ease. In some instances, users who are unable to view or read text on the screen, may wish to navigate through the device's menu screens and/or select one or more application options. While some electronic devices may aurally present textual content to a user, offer printed instructions on a screen protector (e.g., using a braille-embossed screen cover), or offer a hunt-and-peck approach for navigating through menu options, an accessible menu navigation user interface as described herein may provide a more intuitive or otherwise positive user experience.
  • Thus, and in accordance with an embodiment of the present invention, accessible menu navigation techniques are disclosed for use in electronic touch screen devices. The techniques facilitate an accessible user interface that may be supported by multiple reading and navigation modes. In some embodiments, an accessible mode for an electronic device may read aloud the text of an eBook, an email message, or some other textual content that may be displayed on the device. In one such embodiment, the accessible mode has an automatic and a manual mode, wherein the manual mode allows the user to actively navigate and/or adjust device settings. The automatic mode, in some embodiments, does not allow manual navigation and settings adjustment, and a specific gesture or control feature activation may switch the device from automatic to manual mode. In one embodiment, the automatic reading mode facilitates an electronic device reading automatically and continuously from a predetermined point with a selected voice font, volume, and rate. The automatic mode may also, for example, play an earcon or audio cue upon passing sentences, paragraphs, pages, chapters, or other boundaries. In manual mode, however, the user may change different reading rates, fonts, volumes, etc. In one embodiment, the entire display screen may be treated as a single button to facilitate transitioning from automatic into manual mode, or the mode transition may be performed using a physical switch or control feature. For example, a single tap on the display screen may transition the device from automatic mode to manual mode.
  • In the manual accessibility mode the user can perform, for example, the various navigation gestures and selection gestures described herein. A menu navigation gesture may include, for example, a swipe gesture up or down on the device screen. As used herein, a swipe gesture may include a sweeping or dragging gesture across at least a portion of the touch sensitive surface whether directly contacting that surface or hovering over that surface (e.g., within a few centimeters or otherwise close enough to be detected by the touch sensitive surface). In some embodiments, the swipe gesture may be performed at a constant speed in one single direction, or may also be an accelerated flick gesture. The gestures can be performed, for example, with the tip of a finger or a stylus, or any other suitable implement capable of providing a detectable swipe gesture. To facilitate detection of a substantially horizontal and/or vertical swipe gesture with reference to the bottom of the electronic device's screen, any swipe gesture that is, for example, within a range of 45 degrees of the horizontal or vertical may be treated as a horizontal or vertical gesture.
  • Once invoked, the accessible menu navigation mode may present an audio cue to the user (e.g., reading aloud “entering menu” or playing a distinctive sound effect), or the menu navigation mode may simply begin reading aloud the various menu options. In one embodiment, the user input gestures for the accessible menu navigation mode may be categorized into two types: navigation gestures and selection gestures. Navigation gestures include those that allow the user to navigate through the various menus and sub-menus, while selection gestures allow the user to select a menu or sub-menu option, adjust a settings option, or otherwise complete an action within the accessible menu navigation mode. Upward and/or downward swipe gestures may be configured to navigate through various menu levels, and the user may scroll through various sub-menu options within a menu level using sideways swipe gestures (either left or right), in some embodiments. Selecting a given option within a menu or sub-menu may be performed with a selection gesture that could include a swipe gesture, releasing a contact that was being held down, a double-tap gesture, or some other uniquely identifiable selection gesture. As will be apparent, different gesture configurations performed with one or more contact points may be used for any of the navigation or selection gestures described herein. In some embodiments, the accessible menu navigation mode may read aloud each menu option and an earcon or sound effect may be played upon accepting or cancelling a menu option, entering a manual or automatic mode, passing a boundary such as a sentence, page, paragraph, or chapter, or upon performing a certain function. An earcon may be, for example, a brief and distinctive sound or chime used to represent a specific action or event, convey other information, or prompt a user action.
  • In addition to adjusting device settings and navigating through content, in some embodiments the accessible menu navigation mode allows for sharing or bookmarking of selected content. In such an embodiment, a user may set up sub-menus to define different communication means, such as Facebook®, email, etc., or to bookmark or highlight selected content. The user may also define a group of friends and create a customized subgroup, such as, a reading club, church group, etc., and then forward their selection or bookmark to that particular group using one or more communication means. In one embodiment, if no bookmark is currently on a page when the bookmark option is selected, a new bookmark is added to that page. If there is already a bookmark on the current page, however, selecting the bookmark option may delete the bookmark.
  • Given the global nature and/or uniqueness of the engagement mechanism, in accordance with some example embodiments, the accessible menu navigation UI can be similarly invoked within multiple diverse applications, for example, an eReader, Internet browser, picture viewer, file browser, or any other content containing multiple levels of data, menu options, or settings. In such embodiments, the accessible menu navigation UI may be invoked without conflicting with other global gestures that might also be used by the device's operating system. In other embodiments, the techniques described herein may be combined with drag-and-drop UI techniques, or other UI techniques to aid in navigating and organizing content. Numerous uniquely identifiable engagement schemes that exploit a touch sensitive surface can be used as will be appreciated in light of this disclosure. Further note that any touch sensitive device (e.g., track pad, touch screen, or other touch sensitive surface, whether capacitive, resistive, acoustic or other touch detecting technology, regardless of whether a user is physically contacting the device or using some sort of implement, such as a stylus) may be used to detect the user contact, and the claimed invention is not intended to be limited to any particular type of touch sensitive technology, unless expressly stated. For ease of reference, user input is sometimes referred to as contact or user contact; however, direct and/or proximate contact (e.g., hovering within a few centimeters of the touch sensitive surface) can be used. In other words, in some embodiments, a user can operate the accessible menu navigation user interface without physically touching the touch sensitive device.
  • Architecture
  • FIGS. 1 a-b illustrate an example electronic touch sensitive device having an accessible menu navigation user interface configured in accordance with an embodiment of the present invention. As can be seen, in this example embodiment, the touch sensitive surface is a touch screen display. The device could be, for example, a tablet such as the NOOK® tablet or eReader by Barnes & Noble. In a more general sense, the device may be any electronic device having a touch sensitive user interface for detecting direct touch or otherwise sufficiently proximate contact, and capability for displaying content to a user, such as a mobile phone or mobile computing device such as a laptop, a desktop computing system, a television, a smart display screen, or any other device having a touch sensitive display or a non-sensitive display screen that can be used in conjunction with a touch sensitive surface. As will be appreciated in light of this disclosure, the claimed invention is not intended to be limited to any specific kind or type of electronic device or form factor.
  • As can be seen with this example configuration, the device comprises a housing that includes a number of hardware features such as a power button, control features, and a press-button (sometimes called a home button herein). A user interface is also provided, which in this example embodiment includes a quick navigation menu having six main categories to choose from (Home, Library, Shop, Search, Light, and Settings) and a status bar that includes a number of icons (a night-light icon, a wireless network icon, and a book icon), a battery indicator, and a clock. In one embodiment, an accessible UI may aurally present to the user the various menu categories from which the user may select the desired menu with a touch screen gesture or by activating a control feature. Some embodiments may have fewer or additional such UI features, or different UI features altogether, depending on the target application of the device. Any such general UI controls and features can be implemented using any suitable conventional or custom technology, as will be appreciated.
  • The hardware control features provided on the device housing in this example embodiment are configured as elongated press-bars and can be used, for example, to page forward (using the top press-bar) or to page backward (using the bottom press-bar), such as might be useful in an eReader application. The power button can be used to turn the device on and off, and may be used in conjunction with a touch-based UI control feature that allows the user to confirm a given power transition action request (e.g., such as a slide bar or tap point graphic to turn power off). Numerous variations will be apparent, and the claimed invention is not intended to be limited to any particular set of hardware buttons or UI features, or device form factor.
  • In this example configuration, the home button is a physical press-button that can be used as follows: when the device is awake and in use, pressing the button will present to the user (either aurally or visually) the quick navigation menu, which is a toolbar that provides quick access to various features of the device. The home button may also be configured to cease an active function that is currently executing on the device (such as an accessible menu navigation mode), or close a configuration sub-menu that is currently open. The button may further control other functionality if, for example, the user presses and holds the home button. For instance, an example such push-and-hold function could engage a power conservation routine where the device is put to sleep or an otherwise lower power consumption mode. So, a user could grab the device by the button, press and keep holding as the device is stowed into a bag or purse. Thus, one physical gesture may safely put the device to sleep. In such an example embodiment, the home button may be associated with and control different and unrelated actions: 1) present the quick navigation menu; 2) exit a configuration sub-menu; and 3) put the device to sleep. As can be further seen, the status bar may also include a book icon (upper left corner). In some cases, selecting the book icon may provide bibliographic information on the content or provide the main menu or table of contents for the book, movie, playlist, or other content.
  • In one particular embodiment, an accessible menu navigation configuration sub-menu, such as the one shown in FIG. 1 d, may be accessed by selecting the Settings option in the quick navigation menu, which causes the device to present the general sub-menu shown in FIG. 1 c. An accessible UI may, for example, aurally present to the user the various sub-menus and configuration options displayed in FIGS. 1 c-d, and the user may select the desired sub-menu or option with a touch screen selection gesture, by activating a control feature, by speaking a voice command, or through some other input means. Other accessible selection options will be apparent in light of this disclosure. From this general sub-menu, the user can select any one of a number of options, including one designated Screen/UI in this specific example case. Selecting this sub-menu item may cause the configuration sub-menu of FIG. 1 d to be presented, in accordance with an embodiment. In other example embodiments, selecting the Screen/UI option may present the user with a number of additional sub-options, one of which may include a so-called “accessible menu navigation” option, which may then be selected by the user so as to cause the accessible menu navigation configuration sub-menu of FIG. 1 d to be presented. Any number of such menu schemes and nested hierarchies can be used, as will be appreciated in light of this disclosure. In other example embodiments, the accessible menu navigation UI is hard-coded such that no configuration sub-menus are needed or otherwise provided (e.g., performing the accessible menu navigation gestures as described herein, with no user configuration needed). The degree of hard-coding versus user-configurability can vary from one embodiment to the next, and the claimed invention is not intended to be limited to any particular configuration scheme of any kind, as will be appreciated.
  • As will be appreciated, the various UI control features and sub-menus displayed to the user are implemented as touch screen controls in this example embodiment. Such UI screen controls can be programmed or otherwise configured using any number of conventional or custom technologies. In general, the touch screen display translates a touch (direct or hovering, by a user's hand, a stylus, or any other suitable implement) in a given location into an electrical signal which is then received and processed by the device's underlying operating system (OS) and circuitry (processor, display controller, etc.). In some instances, note that the user need not actually physically touch the touch sensitive device to perform an action. For example, the touch screen display may be configured to detect input based on a finger or stylus hovering over the touch sensitive surface (e.g., within 3 centimeters of the touch screen or otherwise sufficiently proximate to be detected by the touch sensing circuitry). Additional example details of the underlying OS and circuitry in accordance with some embodiments will be discussed in turn with reference to FIG. 2 a.
  • The touch sensitive surface (or touch sensitive display, in this example case) can be any surface that is configured with touch detecting technologies, whether capacitive, resistive, acoustic, active-stylus, and/or other input detecting technology, including direct contact and/or proximate contact. In some embodiments, the screen display can be layered above input sensors, such as a capacitive sensor grid for passive touch-based input, such as with a finger or passive stylus contact in the case of a so-called in-plane switching (IPS) panel, or an electro-magnetic resonance (EMR) sensor grid for sensing a resonant circuit of a stylus. In some embodiments, the touch sensitive display can be configured with a purely capacitive sensor, while in other embodiments the touch screen display may be configured to provide a hybrid mode that allows for both capacitive input and EMR input, for example. In still other embodiments, the touch sensitive surface is configured with only an active stylus sensor. Numerous touch screen display configurations can be implemented using any number of known or proprietary screen based input detecting technologies. In any such embodiments, a touch sensitive controller may be configured to selectively scan the touch sensitive surface and/or selectively report user inputs detected directly on or otherwise sufficiently proximate to (e.g., within a few centimeters, or otherwise sufficiently close so as to allow detection) the detection surface (or touch sensitive display, in this example case).
  • As previously explained, and with further reference to FIGS. 1 c and 1 d, once the Settings sub-menu is presented (FIG. 1 c), the user can then select the Screen/UI option. In response to such a selection, the accessible menu navigation configuration sub-menu shown in FIG. 1 d can be provided to the user, in accordance with one such example embodiment. The user can configure a number of features with respect to the accessible menu navigation mode, in this example case. For instance, the configuration sub-menu includes a UI check box that when checked or otherwise selected by the user, effectively enables the accessible menu navigation mode (shown in the enabled state); unchecking the box disables the function. Other embodiments may have the accessible menu navigation mode always enabled or enabled by a physical switch or button located on the device, for example. As previously explained, the accessible menu navigation mode may associate a sound effect or earcon with certain menu actions and/or boundaries. In some cases, the user may enable or disable the earcon function, and in this particular embodiment, the user has enabled earcons. In some cases, the accessible menu navigation mode may also allow content sharing through one or more communication means such as Facebook®, email, etc. In some cases, the user may wish to disable content sharing in order to ensure that no external access to the user's content is possible, and the accessible menu navigation configuration sub-menu may allow the user to enable or disable this feature. In this particular embodiment, the user does not have content sharing enabled. As discussed above, the menu and/or sub-menu option selection gesture could include a vertical or horizontal swipe gesture, releasing a contact point that was being held down, a double-tap gesture, or some other gesture. In this particular embodiment, the user may configure the selection gesture by picking from a drop-down menu, and the user has configured the selection gesture to be a double-tap gesture.
  • As can be further seen, a back button arrow UI control feature may be provisioned on the screen for any of the menus provided, so that the user can go back to the previous menu, if so desired. In other embodiments, a universal back screen gesture may be performed in order to return to the previous menu. Note that configuration settings provided by the user can be saved automatically (e.g., user input is saved as selections are made or otherwise provided). Alternatively, a save button or other such UI feature can be provisioned, or a save gesture performed, which the user can engage as desired. The configuration sub-menu shown in FIG. 1 d is presented merely as an example of how an accessible menu navigation mode may be configured by the user, and numerous other configurable or hard-codable aspects will be apparent in light of this disclosure. Other embodiments may confirm an action or menu selection using earcons, sound effects, or animations, and such sound effects and/or animations may be used to provide clarity to the function being performed or otherwise enhance the user experience. In some embodiments, such animations and sound effects may be user-configurable, while in other embodiments they are hard-coded.
  • FIG. 2 a illustrates a block diagram of an electronic touch screen device configured in accordance with an embodiment of the present invention. As can be seen, this example device includes a processor, memory (e.g., RAM and/or ROM for processor workspace and storage), additional storage/memory (e.g., for content), a communications module, a touch screen, and an audio module. A communications bus and interconnect is also provided to allow inter-device communication. Other typical componentry and functionality not reflected in the block diagram will be apparent (e.g., battery, co-processor, etc). The touch screen and underlying circuitry is capable of translating a user's contact (direct or proximate) with the touch screen into an electronic signal that can be manipulated or otherwise used to trigger a specific user interface action, such as those provided herein. The principles provided herein equally apply to any such touch sensitive devices. For ease of description, examples are provided with touch screen technology.
  • In this example embodiment, the memory includes a number of modules stored therein that can be accessed and executed by the processor (and/or a co-processor). The modules include an operating system (OS), a user interface (UI), and a power conservation routine (Power). The modules can be implemented, for example, in any suitable programming language (e.g., C, C++, objective C, JavaScript, custom or proprietary instruction sets, etc), and encoded on a machine readable medium, that when executed by the processor (and/or co-processors), carries out the functionality of the device including a UI having an accessible menu navigation mode as variously described herein. The computer readable medium may be, for example, a hard drive, compact disk, memory stick, server, or any suitable non-transitory computer/computing device memory that includes executable instructions, or a plurality or combination of such memories. Other embodiments can be implemented, for instance, with gate-level logic or an application-specific integrated circuit (ASIC) or chip set or other such purpose-built logic, or a microcontroller having input/output capability (e.g., inputs for receiving user inputs and outputs for directing other components) and a number of embedded routines for carrying out the device functionality. In short, the functional modules can be implemented in hardware, software, firmware, or a combination thereof.
  • The processor can be any suitable processor (e.g., Texas Instruments OMAP4, dual-core ARM Cortex-A9, 1.5 GHz), and may include one or more co-processors or controllers to assist in device control. In this example case, the processor receives input from the user, including input from or otherwise derived from the power button and the home button. The processor can also have a direct connection to a battery so that it can perform base level tasks even during sleep or low power modes. The memory (e.g., for processor workspace and executable file storage) can be any suitable type of memory and size (e.g., 256 or 512 Mbytes SDRAM), and in other embodiments may be implemented with non-volatile memory or a combination of non-volatile and volatile memory technologies. The storage (e.g., for storing consumable content and user files) can also be implemented with any suitable memory and size (e.g., 2 GBytes of flash memory). The display can be implemented, for example, with a 7 to 9 inch 1920×1280 IPS LCD touchscreen touch screen, or any other suitable display and touchscreen interface technology. The communications module can be, for instance, any suitable 802.11b/g/n WLAN chip or chip set, which allows for connection to a local network, and so that content can be exchanged between the device and a remote system (e.g., content provider or repository depending on the application of the device). In some specific example embodiments, the device housing that contains all the various componentry measures about 7″ to 9″ high by about 5″ to 6″ wide by about 0.5″ thick, and weighs about 7 to 8 ounces. Any number of suitable form factors can be used, depending on the target application (e.g., laptop, desktop, mobile phone, etc). The device may be smaller, for example, for smartphone and tablet applications and larger for smart computer monitor and laptop and desktop computer applications.
  • The operating system (OS) module can be implemented with any suitable OS, but in some example embodiments is implemented with Google Android OS or Linux OS or Microsoft OS or Apple OS. As will be appreciated in light of this disclosure, the techniques provided herein can be implemented on any such platforms. The power management (Power) module can be configured as typically done, such as to automatically transition the device to a low power consumption or sleep mode after a period of non-use. A wake-up from that sleep mode can be achieved, for example, by a physical button press and/or a touch screen swipe or other action. The user interface (UI) module can be, for example, based on touchscreen technology and the various example screen shots and use-case scenarios shown in FIGS. 1 a, 1 c-d, 3, 4 a-b, 5 a-b, and 6 a-e, and in conjunction with the accessible menu navigation methodologies demonstrated in FIG. 7, which will be discussed in turn. The audio module can be configured to speak or otherwise aurally present, for example, menu options, a selected eBook, or any other textual content, and/or to provide verbal and/or other sound-based cues and earcons to guide the accessible menu navigation process, as will be appreciated in light of this disclosure. Numerous commercially available text-to-speech modules can be used, such as Verbose text-to-speech software by NCH Software. In some example cases, if additional space is desired, for example, to store digital books or other content and media, storage can be expanded via a microSD card or other suitable memory expansion technology (e.g., 32 GBytes, or higher). Further note that although a touch screen display is provided, other embodiments may include a non-touch screen and a touch sensitive surface such as a track pad, or a touch sensitive housing configured with one or more acoustic sensors, etc.
  • Client-Server System
  • FIG. 2 b illustrates a block diagram of a communication system configured in accordance with an embodiment of the present invention. As can be seen, the system generally includes an electronic touch sensitive device (such as the one in FIG. 2 a) that is capable of communicating with a server via a network/cloud. In this example embodiment, the electronic touch sensitive device may be, for example, an eBook reader, a mobile cell phone, a laptop, a tablet, desktop, or any other touch sensitive computing device. The network/cloud may be a public and/or private network, such as a private local area network operatively coupled to a wide area network such as the Internet. In this example embodiment, the server may be programmed or otherwise configured to receive content requests from a user via the touch sensitive device and to respond to those requests by performing a desired function or providing the user with requested or otherwise recommended content. In some such embodiments, the server is configured to remotely provision an accessible menu navigation mode as provided herein to the touch screen device (e.g., via JavaScript or other browser based technology). In other embodiments, portions of the accessible menu navigation methodology can be executed on the server and other portions of the methodology can be executed on the device. Numerous server-side/client-side execution schemes can be implemented to facilitate an accessible menu navigation mode in accordance with an embodiment, as will be apparent in light of this disclosure.
  • Accessible Menu Navigation Mode Examples
  • FIG. 3 is a visual representation of an accessible navigation menu, configured in accordance with an embodiment of the present invention. In this example navigation menu, a typical sentence in an eBook is represented as “The quick brown fox jumps over the lazy dog.” The sentence may be read aloud to the user, and the current word being read may be “jumps,” so that word is displayed in all-caps in this particular example. The navigation menu described herein may be utilized for a single word, a sentence, a selection of text, or any other content presented on the device. The navigation menu may include, for example, a number of navigation and/or settings options and the user may engage these options by performing a swipe or flick gesture up or down vertically on the touch screen device. In this particular embodiment, the settings options are read aloud if the user scrolls down, while the navigation options are read aloud if the user scrolls up. In other embodiments all menu options may be read aloud when scrolling either up or down, and different gestures and/or menu structures will be apparent in light of this disclosure. In this particular example, the navigation options include options to find a word, define a word, spell a word, select chapters, add/delete bookmarks, add/delete notes, and share content; while the settings options include adjusting the reading rate, adjusting volume, and changing voice font. Some options, like changing the voice font, for example, may have multiple sub-options that can be scrolled through. Other options, like defining or spelling a selected word, for example, may only be selected or not selected. Still other options, like adjusting the reading rate or volume, may be adjusted using a specific screen gesture. In one embodiment, the user input gestures for the accessible menu navigation mode may be categorized into two types: menu navigation gestures and selection gestures. As discussed above, menu navigation gestures may include those that allow the user to navigate through the various menus and sub-menus, while selection gestures allow the user to select a menu or sub-menu option, adjust a settings option, or otherwise complete an action within the accessible menu navigation mode.
  • FIGS. 4 a-b illustrate tables of two accessible menu navigation gesture sequences, in accordance with an embodiment of the present invention. In these examples, the accessible menu navigation mode is configured according to the menu structure shown in FIG. 3. As can be seen in FIG. 4 a, the first gesture includes a downward swipe to access the settings options, and the “adjust rate” option is the first one that is read aloud. In this particular example, the menu option spoken is a setting with a current value, and a horizontal slide immediately after the setting name is spoken results in changing that setting's value. Consequently, if the user performs a swipe gesture in the left or right direction, for example, the reading rate could be decreased or increased respectively. In this example, the user's second gesture is a horizontal swipe toward the right, which adjusts the value of the current setting by increasing the rate once. The user may continue to hold the contact point on the touch screen, in some embodiments, so that the reading rate continues to increase. Such a swipe-and-hold gesture may continuously increase the rate until the contact point is lifted. In another embodiment, the user may perform multiple swipe gestures to the right in order to increase the rate a number of times. In one particular embodiment, whenever the user enters the menu navigation mode, the currently running application (e.g., an audiobook reading) is paused. If the user swipes to the right to increase the reading rate, an audio sample of the increased reading rate may be played, in some embodiments. Upon hearing the desired reading rate, the user's fourth gesture includes lifting the contact point to select that rate. In some embodiments, instead of lifting the contact point the user may perform an upward swipe gesture, or some other identifiable selection gesture, in order to select the desired reading rate.
  • As can be seen in FIG. 4 b, the first gesture includes an upward swipe to access the navigation options, and the “find word” option is the first one that is read aloud. The user performs gesture 2, a sideways swipe to the right, in order to select the “find word” option, and the accessible menu navigation mode begins slowly reading the words of the current sentence or selection of text. In another example embodiment, the “find word” function may include performing a word search, as shown in the example of FIG. 6 a and described in more detail below. In one embodiment, an earcon may be played after each word and a short pause may give the user time to select the word. Once the desired word has been read aloud, the user performs gesture 3, an upward swipe gesture, which in this particular menu structure accesses the “define word” menu option. In another example embodiment, the words of a sentence may be presented in a list that can be read through using single flick gestures (e.g., horizontal flick gestures), and a double-tap or upward swipe gesture may take the user back to the page associated with the selected word. In order to select the menu option in this example and hear the word's definition, the user performs a selection gesture, gesture 4, which is a sideways swipe to the right. Upon performing gesture 4, the accessible menu navigation mode reads the word's definition, and in some cases the definition may be textually displayed on the device in a pop-up window. In one example embodiment, a select gesture (e.g., a double-tap gesture) may cause the definition to be re-read from the beginning. In some embodiments, a dismiss gesture or action (e.g., pressing the home button) at any point during the menu navigation or option selection may close the current window, return to the previous menu structure level, or return to reading content. In another embodiment, the content within the definition window may be navigable by section. Similarly, the other menu options displayed in FIG. 3 may be adjusted or selected using the same or similar procedures as the ones described in reference to FIGS. 4 a-b.
  • FIGS. 5 a-b collectively illustrate an example accessible menu navigation mode of an electronic touch screen device, in accordance with an embodiment of the present invention. This example gesture sequence shows how the accessible menu navigation mode might be implemented on the surface of a touch screen device operating in the manual mode. As can be seen, the device housing surrounds the touch screen of the device, and the user can interact with the touch screen with fingers or other suitable implement. The example in FIG. 5 a shows the user performing two downward menu navigation gestures in order to access the “adjust volume” menu option. When the “adjust volume” menu option is being read aloud, it may be accompanied by an earcon or other sound effect that presents the current volume level to the user. In some embodiments, a volume icon may also be displayed on the screen to indicate the current volume level. Once the “adjust volume” menu option is read aloud, the user may perform a sideways swipe gesture to the right in order to increase the volume, as shown in the example of FIG. 5 b. The swipe gesture might be a single swipe motion, or a swipe-and-hold motion, in some embodiments. A single swipe gesture might increase the volume by one volume unit, while a swipe-and-hold gesture may continuously increase the volume until the contact point is lifted. In some embodiments, adjusting the volume may be accompanied by an earcon that presents the volume level to the user, or a volume icon that indicates the current volume level.
  • FIGS. 6 a-e collectively illustrate example menu navigation mode functions, in accordance with an embodiment of the present invention. FIG. 6 a shows an example where the “find word” function includes performing a word search through a document for the currently selected word or a word in an application clipboard. In this example, a word is selected using a single finger double tap, thus accessing a menu of actions that may be performed on that word, including “find word,” “define word,” “spell word,” etc. In one such example, when the “find word” option is selected (using a single finger double tap in this example) results from the lookup may be presented in a list with the first item in the list selected. The list may be visually displayed on the device, in some embodiments, and the results may be read aloud along with audio cues, such as “Result 1 of N” where N is the number of lookup results (capped at some rational maximum). The list may be longer than the display screen, in some embodiments, and navigating “off” the list may merely refresh the visual display with the next (or previous) chunk of search results. The user may navigate the list using touch screen gestures, or the list may be read aloud with pauses between each result allowing the user time to select the desired word. In some cases the entire sentence or phrase containing the word may be read aloud in order to provide the context of the search result. Selecting a search result may return the user to the reading mode at the chosen location.
  • FIG. 6 b shows an example menu navigation function where the “go to page” menu option is selected (using a single finger double tap gesture), in accordance with one embodiment of the present invention. In this example, the user is presented with a phone-pad-style keypad, or other page entry window, that allows the input of page numbers. Each number selected may be added to the display window, and the selection process may be aurally presented and guided using audio cues or earcons, in some embodiments. The user has a delete button to correct mistakes and a “go” button to jump to the selected page. If the entry results in a non-existent page, and appropriate error message or earcon may be aurally presented and the keypad may remain visible with the text field cleared for further input.
  • FIG. 6 c shows an example menu navigation function where the “spell word” menu option has been selected, in accordance with one embodiment of the present invention. In this example, the selected word is stated, spelled, and then re-stated, per normal spelling bee rules. In other embodiments, other spelling presentations may be implemented. In this example case, once the word is spelled the device remains in the spelling menu layer such that a subsequent selection gesture (e.g., a double tap gesture) will spell the word again. In some cases, the “spell word” option is used to spell a pre-selected word, however, if no word is currently selected, a word selection may be performed after the “spell word” function begins.
  • FIGS. 6 d-e show an example menu navigation function for adding and deleting notes, in accordance with one embodiment of the present invention. In this example, when the “add note” menu option is selected (e.g., using a single finger double tap gesture), the device displays a text entry window that allows the user to input the content of a note. The user can save the note using a selection gesture, or exit the add note function using a dismiss gesture. In some cases, the user may be aurally prompted to confirm creating an note, e.g., “Double tap to confirm, single tap to return to text entry.” Once the note creation is confirmed, the text entry window may disappear and the device may return to reading content. FIG. 6 e shows an example where the user selects the “Notes” or “Go to Notes” menu option and is presented with a list of all the notes relating to the content currently being presented on the device. The notes may be presented in a list and may be read aloud to the user with audio cues, such as “Note 1 of N” where N is the number of notes. The user can move from note to note, for example, using horizontal flick gestures. The list may also be visually displayed on the device, and the list may be longer than the display screen, in some embodiments. In such cases, navigating “off” the list may merely refresh the visual display with the next (or previous) chunk of notes. A single tap on a note, for example, may jump the user to the note in a note editor window similar to the one shown in FIG. 6 d, and a delete gesture (e.g., a double tap with two contact points) may begin a delete dialog. In some cases, the delete dialog may prompt the user to confirm the deletion, and after deletion the note list may be displayed again until a global dismiss gesture is performed. As discussed above, a global dismiss gesture (e.g., a double tap gesture with two fingers) performed at any point during menu navigation may return the user to a previous menu or back to content reading, in some embodiments. The various gestures described are provided as examples only, and many other selection and/or dismiss gestures will be apparent in light of this disclosure.
  • Methodology
  • FIG. 7 illustrates a method for providing an accessible menu navigation user interface in an electronic touch screen device, in accordance with an embodiment of the present invention. This example methodology may be implemented, for instance, by the UI module of the example touch screen device shown in FIG. 2 a, or the example touch screen device shown in FIG. 2 b (e.g., with the UI provisioned to the client by the server). To this end, the accessible menu navigation mode can be implemented in software, hardware, firmware, or any combination thereof, as will be appreciated in light of this disclosure.
  • As can be seen, the method generally includes sensing a user's input by a touch screen display. As soon as the user begins to swipe, drag or otherwise move a contact point, the UI code (and/or hardware) can assume a swipe gesture has been engaged and track the path of the contact point with respect to any fixed point within the touch screen until the user stops engaging the touch screen surface. The release point can also be captured by the UI as it may be used to commit the action started when the user pressed on the touch sensitive screen. In a similar fashion, if the user releases hold without moving the contact point, a tap or press or press-and-hold command may be assumed depending on the amount of time the user was continually pressing on the touch sensitive screen. These main detections can be used in various ways to implement UI functionality, including an accessible menu navigation mode as variously described herein, as will be appreciated in light of this disclosure.
  • In this example case, the method includes detecting 701 a menu navigation gesture on the touch sensitive interface. As described above, the gesture contact may be performed in any suitable manner using a stylus, the user's finger, or any other suitable implement, and it may be performed on a touch screen surface, a track pad, acoustic sensor, or other touch sensitive surface. The user contact monitoring is essentially continuous. Once a user contact has been detected, the method may continue with reading 702 the menu option aloud to the user. In one embodiment, an upward swipe may prompts a group of navigation menu options to be read aloud, while a downward swipe may prompt a group of settings options to be read aloud. Other embodiments may read aloud all the menu options, either organized or listed at random, upon performing either an upward or downward swipe gesture. Some embodiments may scroll through the entire menu list after a single menu navigation gesture, with each menu option followed by an earcon or a slight pause, giving the user a chance to select that option. Other embodiments, like the one shown in this example method, may require a separate swipe gesture in order to scroll through each menu option.
  • The method may continue with determining 703 whether the menu option has one or more sub-menus. In one example, a navigation menu may include a “chapters” menu option and this option may include multiple sub-menu options for each chapter in a given book. If sub-menu options are available, the method may continue with determining 704 whether a sub-menu navigation gesture is detected. In one embodiment, if the menu navigation gesture is a vertical swipe gesture, the sub-menu navigation gesture may be a horizontal swipe gesture. If no sub-menu navigation gesture is detected, the method may return to monitoring 701 for another menu navigation gesture. If a sub-menu navigation gesture is detected, the method may read aloud 705 the sub-menu options. The method may continue with determining 706 whether a sub-menu selection gesture is detected, and if none is detected the method may continue with determining 604 whether another sub-menu navigation gesture is detected. If a sub-menu selection gesture is detected, the method may continue with selecting 707 the sub-menu option. In one specific example, a user has just opened a book with an eReader application and performs multiple menu navigation gestures (e.g., upward swipe gestures) in order to access the “chapter” menu. The user then performs three sub-menu navigation gestures (e.g., sideways swipe gestures to the right) in order to access the sub-menu option “chapter 3,” and then performs a sub-menu selection gesture (e.g., a double-tap gesture) in order to access the content of chapter 3.
  • If the menu option read aloud at 702 does not have sub-menu options, the method may continue with determining 708 whether the menu option selection gesture is detected. One such menu option may be the “spell word” menu option, because the option may either be selected or not, and no sub-menu options are available. Another such menu option may include an “adjust volume” or “adjust rate” menu option, wherein the menu option selection gesture includes a value adjustment gesture, such as a horizontal swipe gesture used to adjust the value for that setting or menu option. If a menu selection gesture is detected, the method may continue with performing 709 the menu option function. If no menu selection gesture is detected, the method may continue with abandoning 710 the menu navigation mode. Abandoning the menu navigation mode may occur if no contact is detected after a certain period of time. Furthermore, at any point during the accessible menu navigation mode the mode may be abandoned if the home button is pressed or if some other hard-coded or configurable abandon action is performed. As discussed above, the various menu navigation gestures and selection gestures may be hard-coded or configured by the user, and different gesture configurations performed with one or more contact points may be used for any of the navigation or selection gestures.
  • Numerous variations and embodiments will be apparent in light of this disclosure. One example embodiment of the present invention provides a device including a touch sensitive surface for allowing user input. The device also includes a user interface including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture. In some cases, the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options. In some cases, the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option. In some cases, the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures. In some cases, the selection gesture includes at least one of: a vertical swipe gesture, a horizontal swipe gesture, a double-tap gesture, and/or lifting a contact point from the touch sensitive surface. In some cases, the selection gesture is user configurable. In some cases, the accessible menu navigation mode is configured to navigate through one or more content navigation options in response to a first menu navigation gesture, and to navigate through one or more settings options in response to a second menu navigation gesture, the first and second menu navigation gestures having opposite orientations. In some cases, the menu and sub-menu options are user configurable. In some cases, one or more menu and/or sub-menu options allow the user to share selected content over the Internet. In some cases, one or more menu and/or sub-menu options allow the user to create a customized group of people and share selected content with that group. In some cases, the accessible menu navigation mode is configured to abandon if no user contact is detected at the touch sensitive surface after a specific period of time.
  • Another example embodiment of the present invention provides a mobile computing system including a processor and a touch sensitive surface for allowing user input, and a user interface executable on the processor and including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture. In some cases, the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options. In some cases, the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option. In some cases, the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures.
  • Another example embodiment of the present invention provides a computer program product including a plurality of instructions non-transiently encoded thereon to facilitate operation of an electronic device according to a process. The computer program product may include one or more computer readable mediums such as, for example, a hard drive, compact disk, memory stick, server, cache memory, register memory, random access memory, read only memory, flash memory, or any suitable non-transitory memory that is encoded with instructions that can be executed by one or more processors, or a plurality or combination of such memories. In this example embodiment, the process is configured to receive at a touch sensitive surface of the electronic device a menu navigation gesture; aurally present a menu option in response to the menu navigation gesture; receive at the touch sensitive surface a selection gesture; and adjust a navigation option and/or settings option in response to the selection gesture. In some cases, the process is further configured to receive at the touch sensitive surface a sub-menu navigation gesture; and aurally present a sub-menu option in response to the sub-menu navigation gesture. In some cases, the process is further configured to receive at the touch sensitive surface a menu option value adjustment gesture; and adjust a menu option value in response to the value adjustment gesture. In some cases, the process is further configured to activate a manual reading mode in response to a manual reading mode activation gesture detected at the touch sensitive surface, wherein the manual reading mode allows the user to adjust and select navigation and settings options. In some cases, the process is further configured to: aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
  • The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.

Claims (20)

What is claimed is:
1. A device, comprising:
a touch sensitive surface for allowing user input; and
a user interface including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture.
2. The device of claim 1 wherein the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options.
3. The device of claim 1 wherein the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
4. The device of claim 1 wherein the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures.
5. The device of claim 1 wherein the selection gesture comprises at least one of: a vertical swipe gesture, a horizontal swipe gesture, a double-tap gesture, and/or lifting a contact point from the touch sensitive surface.
6. The device of claim 1 wherein the selection gesture is user configurable.
7. The device of claim 1 wherein the accessible menu navigation mode is configured to navigate through one or more content navigation options in response to a first menu navigation gesture, and to navigate through one or more settings options in response to a second menu navigation gesture, the first and second menu navigation gestures having opposite orientations.
8. The device of claim 1 wherein the menu and sub-menu options are user configurable.
9. The device of claim 1 wherein one or more menu and/or sub-menu options allow the user to share selected content over the Internet.
10. The device of claim 9 wherein one or more menu and/or sub-menu options allow the user to create a customized group of people and share selected content with that group.
11. The device of claim 1 wherein the accessible menu navigation mode is configured to abandon if no user contact is detected at the touch sensitive surface after a specific period of time.
12. A mobile computing system, comprising:
a processor and a touch sensitive surface for allowing user input; and
a user interface executable on the processor and including an accessible menu navigation mode configured to navigate through menu and sub-menu options in response to one or more navigation gestures, wherein each menu and sub-menu option is aurally presented to the user in response to a corresponding navigation gesture, and wherein the accessible menu navigation mode is further configured to adjust and/or enable menu and sub-menu options in response to a selection gesture.
13. The system of claim 12 wherein the accessible menu navigation mode is further configured to activate a manual reading mode in response to a manual mode activation gesture, wherein the manual reading mode allows the user to adjust and select navigation and settings options.
14. The system of claim 12 wherein the accessible menu navigation mode is further configured to aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
15. The system of claim 12 wherein the accessible menu navigation mode is configured to navigate through one or more menu options in response to one or more substantially vertical swipe gestures, and to perform at least one of: navigate through one or more sub-menu options, and/or change a value for a menu option in response to one or more substantially horizontal swipe gestures.
16. A computer program product comprising a plurality of instructions non-transiently encoded thereon to facilitate operation of an electronic device according to the following process, the process comprising:
receive at a touch sensitive surface of the electronic device a menu navigation gesture;
aurally present a menu option in response to the menu navigation gesture;
receive at the touch sensitive surface a selection gesture; and
adjust a navigation option and/or settings option in response to the selection gesture.
17. The computer program product of claim 16 wherein the process is further configured to:
receive at the touch sensitive surface a sub-menu navigation gesture; and
aurally present a sub-menu option in response to the sub-menu navigation gesture.
18. The computer program product of claim 16 wherein the process is further configured to:
receive at the touch sensitive surface a menu option value adjustment gesture; and
adjust a menu option value in response to the value adjustment gesture.
19. The computer program product of claim 16 wherein the process is further configured to: activate a manual reading mode in response to a manual reading mode activation gesture detected at the touch sensitive surface, wherein the manual reading mode allows the user to adjust and select navigation and settings options.
20. The computer program product of claim 16 wherein the process is further configured to: aurally present one or more earcons in response to at least one of: accepting a menu option, cancelling a menu option, entering a manual reading mode, entering an automatic reading mode, passing a content boundary, navigating through a menu, and/or adjusting a settings option.
US13/946,530 2012-07-20 2013-07-19 Accessible Menu Navigation Techniques For Electronic Devices Abandoned US20140026101A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/946,530 US20140026101A1 (en) 2012-07-20 2013-07-19 Accessible Menu Navigation Techniques For Electronic Devices

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261674098P 2012-07-20 2012-07-20
US201261674102P 2012-07-20 2012-07-20
US13/946,530 US20140026101A1 (en) 2012-07-20 2013-07-19 Accessible Menu Navigation Techniques For Electronic Devices

Publications (1)

Publication Number Publication Date
US20140026101A1 true US20140026101A1 (en) 2014-01-23

Family

ID=49947636

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/946,538 Active 2034-07-27 US9658746B2 (en) 2012-07-20 2013-07-19 Accessible reading mode techniques for electronic devices
US13/946,530 Abandoned US20140026101A1 (en) 2012-07-20 2013-07-19 Accessible Menu Navigation Techniques For Electronic Devices
US15/601,153 Active 2034-02-21 US10585563B2 (en) 2012-07-20 2017-05-22 Accessible reading mode techniques for electronic devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/946,538 Active 2034-07-27 US9658746B2 (en) 2012-07-20 2013-07-19 Accessible reading mode techniques for electronic devices

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/601,153 Active 2034-02-21 US10585563B2 (en) 2012-07-20 2017-05-22 Accessible reading mode techniques for electronic devices

Country Status (1)

Country Link
US (3) US9658746B2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140250407A1 (en) * 2013-03-04 2014-09-04 Tencent Technology (Shenzhen) Company Limited Method, apparatus and computer readable storage medium for displaying sidebar information
US20140325402A1 (en) * 2013-04-26 2014-10-30 Samsung Electronics Co., Ltd. User terminal device with pen and controlling method thereof
US20150121213A1 (en) * 2013-10-28 2015-04-30 Kobo Incorporated Method and system for a user selected zoom level for optimal content display screen rendering
US20150227269A1 (en) * 2014-02-07 2015-08-13 Charles J. Kulas Fast response graphical user interface
US20150341900A1 (en) * 2014-05-23 2015-11-26 Samsung Electronics Co., Ltd. Wearable device and method of setting reception of notification message therein
US9250712B1 (en) * 2015-03-20 2016-02-02 Hand Held Products, Inc. Method and application for scanning a barcode with a smart device while continuously running and displaying an application on the smart device display
US20160055138A1 (en) * 2014-08-25 2016-02-25 International Business Machines Corporation Document order redefinition for assistive technologies
US9727182B2 (en) 2014-07-18 2017-08-08 Google Technology Holdings LLC Wearable haptic and touch communication device
US9965036B2 (en) 2014-07-18 2018-05-08 Google Technology Holdings LLC Haptic guides for a touch-sensitive display
US10121335B2 (en) * 2014-07-18 2018-11-06 Google Technology Holdings LLC Wearable haptic device for the visually impaired
US20190129576A1 (en) * 2017-10-27 2019-05-02 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Processing of corresponding menu items in response to receiving selection of an item from the respective menu
US20200022577A1 (en) * 2015-03-10 2020-01-23 Eyefree Assisting Communication Ltd. System and method for enabling communication through eye feedback
US10691329B2 (en) * 2017-06-19 2020-06-23 Simple Design Ltd. User interface of media player application for controlling media content display
US11237635B2 (en) 2017-04-26 2022-02-01 Cognixion Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio
US11347379B1 (en) * 2019-04-22 2022-05-31 Audible, Inc. Captions for audio content
US11385789B1 (en) * 2019-07-23 2022-07-12 Facebook Technologies, Llc Systems and methods for interacting with displayed items
US11402909B2 (en) 2017-04-26 2022-08-02 Cognixion Brain computer interface for augmented reality
US11463507B1 (en) 2019-04-22 2022-10-04 Audible, Inc. Systems for generating captions for audio content

Families Citing this family (134)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US10002189B2 (en) 2007-12-20 2018-06-19 Apple Inc. Method and apparatus for searching using an active ontology
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US9261961B2 (en) 2012-06-07 2016-02-16 Nook Digital, Llc Accessibility aids for users of electronic devices
KR20140019167A (en) * 2012-08-06 2014-02-14 삼성전자주식회사 Method for providing voice guidance function and an electronic device thereof
US9971495B2 (en) 2013-01-28 2018-05-15 Nook Digital, Llc Context based gesture delineation for user interaction in eyes-free mode
KR102380145B1 (en) 2013-02-07 2022-03-29 애플 인크. Voice trigger for a digital assistant
US10652394B2 (en) 2013-03-14 2020-05-12 Apple Inc. System and method for processing voicemail
US10748529B1 (en) 2013-03-15 2020-08-18 Apple Inc. Voice activated device for use with a voice-based digital assistant
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
CN110442699A (en) 2013-06-09 2019-11-12 苹果公司 Operate method, computer-readable medium, electronic equipment and the system of digital assistants
KR101749009B1 (en) 2013-08-06 2017-06-19 애플 인크. Auto-activating smart responses based on activities from remote devices
JP6144582B2 (en) * 2013-09-13 2017-06-07 Dmg森精機株式会社 NC machine tool operation device
US20150095835A1 (en) * 2013-09-30 2015-04-02 Kobo Incorporated Providing a user specific reader mode on an electronic personal display
US10296160B2 (en) 2013-12-06 2019-05-21 Apple Inc. Method for extracting salient dialog usage from live data
US9965171B2 (en) * 2013-12-12 2018-05-08 Samsung Electronics Co., Ltd. Dynamic application association with hand-written pattern
DE102013021576B4 (en) * 2013-12-19 2024-10-02 Audi Ag Method for selecting a text section on a touch-sensitive screen and display and operating device
US20150227166A1 (en) * 2014-02-13 2015-08-13 Samsung Electronics Co., Ltd. User terminal device and displaying method thereof
US10592095B2 (en) * 2014-05-23 2020-03-17 Apple Inc. Instantaneous speaking of content on touch devices
WO2015184186A1 (en) 2014-05-30 2015-12-03 Apple Inc. Multi-command single utterance input method
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
EP3859728B1 (en) 2014-07-10 2024-02-14 Intelligent Platforms, LLC Apparatus and method for electronic labeling of electronic equipment
CN104199609B (en) * 2014-08-21 2018-02-02 小米科技有限责任公司 The method and device of cursor positioning
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10235130B2 (en) * 2014-11-06 2019-03-19 Microsoft Technology Licensing, Llc Intent driven command processing
US9922098B2 (en) 2014-11-06 2018-03-20 Microsoft Technology Licensing, Llc Context-based search and relevancy generation
US10152299B2 (en) 2015-03-06 2018-12-11 Apple Inc. Reducing response latency of intelligent automated assistants
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US10126846B2 (en) * 2015-04-09 2018-11-13 Blackberry Limited Portable electronic device including touch-sensitive display and method of controlling selection of information
US10460227B2 (en) 2015-05-15 2019-10-29 Apple Inc. Virtual assistant in a communication session
US10200824B2 (en) 2015-05-27 2019-02-05 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
US10740384B2 (en) 2015-09-08 2020-08-11 Apple Inc. Intelligent automated assistant for media search and playback
US10331312B2 (en) 2015-09-08 2019-06-25 Apple Inc. Intelligent automated assistant in a media environment
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10956666B2 (en) 2015-11-09 2021-03-23 Apple Inc. Unconventional virtual assistant interactions
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
JP6455466B2 (en) * 2016-03-02 2019-01-23 京セラドキュメントソリューションズ株式会社 Display operation device and program
US10845987B2 (en) * 2016-05-03 2020-11-24 Intelligent Platforms, Llc System and method of using touch interaction based on location of touch on a touch screen
US11079915B2 (en) 2016-05-03 2021-08-03 Intelligent Platforms, Llc System and method of using multiple touch inputs for controller interaction in industrial control systems
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
CN106791155A (en) * 2017-01-03 2017-05-31 努比亚技术有限公司 A kind of volume adjustment device, volume adjusting method and mobile terminal
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
DK180048B1 (en) 2017-05-11 2020-02-04 Apple Inc. MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
DK201770427A1 (en) 2017-05-12 2018-12-20 Apple Inc. Low-latency intelligent automated assistant
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK201770411A1 (en) 2017-05-15 2018-12-20 Apple Inc. Multi-modal interfaces
DK179560B1 (en) 2017-05-16 2019-02-18 Apple Inc. Far-field extension for digital assistant services
US20180336892A1 (en) 2017-05-16 2018-11-22 Apple Inc. Detecting a trigger of a digital assistant
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US20180336275A1 (en) 2017-05-16 2018-11-22 Apple Inc. Intelligent automated assistant for media exploration
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
JP2019008482A (en) * 2017-06-22 2019-01-17 京セラドキュメントソリューションズ株式会社 Braille character tactile sense presentation device and image forming apparatus
US10529315B2 (en) * 2017-08-17 2020-01-07 Wipro Limited System and method for text to speech conversion of an electronic document
CN107728918A (en) * 2017-09-27 2018-02-23 北京三快在线科技有限公司 Browse the method, apparatus and electronic equipment of continuous page
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
CN108279833A (en) * 2018-01-08 2018-07-13 维沃移动通信有限公司 A kind of reading interactive approach and mobile terminal
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
DK201870355A1 (en) 2018-06-01 2019-12-16 Apple Inc. Virtual assistant operation in multi-device environments
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
US11449205B2 (en) * 2019-04-01 2022-09-20 Microsoft Technology Licensing, Llc Status-based reading and authoring assistance
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
DK201970509A1 (en) 2019-05-06 2021-01-15 Apple Inc Spoken notifications
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
DK201970511A1 (en) 2019-05-31 2021-02-15 Apple Inc Voice identification in digital assistant systems
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. User activity shortcut suggestions
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
US11468890B2 (en) 2019-06-01 2022-10-11 Apple Inc. Methods and user interfaces for voice-based control of electronic devices
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators
US11080463B1 (en) 2020-01-10 2021-08-03 International Business Machines Corporation Scrolling for multi-platforms
US11061543B1 (en) 2020-05-11 2021-07-13 Apple Inc. Providing relevant data items based on context
US11038934B1 (en) 2020-05-11 2021-06-15 Apple Inc. Digital assistant hardware abstraction
US11755276B2 (en) 2020-05-12 2023-09-12 Apple Inc. Reducing description length based on confidence
US11490204B2 (en) 2020-07-20 2022-11-01 Apple Inc. Multi-device audio adjustment coordination
US11438683B2 (en) 2020-07-21 2022-09-06 Apple Inc. User identification using headphones

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060067577A1 (en) * 2004-03-17 2006-03-30 James Marggraff Method and system for implementing a user interface for a device employing written graphical elements
US20100281374A1 (en) * 2009-04-30 2010-11-04 Egan Schulz Scrollable menus and toolbars
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
US20120154294A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US20130132904A1 (en) * 2011-11-22 2013-05-23 Backplane, Inc. Content sharing application utilizing radially-distributed menus
US20130145244A1 (en) * 2011-12-05 2013-06-06 Microsoft Corporation Quick analysis tool for spreadsheet application programs
US20130198690A1 (en) * 2012-02-01 2013-08-01 Microsoft Corporation Visual indication of graphical user interface relationship

Family Cites Families (121)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6259438B1 (en) 1998-06-04 2001-07-10 Wacom Co., Ltd. Coordinate input stylus
US4505992A (en) 1983-04-11 1985-03-19 Engelhard Corporation Integral gas seal for fuel cell gas distribution assemblies and method of fabrication
US4896543A (en) 1988-11-15 1990-01-30 Sri International, Inc. Three-axis force measurement stylus
WO1996012222A1 (en) 1994-10-14 1996-04-25 Ast Research, Inc. A system and method for detecting screen hotspots
JP2717774B2 (en) 1995-01-13 1998-02-25 株式会社ワコム Pressure sensitive element and stylus pen with pressure sensitive function
US6115482A (en) * 1996-02-13 2000-09-05 Ascent Technology, Inc. Voice-output reading system with gesture-based navigation
US6334157B1 (en) 1997-03-11 2001-12-25 Microsoft Corporation Programmatically providing direct access to user interface elements of an application program
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US7840912B2 (en) 2006-01-30 2010-11-23 Apple Inc. Multi-touch gesture dictionary
US20010025289A1 (en) 1998-09-25 2001-09-27 Jenkins Michael D. Wireless pen input device
US6933928B1 (en) 2000-07-18 2005-08-23 Scott E. Lilienthal Electronic book player with audio synchronization
US20020116421A1 (en) 2001-02-17 2002-08-22 Fox Harold L. Method and system for page-like display, formating and processing of computer generated information on networked computers
US7107533B2 (en) 2001-04-09 2006-09-12 International Business Machines Corporation Electronic book with multimode I/O
US7133862B2 (en) 2001-08-13 2006-11-07 Xerox Corporation System with user directed enrichment and import/export control
US9223426B2 (en) * 2010-10-01 2015-12-29 Z124 Repositioning windows in the pop-up window
US20050216839A1 (en) * 2004-03-25 2005-09-29 Keith Salvucci Audio scrubbing
US7649524B2 (en) 2004-07-15 2010-01-19 N-Trig Ltd. Tracking window for a digitizer system
US7620895B2 (en) * 2004-09-08 2009-11-17 Transcensus, Llc Systems and methods for teaching a person to interact with a computer program having a graphical user interface
US8838591B2 (en) 2005-08-23 2014-09-16 Ricoh Co., Ltd. Embedding hot spots in electronic documents
US7898541B2 (en) 2004-12-17 2011-03-01 Palo Alto Research Center Incorporated Systems and methods for turning pages in a three-dimensional electronic document
US7779347B2 (en) 2005-09-02 2010-08-17 Fourteen40, Inc. Systems and methods for collaboratively annotating electronic documents
US7694231B2 (en) 2006-01-05 2010-04-06 Apple Inc. Keyboards for portable electronic devices
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20080020356A1 (en) 2006-07-24 2008-01-24 Miriam Saba Braille overlay member for a cell phone
US8988357B2 (en) 2006-08-10 2015-03-24 Sony Corporation Stylus activated display/key-lock
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards
US8477095B2 (en) 2007-10-05 2013-07-02 Leapfrog Enterprises, Inc. Audio book for pen-based computer
US8933876B2 (en) * 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US20090228798A1 (en) 2008-03-07 2009-09-10 Tandem Readers, Llc Synchronized display of media and recording of audio across a network
US8659555B2 (en) 2008-06-24 2014-02-25 Nokia Corporation Method and apparatus for executing a feature using a tactile cue
US8239201B2 (en) 2008-09-13 2012-08-07 At&T Intellectual Property I, L.P. System and method for audibly presenting selected text
US20100100854A1 (en) 2008-10-16 2010-04-22 Dell Products L.P. Gesture operation input system
US8346557B2 (en) 2009-01-15 2013-01-01 K-Nfb Reading Technology, Inc. Systems and methods document narration
US8631354B2 (en) 2009-03-06 2014-01-14 Microsoft Corporation Focal-control user interface
US8274536B2 (en) 2009-03-16 2012-09-25 Apple Inc. Smart keyboard management for a multifunction device with a touch screen display
US20100259482A1 (en) 2009-04-10 2010-10-14 Microsoft Corporation Keyboard gesturing
US9886936B2 (en) 2009-05-14 2018-02-06 Amazon Technologies, Inc. Presenting panels and sub-panels of a document
US20100295782A1 (en) 2009-05-21 2010-11-25 Yehuda Binder System and method for control based on face ore hand gesture detection
KR101597553B1 (en) 2009-05-25 2016-02-25 엘지전자 주식회사 Function execution method and apparatus thereof
US8416065B2 (en) 2009-06-30 2013-04-09 Research In Motion Limited Overlay for electronic device and method of identifying same
US8451238B2 (en) 2009-09-02 2013-05-28 Amazon Technologies, Inc. Touch-screen user interface
US8766928B2 (en) 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US20110167350A1 (en) 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
WO2011085386A2 (en) 2010-01-11 2011-07-14 Apple Inc. Electronic text manipulation and display
US20110191675A1 (en) 2010-02-01 2011-08-04 Nokia Corporation Sliding input user interface
US9361018B2 (en) 2010-03-01 2016-06-07 Blackberry Limited Method of providing tactile feedback and apparatus
US8704783B2 (en) * 2010-03-24 2014-04-22 Microsoft Corporation Easy word selection and selection ahead of finger
EP2369443B1 (en) 2010-03-25 2017-01-11 BlackBerry Limited System and method for gesture detection and feedback
EP2381351B1 (en) 2010-04-20 2017-12-06 BlackBerry Limited Portable electronic device having touch-sensitive display with a variable repeat control mode.
US20110273379A1 (en) 2010-05-05 2011-11-10 Google Inc. Directional pad on touchscreen
US20110283241A1 (en) 2010-05-14 2011-11-17 Google Inc. Touch Gesture Actions From A Device's Lock Screen
US8451240B2 (en) 2010-06-11 2013-05-28 Research In Motion Limited Electronic device and method of providing tactile feedback
US8552999B2 (en) 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
US10976784B2 (en) 2010-07-01 2021-04-13 Cox Communications, Inc. Mobile device user interface change based on motion
USD668674S1 (en) 2010-07-26 2012-10-09 Apple Inc. Display screen or portion thereof with icon
KR101058268B1 (en) 2010-08-03 2011-08-22 안명환 Mobile terminal with non-reading part
US8452600B2 (en) * 2010-08-18 2013-05-28 Apple Inc. Assisted reader
TWI564757B (en) 2010-08-31 2017-01-01 萬國商業機器公司 Computer device with touch screen, method, and computer readable medium for operating the same
US8754858B2 (en) 2010-09-07 2014-06-17 STMicroelectronics Aisa Pacific Pte Method to parameterize and recognize circular gestures on touch sensitive surfaces
US8963836B2 (en) 2010-09-17 2015-02-24 Tencent Technology (Shenzhen) Company Limited Method and system for gesture-based human-machine interaction and computer-readable medium thereof
US9678572B2 (en) * 2010-10-01 2017-06-13 Samsung Electronics Co., Ltd. Apparatus and method for turning e-book pages in portable terminal
US20120110517A1 (en) 2010-10-29 2012-05-03 Honeywell International Inc. Method and apparatus for gesture recognition
US8533623B2 (en) * 2010-11-17 2013-09-10 Xerox Corporation Interface that allows a user to riffle through pages of an electronic document
US9639178B2 (en) 2010-11-19 2017-05-02 Apple Inc. Optical stylus
KR101787750B1 (en) 2010-12-01 2017-10-19 삼성전자주식회사 Capacitive stylus pen
US9552015B2 (en) * 2011-01-24 2017-01-24 Apple Inc. Device, method, and graphical user interface for navigating through an electronic document
US10365819B2 (en) * 2011-01-24 2019-07-30 Apple Inc. Device, method, and graphical user interface for displaying a character input user interface
US9645986B2 (en) 2011-02-24 2017-05-09 Google Inc. Method, medium, and system for creating an electronic book with an umbrella policy
US9134899B2 (en) 2011-03-14 2015-09-15 Microsoft Technology Licensing, Llc Touch gesture indicating a scroll on a touch-sensitive display in a single direction
WO2012125989A2 (en) * 2011-03-17 2012-09-20 Laubach Kevin Touch enhanced interface
EP2686755B1 (en) * 2011-03-17 2020-10-14 Laubach, Kevin Input device enhanced interface
US20120242584A1 (en) 2011-03-22 2012-09-27 Nokia Corporation Method and apparatus for providing sight independent activity reports responsive to a touch gesture
US8922489B2 (en) 2011-03-24 2014-12-30 Microsoft Corporation Text input using key and gesture information
US20120272144A1 (en) 2011-04-20 2012-10-25 Microsoft Corporation Compact control menu for touch-enabled command execution
US20130055141A1 (en) 2011-04-28 2013-02-28 Sony Network Entertainment International Llc User interface for accessing books
US20120280947A1 (en) 2011-05-06 2012-11-08 3M Innovative Properties Company Stylus with pressure sensitive membrane
WO2012159254A1 (en) 2011-05-23 2012-11-29 Microsoft Corporation Invisible control
US20120299853A1 (en) 2011-05-26 2012-11-29 Sumit Dagar Haptic interface
US8826190B2 (en) 2011-05-27 2014-09-02 Google Inc. Moving a graphical selector
US8508494B2 (en) 2011-06-01 2013-08-13 Motorola Mobility Llc Using pressure differences with a touch-sensitive display screen
DE112011105305T5 (en) * 2011-06-03 2014-03-13 Google, Inc. Gestures for text selection
WO2012170745A2 (en) 2011-06-07 2012-12-13 Lozinski Christopher Touch typing on a touch screen device
US20120324355A1 (en) 2011-06-20 2012-12-20 Sumbola, Inc. Synchronized reading in a web-based reading system
US20120329529A1 (en) 2011-06-21 2012-12-27 GreatCall, Inc. Gesture activate help process and system
US8913019B2 (en) 2011-07-14 2014-12-16 Microsoft Corporation Multi-finger detection and component resolution
CN103782342B (en) 2011-07-26 2016-08-31 布克查克控股有限公司 The sound channel of e-text
US9256361B2 (en) 2011-08-03 2016-02-09 Ebay Inc. Control of search results with multipoint pinch gestures
WO2013024479A1 (en) 2011-08-17 2013-02-21 Project Ray Ltd. Interface layer and operating system facilitating use, including by blind and visually-impaired users, of touch-screen-controlled consumer electronic devices
BR112014003925A2 (en) 2011-08-19 2017-02-21 Apple Inc digital book content authoring method and digital book content authoring system
AU2012101185B4 (en) 2011-08-19 2013-05-02 Apple Inc. Creating and viewing digital note cards
US8976128B2 (en) 2011-09-12 2015-03-10 Google Technology Holdings LLC Using pressure differences with a touch-sensitive display screen
US20130076654A1 (en) 2011-09-27 2013-03-28 Imerj LLC Handset states and state diagrams: open, closed transitional and easel
US8286104B1 (en) 2011-10-06 2012-10-09 Google Inc. Input method application for a touch-sensitive user interface
US20130159939A1 (en) 2011-10-12 2013-06-20 Qualcomm Incorporated Authenticated gesture recognition
KR101799408B1 (en) 2011-11-03 2017-11-20 삼성전자주식회사 Apparatus and method for controlling controllable device in portable terminal
KR20130050607A (en) 2011-11-08 2013-05-16 삼성전자주식회사 Method and apparatus for managing reading in device
KR20130052151A (en) 2011-11-11 2013-05-22 삼성전자주식회사 Data input method and device in portable terminal having touchscreen
US9031493B2 (en) 2011-11-18 2015-05-12 Google Inc. Custom narration of electronic books
CN102520847A (en) 2011-11-25 2012-06-27 鸿富锦精密工业(深圳)有限公司 Electronic reading device and page processing method thereof
JP6194167B2 (en) 2011-11-25 2017-09-06 京セラ株式会社 Apparatus, method, and program
JP6159078B2 (en) 2011-11-28 2017-07-05 京セラ株式会社 Apparatus, method, and program
US20130151955A1 (en) 2011-12-09 2013-06-13 Mechell Williams Physical effects for electronic books
JP2015509225A (en) 2011-12-16 2015-03-26 ティ−タッチ・インターナショナル・ソシエテ・ア・レスポンサビリテ・リミテT−Touch International S.a.r.l. Touch-sensitive data carrier and method
US8860763B2 (en) 2012-01-31 2014-10-14 Xerox Corporation Reversible user interface component
US9117195B2 (en) 2012-02-13 2015-08-25 Google Inc. Synchronized consumption modes for e-books
US9817568B2 (en) 2012-02-29 2017-11-14 Blackberry Limited System and method for controlling an electronic device
CN103294657B (en) 2012-03-02 2017-10-27 富泰华工业(深圳)有限公司 Method for editing text and system
US20130268826A1 (en) 2012-04-06 2013-10-10 Google Inc. Synchronizing progress in audio and text versions of electronic books
US9261961B2 (en) 2012-06-07 2016-02-16 Nook Digital, Llc Accessibility aids for users of electronic devices
US9542665B2 (en) 2012-10-25 2017-01-10 Interactive Intelligence Group, Inc. Methods for creating, arranging, and leveraging an ad-hoc collection of heterogeneous organization components
US20140127667A1 (en) 2012-11-05 2014-05-08 Marco Iannacone Learning system
US9075462B2 (en) 2012-12-10 2015-07-07 Sap Se Finger-specific input on touchscreen devices
US9781223B2 (en) 2012-12-28 2017-10-03 Facebook, Inc. Conserving battery and data usage
US10249007B2 (en) 2012-12-28 2019-04-02 Facebook, Inc. Social cover feed interface
US10649607B2 (en) 2012-12-28 2020-05-12 Facebook, Inc. Re-ranking story content
US10761672B2 (en) 2012-12-28 2020-09-01 Facebook, Inc. Socialized dash
US20140210729A1 (en) 2013-01-28 2014-07-31 Barnesandnoble.Com Llc Gesture based user interface for use in an eyes-free mode
US9971495B2 (en) 2013-01-28 2018-05-15 Nook Digital, Llc Context based gesture delineation for user interaction in eyes-free mode
US20140215339A1 (en) 2013-01-28 2014-07-31 Barnesandnoble.Com Llc Content navigation and selection in an eyes-free mode
US9615231B2 (en) 2013-06-04 2017-04-04 Sony Corporation Configuring user interface (UI) based on context

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060067577A1 (en) * 2004-03-17 2006-03-30 James Marggraff Method and system for implementing a user interface for a device employing written graphical elements
US20100281374A1 (en) * 2009-04-30 2010-11-04 Egan Schulz Scrollable menus and toolbars
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
US20120154294A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US20130132904A1 (en) * 2011-11-22 2013-05-23 Backplane, Inc. Content sharing application utilizing radially-distributed menus
US20130145244A1 (en) * 2011-12-05 2013-06-06 Microsoft Corporation Quick analysis tool for spreadsheet application programs
US20130198690A1 (en) * 2012-02-01 2013-08-01 Microsoft Corporation Visual indication of graphical user interface relationship

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140250407A1 (en) * 2013-03-04 2014-09-04 Tencent Technology (Shenzhen) Company Limited Method, apparatus and computer readable storage medium for displaying sidebar information
US20140325402A1 (en) * 2013-04-26 2014-10-30 Samsung Electronics Co., Ltd. User terminal device with pen and controlling method thereof
US11340759B2 (en) 2013-04-26 2022-05-24 Samsung Electronics Co., Ltd. User terminal device with pen and controlling method thereof
US10289268B2 (en) * 2013-04-26 2019-05-14 Samsung Electronics Co., Ltd. User terminal device with pen and controlling method thereof
US10025465B2 (en) * 2013-10-28 2018-07-17 Rakuten Kobo Inc. Method and system for a user selected zoom level for optimal content display screen rendering
US20150121213A1 (en) * 2013-10-28 2015-04-30 Kobo Incorporated Method and system for a user selected zoom level for optimal content display screen rendering
US20150227269A1 (en) * 2014-02-07 2015-08-13 Charles J. Kulas Fast response graphical user interface
US20150341900A1 (en) * 2014-05-23 2015-11-26 Samsung Electronics Co., Ltd. Wearable device and method of setting reception of notification message therein
US9727182B2 (en) 2014-07-18 2017-08-08 Google Technology Holdings LLC Wearable haptic and touch communication device
US9965036B2 (en) 2014-07-18 2018-05-08 Google Technology Holdings LLC Haptic guides for a touch-sensitive display
US10121335B2 (en) * 2014-07-18 2018-11-06 Google Technology Holdings LLC Wearable haptic device for the visually impaired
US20160055138A1 (en) * 2014-08-25 2016-02-25 International Business Machines Corporation Document order redefinition for assistive technologies
US10203865B2 (en) * 2014-08-25 2019-02-12 International Business Machines Corporation Document content reordering for assistive technologies by connecting traced paths through the content
US11883101B2 (en) * 2015-03-10 2024-01-30 Eyefree Assisting Communication Ltd. System and method for enabling communication through eye feedback
US20200022577A1 (en) * 2015-03-10 2020-01-23 Eyefree Assisting Communication Ltd. System and method for enabling communication through eye feedback
US9250712B1 (en) * 2015-03-20 2016-02-02 Hand Held Products, Inc. Method and application for scanning a barcode with a smart device while continuously running and displaying an application on the smart device display
US11237635B2 (en) 2017-04-26 2022-02-01 Cognixion Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio
US11977682B2 (en) 2017-04-26 2024-05-07 Cognixion Corporation Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio
US11561616B2 (en) 2017-04-26 2023-01-24 Cognixion Corporation Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio
US11402909B2 (en) 2017-04-26 2022-08-02 Cognixion Brain computer interface for augmented reality
US11762467B2 (en) 2017-04-26 2023-09-19 Cognixion Corporation Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio
US10691329B2 (en) * 2017-06-19 2020-06-23 Simple Design Ltd. User interface of media player application for controlling media content display
US20190129576A1 (en) * 2017-10-27 2019-05-02 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Processing of corresponding menu items in response to receiving selection of an item from the respective menu
US11463507B1 (en) 2019-04-22 2022-10-04 Audible, Inc. Systems for generating captions for audio content
US11347379B1 (en) * 2019-04-22 2022-05-31 Audible, Inc. Captions for audio content
US11385789B1 (en) * 2019-07-23 2022-07-12 Facebook Technologies, Llc Systems and methods for interacting with displayed items

Also Published As

Publication number Publication date
US20140026055A1 (en) 2014-01-23
US9658746B2 (en) 2017-05-23
US20170255353A1 (en) 2017-09-07
US10585563B2 (en) 2020-03-10

Similar Documents

Publication Publication Date Title
US10585563B2 (en) Accessible reading mode techniques for electronic devices
US11204687B2 (en) Visual thumbnail, scrubber for digital content
US11126346B2 (en) Digital flash card techniques
US11320931B2 (en) Swipe-based confirmation for touch sensitive devices
US9448643B2 (en) Stylus sensitive device with stylus angle detection functionality
US20180239512A1 (en) Context based gesture delineation for user interaction in eyes-free mode
US9367208B2 (en) Move icon to reveal textual information
US9261985B2 (en) Stylus-based touch-sensitive area for UI control of computing device
US9400601B2 (en) Techniques for paging through digital content on touch screen devices
US9766723B2 (en) Stylus sensitive device with hover over stylus control functionality
US9785259B2 (en) Stylus-based slider functionality for UI control of computing device
US9588979B2 (en) UI techniques for navigating a file manager of an electronic computing device
US20140218343A1 (en) Stylus sensitive device with hover over stylus gesture functionality
US9423932B2 (en) Zoom view mode for digital content including multiple regions of interest
US9134903B2 (en) Content selecting technique for touch screen UI
US20140173483A1 (en) Drag-based content selection technique for touch screen ui
US20140168076A1 (en) Touch sensitive device with concentration mode
US20150185982A1 (en) Content flagging techniques for digital content
US20140210729A1 (en) Gesture based user interface for use in an eyes-free mode
US20150185981A1 (en) User interface for navigating paginated digital content
US20140215339A1 (en) Content navigation and selection in an eyes-free mode

Legal Events

Date Code Title Description
AS Assignment

Owner name: BARNESANDNOBLE.COM LLC, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PALLAKOFF, MATTHEW;COHN, HAROLD E.;SIGNING DATES FROM 20130724 TO 20130805;REEL/FRAME:030942/0823

AS Assignment

Owner name: NOOK DIGITAL, LLC, NEW YORK

Free format text: CHANGE OF NAME;ASSIGNOR:NOOK DIGITAL LLC;REEL/FRAME:035187/0476

Effective date: 20150303

Owner name: NOOK DIGITAL LLC, NEW YORK

Free format text: CHANGE OF NAME;ASSIGNOR:BARNESANDNOBLE.COM LLC;REEL/FRAME:035187/0469

Effective date: 20150225

AS Assignment

Owner name: NOOK DIGITAL, LLC, NEW YORK

Free format text: CORRECTIVE ASSIGNMENT TO REMOVE APPLICATION NUMBERS 13924129 AND 13924362 PREVIOUSLY RECORDED ON REEL 035187 FRAME 0476. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:NOOK DIGITAL LLC;REEL/FRAME:036131/0801

Effective date: 20150303

Owner name: NOOK DIGITAL LLC, NEW YORK

Free format text: CORRECTIVE ASSIGNMENT TO REMOVE APPLICATION NUMBERS 13924129 AND 13924362 PREVIOUSLY RECORDED ON REEL 035187 FRAME 0469. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:BARNESANDNOBLE.COM LLC;REEL/FRAME:036131/0409

Effective date: 20150225

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION