[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2025024055A1 - Simplified user interfaces - Google Patents

Simplified user interfaces Download PDF

Info

Publication number
WO2025024055A1
WO2025024055A1 PCT/US2024/032647 US2024032647W WO2025024055A1 WO 2025024055 A1 WO2025024055 A1 WO 2025024055A1 US 2024032647 W US2024032647 W US 2024032647W WO 2025024055 A1 WO2025024055 A1 WO 2025024055A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
mode
computing device
user
processors
Prior art date
Application number
PCT/US2024/032647
Other languages
French (fr)
Inventor
Qian Zhang
Bingying Xia
Lingeng WANG
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Llc filed Critical Google Llc
Publication of WO2025024055A1 publication Critical patent/WO2025024055A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Smartphones are portable computer devices that integrate mobile telephone and computing functions. Smartphones typically have a display, a keyboard, and other features such as a camera, Internet access, and email. Smartphone applications, also known as mobile apps, are software programs designed to run on smartphones and provide various functionalities and services. Smartphone user interfaces allow users to interact with smartphone devices and apps and provide for touch based input so users may interact with their devices.
  • aspects of this disclosure are directed to techniques to enable a computing device to switch between operating in a standard user interface mode and a simplified user interface mode. While operating in the simplified user interface mode, the computing device may produce a simplified version of the user interface rather than a standard user interface. In generating the simplified version of the user interface, the computing device may identify particular user interface elements (e.g., media control buttons, page navigation buttons, volume control buttons, save, delete, undo and print buttons, search fields, help buttons etc.) and convert those identified user interface elements into a smaller set of one or more user interface elements that may be more visually prominent and/or more easily located within the user interface.
  • particular user interface elements e.g., media control buttons, page navigation buttons, volume control buttons, save, delete, undo and print buttons, search fields, help buttons etc.
  • the simplified user interface may identify multiple standard selectable user interface elements within the standard graphical user interface and generate a unitary' selectable user interface element.
  • the single selectable user interface element may allow access to a menu of selectable functionalities, services, or other actions associated with the multiple standard selectable user interface elements, which may simplify the visual presentation of the user interface.
  • the menu may include text descriptions of the actions .
  • the computing device While operating in the simplified user interface mode, the computing device may also modify how a user navigates an active application. In the standard user interface, there may not be any graphical indication how to navigate the user interface, for example, how to scroll horizontally to display additional content.
  • the computing device may include arrow buttons that provide a distinct visual indication that there is additional content available and enable users to scroll horizontally by tapping on the arrow buttons.
  • the computing device may also replace other gesture functionality of the standard user interface in the simplified user interface mode with user interface elements, such as buttons.
  • the replaced gesture functionality may include such gesture functionality as vertical scrolling gesture functionality, back gesture functionality, dismiss gesture functionality, and gesture functionality used to display applications.
  • the techniques described herein relate to a method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transitioning, by the one or more processors, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detecting, by the one or more processors, a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detecting, by the one or more processors,
  • the techniques described herein relate to a computing device comprising memory; and one or more processors communicably coupled to the memory and configured to output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary' selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
  • the techniques described herein relate to computer-readable storage medium having stored thereon instructions that, when executed, cause one or more processors of a computing device to: output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary' selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action
  • the techniques described herein relate to a computing device including means for outputting, by a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; means for transitioning from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: means for outputting a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; means for detecting a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, means for displaying a menu of the actions corresponding to the plurality of selectable user interface elements; means for detecting a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, means for performing the selected
  • FIG. 1A is a conceptual diagram illustrating an example computing device that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure.
  • FIG. IB is a conceptual diagram illustrating an example computing device that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure.
  • FIG. 2 is a block diagram illustrating an example computing device, in accordance with one or more aspects of the present disclosure.
  • FIG. 3 is a block diagram illustrating a user interface for selecting a simple mode, in accordance with one or more aspects of the present disclosure.
  • FIG. 4 is a flowchart illustrating example operations of an example computing device, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure.
  • FIG. 5 is a flowchart illustrating example operations of an example computing device, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure.
  • FIG. 1A is a conceptual diagram illustrating an example computing device 102 that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure.
  • Computing device 102 may be an individual mobile or non-mobile computing device.
  • Examples of computing device 102 include a mobile phone, a tablet computer, a laptop computer, a desktop computer, a server, a mainframe, a set-top box, a television, a wearable device (e.g., a computerized watch, computerized eyewear, computerized headphones, computerized gloves, etc.), a home automation device or system (e.g., an intelligent thermostat or home assistant device), a gaming system, a media player, an e-book reader, a mobile television platform, an automobile navigation or infotainment system, or any other type of mobile, non-mobile, wearable, and non-wearable computing device.
  • a wearable device e.g., a computerized watch, computerized eyewear, computerized headphones, computerized gloves, etc.
  • Computing device 102 includes a user interface device (UID) 104.
  • UID 104 of computing device 102 may function as an input device for computing device 102 and as an output device for computing device 102.
  • UID 104 may be implemented using various technologies. For instance, UID 104 may function as an input device using a presencesensitive screen, such as a resistive touchscreen, a surface acoustic wave touchscreen, a capacitive touchscreen, a projective capacitive touchscreen, a pressure sensitive screen, an acoustic pulse recognition touchscreen, or another presence-sensitive display technology. In some examples, UID 104 may function as an input device using one or more audio input devices, such as one or more microphones.
  • LTD 104 may function as an output (e.g., display) device using any one or more display devices, such as a liquid crystal display (LCD), dot matrix display, light emitting diode (LED) display, microLED, organic light-emitting diode (OLED) display, c-ink. or similar monochrome or color display capable of outputting visible information to a user of computing device 102.
  • UID 104 may function as an audio output device and may include one or more speakers, one or more headsets, or any other audio output device capable of outputting audible information to a user of computing device 102.
  • UID 104 of computing device 102 may include a presencesensitive display that may receive tactile input from a user of computing device 102.
  • UID 104 may receive indications of the tactile input by detecting one or more gestures from a user of computing device 102 (e.g., the user touching or pointing to one or more locations of UID 104 with a finger or a stylus pen).
  • UID 104 may present output to a user, for instance at a presence-sensitive display.
  • UID 104 may present the output as a user interface (e.g., user interfaces 114A, 114B, 121A and 12 IB), which may be associated with functionality' provided by computing device 102.
  • UID 104 may present various user interfaces of components of a computing platform, operating system, applications (e.g., applications 112), or services executing at or accessible by computing device 102 (e.g., an electronic message application, an Internet browser application, a mobile operating system, etc.). A user may interact with a respective user interface to cause computing device 102 to perform operations relating to a function.
  • Computing device 102 may include a user interface module 106 (“UI module 106”) and mode selection module 108. Modules 106 and 108 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 102 or at one or more other remote computing devices.
  • modules 106 and 108 may be implemented as hardware, software, and/or a combination of hardware and software.
  • Computing device 102 may execute modules 106 and 108 with one or more processors.
  • Computing device 102 may execute any of modules 106 and 108 as or within a virtual machine executing on underlying hardware.
  • Modules 106 and 108 may be implemented in various ways. For example, any of modules 106 and 108 may be implemented as a downloadable or pre-installed application or “app.” In another example, any of modules 106 and 108 may be implemented as part of an operating system of computing device 102.
  • Other examples of computing device 102 that implement techniques of this disclosure may include additional components not shown in FIG. 1A.
  • UI module 106 may interpret inputs detected at UID 104. UI module 106 may relay information about the inputs detected at UID 104 to one or more associated platforms, operating systems, applications, and''or services executing at computing device 102 to cause computing device 102 to perform a function. UI module 106 may also receive information and instructions from one or more associated platforms, operating systems, applications, and/or services executing at computing device 102 (e.g., applications 112) for generating a GUI.
  • UI module 106 may act as an intermediary between the one or more associated platforms, operating systems, applications, and/or services executing at computing device 102 and various output devices of computing device 102 (e.g., speakers, LED indicators, vibrators, etc.) to produce output (e.g., graphical, audible, tactile, etc.) with computing device 102.
  • output devices e.g., speakers, LED indicators, vibrators, etc.
  • one of applications 112 may send user interface data to UI module 106.
  • UI module 106 may output instructions and information to UID 104 that cause UID 104 to display a user interface according to the information received from the application.
  • UID 104 When handling input detected by UID 104.
  • UI module 106 may receive information from UID 104 in response to inputs detected at locations of a screen of UID 104 at which elements of the user interface are displayed.
  • UI module 106 disseminates information about inputs detected by UID 104 to other components of computing device 102 for interpreting the inputs and for causing computing device 102 to perform one or more functions in response to the inputs.
  • Computing device 102 includes applications 112 that include functionality to perform any variety of operations on computing device 102.
  • applications 112 may include a web browser, an email application, text messaging application, instant messaging application, weather application, video conferencing application, social networking application, e-commerce application, stock market application, emergency alert application, sports application, office productivity application, multimedia player, etc.
  • Applications 112 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 102 or at one or more other remote computing devices.
  • applications 112 may be implemented as hardware, software, and/or a combination of hardware and software.
  • Computing device 102 may execute applications 112 with one or more processors.
  • Computing device 102 may execute any of applications 112 as or within a virtual machine executing on underlying hardware.
  • Applications 112 may be implemented in various ways. For example, any of applications 112 may be implemented as a downloadable or preinstalled application or “app.” In another example, any of applications 112 may be implemented as part of an operating system of computing device 102.
  • Other examples of computing device 102 that implement techniques of this disclosure may include additional components not shown in FIG. 1 A.
  • application 112A in a standard mode, produces and outputs user interface 114A including selectable user interface elements 115.
  • user interface 114A includes video display functionality but it is to be understood that other functionality such as text display, document processing, social media and other app functionality may be provided.
  • Mode selection module 108 provides, to applications 112, a display mode such as the standard mode or a simple mode.
  • mode selection module 108 indicates to application 112A that the computing device 102 is in the standard mode.
  • the standard mode may be the default mode.
  • application 112A provides interface data to UI module 106 for the user interface 114A.
  • UI module 106 may generate selectable user interface elements 1 i 5 using user interface controls provided by the operating system.
  • user interface 114A in the standard mode, includes a plurality of selectable user interface elements 115 each of which is positioned at a respective location in user interface 114A. In the Example of FIG.
  • selectable user interface elements 115 include “Play on TV” selectable user interface element 115A, “Autoplay” selectable user interface element 115B, “Closed Captions” selectable user interface element 115C, “Settings” selectable user interface element 115D, “Close” selectable user interface element 115E, “Previous” selectable user interface element 115F, “Rewind” selectable user interface element 115G, “Pause” selectable user interface element 115H, “Fast Forward” selectable user interface element 1151 and “Next” selectable user interface element 115 J.
  • a user may select one of the selectable user interface elements 115 at the user interface 114A through the user interface device 104.
  • “Autoplay” selectable user interface element 115B may be a toggle button user interface control and the user may switch the “Autoplay” selectable user interface element 115B into an on position from an off position.
  • UI module 106 may interpret inputs detected at UID 104 for “Autoplay” selectable user interface element 115B and relay information about the inputs detected to application 112A. For example, application 112A may then use the selected autoplay functionality to play another video after the current video completes. Application 112A or UI module 106 may update the display of “Autoplay” selectable user interface element 115B to indicate that the autoplay functionality is on.
  • selectable user interface elements 115 may be icons that may be difficult for inexperienced users to understand.
  • “Autoplay” selectable user interface element 115B is a toggle button user interface control which does not have any associated written indication of the functionality enabled by “Autoplay” selectable user interface element 115B. Users that are unfamiliar with the icon for the “Autoplay” selectable user interface element 115B may thus have difficulty enabling or disabling the autoplay functionality.
  • computing device 102 may implement a simple mode. In the simple mode, computing device 102 produces a simplified user interface, such as user interface 114B, rather than a standard user interface, such as user interface 114A.
  • computing device 102 may replace the plurality of selectable user interface elements 115 of the standard mode with a u nit ary selectable user interface element 119.
  • the unitary selectable user interface element 119 simplifies the user interface compared to the plurality of selectable user interface elements 115 of the standard mode.
  • computing device 102 may display a menu 113 of the actions 117 corresponding to the plurality of selectable user interface elements 115 of the standard mode.
  • computing device 102 may perform the selected action.
  • application 112A provides interface data to UI module 106 for the user interface 114B.
  • Application 112A may provide modified interface data that allows for the production of user interface 114B.
  • UI module 106 may then generate unitary selectable user interface element 119 and menu 113 of actions 117 in user interface 114B using user interface controls provided by the operating system.
  • the user selects unitary selectable user interface element 119 at user interface 114B.
  • UI module 106 may interpret inputs detected at UID 104 for unitary selectable user interface element 119 and relay information about the inputs detected to application 112A.
  • Application 112A may then provide interface data to UI module 106 to produce menu 113 of actions 117.
  • UI module 106 may generate menu 113 of actions 117 using user interface controls provided by the operating system. UI module 106 may use menu-based user interface controls to produce menu 113 with actions 117.
  • user interface 114B includes menu 113 that includes a plurality of actions 117 that each correspond to at least one selectable user interface element 115 from the plurality of selectable user interface elements of user interface 114A rather than at the respective locations of user interface 114B.
  • “Autoplay” action 117B in menu 113 of user interface 114B corresponds to “Autoplay” selectable user interface element 115B of user interface 114A.
  • actions 117 include “Play on TV” action I 17A, “Autoplay” action 117B, “Closed Captions” action 117C, “Settings” action 117D, “Close” action 117E, “Previous” action 117F, “Rewind” action 117G, “Pause” action 1 17H, a “Fast Forward” action 1171, and “Next” action 117J.
  • Menu 113 of actions 117 may include text descriptions of functionality associated with the plurality of actions.
  • “Rewind” action 117G of FIG. 1A includes the text “Rewind”
  • the text descriptions may help users understand the functionality without requiring the users to remember a meaning of an icon, such as the icons of selectable user interface elements 115 of user interface 114A.
  • Menu 113 of actions 117 of user interface 114B may also include icons indicating functionality of the plurality of selectable user interface elements.
  • “Rewind” action 117G of menu 113 may include a “Rewind” icon as well as the text “Rewind”.
  • computing device 102 detects user input at the location of “Rewind” action 117G.
  • UID 104 may receive indications of the user input by detecting one or more gestures from a user of computing device 102 at the location. Responsive to detecting the user input, UI module 106 may interpret inputs detected at UID 104 for “Rewind” action 117G and relay information about the inputs detected to application 112 A. For example, application 112 A may perform the action, such as rewinding a video in display area 111 in response to selecting “Rewind” action 117G.
  • Mode selection module 108 enables the user to select between a first user interface mode, such as the standard mode, and a second user interface mode, such as the simple mode.
  • Computing device 102 may receive user selections provided at a user interface, such as the user interface shown in FIG. 3, that cause the computing device 102 to transition into the second user interface mode.
  • Mode selection module 108 may also select the user interface mode based on an analysis of user behavior with respect to computing device 102 such as by analyzing whether a user has difficulty using user interfaces in the first user interface mode. Mode selection module 108 may use a rule based or machine learned model to evaluate user inputs to determine that the user has difficulty using user interfaces in the first user interface user interface mode then transition or make suggestions to the user to transition to the second user interface mode.
  • Mode selection module 108 receives and stores the simple mode as the current display mode and indicates to application 112A that the computing device 102 is now in the simple mode.
  • Display mode-based changes to the user interface may be applied at the application level.
  • Applications 112 may receive indications of the user interface mode from mode selection module 108 and, based on the user interface mode, determine the user interface elements to include in the user interface.
  • Display mode-based changes to the user interface may also be applied at the operating system level such that when the computing device 102 changes into the second user interface mode, UI module 106 may modify the user interface without requiring a modification of code for application 112A. In either case, rather than generating user interface 114A, in the second user interface mode, UI module 106 generates a modified user interface, such as user interface 114B.
  • IB is a conceptual diagram illustrating an example computing device 102 that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure.
  • Computing device 102, user interface device 104, UI module 106, applications 112 and mode selection module 108 are described above with respect to FIG 1A.
  • Mode selection module 108 may allow the user to select between a first user interface mode, such as a standard mode, and a second user interface mode, such as a simple mode.
  • Computing device 102 may receive user selections through a user interface such as the user interface shown in FIG. 3 that cause computing device 102 to transition into the second user interface mode.
  • Mode selection module 108 may also select the user interface mode based on an analysis of user behavior with respect to computing device 102 such as by analyzing whether a user has difficulty using user interfaces in the first user interface mode.
  • application 112B in a standard mode, produces a user interface 121 A including gesture functionality, for example horizontal scrolling gestures, vertical scrolling gestures, back gestures, dismiss gestures, and gestures to display all applications.
  • the user interface 114B includes a display of selectable content but it is to be understood that other functionality such as video display, text display, document, processing, social media, and other app functionality may be provided.
  • Mode selection module 108 provides to application 112B a display mode such as the standard mode or a simple mode.
  • mode selection module 108 indicates to application 112B that the computing device 102 is in the standard mode.
  • the standard mode may be the default mode.
  • Application 112B provides interface data to UI module 106 for the user interface 121 A.
  • UI module 106 may generate gesture functionality, such as horizontal scrolling gesture functionality, using user interface controls provided by the operating system.
  • Gesture functionality is often used with touchscreen devices for actions such as navigating through content or selecting items. For example, for horizontal scrolling functionality, to perform a left swipe, a user drags a finger across the screen from right side to the left side and to perform a right swipe the user drags a finger across the screen from left side to the right side.
  • Such gesture functionality may not be intuitive for all users given that user interfaces often give no visual indication of the gesture functionality.
  • Computing device 102 interprets gestures from user as associated with actions. For example, computing device 102 may interpret user horizontal scrolling gestures to the left or to the right so as to move content in portion 123. For example, when a user contacts portion 123 of user interface 121A, UID 104 receives indications of the tactile input in portion 123 of user interface 121A. UI module 106 may interpret gestures detected at UID 104 and relay information about the inputs detected to application 112B. Application 112B may then update the display of content in portion 123. In the standard mode, the horizontal scrolling functionality may be difficult for inexperienced users to understand and/or use because now visible indication of the horizontal scrolling functionality is provided.
  • computing device may implement a simple mode.
  • computing device 102 produces a simplified user interface, such as user interface 121B, rather than a standard user interface, such as user interface 121 A.
  • Mode selection module 108 allows the user to select between a first user interface mode, such as the standard mode, and a second user interface mode, such as the simple mode.
  • Computing device 102 may receive user selections through a user interface such as the user interface shown in FIG. 3 that cause the computing device 102 to transition into the second user interface mode (simple mode). Details of the mode selection module 108 are discussed above with respect to FIG. IA.
  • Mode selection module 108 receives and stores the simple mode as the current display mode and indicates to application 112B that the computing device 102 is now in the simple mode.
  • application 112A provides interface data to UI module 106 for the user interface 121B.
  • Application 112A may provide modified interface data that allows for the production of user interface 12 IB.
  • UI module 106 may then generate user interface elements, such as arrow buttons 125A and 125B, using user interface controls provided by the operating system.
  • arrow buttons I25A and 125B may replace the horizontal scrolling gesture functionality present in user interface 121 A.
  • Arrow buttons 125 A and 125B may be at least partially transparent to allow content beneath the buttons to be partially visible.
  • UI module 106 may position arrow buttons 125A and 125B in portion 123.
  • application 112B may make display mode-based changes to the user interface.
  • Application 112B may receive an indication of the user interface mode from mode selection module 108 and based on the user interface mode determine functionality at user interface 12IB.
  • Application 112B may use the mode indication and then provide the gesture functionality in the first user interface mode and user interface element(s), such as the arrow buttons 125A and 125B, in the second user interface mode.
  • Display mode-based changes to the user interface may be applied at the operating system level such that when the computing device 102 is put into the second user interface mode, UI module 106 may modify the user interface without requiring a modification of code for application 112B. Display mode-based changes to the user interface may also be applied at the application level. Applications 112 may receive indications of the user interface mode from mode selection module 108 and based on the user interface mode determine the user interface elements to include in the user interface. In either case, rather than generating user interface 12IA, in the second user interface mode, UI module 106 may generate a modified user interface such as user interface 121B with user interface elements, such as arrow buttons 125A and 125B.
  • the user selects a user interface element, such as arrow button 125A, at the user interface 12 IB.
  • UI module 106 may interpret inputs detected at UID 104 for user interface element, such as arrow button 125A, and relays information about the inputs detected to application 112B.
  • application 112B may update user interface 121B by performing an action, such as scrolling the content in portion 123 to the right when arrow button 125A is pressed.
  • Computing device 102 may detect user input at a location at which the user interface element, such as one of arrow buttons 125A and 125B, is located.
  • UID 104 may receive indications of the user input by detecting one or more gestures from a user of computing device 102 at the location. Responsive to detecting the user input, computing device 102 may perform the selected action, such as horizontally scrolling portion 123 of user interface 121B. As a result of horizontal scrolling, content in portion 123 of user interface 12 IB may move to show hidden pictures, videos or other content.
  • computing device 102 may horizontally scroll portion 123 of user interface 12 IB to the right and, responsive to detecting the user input at the location of arrow button 125B, computing device 102 may horizontally scroll portion 123 of user interface 121B to the left.
  • FIG. 2 is a block diagram illustrating an example computing device, in accordance with one or more aspects of the present disclosure.
  • Computing device 202 of FIG. 2 is an example of computing device 102 of FIG. 1.
  • Computing device 202 is only one particular example of computing device 102 of FIG. 1, and many other examples of computing devices may be used in other instances.
  • computing device 202 may be a mobile computing device (e.g., a smartphone), or any other computing device.
  • Computing device 202 of FIG. 2 may include a subset of the components included in computing device 202 or may include additional components not shown in FIG. 2.
  • computing device 202 includes user interface device 204 (“UID 204”), one or more processors 240, one or more input devices 242, one or more communication units 244, one or more output devices 246, and one or more storage devices 248.
  • Storage devices 248 of computing device 202 also include operating system 254, mode selection module 208, UI module 206, and applications 212.
  • Communication channels 250 may interconnect each of the components 240, 242, 244, 246, 248, and 204 for inter-component communications (physically, communicatively, and/or operatively).
  • communication channels 250 may include a system bus, a network connection, an inter-process communication data structure, or any other method for communicating data.
  • One or more input devices 242 of computing device 202 may be configured to receive input. Examples of input are tactile, audio, and video input.
  • Input devices 242 of computing device 202 includes a presence-sensitive display, touch- sensitive screen, mouse, keyboard, voice responsive system, video camera, microphone or any other type of device for detecting input from a human or machine.
  • One or more output devices 246 of computing device 202 may be configured to generate output. Examples of output are tactile, audio, and video output.
  • Output devices 246 of computing device 202 includes a presence-sensitive organic light emitting diode (OLED) display, sound card, video graphics adapter card, speaker, monitor, a presence-sensitive liquid crystal display (LCD), or any other type of device for generating output to a human or machine.
  • OLED organic light emitting diode
  • LCD presence-sensitive liquid crystal display
  • One or more communication units 244 of computing device 202 may communicate with external devices via one or more wired and/or wireless networks by transmitting and/or receiving network signals on the one or more networks.
  • Examples of communication unit 244 include a network interface card (e.g., such as an Ethernet card), an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that may send and/or receive information.
  • Other examples of communication units 244 may include short wave radios, cellular data radios, wireless network radios, as well as universal serial bus (USB) controllers.
  • USB universal serial bus
  • UID 204 of computing device 202 may include functionality of input devices 242 and/or output devices 246.
  • UID 204 may be or may include a presence- sensitive input device.
  • a presence sensitive input device may detect an object at and/or near a screen.
  • a presence-sensitive input device may detect an object, such as a finger or stylus that is within 2 inches or less of the screen.
  • the presence-sensitive input device may determine a location (e.g., an (x,y) coordinate) of a screen at which the object was detected.
  • a presence-sensitive input device may detect an object six inches or less from the screen and other ranges are also possible.
  • the presence-sensitive input device may determine the location of the screen selected by a user’s finger using capacitive, inductive, and/or optical recognition techniques.
  • a presence sensitive input device also provides output to a user using tactile, audio, or video stimuli as described with respect to output device 246, e.g., at a display.
  • UID 204 may present a user interface.
  • UID 204 While illustrated as an internal component of computing device 202, UID 204 also represents an external component that shares a data path with computing device 202 for transmitting and/or receiving input and output.
  • UID 204 represents a built-in component of computing device 202 located within and physically connected to the external packaging of computing device 202 (e.g., a screen on a mobile phone).
  • UID 204 represents an external component of computing device 202 located outside and physically separated from the packaging of computing device 202 (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data path with a tablet computer).
  • One or more storage devices 248 within computing device 202 may store information for processing during operation of computing device 202.
  • storage device 248 is a temporary memory, meaning that a primary purpose of storage device 248 is not long-term storage.
  • Storage devices 248 on computing device 202 may be configured for short-term storage of information as volatile memory and therefore not retain stored contents if powered off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art.
  • RAM random access memories
  • DRAM dynamic random access memories
  • SRAM static random access memories
  • Storage devices 248 may be configured to store larger amounts of information than volatile memory.
  • Storage devices 248 may further be configured for long-term storage of information as non-volatile memory space and retain information after power on/off cycles. Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
  • Storage devices 248 may store program instructions and/or information (e.g., data) associated with operating system 254, mode selection module 208, applications 212.
  • processors 240 may implement functionality and/or execute instractions within computing device 202.
  • processors 240 of computing device 202 may receive and execute instructions stored by storage devices 248 that execute the functionality of operating system 254, mode selection module 208, applications 212.
  • Operating system 254, mode selection module 208, applications 212 are described below executing at one or more processors 240. It should be understood that one or more processors 240 are configured to execute operating system 254, mode selection module 208, applications 212 to perform the functionality of operating system 254, mode selection module 208, and applications 212 described below. That is, one or more processors 240 are configured to execute the instructions of operating system 254, mode selection module 208, applications 212 to perform the functionality of operating system 254, mode selection module 208, applications 212 described below.
  • Operating system 254 may be composed of several layers, each building upon the previous one to provide the overall functionality of the operating system 254.
  • the layers may include a kernel responsible for providing low-level system services such as memory management, process management, and device drivers, Native Libraries for system services such as graphics, media, and database access, a runtime or virtual machine executing applications, an application framework providing a set of Application Programing Interfaces (APIs) and services for developers to build applications.
  • APIs Application Programing Interfaces
  • a smartphone launcher may be used to allow users to launch and interact with apps.
  • Smartphone launchers typically include a home screen with app shortcuts, a drawer for accessing all installed apps, and an app tray for holding widgets.
  • Applications 212 may include applications 212A and 212B, and one or more processors 240 are configured to execute applications 212A and 212B.
  • processors 240 may be configured to execute applications 212A and 212B to produce user interfaces with user interface elements that are different based on a selected user interface mode.
  • Mode selection module 208 may allow users to select between two user interface modes: a standard mode and a simple mode. Users may select the user interface mode through a user interface, such as the one shown in FIG. 3. The mode selection module 208 may also select the mode based on an analysis of user behavior, such as whether the user has difficulty using the computing device in the standard mode.
  • Applications 212 may modify the user interfaces based on the selected mode.
  • applications 212 may make the user interfaces easier to use.
  • application 212 A may generate a menu using user interface controls provided by the operating system. The menu may include a plurality of actions, each of which corresponds to one or more selectable user interface elements from the user interface of the standard mode as shown in FIG. 1 A.
  • application 2I2B may produce a user interface that includes user interface elements, such as arrowbuttons, that replace gesture functionality, such as horizontal scrolling gesture functionality, present in the user interface of the standard mode as shown in FIG. IB.
  • Operating system 254 may also modify the user interfaces based on the selected mode.
  • FIG. 3 is a block diagram illustrating a user interface for selecting a simple user interface mode, in accordance with one or more aspects of the present disclosure.
  • the computing device may produce page 360 to enable the selection of a user interface mode.
  • Selector 362 may allow a user to switch between a simple mode and a standard mode.
  • Mode selection module and applications may receive input from selector 362 to control the user interface mode displays.
  • Display options at page 360 may allow text size selection based on transitioning into the second user interface mode, such as the simple mode.
  • Page 360 may also allow the user to adjust the display options for the second user interface mode, such as the simple mode.
  • selector 362 default values for functionality associated with the simple mode may be shown in page 360 and these values may be updated as discussed below.
  • Selector 364 may be used to toggle the menu selection functionality described above with respect to FIG. 1 A on and off in the simple mode.
  • Selector 366 may be used to toggle the arrow scrolling functionality described above with respect to FIG IB on and off.
  • Buttons 368 may be used to select the text size for the simple mode.
  • a default text size of the simple mode may be set larger than the default text size for the standard mode.
  • a larger text size may help users use the computing device in the simple mode.
  • Buttons 370 may be used to select the display size for the simple mode.
  • a default display size of the simple mode may be set larger than the default display size for the standard mode.
  • a larger display size may help users use the computing device in the simple mode.
  • Buttons 372 may be used to select the contrast for the simple mode.
  • a default contrast of the simple mode may be set greater than the default contrast for the standard mode. Greater contrast may help users use the computing device in the simple mode.
  • Slider 374 may be used to select the brightness for the simple mode. A default brightness of the simple mode may be set greater than the default brightness for the standard mode. Greater brightness may help users use the computing device in the simple mode.
  • Slider 376 may be used to select the volume for the simple mode.
  • a default volume of the simple mode may be set greater than the default volume for the standard mode. Increased volume may help users use the computing device in the simple mode.
  • Touch and hold slider 378 may adjust the length of time before a tap on the screen becomes a touch and hold. Touch and hold functionality may enable selection of user interface elements for example to move such user interface elements in the display.
  • a default length of time of the simple mode may be set greater than the default length of time for the standard mode. The greater length of time may help users avoid accidentally enabling the touch and hold functionality.
  • FIG. 4 is a flowchart illustrating example operations of an example computing system, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure. For purposes of illustration only, the example operations are described below within the context of computing device 102 of FIG. 1A.
  • Computing device 102 may output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface (402).
  • the selectable user interface elements may be icons without associated text.
  • Computing device 102 may transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode (404).
  • Computing device 102 may while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements (406).
  • the unitary selectable user interface combines multiple standard UI elements into a single UI element in the second user interface mode.
  • Computing device 102 may detect a first user input selecting the unitary selectable user interface element (408). Responsive to detecting the first user input, computing device 102 may displaying a menu of the actions corresponding to the plurality of selectable user interface elements (410). The actions may include text that allows some users to better understand the functionality associated with the actions. [0087] Computing device 102 may detect a second user input selecting one of the actions from the menu of the actions (412). Responsive to detecting the second user input, computing device 102 may perform the selected action (414).
  • FIG. 5 is a flowchart illustrating example operations of an example computing system, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure. For purposes of illustration only, the example operations are described below within the context of computing device 102 of FIG. IB.
  • Computing device 102 may output, in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action (502).
  • Computing device 102 may transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode (504).
  • Computing device 102 may, while operating in the second user interface mode, output, for display, a second user interface of the application, the second user interface including a user interface element that replaces the gesture functionality present in the first user interface mode (506).
  • Computing device 102 may detect a user input at a location at which the user interface element is displayed (508). Computing device 102 may, responsive to detecting the user input, perform the action (510).
  • Example 1 A method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transitioning, by the one or more processors, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detecting, by the one or more processors, a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detecting, by the one or more processors, a second user input selecting one of the actions from
  • Example 2 The method of example 1, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
  • Example 3 The method of example 2, wherein the menu of the actions includes icons associated with the text descriptions.
  • Example 4 The method of any of examples 1-3, further comprising receiving, by the one or more processors, a user selection to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 5 The method of any of examples 1-4, further comprising determining, by the one or more processors, based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 6 The method of any of examples 1-5, further comprising changing, by the one or more processors, display options, including text size, based on transitioning into the second user interface mode.
  • Example 7 The method of example 6, further comprising providing a page to adjust the display options for the second user interface mode.
  • Example 8 A device comprising means for performing any combination of the methods of examples 1-7.
  • Example 9 A system implementing any combination of the methods of examples 1-7.
  • Example 10 A non-transitory computer-readable storage medium implementing any combination of the methods of examples 1-7.
  • Example 11 A computing device comprising memory; and one or more processors communicably coupled to the memory and configured to output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
  • Example 12 The computing device of example 11, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
  • Example 13 The computing device of example 12, wherein the menu of the actions includes icons associated with the text descriptions.
  • Example 14 The computing device of any of examples 11-13, wherein the one or more processors are configured to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 15 The computing device of any of examples 11-14, wherein the one or more processors are configured to determine based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 16 The computing device of any of examples 11-15, wherein the one or more processors are configured to change display options, including text size, based on transitioning into the second user interface mode.
  • Example 17 The computing device of example 16, wherein the one or more processors are configured to provide a page to adjust the display options for the second user interface mode.
  • Example 18 A computer-readable storage medium having stored thereon instractions that, when executed, cause one or more processors of a computing device to: output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
  • Example 19 The computer-readable storage medium of example 18, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
  • Example 20 The computer-readable storage medium of example 19, wherein the menu of the actions includes icons associated with the text descriptions.
  • Example 21 The computer-readable storage medium of any of examples 18-20, wherein the instructions, when executed, further cause the one or more processors to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 22 The computer-readable storage medium of any of examples 18-21, wherein the instructions, when executed, further cause the one or more processors to determine based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 23 The computer-readable storage medium of any of examples 18-22, wherein the instructions, when executed, further cause the one or more processors to change display options, including text size, based on transitioning into the second user interface mode.
  • Example 24 The computer-readable storage medium of example 23, wherein the instructions, when executed, further cause the one or more processors to provide a page to adjust the display options for the second user interface mode.
  • Example 25 A method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action; transitioning, by the computing device, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; while operating in the second user interface mode, outputting, by one or more processors of the computing device, a second user interface of the application, the second user interface including a user interface element that replaces the gesture functionality present in the first user interface mode, detecting, by the computing device, a user input at a location the user interface element a is displayed; and responsive to detecting the user input, performing the action.
  • Example 26 The method of example 25, wherein the gesture functionality is horizontal scrolling gesture functionality, the user interface element includes a pair of arrow buttons and the action is horizontally scrolling a portion of the second user interface.
  • Example 27 The method of example 26, wherein the one of the arrow buttons is positioned in the portion, wherein the one of the arrow buttons points toward additional content for the portion.
  • Example 28 The method of any of examples 26-27, wherein the one of the arrow buttons is at least partially transparent.
  • Example 29 The method of any of examples 25-28, further comprising receiving, by the computing device, a user selection to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 30 The method of any of examples 25-29, further comprising determining, by the computing device, based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 31 The method of any of examples 25-30, further comprising changing, by the computing device, display options including text size based on transitioning into the second user interface mode.
  • Example 32 The method of example 31, further comprising providing a page to adjust the display options for the second user interface mode.
  • Example 33 A device comprising means for performing any combination of the methods of examples 25-32.
  • Example 34 A system implementing any combination of the methods of examples 25-32.
  • Example 35 A non-transitory computer-readable storage medium implementing any combination of the methods of examples 25-32.
  • Example 36 A computing device comprising: a presence-sensitive screen; memory; and one or more processors communicably coupled to the memory and configured to: output, in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; while operating in the second user interface mode, output a second user interface of the application, the second user interface including a user interface element that replaces the gesture functionality present in the first user interface mode, detect a user input at a location the user interface element is displayed; and responsive to detecting the user input, performing the action.
  • Example 37 The computing device of example 36 wherein the gesture functionality is horizontal scrolling gesture functionality, the user interface element includes a pair of arrow buttons and the action is horizontally scrolling a portion of the second user interface.
  • Example 38 The computing device of example 37, wherein the one of the arrow buttons is positioned in the portion, wherein the one of the arrow buttons points toward additional content for the portion.
  • Example 39 The computing device of example 37-38, wherein the one of the arrow buttons is at least partially transparent.
  • Example 40 The computing device of any of examples 36-39, wherein the one or more processors are configured to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 41 The computing device of any of examples 36-40, wherein the one or more processors are configured to, based on user interactions with the computing device, enter the second user interface mode before transitioning into the second user interface mode.
  • Example 42 The computing device of any of examples 36-41, wherein the one or more processors are configured to display options including text size based on transitioning into the second user interface mode.
  • Example 43 The computing device of example 42, wherein the one or more processors are configured to provide a page to adjust the display options for the second user interface mode.
  • Example 44 A computer-readable storage medium having stored thereon instructions that, when executed, cause one or more processors of a computing device to: output, in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; while operating in the second user interface mode, output a second user interface of the application, the second user interface including a user interface element that replace the gesture functionality present in the first user interface mode, detect a user input at a location at which the user interface element is displayed; and responsive to detecting the user input, performing the action.
  • Example 45 The computer-readable storage medium of example 44 wherein the gesture functionality is horizontal scrolling gesture functionality, the user interface element includes a pair of arrow buttons and the action is horizontally scrolling a portion of the second user interface.
  • Example 46 The computer-readable storage medium of example 45, wherein the one of the arrow buttons is positioned in the portion, wherein the one of the arrowbuttons points toward additional content for the portion.
  • Example 47 The computer-readable storage medium of any of examples 45-46, wherein the one of the arrow buttons is at least partially transparent.
  • Example 48 The computer-readable storage medium of any of examples 44-47, wherein the instructions cause one or more processors to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
  • Example 49 The computer-readable storage medium of any of examples 44-48, wherein the instructions cause one or more processors to, based on user interactions with the computing device, enter the second user interface mode before transitioning into the second user interface mode.
  • Example 50 The computer-readable storage medium of any of examples 44-49, wherein the instructions cause one or more processors to display options including text size based on transitioning into the second user interface mode.
  • Example 51 The computer-readable storage medium of example 50, wherein the instructions cause one or more processors to provide a page where the user may adjust the display options for the second user interface mode.
  • Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol.
  • computer-readable media generally may correspond to (1) tangible computer-readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave.
  • Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure.
  • a computer program product may include a computer-readable medium.
  • such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
  • coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
  • DSL digital subscriber line
  • computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transient media, but are instead directed to non-transient, tangible storage media.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
  • the functionality described herein may be provided within dedicated hardware and/or software modules. Also, the techniques could be frilly implemented in one or more circuits or logic elements.
  • the techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set).
  • IC integrated circuit
  • Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
  • Various examples of the disclosure have been described. Any combination of the described systems, operations, or functions is contemplated. These and other examples are within the scope of the following claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computing device may output in a first user interface mode a first user interface including a plurality of selectable user interface elements positioned at a respective location and each of which corresponds to an action. The computing device may transition from operating in the first user interface mode to operating in a second user interface mode. While operating in the second user interface mode, the computing device may output a second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements. The computing device may detect a first user input selecting the unitary selectable user interface element and display a menu of the actions corresponding to the plurality of selectable user interface elements. The computing device may detect a second user input selecting one of the actions from the menu of the actions and perform the selected action.

Description

SIMPLIFIED USER INTERFACES
RELATED APPLICATION
[0001] This application claims benefit to U.S. Provisional Application Number 63/515,550, filed July 25, 2023, the entire contents of which are incorporated herein by reference.
BACKGROUND
[0002] Personal computing devices including smartphones are increasingly popular. Smartphones are portable computer devices that integrate mobile telephone and computing functions. Smartphones typically have a display, a keyboard, and other features such as a camera, Internet access, and email. Smartphone applications, also known as mobile apps, are software programs designed to run on smartphones and provide various functionalities and services. Smartphone user interfaces allow users to interact with smartphone devices and apps and provide for touch based input so users may interact with their devices.
SUMMARY
[0003] In general, aspects of this disclosure are directed to techniques to enable a computing device to switch between operating in a standard user interface mode and a simplified user interface mode. While operating in the simplified user interface mode, the computing device may produce a simplified version of the user interface rather than a standard user interface. In generating the simplified version of the user interface, the computing device may identify particular user interface elements (e.g., media control buttons, page navigation buttons, volume control buttons, save, delete, undo and print buttons, search fields, help buttons etc.) and convert those identified user interface elements into a smaller set of one or more user interface elements that may be more visually prominent and/or more easily located within the user interface. For example, the simplified user interface may identify multiple standard selectable user interface elements within the standard graphical user interface and generate a unitary' selectable user interface element. The single selectable user interface element may allow access to a menu of selectable functionalities, services, or other actions associated with the multiple standard selectable user interface elements, which may simplify the visual presentation of the user interface. In various examples, the menu may include text descriptions of the actions . [0004] While operating in the simplified user interface mode, the computing device may also modify how a user navigates an active application. In the standard user interface, there may not be any graphical indication how to navigate the user interface, for example, how to scroll horizontally to display additional content. Further, even when a horizontal scroll bar may be included in the user interface, it may be difficult for a user to provide an input at the scroll bar to effectively scroll the content given the small size of the scroll bars. Rather than a user providing input at a horizontal scroll bar or requiring the user to just know that there is additional content to view and just know that the user needs to perform a swipe gesture to scroll the content, the computing device may include arrow buttons that provide a distinct visual indication that there is additional content available and enable users to scroll horizontally by tapping on the arrow buttons. The computing device may also replace other gesture functionality of the standard user interface in the simplified user interface mode with user interface elements, such as buttons. The replaced gesture functionality may include such gesture functionality as vertical scrolling gesture functionality, back gesture functionality, dismiss gesture functionality, and gesture functionality used to display applications.
[0005] In some aspects, the techniques described herein relate to a method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transitioning, by the one or more processors, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detecting, by the one or more processors, a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detecting, by the one or more processors, a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, performing, by the one or more processors, the selected action. [0006] In some aspects, the techniques described herein relate to a computing device comprising memory; and one or more processors communicably coupled to the memory and configured to output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary' selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
[0007] In some aspects, the techniques described herein relate to computer-readable storage medium having stored thereon instructions that, when executed, cause one or more processors of a computing device to: output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary' selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
[0008] In some aspects, the techniques described herein relate to a computing device including means for outputting, by a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; means for transitioning from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: means for outputting a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; means for detecting a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, means for displaying a menu of the actions corresponding to the plurality of selectable user interface elements; means for detecting a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, means for performing the selected action. [0009] The details of one or more examples of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGS
[0010] FIG. 1A is a conceptual diagram illustrating an example computing device that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure.
[0011] FIG. IB is a conceptual diagram illustrating an example computing device that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure.
[0012] FIG. 2 is a block diagram illustrating an example computing device, in accordance with one or more aspects of the present disclosure.
[0013] FIG. 3 is a block diagram illustrating a user interface for selecting a simple mode, in accordance with one or more aspects of the present disclosure.
[0014] FIG. 4 is a flowchart illustrating example operations of an example computing device, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure.
[0015] FIG. 5 is a flowchart illustrating example operations of an example computing device, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure. DETAILED DESCRIPTION
[0016] FIG. 1A is a conceptual diagram illustrating an example computing device 102 that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure. Computing device 102 may be an individual mobile or non-mobile computing device. Examples of computing device 102 include a mobile phone, a tablet computer, a laptop computer, a desktop computer, a server, a mainframe, a set-top box, a television, a wearable device (e.g., a computerized watch, computerized eyewear, computerized headphones, computerized gloves, etc.), a home automation device or system (e.g., an intelligent thermostat or home assistant device), a gaming system, a media player, an e-book reader, a mobile television platform, an automobile navigation or infotainment system, or any other type of mobile, non-mobile, wearable, and non-wearable computing device.
[0017] Computing device 102 includes a user interface device (UID) 104. UID 104 of computing device 102 may function as an input device for computing device 102 and as an output device for computing device 102. UID 104 may be implemented using various technologies. For instance, UID 104 may function as an input device using a presencesensitive screen, such as a resistive touchscreen, a surface acoustic wave touchscreen, a capacitive touchscreen, a projective capacitive touchscreen, a pressure sensitive screen, an acoustic pulse recognition touchscreen, or another presence-sensitive display technology. In some examples, UID 104 may function as an input device using one or more audio input devices, such as one or more microphones. LTD 104 may function as an output (e.g., display) device using any one or more display devices, such as a liquid crystal display (LCD), dot matrix display, light emitting diode (LED) display, microLED, organic light-emitting diode (OLED) display, c-ink. or similar monochrome or color display capable of outputting visible information to a user of computing device 102. In some examples, UID 104 may function as an audio output device and may include one or more speakers, one or more headsets, or any other audio output device capable of outputting audible information to a user of computing device 102.
[0018] In some examples, UID 104 of computing device 102 may include a presencesensitive display that may receive tactile input from a user of computing device 102. UID 104 may receive indications of the tactile input by detecting one or more gestures from a user of computing device 102 (e.g., the user touching or pointing to one or more locations of UID 104 with a finger or a stylus pen). UID 104 may present output to a user, for instance at a presence-sensitive display. UID 104 may present the output as a user interface (e.g., user interfaces 114A, 114B, 121A and 12 IB), which may be associated with functionality' provided by computing device 102. For example, UID 104 may present various user interfaces of components of a computing platform, operating system, applications (e.g., applications 112), or services executing at or accessible by computing device 102 (e.g., an electronic message application, an Internet browser application, a mobile operating system, etc.). A user may interact with a respective user interface to cause computing device 102 to perform operations relating to a function. [0019] Computing device 102 may include a user interface module 106 (“UI module 106”) and mode selection module 108. Modules 106 and 108 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 102 or at one or more other remote computing devices. In some examples, modules 106 and 108 may be implemented as hardware, software, and/or a combination of hardware and software. Computing device 102 may execute modules 106 and 108 with one or more processors. Computing device 102 may execute any of modules 106 and 108 as or within a virtual machine executing on underlying hardware. Modules 106 and 108 may be implemented in various ways. For example, any of modules 106 and 108 may be implemented as a downloadable or pre-installed application or “app.” In another example, any of modules 106 and 108 may be implemented as part of an operating system of computing device 102. Other examples of computing device 102 that implement techniques of this disclosure may include additional components not shown in FIG. 1A.
[0020] UI module 106 may interpret inputs detected at UID 104. UI module 106 may relay information about the inputs detected at UID 104 to one or more associated platforms, operating systems, applications, and''or services executing at computing device 102 to cause computing device 102 to perform a function. UI module 106 may also receive information and instructions from one or more associated platforms, operating systems, applications, and/or services executing at computing device 102 (e.g., applications 112) for generating a GUI. In addition, UI module 106 may act as an intermediary between the one or more associated platforms, operating systems, applications, and/or services executing at computing device 102 and various output devices of computing device 102 (e.g., speakers, LED indicators, vibrators, etc.) to produce output (e.g., graphical, audible, tactile, etc.) with computing device 102.
[0021] In the example of FIG. 1A, one of applications 112 may send user interface data to UI module 106. In response, UI module 106 may output instructions and information to UID 104 that cause UID 104 to display a user interface according to the information received from the application. When handling input detected by UID 104. UI module 106 may receive information from UID 104 in response to inputs detected at locations of a screen of UID 104 at which elements of the user interface are displayed. UI module 106 disseminates information about inputs detected by UID 104 to other components of computing device 102 for interpreting the inputs and for causing computing device 102 to perform one or more functions in response to the inputs.
[0022] Computing device 102 includes applications 112 that include functionality to perform any variety of operations on computing device 102. For instance, applications 112 may include a web browser, an email application, text messaging application, instant messaging application, weather application, video conferencing application, social networking application, e-commerce application, stock market application, emergency alert application, sports application, office productivity application, multimedia player, etc. Applications 112 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 102 or at one or more other remote computing devices. In some examples, applications 112 may be implemented as hardware, software, and/or a combination of hardware and software. Computing device 102 may execute applications 112 with one or more processors. Computing device 102 may execute any of applications 112 as or within a virtual machine executing on underlying hardware. Applications 112 may be implemented in various ways. For example, any of applications 112 may be implemented as a downloadable or preinstalled application or “app.” In another example, any of applications 112 may be implemented as part of an operating system of computing device 102. Other examples of computing device 102 that implement techniques of this disclosure may include additional components not shown in FIG. 1 A.
[0023] In the example of FIG. 1A, in a standard mode, application 112A produces and outputs user interface 114A including selectable user interface elements 115. In this example, user interface 114A includes video display functionality but it is to be understood that other functionality such as text display, document processing, social media and other app functionality may be provided.
[0024] Mode selection module 108 provides, to applications 112, a display mode such as the standard mode or a simple mode. In this example, mode selection module 108 indicates to application 112A that the computing device 102 is in the standard mode. The standard mode may be the default mode.
[0025] To produce user interface 114A, application 112A provides interface data to UI module 106 for the user interface 114A. UI module 106 may generate selectable user interface elements 1 i 5 using user interface controls provided by the operating system. [0026] In the example of FIG 1A, in the standard mode, user interface 114A includes a plurality of selectable user interface elements 115 each of which is positioned at a respective location in user interface 114A. In the Example of FIG. 1 A, selectable user interface elements 115 include “Play on TV” selectable user interface element 115A, “Autoplay” selectable user interface element 115B, “Closed Captions” selectable user interface element 115C, “Settings” selectable user interface element 115D, “Close” selectable user interface element 115E, “Previous” selectable user interface element 115F, “Rewind” selectable user interface element 115G, “Pause” selectable user interface element 115H, “Fast Forward” selectable user interface element 1151 and “Next” selectable user interface element 115 J.
[0027] A user may select one of the selectable user interface elements 115 at the user interface 114A through the user interface device 104. For example, “Autoplay” selectable user interface element 115B may be a toggle button user interface control and the user may switch the “Autoplay” selectable user interface element 115B into an on position from an off position.
[0028] UI module 106 may interpret inputs detected at UID 104 for “Autoplay” selectable user interface element 115B and relay information about the inputs detected to application 112A. For example, application 112A may then use the selected autoplay functionality to play another video after the current video completes. Application 112A or UI module 106 may update the display of “Autoplay” selectable user interface element 115B to indicate that the autoplay functionality is on.
[0029] In the standard mode, selectable user interface elements 115 may be icons that may be difficult for inexperienced users to understand. For example, “Autoplay” selectable user interface element 115B is a toggle button user interface control which does not have any associated written indication of the functionality enabled by “Autoplay” selectable user interface element 115B. Users that are unfamiliar with the icon for the “Autoplay” selectable user interface element 115B may thus have difficulty enabling or disabling the autoplay functionality. [0030] In order to have simplified user interface displays, computing device 102 may implement a simple mode. In the simple mode, computing device 102 produces a simplified user interface, such as user interface 114B, rather than a standard user interface, such as user interface 114A. In the simple mode, computing device 102 may replace the plurality of selectable user interface elements 115 of the standard mode with a u nit ary selectable user interface element 119. The unitary selectable user interface element 119 simplifies the user interface compared to the plurality of selectable user interface elements 115 of the standard mode. When the user selects the unitary selectable user interface element 119, computing device 102 may display a menu 113 of the actions 117 corresponding to the plurality of selectable user interface elements 115 of the standard mode. When the user selects one of the actions 117 from the menu 113 of the actions; computing device 102 may perform the selected action.
[0031] To produce user interface 114B, application 112A provides interface data to UI module 106 for the user interface 114B. Application 112A may provide modified interface data that allows for the production of user interface 114B. UI module 106 may then generate unitary selectable user interface element 119 and menu 113 of actions 117 in user interface 114B using user interface controls provided by the operating system. [0032] In one example, the user selects unitary selectable user interface element 119 at user interface 114B. UI module 106 may interpret inputs detected at UID 104 for unitary selectable user interface element 119 and relay information about the inputs detected to application 112A. Application 112A may then provide interface data to UI module 106 to produce menu 113 of actions 117.
[0033] When instructed by application 112A, in the second user interface mode, UI module 106 may generate menu 113 of actions 117 using user interface controls provided by the operating system. UI module 106 may use menu-based user interface controls to produce menu 113 with actions 117.
[0034] In the example of FIG. I A, user interface 114B includes menu 113 that includes a plurality of actions 117 that each correspond to at least one selectable user interface element 115 from the plurality of selectable user interface elements of user interface 114A rather than at the respective locations of user interface 114B. For example. “Autoplay” action 117B in menu 113 of user interface 114B corresponds to “Autoplay” selectable user interface element 115B of user interface 114A.
[0035] The user may then select one of the actions 117 from menu 113. In the example of FIG. 1A, actions 117 include “Play on TV” action I 17A, “Autoplay” action 117B, “Closed Captions” action 117C, “Settings” action 117D, “Close” action 117E, “Previous” action 117F, “Rewind” action 117G, “Pause” action 1 17H, a “Fast Forward” action 1171, and “Next” action 117J.
[0036] Menu 113 of actions 117 may include text descriptions of functionality associated with the plurality of actions. For example, “Rewind” action 117G of FIG. 1A includes the text “Rewind” The text descriptions may help users understand the functionality without requiring the users to remember a meaning of an icon, such as the icons of selectable user interface elements 115 of user interface 114A. Menu 113 of actions 117 of user interface 114B may also include icons indicating functionality of the plurality of selectable user interface elements. For example, “Rewind” action 117G of menu 113 may include a “Rewind” icon as well as the text “Rewind”.
[0037] When the user selects “Rewind” action 117G, computing device 102 detects user input at the location of “Rewind” action 117G. UID 104 may receive indications of the user input by detecting one or more gestures from a user of computing device 102 at the location. Responsive to detecting the user input, UI module 106 may interpret inputs detected at UID 104 for “Rewind” action 117G and relay information about the inputs detected to application 112 A. For example, application 112 A may perform the action, such as rewinding a video in display area 111 in response to selecting “Rewind” action 117G.
[0038] Mode selection module 108 enables the user to select between a first user interface mode, such as the standard mode, and a second user interface mode, such as the simple mode. Computing device 102 may receive user selections provided at a user interface, such as the user interface shown in FIG. 3, that cause the computing device 102 to transition into the second user interface mode.
[0039] Mode selection module 108 may also select the user interface mode based on an analysis of user behavior with respect to computing device 102 such as by analyzing whether a user has difficulty using user interfaces in the first user interface mode. Mode selection module 108 may use a rule based or machine learned model to evaluate user inputs to determine that the user has difficulty using user interfaces in the first user interface user interface mode then transition or make suggestions to the user to transition to the second user interface mode.
[0040] In one example, the user selects the simple mode through a user interface such as the user interface shown in FIG. 3. Mode selection module 108 receives and stores the simple mode as the current display mode and indicates to application 112A that the computing device 102 is now in the simple mode.
[0041] Display mode-based changes to the user interface may be applied at the application level. Applications 112 may receive indications of the user interface mode from mode selection module 108 and, based on the user interface mode, determine the user interface elements to include in the user interface. Display mode-based changes to the user interface may also be applied at the operating system level such that when the computing device 102 changes into the second user interface mode, UI module 106 may modify the user interface without requiring a modification of code for application 112A. In either case, rather than generating user interface 114A, in the second user interface mode, UI module 106 generates a modified user interface, such as user interface 114B. [0042] FIG. IB is a conceptual diagram illustrating an example computing device 102 that may operate in a simplified user interface mode in accordance with one or more aspects of the present disclosure. Computing device 102, user interface device 104, UI module 106, applications 112 and mode selection module 108 are described above with respect to FIG 1A.
[0043] Mode selection module 108 may allow the user to select between a first user interface mode, such as a standard mode, and a second user interface mode, such as a simple mode. Computing device 102 may receive user selections through a user interface such as the user interface shown in FIG. 3 that cause computing device 102 to transition into the second user interface mode. Mode selection module 108 may also select the user interface mode based on an analysis of user behavior with respect to computing device 102 such as by analyzing whether a user has difficulty using user interfaces in the first user interface mode.
[0044] In the example of FIG. IB, in a standard mode, application 112B produces a user interface 121 A including gesture functionality, for example horizontal scrolling gestures, vertical scrolling gestures, back gestures, dismiss gestures, and gestures to display all applications. In this example, the user interface 114B includes a display of selectable content but it is to be understood that other functionality such as video display, text display, document, processing, social media, and other app functionality may be provided.
[0045] Mode selection module 108 provides to application 112B a display mode such as the standard mode or a simple mode. In this example, mode selection module 108 indicates to application 112B that the computing device 102 is in the standard mode. The standard mode may be the default mode.
[0046] Application 112B provides interface data to UI module 106 for the user interface 121 A. UI module 106 may generate gesture functionality, such as horizontal scrolling gesture functionality, using user interface controls provided by the operating system.
[0047] Gesture functionality is often used with touchscreen devices for actions such as navigating through content or selecting items. For example, for horizontal scrolling functionality, to perform a left swipe, a user drags a finger across the screen from right side to the left side and to perform a right swipe the user drags a finger across the screen from left side to the right side. Such gesture functionality may not be intuitive for all users given that user interfaces often give no visual indication of the gesture functionality.
[0048] Computing device 102 interprets gestures from user as associated with actions. For example, computing device 102 may interpret user horizontal scrolling gestures to the left or to the right so as to move content in portion 123. For example, when a user contacts portion 123 of user interface 121A, UID 104 receives indications of the tactile input in portion 123 of user interface 121A. UI module 106 may interpret gestures detected at UID 104 and relay information about the inputs detected to application 112B. Application 112B may then update the display of content in portion 123. In the standard mode, the horizontal scrolling functionality may be difficult for inexperienced users to understand and/or use because now visible indication of the horizontal scrolling functionality is provided.
[0049] In order to have simplified user interface displays, computing device may implement a simple mode. In the simple mode, computing device 102 produces a simplified user interface, such as user interface 121B, rather than a standard user interface, such as user interface 121 A.
[0050] Mode selection module 108 allows the user to select between a first user interface mode, such as the standard mode, and a second user interface mode, such as the simple mode. Computing device 102 may receive user selections through a user interface such as the user interface shown in FIG. 3 that cause the computing device 102 to transition into the second user interface mode (simple mode). Details of the mode selection module 108 are discussed above with respect to FIG. IA.
[0051] In one example, the user selects the simple mode through a user interface such as the user interface shown in FIG. 3. Mode selection module 108 receives and stores the simple mode as the current display mode and indicates to application 112B that the computing device 102 is now in the simple mode.
[0052] To produce user interface 12 IB, application 112A provides interface data to UI module 106 for the user interface 121B. Application 112A may provide modified interface data that allows for the production of user interface 12 IB. UI module 106 may then generate user interface elements, such as arrow buttons 125A and 125B, using user interface controls provided by the operating system. In the example of FIG. 1 B, arrow buttons I25A and 125B may replace the horizontal scrolling gesture functionality present in user interface 121 A. Arrow buttons 125 A and 125B may be at least partially transparent to allow content beneath the buttons to be partially visible. UI module 106 may position arrow buttons 125A and 125B in portion 123.
[0053] In one example, application 112B may make display mode-based changes to the user interface. Application 112B may receive an indication of the user interface mode from mode selection module 108 and based on the user interface mode determine functionality at user interface 12IB. Application 112B may use the mode indication and then provide the gesture functionality in the first user interface mode and user interface element(s), such as the arrow buttons 125A and 125B, in the second user interface mode.
[0054] Display mode-based changes to the user interface may be applied at the operating system level such that when the computing device 102 is put into the second user interface mode, UI module 106 may modify the user interface without requiring a modification of code for application 112B. Display mode-based changes to the user interface may also be applied at the application level. Applications 112 may receive indications of the user interface mode from mode selection module 108 and based on the user interface mode determine the user interface elements to include in the user interface. In either case, rather than generating user interface 12IA, in the second user interface mode, UI module 106 may generate a modified user interface such as user interface 121B with user interface elements, such as arrow buttons 125A and 125B. [0055] In one example, the user selects a user interface element, such as arrow button 125A, at the user interface 12 IB. UI module 106 may interpret inputs detected at UID 104 for user interface element, such as arrow button 125A, and relays information about the inputs detected to application 112B. In response, application 112B may update user interface 121B by performing an action, such as scrolling the content in portion 123 to the right when arrow button 125A is pressed. [0056] Computing device 102 may detect user input at a location at which the user interface element, such as one of arrow buttons 125A and 125B, is located. For example, UID 104 may receive indications of the user input by detecting one or more gestures from a user of computing device 102 at the location. Responsive to detecting the user input, computing device 102 may perform the selected action, such as horizontally scrolling portion 123 of user interface 121B. As a result of horizontal scrolling, content in portion 123 of user interface 12 IB may move to show hidden pictures, videos or other content. For horizontal scrolling functionality, responsive to detecting the user input at the location of arrow button 125A, computing device 102 may horizontally scroll portion 123 of user interface 12 IB to the right and, responsive to detecting the user input at the location of arrow button 125B, computing device 102 may horizontally scroll portion 123 of user interface 121B to the left.
[0057] A computing device with a simple mode as described above with respect to FIGS 1A and IB has a number of technical advantages. For example, a simplified user interface may result in a more efficient use of the computing device by some users. When a user does not use a device effectively, they may spend substantial amounts of time attempting to do a desired functionality. Such attempts may result in power use that may drain a battery of the computing device as well as cause excess cellular and Wi-Fi bandwidth usage that may affect other computing devices on a wireless network. [0058] FIG. 2 is a block diagram illustrating an example computing device, in accordance with one or more aspects of the present disclosure. Computing device 202 of FIG. 2 is an example of computing device 102 of FIG. 1. Computing device 202 is only one particular example of computing device 102 of FIG. 1, and many other examples of computing devices may be used in other instances. In the example of FIG. 2, computing device 202 may be a mobile computing device (e.g., a smartphone), or any other computing device. Computing device 202 of FIG. 2 may include a subset of the components included in computing device 202 or may include additional components not shown in FIG. 2.
[0059] As shown in the example of FIG. 2, computing device 202 includes user interface device 204 (“UID 204”), one or more processors 240, one or more input devices 242, one or more communication units 244, one or more output devices 246, and one or more storage devices 248. Storage devices 248 of computing device 202 also include operating system 254, mode selection module 208, UI module 206, and applications 212. [0060] Communication channels 250 may interconnect each of the components 240, 242, 244, 246, 248, and 204 for inter-component communications (physically, communicatively, and/or operatively). In some examples, communication channels 250 may include a system bus, a network connection, an inter-process communication data structure, or any other method for communicating data.
[0061] One or more input devices 242 of computing device 202 may be configured to receive input. Examples of input are tactile, audio, and video input. Input devices 242 of computing device 202, in one example, includes a presence-sensitive display, touch- sensitive screen, mouse, keyboard, voice responsive system, video camera, microphone or any other type of device for detecting input from a human or machine.
[0062] One or more output devices 246 of computing device 202 may be configured to generate output. Examples of output are tactile, audio, and video output. Output devices 246 of computing device 202, in one example, includes a presence-sensitive organic light emitting diode (OLED) display, sound card, video graphics adapter card, speaker, monitor, a presence-sensitive liquid crystal display (LCD), or any other type of device for generating output to a human or machine.
[0063] One or more communication units 244 of computing device 202 may communicate with external devices via one or more wired and/or wireless networks by transmitting and/or receiving network signals on the one or more networks. Examples of communication unit 244 include a network interface card (e.g., such as an Ethernet card), an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that may send and/or receive information. Other examples of communication units 244 may include short wave radios, cellular data radios, wireless network radios, as well as universal serial bus (USB) controllers.
[0064] In some examples, UID 204 of computing device 202 may include functionality of input devices 242 and/or output devices 246. In the example of FIG. 2, UID 204 may be or may include a presence- sensitive input device. In some examples, a presence sensitive input device may detect an object at and/or near a screen. As one example range, a presence-sensitive input device may detect an object, such as a finger or stylus that is within 2 inches or less of the screen. The presence-sensitive input device may determine a location (e.g., an (x,y) coordinate) of a screen at which the object was detected. In another example range, a presence-sensitive input device may detect an object six inches or less from the screen and other ranges are also possible. The presence-sensitive input device may determine the location of the screen selected by a user’s finger using capacitive, inductive, and/or optical recognition techniques. In some examples, a presence sensitive input device also provides output to a user using tactile, audio, or video stimuli as described with respect to output device 246, e.g., at a display. In the example of FIG. 2, UID 204 may present a user interface.
[0065] While illustrated as an internal component of computing device 202, UID 204 also represents an external component that shares a data path with computing device 202 for transmitting and/or receiving input and output. For instance, in one example, UID 204 represents a built-in component of computing device 202 located within and physically connected to the external packaging of computing device 202 (e.g., a screen on a mobile phone). In another example, UID 204 represents an external component of computing device 202 located outside and physically separated from the packaging of computing device 202 (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data path with a tablet computer).
[0066] One or more storage devices 248 within computing device 202 may store information for processing during operation of computing device 202. In some examples, storage device 248 is a temporary memory, meaning that a primary purpose of storage device 248 is not long-term storage. Storage devices 248 on computing device 202 may be configured for short-term storage of information as volatile memory and therefore not retain stored contents if powered off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art.
[0067] Storage devices 248, in some examples, also include one or more computer- readable storage media. Storage devices 248 may be configured to store larger amounts of information than volatile memory. Storage devices 248 may further be configured for long-term storage of information as non-volatile memory space and retain information after power on/off cycles. Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. Storage devices 248 may store program instructions and/or information (e.g., data) associated with operating system 254, mode selection module 208, applications 212.
[0068] One or more processors 240 may implement functionality and/or execute instractions within computing device 202. For example, processors 240 of computing device 202 may receive and execute instructions stored by storage devices 248 that execute the functionality of operating system 254, mode selection module 208, applications 212.
[0069] Operating system 254, mode selection module 208, applications 212 are described below executing at one or more processors 240. It should be understood that one or more processors 240 are configured to execute operating system 254, mode selection module 208, applications 212 to perform the functionality of operating system 254, mode selection module 208, and applications 212 described below. That is, one or more processors 240 are configured to execute the instructions of operating system 254, mode selection module 208, applications 212 to perform the functionality of operating system 254, mode selection module 208, applications 212 described below.
[0070] Operating system 254 may be composed of several layers, each building upon the previous one to provide the overall functionality of the operating system 254. The layers may include a kernel responsible for providing low-level system services such as memory management, process management, and device drivers, Native Libraries for system services such as graphics, media, and database access, a runtime or virtual machine executing applications, an application framework providing a set of Application Programing Interfaces (APIs) and services for developers to build applications. These layers interact with each other to provide the features and functionality' of the operating system. A smartphone launcher may be used to allow users to launch and interact with apps. Smartphone launchers typically include a home screen with app shortcuts, a drawer for accessing all installed apps, and an app tray for holding widgets.
[0071] Applications 212 may include applications 212A and 212B, and one or more processors 240 are configured to execute applications 212A and 212B. For example, one or more processors 240 may be configured to execute applications 212A and 212B to produce user interfaces with user interface elements that are different based on a selected user interface mode.
[0072] Mode selection module 208 may allow users to select between two user interface modes: a standard mode and a simple mode. Users may select the user interface mode through a user interface, such as the one shown in FIG. 3. The mode selection module 208 may also select the mode based on an analysis of user behavior, such as whether the user has difficulty using the computing device in the standard mode.
[0073] Applications 212 may modify the user interfaces based on the selected mode. In the simple mode, applications 212 may make the user interfaces easier to use. In the simple mode, application 212 A may generate a menu using user interface controls provided by the operating system. The menu may include a plurality of actions, each of which corresponds to one or more selectable user interface elements from the user interface of the standard mode as shown in FIG. 1 A. In the simple mode, application 2I2B may produce a user interface that includes user interface elements, such as arrowbuttons, that replace gesture functionality, such as horizontal scrolling gesture functionality, present in the user interface of the standard mode as shown in FIG. IB. Operating system 254 may also modify the user interfaces based on the selected mode. [0074] FIG. 3 is a block diagram illustrating a user interface for selecting a simple user interface mode, in accordance with one or more aspects of the present disclosure. The computing device may produce page 360 to enable the selection of a user interface mode. Selector 362 may allow a user to switch between a simple mode and a standard mode. Mode selection module and applications may receive input from selector 362 to control the user interface mode displays. Display options at page 360 may allow text size selection based on transitioning into the second user interface mode, such as the simple mode. Page 360 may also allow the user to adjust the display options for the second user interface mode, such as the simple mode.
[0075] If the simple mode is selected using selector 362, default values for functionality associated with the simple mode may be shown in page 360 and these values may be updated as discussed below. Selector 364 may be used to toggle the menu selection functionality described above with respect to FIG. 1 A on and off in the simple mode. Selector 366 may be used to toggle the arrow scrolling functionality described above with respect to FIG IB on and off.
[0076] Buttons 368 may be used to select the text size for the simple mode. A default text size of the simple mode may be set larger than the default text size for the standard mode. A larger text size may help users use the computing device in the simple mode. [0077] Buttons 370 may be used to select the display size for the simple mode. A default display size of the simple mode may be set larger than the default display size for the standard mode. A larger display size may help users use the computing device in the simple mode.
[0078] Buttons 372 may be used to select the contrast for the simple mode. A default contrast of the simple mode may be set greater than the default contrast for the standard mode. Greater contrast may help users use the computing device in the simple mode. [0079] Slider 374 may be used to select the brightness for the simple mode. A default brightness of the simple mode may be set greater than the default brightness for the standard mode. Greater brightness may help users use the computing device in the simple mode.
[0080] Slider 376 may be used to select the volume for the simple mode. A default volume of the simple mode may be set greater than the default volume for the standard mode. Increased volume may help users use the computing device in the simple mode. [0081] Touch and hold slider 378 may adjust the length of time before a tap on the screen becomes a touch and hold. Touch and hold functionality may enable selection of user interface elements for example to move such user interface elements in the display. A default length of time of the simple mode may be set greater than the default length of time for the standard mode. The greater length of time may help users avoid accidentally enabling the touch and hold functionality.
[0082] FIG. 4 is a flowchart illustrating example operations of an example computing system, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure. For purposes of illustration only, the example operations are described below within the context of computing device 102 of FIG. 1A. [0083] Computing device 102 may output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface (402). The selectable user interface elements may be icons without associated text.
[0084] Computing device 102 may transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode (404).
[0085] Computing device 102 may while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements (406). The unitary selectable user interface combines multiple standard UI elements into a single UI element in the second user interface mode.
[0086] Computing device 102 may detect a first user input selecting the unitary selectable user interface element (408). Responsive to detecting the first user input, computing device 102 may displaying a menu of the actions corresponding to the plurality of selectable user interface elements (410). The actions may include text that allows some users to better understand the functionality associated with the actions. [0087] Computing device 102 may detect a second user input selecting one of the actions from the menu of the actions (412). Responsive to detecting the second user input, computing device 102 may perform the selected action (414).
[0088] FIG. 5 is a flowchart illustrating example operations of an example computing system, in accordance with one or more aspects of the present disclosure, with one or more aspects of the present disclosure. For purposes of illustration only, the example operations are described below within the context of computing device 102 of FIG. IB. [0089] Computing device 102 may output, in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action (502). Computing device 102 may transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode (504). Computing device 102 may, while operating in the second user interface mode, output, for display, a second user interface of the application, the second user interface including a user interface element that replaces the gesture functionality present in the first user interface mode (506).
[0090] Computing device 102 may detect a user input at a location at which the user interface element is displayed (508). Computing device 102 may, responsive to detecting the user input, perform the action (510).
[0091] This disclosure includes the following examples.
[0092] Example 1. A method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transitioning, by the one or more processors, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detecting, by the one or more processors, a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detecting, by the one or more processors, a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, performing, by the one or more processors, the selected action.
[0093] Example 2. The method of example 1, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
[0094] Example 3. The method of example 2, wherein the menu of the actions includes icons associated with the text descriptions.
[0095] Example 4. The method of any of examples 1-3, further comprising receiving, by the one or more processors, a user selection to enter the second user interface mode before transitioning into the second user interface mode.
[0096] Example 5. The method of any of examples 1-4, further comprising determining, by the one or more processors, based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
[0097] Example 6. The method of any of examples 1-5, further comprising changing, by the one or more processors, display options, including text size, based on transitioning into the second user interface mode.
[0098] Example 7. The method of example 6, further comprising providing a page to adjust the display options for the second user interface mode.
[0099] Example 8: A device comprising means for performing any combination of the methods of examples 1-7.
[0100] Example 9: A system implementing any combination of the methods of examples 1-7.
[0101] Example 10: A non-transitory computer-readable storage medium implementing any combination of the methods of examples 1-7.
[0102] Example 11. A computing device comprising memory; and one or more processors communicably coupled to the memory and configured to output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
[0103] Example 12. The computing device of example 11, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
[0104] Example 13. The computing device of example 12, wherein the menu of the actions includes icons associated with the text descriptions.
[0105] Example 14. The computing device of any of examples 11-13, wherein the one or more processors are configured to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
[0106] Example 15. The computing device of any of examples 11-14, wherein the one or more processors are configured to determine based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
[0107] Example 16. The computing device of any of examples 11-15, wherein the one or more processors are configured to change display options, including text size, based on transitioning into the second user interface mode.
[0108] Example 17. The computing device of example 16, wherein the one or more processors are configured to provide a page to adjust the display options for the second user interface mode.
[0109] Example 18. A computer-readable storage medium having stored thereon instractions that, when executed, cause one or more processors of a computing device to: output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
[0110] Example 19. The computer-readable storage medium of example 18, wherein the menu of the actions of the second user interface includes text descriptions of the actions. [0111] Example 20. The computer-readable storage medium of example 19, wherein the menu of the actions includes icons associated with the text descriptions.
[0112] Example 21. The computer-readable storage medium of any of examples 18-20, wherein the instructions, when executed, further cause the one or more processors to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
[0113] Example 22. The computer-readable storage medium of any of examples 18-21, wherein the instructions, when executed, further cause the one or more processors to determine based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
[0114] Example 23. The computer-readable storage medium of any of examples 18-22, wherein the instructions, when executed, further cause the one or more processors to change display options, including text size, based on transitioning into the second user interface mode.
[0115] Example 24. The computer-readable storage medium of example 23, wherein the instructions, when executed, further cause the one or more processors to provide a page to adjust the display options for the second user interface mode.
[0116] Example 25: A method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action; transitioning, by the computing device, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; while operating in the second user interface mode, outputting, by one or more processors of the computing device, a second user interface of the application, the second user interface including a user interface element that replaces the gesture functionality present in the first user interface mode, detecting, by the computing device, a user input at a location the user interface element a is displayed; and responsive to detecting the user input, performing the action.. [0117] Example 26. The method of example 25, wherein the gesture functionality is horizontal scrolling gesture functionality, the user interface element includes a pair of arrow buttons and the action is horizontally scrolling a portion of the second user interface.
[0118] Example 27 : The method of example 26, wherein the one of the arrow buttons is positioned in the portion, wherein the one of the arrow buttons points toward additional content for the portion.
[0119] Example 28: The method of any of examples 26-27, wherein the one of the arrow buttons is at least partially transparent.
[0120] Example 29. The method of any of examples 25-28, further comprising receiving, by the computing device, a user selection to enter the second user interface mode before transitioning into the second user interface mode.
[0121] Example 30: The method of any of examples 25-29, further comprising determining, by the computing device, based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
[0122] Example 31 : The method of any of examples 25-30, further comprising changing, by the computing device, display options including text size based on transitioning into the second user interface mode.
[0123] Example 32: The method of example 31, further comprising providing a page to adjust the display options for the second user interface mode.
[0124] Example 33: A device comprising means for performing any combination of the methods of examples 25-32.
[0125] Example 34: A system implementing any combination of the methods of examples 25-32.
[0126] Example 35: A non-transitory computer-readable storage medium implementing any combination of the methods of examples 25-32.
[0127] Example 36: A computing device comprising: a presence-sensitive screen; memory; and one or more processors communicably coupled to the memory and configured to: output, in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; while operating in the second user interface mode, output a second user interface of the application, the second user interface including a user interface element that replaces the gesture functionality present in the first user interface mode, detect a user input at a location the user interface element is displayed; and responsive to detecting the user input, performing the action. [0128] Example 37. The computing device of example 36 wherein the gesture functionality is horizontal scrolling gesture functionality, the user interface element includes a pair of arrow buttons and the action is horizontally scrolling a portion of the second user interface.
[0129] Example 38: The computing device of example 37, wherein the one of the arrow buttons is positioned in the portion, wherein the one of the arrow buttons points toward additional content for the portion.
[0130] Example 39: The computing device of example 37-38, wherein the one of the arrow buttons is at least partially transparent.
[0131] Example 40: The computing device of any of examples 36-39, wherein the one or more processors are configured to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
[0132] Example 41: The computing device of any of examples 36-40, wherein the one or more processors are configured to, based on user interactions with the computing device, enter the second user interface mode before transitioning into the second user interface mode.
[0133] Example 42: The computing device of any of examples 36-41, wherein the one or more processors are configured to display options including text size based on transitioning into the second user interface mode.
[0134] Example 43: The computing device of example 42, wherein the one or more processors are configured to provide a page to adjust the display options for the second user interface mode.
[0135] Example 44: A computer-readable storage medium having stored thereon instructions that, when executed, cause one or more processors of a computing device to: output, in a first user interface mode, a first user interface of an application, the first user interface including gesture functionality associated with an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; while operating in the second user interface mode, output a second user interface of the application, the second user interface including a user interface element that replace the gesture functionality present in the first user interface mode, detect a user input at a location at which the user interface element is displayed; and responsive to detecting the user input, performing the action.. [0136] Example 45. The computer-readable storage medium of example 44 wherein the gesture functionality is horizontal scrolling gesture functionality, the user interface element includes a pair of arrow buttons and the action is horizontally scrolling a portion of the second user interface.
[0137] Example 46: The computer-readable storage medium of example 45, wherein the one of the arrow buttons is positioned in the portion, wherein the one of the arrowbuttons points toward additional content for the portion.
[0138] Example 47: The computer-readable storage medium of any of examples 45-46, wherein the one of the arrow buttons is at least partially transparent.
[0139] Example 48: The computer-readable storage medium of any of examples 44-47, wherein the instructions cause one or more processors to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
[0140] Example 49: The computer-readable storage medium of any of examples 44-48, wherein the instructions cause one or more processors to, based on user interactions with the computing device, enter the second user interface mode before transitioning into the second user interface mode.
[0141] Example 50: The computer-readable storage medium of any of examples 44-49, wherein the instructions cause one or more processors to display options including text size based on transitioning into the second user interface mode.
[0142] Example 51 : The computer-readable storage medium of example 50, wherein the instructions cause one or more processors to provide a page where the user may adjust the display options for the second user interface mode.
[0143] In one or more examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over, as one or more instructions or code, a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer-readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
[0144] By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transient media, but are instead directed to non-transient, tangible storage media. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
[0145] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. In addition, in some aspects, the functionality described herein may be provided within dedicated hardware and/or software modules. Also, the techniques could be frilly implemented in one or more circuits or logic elements.
[0146] The techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware. [0147] Various examples of the disclosure have been described. Any combination of the described systems, operations, or functions is contemplated. These and other examples are within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A method comprising: outputting, by one or more processors of a computing device operating in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transitioning, by the one or more processors, from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: outputting, by the one or more processors, a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements: detecting, by the one or more processors, a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality' of selectable user interface elements: detecting, by the one or more processors, a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, performing, by the one or more processors, the selected action.
2. The method of claim 1, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
3. The method of claim 2, wherein the menu of the actions includes icons associated with the text descriptions.
4. The method of any of claims 1-3, further comprising receiving, by the one or more processors, a user selection to enter the second user interface mode before transitioning into the second user interface mode.
5. The method of any of claims 1-4, further comprising determining, by the one or more processors, based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
6. The method of any of claims 1-5, further comprising changing, by the one or more processors, display options, including text size, based on transitioning into the second user interface mode.
7. The method of claim 6, further comprising providing a page to adjust the display options for the second user interface mode.
8. A computing device comprising: memory; and one or more processors communicably coupled to the memory and configured to: output, in a first user interface mode, a first user interface of an application, the first user interface including a plurality of selectable user interface elements each of which is positioned at a respective location of the first user interface and each of which corresponds to an action; transition from operating in the first user interface mode to operating in a second user interface mode different from the first user interface mode; and while operating in the second user interface mode: output a second user interface of the application, the second user interface including a unitary selectable user interface element that replaces the plurality of selectable user interface elements; detect a first user input selecting the unitary selectable user interface element; responsive to detecting the first user input, displaying a menu of the actions corresponding to the plurality of selectable user interface elements; detect a second user input selecting one of the actions from the menu of the actions; and responsive to detecting the second user input, perform the selected action.
9. The computing device of claim 8, wherein the menu of the actions of the second user interface includes text descriptions of the actions.
10. The computing device of claim 9, wherein the menu of the actions includes icons associated with the text descriptions.
11. The computing device of any of claims 8-10, wherein the one or more processors are configured to receive a user selection to enter the second user interface mode before transitioning into the second user interface mode.
12. The computing device of any of claims 8-11, wherein the one or more processors are configured to determine based on user interactions with the computing device to enter the second user interface mode before transitioning into the second user interface mode.
13. The computing device of any of claims 8-12, wherein the one or more processors are configured to change display options, including text size, based on transitioning into the second user interface mode.
14. A non-transitory computer-readable storage medium encoded with instructions that, when executed by one or more processors, cause the one or more processors to perform any combination of the methods of claims 1-7.
15. A computer program product comprising at least one non-transitory computer- readable medium including one or more instructions that, when executed by at least one processor, cause the at least one processor to perform any combination of the methods of claims 1-7.
PCT/US2024/032647 2023-07-25 2024-06-05 Simplified user interfaces WO2025024055A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363515550P 2023-07-25 2023-07-25
US63/515,550 2023-07-25

Publications (1)

Publication Number Publication Date
WO2025024055A1 true WO2025024055A1 (en) 2025-01-30

Family

ID=91853672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/032647 WO2025024055A1 (en) 2023-07-25 2024-06-05 Simplified user interfaces

Country Status (1)

Country Link
WO (1) WO2025024055A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100313125A1 (en) * 2009-06-07 2010-12-09 Christopher Brian Fleizach Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface
US20210216654A1 (en) * 2019-05-06 2021-07-15 Apple Inc. Restricted operation of an electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100313125A1 (en) * 2009-06-07 2010-12-09 Christopher Brian Fleizach Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface
US20210216654A1 (en) * 2019-05-06 2021-07-15 Apple Inc. Restricted operation of an electronic device

Similar Documents

Publication Publication Date Title
US8473871B1 (en) Multiple seesawing panels
US8756533B2 (en) Multiple seesawing panels
US9383827B1 (en) Multi-modal command display
US11978335B2 (en) Controlling remote devices using user interface templates
US9176480B2 (en) Gesture-based time input
US9213467B2 (en) Interaction method and interaction device
US20180196854A1 (en) Application extension for generating automatic search queries
US20180188906A1 (en) Dynamically generating a subset of actions
US10007565B1 (en) Dynamic deep links to targets
US20190050115A1 (en) Transitioning between graphical interface element modalities based on common data sets and characteristic of user input
US20140354554A1 (en) Touch Optimized UI
US20200285377A1 (en) Dynamicaly configurable application control elements
US20170185422A1 (en) Method and system for generating and controlling composite user interface control
US9830056B1 (en) Indicating relationships between windows on a computing device
US8698835B1 (en) Mobile device user interface having enhanced visual characteristics
EP3223072A1 (en) Projector playing control method, device, and computer storage medium
US11243679B2 (en) Remote data input framework
US11460971B2 (en) Control method and electronic device
US20200142718A1 (en) Accessing application features from within a graphical keyboard
US20180136789A1 (en) Sender-initiated control of information display within multiple-partition user interface
WO2025024055A1 (en) Simplified user interfaces
JP2023548807A (en) Information processing methods, devices and electronic equipment
US20240160346A1 (en) Back gesture preview on computing devices
US20240289012A1 (en) Page switching method, page switching apparatus, electronic device and readable storage medium
WO2024058802A1 (en) Swivel gesture functionality on computing devices