US20220058935A1 - Context-Based Alerts for an Electronic Device - Google Patents
Context-Based Alerts for an Electronic Device Download PDFInfo
- Publication number
- US20220058935A1 US20220058935A1 US17/516,451 US202117516451A US2022058935A1 US 20220058935 A1 US20220058935 A1 US 20220058935A1 US 202117516451 A US202117516451 A US 202117516451A US 2022058935 A1 US2022058935 A1 US 2022058935A1
- Authority
- US
- United States
- Prior art keywords
- alert
- haptic
- component
- output
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B23/00—Alarms responsive to unspecified undesired or abnormal conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M19/00—Current supply arrangements for telephone systems
- H04M19/02—Current supply arrangements for telephone systems providing ringing current or supervisory tones, e.g. dialling tone or busy tone
- H04M19/04—Current supply arrangements for telephone systems providing ringing current or supervisory tones, e.g. dialling tone or busy tone the ringing-current being generated at the substations
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B7/00—Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
- G08B7/06—Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W52/00—Power management, e.g. TPC [Transmission Power Control], power saving or power classes
- H04W52/02—Power saving arrangements
- H04W52/0209—Power saving arrangements in terminal devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Definitions
- the present disclosure is directed to selecting and providing an alert level for an electronic device. Specifically, the present disclosure is directed to providing an alert that is selected from a set of three or more alert modes based on one or more environmental conditions associated with the electronic device.
- Electronic devices have become ubiquitous in our daily lives. Certain electronic devices including, cell phones, tablet computers, personal digital assistants, and the like have become common items in the workplace and at home. Some of these electronic devices include an ability to notify a user particular item of interest, such as, for example, an incoming phone call, or may otherwise attempt to gain the user's attention through the use of an alarm or signal.
- Embodiments of the present disclosure provide a system and method for providing an alert in response to detecting an occurrence of an event.
- a response to the event in response to detecting the occurrence of the event, is determined based on a current alert mode selected from a set of three or more alert modes. The selection may be based on the one or more environmental conditions.
- a first alert may be output in response to the event.
- a second alert may be output in response to the event. The second alert may be different from the first alert.
- Embodiments of the present disclosure provide a system and method for forgoing an alert in response to detecting a level of user activity or receiving a number or event notifications that are below a threshold.
- an output of an alert is forgone, in accordance with a determination that an activity level exceeds a threshold.
- the alert is output in accordance with a determination that the activity level does not exceed the threshold.
- an alert in response to detecting the event, is output in accordance with a determination that a number of events that have been detected over a predetermined period exceeds a threshold.
- the output of an alert is forgone in accordance with a determination that the number of events that have been detected over the predetermined period does not exceed the threshold.
- Embodiments of the present disclosure provide a system and method for providing a modified alert sequence in response to detecting an interaction by the user.
- a portion of an alert sequence is output.
- An interaction with the user is detected during the outputting of the portion of the alarm sequence, and in response to detecting the interaction, a modified alert sequence is selected in response to the input.
- the modified alert sequence is output using the device.
- Embodiments of the present disclosure provide a system and method for selecting a device to output an alert in response to detecting another device that is in proximity to the electronic device.
- an alert-output device in response to detecting an event, is selected in accordance with a determination that a second device is in proximity to the first device, and the alert is output on the alert-output device.
- Embodiments of the present disclosure provide a system and method for providing an audio and haptic output that depends on the speed of a user input.
- a first input is received on the that is below an input threshold.
- a first output is produced using the device.
- the first output includes a haptic component for the first input that is coordinated with an audio component for the first input.
- a second input is received on the device, and in response to detecting the second input, a second output is produced.
- the second output includes a haptic component for the second input that is coordinated with an audio component for the second input, and in accordance with a determination that the second input is above the input threshold, the second output includes a modified haptic component for the second input.
- FIGS. 1A-B depict example electronic devices that may be used to provide an alert according to one or more embodiments of the present disclosure.
- FIG. 2 depicts an example electronic device being worn by a user according to one or more embodiments of the present disclosure.
- FIG. 3 depicts an example electronic device being worn and another example electronic device being carried by the user according to one or more embodiments of the present disclosure.
- FIG. 4 depicts an example electronic device in an exemplary operating environment according to one or more embodiments of the present disclosure.
- FIG. 5 depicts a user interacting with an example electronic device according to one or more embodiments of the present disclosure
- FIG. 6 depicts example user input to an electronic device according to one or more embodiments of the present disclosure.
- FIG. 7A depicts a process for determining a response to an event according to one or more embodiments of the present disclosure.
- FIG. 7B depicts a process for determining whether or not to respond to an event based on user activity according to one or more embodiments of the present disclosure.
- FIG. 7C depicts a process for determining whether or not to respond to an event based on a number of events according to one or more embodiments of the present disclosure.
- FIG. 7D depicts a process for outputting a modified alert sequence according to one or more embodiments of the present disclosure.
- FIG. 7E depicts a process for determining an output device according to one or more embodiments of the present disclosure.
- FIG. 7F depicts a process for producing an audio and haptic feedback in response to a user input according to one or more embodiments of the present disclosure.
- FIGS. 8-9 are block diagrams of an example electronic device that may be used with one or more embodiments of the present disclosure.
- FIG. 10 depicts an example acoustic module of an electronic device that may be used with one or more embodiments of the present disclosure.
- FIGS. 11A-B depict an example haptic actuator of an electronic device that may be used with one or more embodiments of the present disclosure.
- FIG. 12 depicts an example crown with an optical encoder that may be used with one or more embodiments of the present disclosure.
- FIGS. 13-18 depict functional block diagrams of electronic devices in accordance with some embodiments.
- embodiments of the present disclosure provide a system and method for producing an alert according to an alert mode that is automatically selected based on one or more environmental conditions.
- the environmental conditions optionally relate to the ambient conditions in which the electronic device is being operated.
- the electronic device detects or senses the environmental conditions using one or more sensors associated with an electronic device.
- the output from the one or more sensors is, optionally used to determine or estimate certain qualities of the environmental conditions or operating environment of the electronic device, including, for example, noise level, light level, motion level, and the like.
- an alert mode is, optionally selected from a set of three or more alert modes.
- the device optionally produces an alert, in accordance with the selected alert mode that corresponds to the one or more environmental conditions.
- Each alert mode may define a distinct alert that may include multiple components that provide different types of stimuli to the user.
- an alert mode may define an audio component, a visual component, and/or a haptic component.
- an alert mode may define a relative timing between components.
- the alert mode optionally defines a slight delay between an audio component and a haptic component to produce a composite stimulus that is more readily detected by the user in some situations.
- the components of the alert including the relative timing of the components, can be varied to provide a composite stimulus that is tailored to a particular scenario or set of environmental conditions. In some cases, the alert mode can be automatically selected based on the one or more environmental conditions that are detected.
- the environmental sensor is a microphone that is configured to detect an ambient sound level.
- the alert mode may be selected based on ambient sound level detected by the sensor.
- the selected alert mode includes an audio component that corresponds to or is appropriate for the ambient sound level detected by the sensor.
- the environmental sensor is a motion sensor that is configured to detect an activity level, which is used to select an alert mode.
- the alert mode that is selected can have an audio component, a haptic component, and/or visual component that corresponds to detected the activity level.
- the environmental sensor is an image sensor that is configured to detect an ambient light level, which is used to select an alert mode.
- one or more sensors are configured to detect a current battery level, which can, optionally be used to select an alert mode that conserves power or reduced peak power usage. For example, by separating the timing of audio and haptic alert components of an alert the peak power output may be reduced. Also, by reducing the amplitude of audio and/or haptic alert components, the peak power output may be reduced.
- the alert is tailored to represent a series of events that are detected over a predetermined time.
- a series of closely occurring events results in a single, batched alert instead of triggering a series of individual alerts for each event.
- a series of text messages may be received over a relatively short time period.
- an alert output may be held or forgone for a period of time and then a single, batched output may be produced.
- a combined or batched alert may be useful, for example, when a large number of event occur over a period of time, or when the time between events is very small. In these cases, producing a single alert may be more effective in capturing the user's attention and may also prevent alert fatigue. For example, if a user receives a large number of alerts over a short time period, or receives a nearly continuous stream of alerts, the user may begin to ignore or disregard the alerts.
- the number of events that occur over a period of time are monitored by the device. If the number of events is less than a threshold amount, the device can, optionally forgo outputting an alert. However, once the number of events exceeds the threshold, a composite or batched alert can, optionally be produced or output by the device. Events that are monitored include, without limitation, receiving an e-mail, receiving a phone call, receiving a message, and/or receiving calendar reminder.
- an alert is conditionally delayed or forgone while a user is active. If, for example, the user is engaged in exercise or heavy activity, the stimulus provided by an alert may not be readily detected. Thus, in some cases it may be advantageous to monitor or detect a user's activity level and, if an event occurs during a period of high or heavy activity, the alert associated with that event is, optionally delayed or forgone until the activity is below a threshold level.
- the activity level is based on the movement of the device, as detected by one or more motion sensors, or using one or more biometric sensors that are configured detect a user's physiological state, such as a pulse or blood oxygenation.
- an alert is a sequence of alert outputs that are configured to escalate by producing a stimulus or output that increases in intensity over time.
- the escalation sequence or progression of the alert is interrupted and caused to be modified due to a user interaction with the device.
- an escalating alert sequence is output by the device up until receiving an input from the user or other interaction from the user. (e.g., the user may touch the screen of the device or provide another form of input that is detected by the device.)
- the device may select and output a modified alert sequence.
- the modified alert sequence may be non-escalating or have a substantially uniform stimulus.
- the device is configured to detect or determine if another device is in proximity to the user when an event is received or detected.
- the device can, conditionally determine which device is appropriate for outputting an alert associated with the event.
- the alert is output on only one of the devices that are determined to be in proximity to the user.
- the appropriate device can be selected based on a number of different criteria. For example, the last device that has been used by the user can be selected. Additionally or alternatively, the device that the user is currently interacting with or is predicted to be most likely to capture the user's attention can be selected to output the alert. This feature may be advantageous in reducing the number of alerts that are output and increase the likelihood that the alert will capture the user's attention.
- the device is configured to produce a stimulus that provides feedback for a user-action or input to the device.
- This feature may be advantageous for some user input components, such as electronic sensors, that may have few or no moving parts to provide feedback to the user that an input is being received. For example, when a user scrolls through a list of items using a touch screen, an audio click and/or a haptic tap may indicate the progression through the list. This may be more readily perceived by the user or more satisfying than, for example, the visual scrolling of the items alone.
- the stimulus may be adapted to mimic a sound or haptic response that the user may associate with a more traditional mechanical device.
- an audio and/or haptic output corresponds to a user input using, for example, an electronic dial or button.
- a user can, optionally provide an input on a device used to drive a function or task and a synchronized audio and haptic response is used to provide the user with feedback.
- a synchronized audio and haptic response is used to provide the user with feedback.
- the feedback corresponds to the speed of the input, it may be possible to exceed the mechanical response of, for example, a haptic actuator used to produce the feedback.
- FIGS. 1A-B illustrate exemplary electronic devices 100 and 130 respectively that can be used to provide an alert or other output according to one or more embodiments of the present disclosure.
- each of the electronic devices 100 and 130 are portable computing devices.
- the electronic device 100 is a wearable electronic device.
- the electronic device 130 is a mobile phone.
- the electronic device of the present disclosure can include various types of portable computing devices, including tablet computers, laptop computers, time keeping devices, computerized glasses, navigation devices, sports devices, portable music players, health devices, medical devices and the like.
- the wearable electronic device 100 included a display 110 .
- the display 110 can, optionally be formed from a liquid crystal display (LCD), organic light emitting diode (OLED) display, organic electroluminescence (OEL) display, or other type of display device.
- the display 110 can, optionally also include or be integrated with a touch sensor configured to accept touch input from the user over an input area.
- the input area covers the entire area of the display 110 or a portion of the display 110 .
- the touch sensor is able to detect and measure a location and/or a force of a touch in the input area.
- the electronic device 130 also includes one or more buttons 140 or components for receiving input from the user.
- the display 110 is configured to present various forms of visual output to the user.
- the display 110 can, optionally provide a user interface that outputs information generated or received by the wearable electronic device 100 .
- the display 110 presents information corresponding to one or more applications that are executed or stored on the electronic device 100 and/or information related to communications received by the electronic device 100 .
- Such applications can, optionally include e-mail applications, phone applications, calendaring applications, game applications, time keeping applications and the like.
- the display 110 also provides a visual output that corresponds to an alert associated with an event detected by or received by the wearable electronic device 100 .
- Example events include, without limitation, receiving an e-mail message, receiving a phone call, receiving a text message, receiving calendar reminder, and the like.
- the electronic device 130 can, optionally also include a mobile phone or other such computing device.
- the electronic device 130 includes a display 150 for providing an visual output generated or received by electronic device 130 , as described above with respect to FIG. 1A , including the output of a visual component of an alert.
- the display 150 can, optionally also include or be integrated with a touch sensor configured to detect and measure a location and/or a force of a touch of touch input provided by the user.
- the wearable electronic device 100 and the electronic device 130 can, optionally also include other devices or components for producing output, including, without limitation, a speaker, buzzer, or other device configured to generate an audio output.
- An audio output can be used as part of an alert produced by the device.
- an alert can, optionally include an audio component as part of a composite alert that includes multiple forms of stimuli, including, audio, visual, and/or haptic components.
- an audio output is also used to provide feedback to the user that is related to an action or function being performed on the device.
- an audio output corresponds to a user input to provide the user with feedback that the input is being received by the device.
- the wearable electronic device 100 and the electronic device 130 can, optionally also include other components for producing a visual output, including, for example, a light beacon, a light source, a glowing component, a display, or the like.
- Components that are configured to produce a visual output can be used to provide a visual component of an alert.
- the output produced by these components is combined with the visual output of the display 110 , 150 , and other components as part of a composite or multi-stimulus alert.
- the wearable electronic device 100 and the electronic device 130 can, optionally also include a haptic actuator for producing a haptic output that may be perceived as a stimulus by the user.
- the haptic output can be used as part of an alert produced by the device.
- the haptic output can, optionally form part an alert associated with an event detected or received by the device 100 , 130 .
- the haptic output can form a haptic component of a (composite) alert that includes multiple forms of stimuli, including, audio, visual, and/or haptic components.
- the haptic output can also be used to provide feedback to the user that is related to an action or function being performed on the device.
- a haptic output corresponds to a user input to provide the user with feedback that the input is being received by the device.
- the wearable electronic device 100 can, optionally also include a band 120 or a strap that is used to connect or secure the wearable electronic device 100 to a user.
- the wearable electronic device 100 includes a lanyard or necklace.
- the wearable electronic device 100 is secured to or within another part of a user's body.
- the strap, band, lanyard, or other securing mechanism can, optionally include one or more electronic components or sensors in wireless or wired communication with an accessory.
- the band 120 can, optionally include a haptic actuator that is configured to produce a haptic output that may be sensed on the wrist of the user.
- the band 120 also includes a component for producing an audio and/or visual output, similar to as discussed above with respect to the device 100 , 130 . Additionally, in some embodiments, the band 120 includes one or more sensors, an auxiliary battery, a camera, or any other suitable electronic component.
- the wearable electronic device 100 and the electronic device 130 can, optionally also include one or more sensors for monitoring and detecting environmental conditions. Some example sensor components are described in more detail with respect to FIGS. 8-9 .
- the devices 100 , 130 include a microphone or other type of acoustic sensor that is configured to receive acoustic input from the user of from the surrounding environment.
- the microphone or acoustic sensor are configured to function as environmental sensors that are adapted to receive ambient sound.
- the microphone and other components of the devices 100 , 130 are configured to determine an ambient sound level.
- the devices 100 , 130 are configured to activate the microphone over a predetermined period of time and record ambient sounds that are received.
- the recorded signals are be processed to eliminate or reduce outlier input and compute an average or representative audio input. The processed audio signals can be used to determine an ambient sound level.
- the devices 100 , 130 also include one or more motion sensors that are configured to detect motion of the device.
- the motion sensor(s) includes one or more of: an accelerometer, a gyro-sensor, a tilt sensor, rotation sensor, and the like.
- the motions sensor or sensors are configured to function as an environmental sensor that is adapted to detect overall activity of the user.
- the devices 100 , 130 are configured to activate or receive input from the motion sensor(s) over a predetermined period of time and record the motion of the device.
- the number of motion events and the magnitude of the events are be used to compute or determine an estimated activity level of the user.
- the devices 100 , 130 can, optionally also include one or more optical sensors that are configured to function as an environmental sensor.
- the one or more optical sensors may include, for example, an ambient light sensor (ALS), an image sensor (camera), an optical proximity sensor and the like.
- the one or more optical sensors are used to determine an ambient light level surrounding the device.
- the one or more optical sensors are configured to estimate the environmental lighting conditions based on an optical signal or amount of light received by the one or more sensors. Additionally or alternatively, the one or more optical light sensors can be used to detect the user's face and determine whether or not the user is likely to notice a visual output of the device.
- the devices 100 , 130 can, optionally include other types of environmental sensors for collecting information about one or more environmental conditions.
- the devices 100 , 130 may also include a temperature sensor, a barometric pressure sensor, a moisture sensor, a humidity sensor, a magnetic compass sensor, and the like. These sensors can be used alone or in combination to determine or estimate an environmental condition surrounding the device 100 , 130 .
- the wearable electronic device 100 and the electronic device 130 can, optionally include a processor, a memory, and other components. These components, as well as other components of an exemplary computing device are described in more detail below with respect to FIGS. 8-9 . Further, the wearable electronic device 100 and the electronic device 130 can also, optionally include or be integrated with other components, including, for example, a keyboard or other input mechanism. Additionally, in some embodiments, the devices 100 , 130 include one or more components that enable the devices 100 , 130 to connect to the internet and/or access one or more remote databases or storage devices.
- the devices 100 , 130 also enable communication over wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media mediums.
- wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media mediums.
- Such communication channels can be configured to enable the devices 100 , 130 to remotely connect and communicate with one or more additional devices such as, for example, a laptop computer, tablet computer, mobile telephone, personal digital assistant, portable music player, speakers and/or headphones and the like.
- FIG. 2 depicts an example electronic device being worn by a user and subjected to one or more environmental conditions according to one or more embodiments of the present disclosure.
- FIG. 2 may represent an electronic device 100 subjected to one or more environmental conditions that may relevant to the user's potential interaction with the device 100 , particularly an alert or stimulus produced by the device.
- the device 100 may be subjected to motion 220 due to movement or activity of the user 210 .
- the motion 220 may include movement in more than one direction and may also include a combination of rotational and translational movement. As described above with respect to FIGS.
- the device 100 can, optionally include one or more motion sensors that are configured to produce an output that can be used to compute or determine an activity level of the user 210 .
- the activity level of the user as detected by the one or more motion sensors, is indicative of the ability of the user 210 to perceive certain types of stimuli.
- an appropriate alert mode is selected that corresponds to the user's activity level.
- the alert mode that is selected has one or more components (e.g., audio, haptic, visual) that correspond to the activity level of the user 210 .
- the activity level of the user 210 can, conditionally be used to forgo or delay the output of an alert until the user 210 is at rest and may be more likely to perceive the alert.
- the device 100 may be subjected to particular type acoustic environmental condition or conditions. For example, if the user 210 is walking through a crowded area or in a noisy environment, the device 100 may be subjected to loud or high acoustic level environmental conditions. Conversely, the device 100 may be subjected to quiet or low acoustic level environmental conditions if, for example, the user 210 is alone in a room or interior space. As described above with respect to FIGS. 1A-B , the device can, optionally include a microphone or other acoustic sensor that is configured to produce an output that can be used to compute or determine an ambient sound level surrounding the user 210 .
- a microphone or other acoustic sensor that is configured to produce an output that can be used to compute or determine an ambient sound level surrounding the user 210 .
- the ambient sound level detected by the sensor(s) is indicative of the user's 210 ability to perceive certain types of stimuli.
- an appropriate alert mode is selected that corresponds to the ambient sound level.
- the alert mode that is selected has one or more components (e.g., audio, haptic, visual) that correspond to the acoustic level detected by the sensor(s).
- the device 100 may also be subjected to ambient lighting conditions, which may be detected using one or more optical sensors, as described above with respect to FIGS. 1A-B .
- the one or more optical sensors are able to detect low level or dark lighting conditions, which may be consistent with the user 210 being located in a movie theater, presentation, or other quiet area.
- the one or more optical sensors are also be able to detect if the device 100 is being subjected to sunlight conditions, which may be consistent with an outdoor setting or open public area.
- an appropriate alert mode is selected based on the ambient lighting conditions.
- the alert mode that is selected has one or more components (e.g., audio, haptic, visual) that correspond to the light level detected by the sensor(s).
- the output from one or more types of sensors can be combined to detect an environmental condition or set of conditions.
- the light sensor(s), acoustic sensor(s), and/or motion sensor(s) are used to estimate or detect a one or more environmental conditions.
- the activity level of the user can be more accurately determined by using the output of the one or more motion sensors with the output of the acoustic sensor. More specific examples are provided below with respect to FIGS. 7A-B .
- FIG. 3 depicts an example electronic device being worn and another example electronic device being carried by the user and subjected to one or more environmental conditions according to one or more embodiments of the present disclosure.
- multiple devices 100 , 130 are located proximate to the user 210 at the same time.
- a wearable electronic device 100 and a mobile phone 130 are located proximate to the user.
- a laptop computer, desktop computer, or other electronic device may be located in the near-immediate vicinity.
- one or more of the devices ( 100 , 130 ) are be used to determine the environmental conditions surrounding the user 210 .
- the devices automatically pair by a Bluetooth or similar wireless communications protocol.
- the devices 100 , 130 can be configured to communicate information related to the environmental conditions to each other to obtain a more accurate or more complete information about environmental conditions.
- the motion sensor output the wearable device 100 can be used in combination with the motion sensor output of the other device 100 to compute or determine a more accurate estimate of the activity level of the user 210 .
- the optical sensor output from each device 100 , 130 is compared or combined to estimate an ambient lighting condition. For example, the relative difference between the optical sensors of the respective devices 100 , 130 can be used to determine that the device 130 is located in the pocket of a user rather than in a dark room.
- the output from the acoustic sensors (e.g., microphones) of the respective devices 100 , 130 are be combined and/or compared to determine a more accurate or complete estimation of ambient lighting conditions.
- an alert mode is selected based on environmental conditions detected by one or both devices that are in proximity to the user 210 .
- one output device is selected or designated to output an alert, thereby preventing multiple alerts being sent to the user 210 at or near the same time.
- the device that is most likely to be perceived by the user can, conditionally be selected or identified as the output device. Specific examples of this functionality are described below with respect to FIG. 7E .
- FIG. 4 depicts an example electronic device in an exemplary operating environment according to one or more embodiments of the present disclosure.
- the device 130 is placed on a desk, table, or other surface when, for example, the device 130 is not in use.
- the one or more sensors are used to detect this scenario, which may correspond to a conditions where the user is not proximate to the device or may not readily perceive a stimulus or alert output by the device 130 .
- this scenario or environmental condition is detected using one or more motion sensors, which are used to determine a static activity level.
- the output from other sensors, including the microphone and the one or more optical sensors can, conditionally also be used to determine that the device 130 is subjected to a static activity level or environmental conditions consistent with a device that is not in use.
- FIG. 5 depicts a user interacting with an example electronic device according to one or more embodiments of the present disclosure.
- the user 210 may interact with the device by, for example, making a selection on a touch-sensitive surface of the device 100 .
- the user may actively interact with the device by touching or pressing a touch-sensitive display of the device 100 .
- the device 100 is be able to sense that the user is looking at the display, and therefore, at least passively interacting with the device. Passive interaction may be detected, for example, using one or more optical sensors to detect the position and movement of the user's head.
- the one or more optical sensors are configured to sense the location and movement of the user's eye, which may be consistent with the user 210 reading or watching the display of the device 100 .
- a passive interaction may also be detected, for example, using one or more touch sensors that detect the user's hand position or grip on the device.
- an active mode is selected which corresponds to a scenario or condition in which the user is either actively or passively interacting with the device. Additionally, in some cases, active or passive interaction from the user may be used to interrupt an escalating alert sequence and output modified alert sequence that is non-escalating or otherwise different.
- FIG. 6 depicts example user input to an electronic device according to one or more embodiments of the present disclosure.
- the device 100 can, optionally be configured to output a stimulus in response to a user input on a device.
- the user may provide touch input 615 on a touch display 110 or other touch-sensitive surface of the device.
- a two-dimensional scrolling or panning input may be provided by moving the touch along one or more directions on the touch sensitive surface of the device.
- an audible audio output such as a beep or click, is be produced as items or objects are indexed on the display 110 in response to the user input 615 .
- a haptic output is coordinated with the audio output, such as a tap or bump. The audio and haptic output can, conditionally be synchronized.
- an audio and/or haptic output is produced in response to a rotational user input 612 provided using the crown 610 or knob.
- the crown 610 is operatively coupled to a position sensor, such as an optical encoder, that is configured to produce an output signal that corresponds to the rotational input provided by the user.
- a position sensor such as an optical encoder
- an audible click and a haptic tap is output by the device for a predetermined amount of movement of the crown 610 or knob.
- the device may, conditionally produce an output for every 5 degrees of movement of the crown 610 .
- the output may be dependent, at least in part, on the speed or rate that the user input ( 615 , 612 ) is provided. For example, if the speed of the user input ( 615 , 612 ) exceeds a certain threshold, the device used to produce the haptic output may not be able to keep up.
- the response time of the haptic device may be higher than the time between haptic outputs.
- the haptic output can be configurable to change from a synchronous output to an asynchronous output as the user input exceeds a certain threshold.
- FIGS. 7A-F Specific example processes for producing an output using a device are described below with respect to FIGS. 7A-F .
- one or more of the devices described above with respect to FIGS. 1A-B can be used.
- the device(s) can, optionally include internal components or elements consistent with FIGS. 8-9 , described in more detail below. While certain processes and the device hardware implementations are provided by way of example, it is not intended that the description be limited to those example embodiments.
- FIG. 7A illustrates an example process 700 determining a response to an event according to one or more embodiments of the present disclosure.
- a device can be configured to produce an alert or stimulus that is formulated to capture the attention of the user.
- the effectiveness of the alert may depend, in part, on one or more environmental conditions, which may change over time or user activity.
- it may be beneficial to detect the present state of one or more environmental conditions and select an output having a stimulus that corresponds to the detected environmental condition(s).
- the operations of process 700 may be performed using, for example, the example devices described above with respect to FIGS. 1A-B .
- an event is detected by the device.
- the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event.
- the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like.
- the notification or message is received by an external device or service via a wired or wireless communication network.
- an event is detected by, for example, receiving a notification or message from an application or program that is being executed on the device. For example, a clock alarm, clock timer, calendar scheduler, or similar program may trigger an event that is detected by the device.
- an event is triggered in relation to a wide range of activities, including, satisfying personal health goal, reaching a geographic location, or meeting some other criteria or condition.
- an event corresponds to a physiological function exceeding a threshold or satisfying a condition.
- an event can, conditionally be triggered in response to reaching a target heart rate, oxygenation level, or similar physiological condition.
- a response to the event is determined.
- the response to the detected event is determined based on a current alert mode.
- the current alert mode is selected based on one or more environmental conditions.
- the one or more environmental conditions are detected concurrently with either the detection of the alert and/or the selection of the current alert mode. However, in some implementations, all of the environmental conditions are not present or occurring exactly when the detection and/or selection occurs.
- the current alert mode is selected from a set of three or more alert modes.
- an alert includes multiple forms of stimuli, including, for example, audio, haptic, or visual components.
- each of the three or more alert modes includes one or more of: an audio component, a haptic component, and a visual component. Additionally, the components (audio, haptic, visual) can, conditionally vary in intensity and in form depending on the alert mode.
- a first alert is output in response to the event.
- operations 705 and 706 in accordance with a determination that the current alert mode is a second alert mode, a second alert is output in response to the event.
- similar determinations can be made for as many alert modes that are defined or available for selection.
- each of the alert modes may vary with respect to each other in some aspect.
- the components that are used e.g., audio, haptic, visual
- the alert mode is automatically selected based on the one more environmental conditions that are detected by the device.
- the current alert mode is selected prior to detecting the occurrence of the event. For example, the relevancy of the alert mode may be checked and the current alert mode may be selected or confirmed according to a regularly repeating interval. Additionally, in some implementations, the current alert mode is selected or confirmed at or near the same time that the event occurs. For example, the occurrence of an event can be used to trigger the selection of the current alert mode.
- environmental conditions relate to the physical environment that in which the device is being operated in or conditions that the device is subjected.
- Environmental conditions that are used to select the current alert mode can include, without limitation, acoustic noise, user activity, device motion, device orientation ambient light, and others.
- Environmental conditions generally do not include specific alert setting established by a user, such as quiet hours or a silent mode. Additionally, environmental conditions may not, in some cases, include the geographic location of the device.
- the environmental conditions are monitored and detected using one or more of the sensors associated with the device.
- Example environmental sensors include, without limitation, accelerometers, gyroscopes, tilt sensors, microphones, light sensors, image sensors, proximity sensors, and the like. Example environmental sensors are described above with respect to FIGS. 1A-B , above, and FIGS. 8-10 , below. It is not necessary that each of the sensors be located on the device. As mentioned previously with respect to FIG. 3 , multiple devices that are located in the same vicinity or proximate to each other can be configured to share sensor data via a data link or other communication scheme.
- one or more environmental sensors can be configured to detect particular environmental conditions, and the output of those sensors used to select an alert mode having elements or components that correspond to the detected conditions.
- the environmental sensors are used to compute a changed or changing environmental condition and automatically provide for different alert outputs for the same type of event.
- the environmental sensor is a microphone that is configured to detect ambient acoustic conditions.
- the microphone may can, optionally be integrated with the device and configured to record audio signals or input over a sample time period.
- the collected audio data is stored and further analyzed to compute or determine an ambient acoustic level.
- the collected audio data is filtered and processed to remove audio input that may correlate to the user's voice.
- the audio data can, optionally also be processed to determine an average or representative acoustic level over a given period of time. In some instances, the audio data from multiple sample time periods are used to compute or determine the acoustic level.
- the acoustic level is used to select an appropriate alert mode as the current alert mode, in accordance with operation 702 .
- the alert mode that is selected includes an audio component that corresponds to the acoustic level determined using the environmental sensors. For example, if the acoustic level represents a loud or noisy ambient acoustic environmental condition, a first alert mode can, conditionally be selected as the current alert mode, the first alert mode having an audio component with an elevated volume or intensity (as compared to other alert modes).
- a second alert mode can, conditionally be selected as the current alert mode, the second alert mode having an audio component with a volume or intensity that is reduced with respect the audio component of the first alert mode. Additional alert modes may be similarly defined and selected according to an audio component that may correspond to a detected ambient acoustic noise level.
- the current alert mode includes or defines another component that corresponds to the detected acoustic level.
- a third alert mode is selected as the current alert mode, the third alert mode having an haptic component that corresponds to the acoustic noise level.
- the intensity or energy of the haptic output may be stronger in accordance with a loud or noisy acoustic level.
- a haptic output may have and intensity or energy that is weaker or reduced in accordance with a quiet or less noisy acoustic level.
- an alert mode is selected as having a visual component that corresponds to the detected acoustic level.
- the alert mode may include a visual component, such as a beacon or strobe having an intensity or frequency that corresponds to the detected acoustic level.
- a visual component such as a beacon or strobe having an intensity or frequency that corresponds to the detected acoustic level.
- one or more components are used in conjunction with another component to produce an appropriate level of stimulation to the user, depending on the environmental conditions.
- a user is wearing a wearable electronic device, in accordance with the embodiments described above with respect to FIG. 1A .
- the user and the device are subjected to a noisy environment, such as a gymnasium or workout room.
- the device detects the noisy environmental condition using the microphone, which is used to determine or compute an ambient sound level.
- the device selects an alert mode having an audio component having an increased volume (example audio component) that corresponds to the high-ambient sound level.
- the device selects an alert mode having an increased haptic vibration (example haptic component and/or visual strobe (example visual component). The device then outputs an alert in accordance with the selected alert mode.
- the environmental sensor includes one or more motion sensors that are configured to detect device motion and/or user activity.
- the one or more motion sensors can, optionally be integrated or associated with the device and may be configured to record motion and/or activity over a sample time period.
- Example motion sensors include, for example, an accelerometer, gyroscope, tilt sensor, and the like, as discussed above with respect to FIGS. 1A-B .
- the collected motion data is stored and further analyzed to compute or determine an activity level.
- the collected motion data is filtered and processed to determine discrete number or movements (translational or rotational) over a given period of time. In some cases, the number of movements is used to compute or determine an activity level. Additionally or alternatively, the intensity of the movements over a period of time is used to compute or determine an activity level.
- the activity level corresponds to or represents the activity of the user.
- a high activity level may represent an environmental condition in which the user may be exercising or moving rapidly.
- a low activity level may represent an environmental condition in which the user is at rest or sedentary.
- the activity level is used to select an appropriate alert mode as the current alert mode, in accordance with operation 702 .
- the alert mode that is selected includes an audio component that corresponds to the activity level determined using the environmental sensors. For example, if the activity level represents a highly active environmental condition, a first alert mode may be selected as the current alert mode, the first alert mode having an audio component with an elevated volume or intensity (as compared to other alert modes). Similarly, if the activity level represents a condition that less active, a second alert mode may be selected as the current alert mode, the second alert mode having an audio component with a volume or intensity that is reduced with respect the audio component of the first alert mode. Additional alert modes may be similarly defined and selected according to an audio component that may correspond to a detected activity level. As in the previous example, the alert mode that is selected may have other components (e.g., haptic, visual) that also correspond to the detected activity level.
- the alert mode that is selected may have other components (e.g., haptic, visual) that also correspond to the detected
- the environmental sensor includes one or more optical sensors that are configured to detect optical or lighting environmental conditions.
- the one or more optical sensors can, optionally be integrated with the device and configured to record light quantity or lighting conditions over a sample time period.
- Example optical sensors include, for example, an ALS, an image sensor, a proximity sensor, and the like, as discussed above with respect to FIGS. 1A-B .
- the collected optical data is stored and further analyzed to compute or determine an ambient light level.
- the collected optical data is filtered and processed to determine an average amount of light over a given period of time, which can, conditionally be used to compute or determine ambient light level. Additionally or alternatively, the intensity of light received over a period of time can be used to compute or determine a light level.
- the light level corresponds to or represents the setting in which the device is being operated.
- bright or a high light level may represent an outdoor or public operating environment.
- bright environmental conditions indicate that an alert may be more intense because the user is outdoors.
- a low or dim light level may correspond to or represent an environmental condition in which the user is indoors or more private operating environment.
- a low light level may correspond to a user being located in a movie theater or presentation.
- a low light level may indicate that an alert should be less intense to avoid disrupting indoor activities.
- the light level is used to select an appropriate alert mode as the current alert mode, in accordance with operation 702 .
- the alert mode that is selected includes an audio component that corresponds to the light level determined using the environmental sensors. For example, if the light level represents a brightly lit environmental condition, a first alert mode is selected as the current alert mode, the first alert mode having an audio component with an elevated volume or intensity (as compared to other alert modes). Similarly, if the lighting level represents a condition that less bright, a second alert mode is selected as the current alert mode, the second alert mode having an audio component with a volume or intensity that is reduced with respect the audio component of the first alert mode.
- Additional alert modes can, conditionally be similarly defined and/or selected according to an audio component that may correspond to a detected activity level.
- the alert mode that is selected can, optionally have other components (e.g., haptic, visual) that also correspond to the detected light level.
- the environmental sensor includes a battery power sensor that is configured to detect a current battery level.
- the battery power sensor includes a circuit integrated into the device that is configured to measure an electrical property of the battery (e.g., voltage, current, impedance) that may be indicative of the remaining battery power.
- an alert mode can, conditionally be selected as the current alert mode based on a correspondence between the battery power level and one of the components (audio, haptic, visual) of the alert.
- the alert mode is selected based on the power that may be consumed during an alert output. For example, if the battery level is low (e.g., 5%, 10%, or 15% of total battery power), an alert mode is selected that uses less power as compared to some other alert modes.
- One technique may be to eliminate or reduce the intensity of alert components that consume a large amount of energy.
- an alert mode having no haptic component is be selected based on a low battery level. Additionally or alternatively, the output of the components is staggered or delayed in some alert modes in order to reduce peak power usage.
- the alert mode that is selected can, optionally include a variety of component combinations.
- a component can, optionally be eliminated.
- a first alert mode includes a first haptic component and a first visual component.
- a second alert mode includes a second haptic component and no visual component.
- one or both of the components can, conditionally vary depending on the alert mode.
- a first alert mode includes a first audio component and a first haptic component
- a second alert mode includes a second audio component and second haptic component, where the first audio and first haptic component are different than the second audio component and the second haptic component, respectively.
- an alert mode only includes a visual component.
- a first alert mode includes no audio component and no haptic component, and only includes only a visual component such as a notification displayed on a display of the device.
- an alert mode includes two or more components that are staggered or offset by a delay.
- a first alert mode includes a first audio component and a first haptic component offset by a first delay.
- a second alert mode includes the first audio component and the first haptic component, but offset by a second delay that is different than the first delay.
- the difference in delay between the alert modes may depend, in part, on the likelihood that the user will be able to perceive a haptic output given certain environmental conditions.
- the delay between components is increased based on the likelihood that the user is distracted or already receiving a high level of stimulation.
- the delay between alert components is increased if the activity level and/or ambient acoustic levels are high.
- a haptic component can, conditionally proceed an acoustic component by a short offset.
- the haptic component provides a priming stimulus that may increase the likelihood that the audio stimulus will be perceived by the user.
- FIG. 7B depicts a process 710 for determining whether or not to respond to an event based on user activity according to one or more embodiments of the present disclosure.
- a level of user activity is high, it may be difficult for a user to perceive an alert associated with an event. Additionally, even when an alert is perceived by a user engaged in heavy activity, the user may not be as likely to respond to the alert until the activity is complete. Thus, in some implementations, it may be advantageous to delay or forgo the output of an alert until the user has completed an activity or is at rest.
- the operations of process 710 may be performed using, for example, the example devices described above with respect to FIGS. 1A-B .
- an event is detected by the device.
- the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event.
- the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like.
- an event is detected by, for example, receiving a notification or message from an application or program that is being executed on the device.
- an event can, conditionally be triggered in relation to a wide range of activities or functions, including, satisfying personal health goal, reaching a geographic location, reaching a target heart rate, reaching an oxygenation level, or satisfaction of other criteria.
- an activity level is determined.
- one or more sensors is used to detect the motion of the device, which is used to determine an activity level. Similar to the example provided above with respect to FIG. 7A , one or more motion sensors can, optionally be integrated into or associated with the device and may be configured to record motion and/or activity over a sample time period.
- Example motion sensors include, for example, an accelerometer, gyroscope, tilt sensor, and the like, as discussed above with respect to FIGS. 1A-B .
- the collected motion data is stored and further analyzed to compute or determine an activity level.
- the collected motion data is filtered and processed to determine discrete number or movements (translational or rotational) over a given period of time.
- the number of movements is used to compute or determine an activity level. Additionally or alternatively, the intensity of the movements over a period of time is used to compute or determine an activity level.
- the activity level may correspond to or represent the activity of the user. Thus, a high activity level may represent an environmental condition in which the user may be exercising or moving rapidly. Similarly, a low activity level may represent an environmental condition in which the user is at rest or sedentary.
- the threshold may correspond to the method used to determine the activity level in operation 712 .
- the threshold may similarly represent a threshold number of motion events over a similar period of time.
- the threshold may also represent a threshold level that is based, at least in part, on the intensity of the activity.
- the threshold is customized based on an average level of user activity or device motion. For example, if a user is more active, then the threshold can, optionally be set higher than a user that is less active.
- the outputting of an alert is forgone or delayed.
- the output of an alert is delayed until a later time when the activity level may be lower.
- the activity level can, optionally be periodically determined or checked over a predetermined time period.
- the output of the alert is further delayed or forgone as long as the activity level exceeds a low-activity threshold, which can, optionally be the same or different than the original threshold.
- the alert is output.
- the output is provided in accordance with one or more of the other examples provided herein.
- the alert that is provided can, optionally include one or more alert components (audio, haptic, visual) that correspond to the environmental conditions associated with the device.
- the alert that is output can, optionally be a fixed alert that is not dependent on one or more environmental conditions.
- the output of the alert is forgone or delayed until a subsequent criteria is satisfied.
- the output is forgone or delayed until the activity level drops below a low-activity threshold, which can, optionally be the same or different than the original threshold.
- the output is forgone or delayed until a predetermined time period has passed since the first time the output of the alert was forgone.
- the device can be configured to wait until an activity level drops below a threshold, and if the level does not drop over the predetermined period of time, the alert is output anyway.
- the alerts are combined into a single alert when a determination is eventually made to output an alert. For example, if multiple events occur during a period of high activity, the user will be notified by a single (combined) alert that represents all of the alerts that were forgone during the period of high activity.
- process 710 includes user activity associated with an exercise or workout routine.
- the device receives or detects one or more events associated with one or more physiological conditions while the user is performing an exercise.
- the device in some implementations, automatically detects the reduced activity and outputs the one or more alerts associated with the events that occurred during the exercise.
- one or more alerts are delayed or forgone while a user is typing or performing another activity that introduces high-frequency movement near the device. The delayed or forgone alert output may increase the user's perception of, for example, a haptic component output that may be masked by the high-frequency movement.
- decision to forgo the alert is also based on other environmental conditions.
- the output of an alert in some implementations, is forgone or delayed while a user is walking in a busy or loud environment. When the user stops walking, at for example, a traffic light, the device automatically outputs the previously forgone alert.
- FIG. 7C depicts a process 720 for determining whether or not to respond to an event based on a number of events according to one or more embodiments of the present disclosure.
- a series of closely occurring events may result in a single, batched alert instead of triggering a series of individual alerts for each event.
- a combined or batched alert may be useful, for example, when a large number of event occur over a period of time, or when the time between events is very small.
- the operations of process 720 may be performed using, for example, the example devices described above with respect to FIGS. 1A-B .
- an event is detected.
- the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event or receiving a notification or message from an application or program that is being executed on the device.
- the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like.
- an event can, conditionally be triggered in relation to a wide range of activities or functions, including, satisfying personal health goal, reaching a geographic location, reaching a target heart rate, reaching an oxygenation level, or satisfaction of other criteria.
- the number of events is determined.
- the device is configured to wait a predetermined period of time after detecting an event.
- the number of events (requiring an alert output) that occur over the predetermined amount of time are be used to compute the number of events for operation 722 .
- the device is configured to detect and count the number of events received over a predetermined, regularly repeating time period.
- the number of events includes one or more events that occurred prior to the event detected in operation 721 .
- a variety of other techniques can, optionally be used to determine the number of events that are received by the device over a period of time.
- the number of events that are determined depends on the type of event that occurred, a person associated with the event, and/or the content associated with the event. For example, in some implementations only text messages are counted pursuant to operation 722 . Similarly, in some implementations, only events associated with communications (e.g., e-mail, text messages, telephone calls) are counted pursuant to operation 722 . Additionally or alternatively, in some implementations, only events that are associated with the same sender or group of senders are counted for operation 722 . Similarly, in some implementations, events that share similar content or subject matter are counted for operation 722 .
- communications e.g., e-mail, text messages, telephone calls
- the threshold is a fixed threshold that is determined by the device or device settings. In some instances, the threshold is configurable by the user.
- the output of the alert is forgone.
- an alert is output.
- the alert that is output is based, at least in part, on the previously occurring event.
- the alert includes information indicative of the event and one or more prior events occurring prior to the event.
- the alert includes a visual component that includes, a list of the subject lines or senders for one or more e-mail messages that were received during the predefined time period.
- the alert includes an indication of the number of events that have been batched or combined in the alert output.
- the strength of the alert e.g., intensity of the audio or haptic component
- the strength of the alert output is based, in part, on the frequency and/or type of events that have had an alert forgone.
- events of different types are batched together. For example, if the threshold is 3 messages in 2 minutes, receiving an email, an SMS message and a phone call within 2 minutes will trigger an alert. In some embodiments, events of different types of events are batched together. For example, if the threshold is 3 messages in 2 minutes, receiving an email, an SMS message and a phone call within 2 minutes will not trigger an alert, while receiving 3 emails within 2 minutes will trigger an alert. In accordance with the description of process 720 , a variety of events and criteria may be counted and used to forgo and/or eventually output an alert.
- FIG. 7D depicts a process 730 for outputting a modified alert sequence according to one or more embodiments of the present disclosure.
- an alert is configured to escalate by producing a stimulus or output that increases in intensity over time.
- the escalation sequence or progression of the alert is interrupted and modified as a result of a user interaction with the device.
- an interaction from the user may indicate that the user's attention is already focused on the device and that further escalation of the alert sequence may not be necessary.
- an escalating alert sequence is output by the device until receiving an input or an interaction from the user.
- the device detects passive interaction, such detecting or estimating when a user is reading content on the display. The operations of process 730 may be performed using, for example, the example devices described above with respect to FIGS. 1A-B .
- a portion of an alert sequence is output using the device.
- the alert sequence includes predetermined sequence of alert outputs that escalate in intensity over time.
- each of the alert outputs includes one or more alert components (audio, haptic, visual) that increase in intensity over time.
- the alert escalates by adding components to subsequent outputs to increase the overall intensity of the alert.
- a first output includes only a haptic component
- a second output includes a haptic and audio component
- a third or subsequent output includes a haptic, audio, and visual component.
- the alert sequence corresponds to the occurrence of a single event.
- the alert sequence corresponds to the receipt of a message or other communication.
- the alert sequence can, optionally also correspond to an upcoming calendar event, a timing notification, or other type of reminder.
- the alert sequence can, optionally also correspond to a series of activity monitor alerts that provide feedback with regard to the progress of the user toward meeting a fitness goal, such as completing a workout routine.
- an interaction with the user is detected.
- the interaction with the user is, optionally detected while the portion of the alert sequence is being output.
- the interaction with the user may include an active and/or passive interaction.
- a user interaction indicates that the user is already be paying attention to the device and may not need or want an alert to continue to escalate.
- a user interaction is interpreted as a request to modify the alert sequence to either increase or decrease the intrusiveness of the alert.
- the user 210 may actively interact with the device by, for example, contacting a touch-sensitive surface of the device 100 .
- the user may actively interact with the device by selecting an item displayed on a touch-sensitive display of the device 100 .
- the user 210 may also actively interact with the device by pressing a button, turning a knob, or by providing some other form of user input.
- the user may speak a voice command, which may detected by the microphone of the device and interpreted as user input.
- the device is configured to detect passive interaction with the user.
- the device 100 can be configured to use one or more sensors to determine or estimate if the user is looking at the display, and therefore, at least passively interacting with the device.
- passive interaction is detected, for example, using one or more optical sensors to detect the position and movement of the user's head.
- the one or more optical sensors are configured to sense the location and movement of the user's eye, which may be consistent with the user 210 reading or viewing the display of the device 100 .
- the device is configured to detect the user's grip on the device, which may also indicate that the user is currently viewing or interacting with the device.
- a modified alert sequence is selected and output in response to the interaction with the user.
- the original alert sequence output in operation 731 is paused or terminated and a new, modified alert sequence is output instead.
- the modified alert sequence is a continuation of the original alert sequence, but is modified in intensity or intrusiveness. For example, if the original alert sequence includes a series of 10 outputs and the user interaction is detected on the 6th output, the modified alert sequence includes 4 outputs, which replace the 4 remaining outputs of the original series of 10.
- the modified alert sequence is a non-escalating alert sequence.
- the modified alert sequence includes a series of outputs that do not increase in intensity over time.
- the modified alert sequence includes a series of outputs that decrease in intensity over time.
- the modified alert sequence is a silent alert sequence.
- the modified alert sequence produces only a visual component in response to the interaction with the user.
- process 730 is used to increase or decrease the intrusiveness of an alert sequence.
- the user interaction of operation 732 includes an input that is a request to reduce the intrusiveness of the portion of the alert sequence output in operation 731 .
- the request to reduce the intrusiveness is input via the touch-sensitive display of the device and results in the output of a modified alert sequence that is less intrusive.
- a less intrusive output includes an output having an audio component that is reduced in volume and/or a haptic component having a shorter or lower energy output.
- the user interaction of operation 732 includes an input that is a request to increase the intrusiveness of the portion of the alert sequence output in operation 731 .
- a user input results in a modified alert sequence having an increased intrusiveness.
- An output having an increased level of intrusiveness can, optionally include, for example, an audio component having increased volume and/or a haptic component having an increased duration or energy output.
- a user receives an alert sequence triggered by an event associated with the user meeting a health-related goal.
- the alert sequence may be produced in response to the user reaching a target heart rate, as detected by a heart-rate monitor.
- the alert sequence escalates or increases in intensity until the user interacts with the device by, for example, shaking the device, which is perceived by one or more motion sensors integrated with the device. Additionally or alternatively, the user interacts with the device by touching the touch screen, pushing a button, or turning a knob.
- the alert sequence is interrupted and a modified alert sequence is output.
- the modified alert sequence is a non-escalating sequence, such as a sequence of audio beeps.
- the modified alert sequence continues as long as the user maintains the target heart rate or another event triggers another alert.
- FIG. 7E depicts a process 740 for determining an output device according to one or more embodiments of the present disclosure.
- multiple devices are proximate to a user when an event is detected. It may be undesirable for each of the devices to output an alert in response to the same event. Thus, in some cases, it may be advantageous to select one device to output the alert.
- the operations of process 740 may be performed using, for example, the example devices described above with respect to FIGS. 1A-B and 3 .
- an event is detected.
- the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event or receiving a notification or message from an application or program that is being executed on the device.
- the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like.
- an event can, optionally be triggered in relation to a wide range of activities or functions, including, satisfying personal health goal, reaching a geographic location, reaching a target heart rate, reaching an oxygenation level, or satisfaction of other criteria.
- a determination that a second device is in proximity to the first device is made. In some cases, the determination is made in response to detecting the event.
- the first device is a wearable electronic device 100 and the second device is a mobile telephone 130 , as depicted in FIG. 3 .
- one or more of another type of device such as a notebook computer, a desktop computer, a tablet device, a personal media player device, a television device, or the like are in proximity to the user.
- a third device, a fourth device, and other additional devices are also detected in accordance with operation 742 .
- a second device are detected using, for example, a wireless communication signal or beacon.
- the first and second device have been previously paired using, for example, a Bluetooth communication scheme.
- the second device is detected using an automatic detect and connect process as part of the Bluetooth connection routine.
- the second device is detected using a beacon or intermittent broadcast signal that is transmitted from the first device.
- the second device is detected as being connected to a common wireless communication node.
- the second device can, selectively be detected due to a shared WiFi connection with the first device.
- one or both of the devices may include location determining software and/or hardware (e.g., global positioning system (GPS), base-station triangulation) that is used to determine if the second device is proximate to the first device.
- GPS global positioning system
- base-station triangulation base-station triangulation
- an alert-output device is selected.
- the devices may include a wearable electronic device, a mobile telephone, a notebook computer, a desktop computer, a tablet device, a personal media player device, a television device, or the like.
- the selection of the alert-output device is performed in accordance with a number of different techniques.
- the alert-output device is selected based on a user-provided prioritization. For example, the user may provide or designate an ordered priority of multiple devices that may be used to select an appropriate alert-output device.
- the alert-output device is selected based on a usage of one or more of the devices that are in proximity to each other.
- the usage includes a time of usage.
- a device that has a time of usage that is most recent is selected as the alert-output device.
- a time of usage is only used if the most recent time is within a threshold (e.g., the data is not too old to be relevant).
- the usage includes both a time of usage and an amount of usage.
- the device having an amount of usage that is greater over a predetermined time period is selected as the alert-output device.
- the usage includes a type of usage.
- a device having a usage that corresponds to a predetermined usage type is selected as the alert-output device.
- the device is being used in accordance with a communications program (e.g., an e-mail program, a text messaging program), the device is selected as the alert-output device.
- a communications program e.g., an e-mail program, a text messaging program
- an alert-output device if an alert-output device is selected, the alert is output on the alert-output device.
- the alert is output in accordance with one or more other aspects of the present disclosure.
- the alert output includes one or more components (audio, haptic, visual) that correspond to one or more environmental conditions.
- the alert output is fixed or selected by the user.
- only the alert-output device outputs the alert associated with a particular event. For example, no other device that has been determined to be proximate to the user outputs an alert associated with the event. In some embodiments all devices that are determined to be proximate to the user are updated in response to the event, but only the alert-output device outputs an alert. For example, an email or message may be loaded or delivered to all of the devices, but the alert associated with the reception of the e-mail is only output on, for example, a wearable electronic device worn by the user. In some cases, the user may perceive the alert on the wearable device and then check the message on a phone, laptop, or tablet.
- the alert is relayed to the alert-output device using a device that is not selected as alert-output device.
- a notification of an incoming message may be received by a user's mobile telephone 130 .
- the mobile telephone 130 may relay the notification to the wearable device 100 in order to trigger an alert output.
- the mobile telephone 130 may generate a portion of the alert, which may be relayed and output using the wearable device 100 .
- a subsequent alert is output using one or more of the other devices that were not selected as an alert-output device.
- the alert is output on the first device.
- the alert is output on the device that detected the event.
- the user receives an e-mail, which triggers an event that is detected by the mobile telephone 130 .
- the mobile telephone 130 detects or has detected the proximity of the wearable electronic device 100 by, for example, having previously been paired using a Bluetooth or other wireless connection.
- the mobile telephone 130 has been in a dormant state for a period of time due to the placement in the user's pocket. Due to the low or non-usage of the mobile telephone 130 , the wearable electronic device 100 is selected as the alert-output device. In this scenario, the alert associated with the incoming e-mail is relayed to and output on the wearable electronic device 100 .
- FIG. 7F depicts a process 750 for producing an audio and haptic feedback in response to a user input according to one or more embodiments of the present disclosure.
- the device is configured to produce a stimulus that may provide feedback for a user-action or input to the device.
- a stimulus or feedback may be useful for some user input components, such as electronic sensors, that may have few or no moving parts to provide feedback to the user that an input is being received.
- an audio and haptic component may be output in response to a user's interaction with a touch screen or interaction with a rotational dial or button on the device.
- the stimulus may be adapted to mimic a sound or haptic response that the user may associate with a more traditional mechanical device.
- the operations of process 750 may be performed using, for example, the example devices described above with respect to FIGS. 1A-B and 5 .
- a first input is received on the device.
- the first input is received via, for example, touch-sensitive surface of the device, a crown or dial, or some other user input device.
- a user input is received as a translational panning or scrolling input 615 on the touch-sensitive surface of the display 110 .
- the user input is received as a rotational input 612 provided using the crown 610 of the device 100 .
- the first input is below an input threshold.
- the speed or rate of the first input may be slower than a threshold.
- the movement of the touch may be below a threshold rate.
- the speed of the rotation may be below a threshold rate.
- a first output in response to detecting the first input, a first output is produced.
- the first output includes a haptic component for the first input that is coordinated with an audio component for the first input.
- the haptic component is a tap or bump created by a haptic actuator integrated with the device.
- the audio component include sa beep or click that corresponds with the tap or bump created by the haptic actuator.
- the haptic component is synchronized with the audio component.
- the haptic component has an output that is simultaneous to or at a fixed timing relationship with respect to the output of the audio component.
- the first output corresponds to the rate or speed of the first input.
- a haptic tap and an audio beep or click corresponds to the progression through the list caused by the first input. As previously mentioned, this may be more readily perceived by the user or more satisfying than, for example, the visual scrolling of the items alone.
- a user provides a rotational input via, for example the crown 160 or knob of the device depicted in FIG. 5 .
- a haptic tap and an audio click can, optionally be output for every, for example, 5 degrees of rotation of the crown or knob. In this way, the user receives feedback on the speed that the input is being received by the device.
- a second input is received on the device. Similar to operation 751 , in some implementations, the second input is received via a touch-sensitive surface, crown, or other input device. In the present example, the second input is provided via the same input device as the first input of operation 751 . In some implementations, the second input occurs immediately after the first input or, alternatively, the second input occurs after a delay.
- the input threshold is set or determined based, in part, on the limitations of the hardware used to provide the feedback output.
- a haptic actuator has a minimum response time that is an inherent property or physical limitation of the haptic actuator mechanism, which typically includes a moving mass. An example haptic actuator is described in more detail below with respect to FIG. 11 . In some embodiments, the minimum response is due to the time it takes to initiate movement of the mass, produce a haptic output, and stop movement of the mass.
- input threshold of operation 754 is determined, at least in part, based on the upper limit of the haptic actuator.
- the output is similar to the output produced for operation 752 .
- the second input output includes a haptic component for the second input that is coordinated with an audio component for the second input. Similar to the previous example, in some implementations, the haptic component is synchronized with the audio component. In some implementations, the haptic component has an output that is simultaneous to or at a fixed timing relationship with respect to the output of the audio component.
- the second output includes a modified haptic component for the second input. In some instances, if it is determined that the rate or speed of the input may exceed the capabilities of the haptic actuator, the output is modified. In some embodiments, if the second input is above the input threshold, the haptic component for the second input is asynchronous with respect to the audio component for the second input. In some embodiments, the asynchronous haptic output is a continuous haptic output. In some instances, the second output includes a continuous haptic output but maintain a distinct audio “click” output that corresponds to an amount of input that is provided. In some cases, the continuous haptic output includes inflection points or periods of varying intensity. In some implementations, the inflection points may of the haptic output are not be synchronized with the audio output.
- FIG. 8 depicts a block diagram illustrating exemplary components, such as, for example, hardware components of an electronic device 800 according to one or more embodiments of the present disclosure.
- the electronic device 800 is similar to the wearable electronic device 100 described above with respect to FIG. 1A or the electronic device 130 described above with respect to FIG. 1B .
- various components of the device 800 are shown, connections and communication channels between each of the components are omitted for simplicity.
- the electronic device 800 may include at least one processor 805 and an associated memory 810 .
- the memory 810 comprises, but is not limited to, volatile storage such as random access memory, non-volatile storage such as read-only memory, flash memory, or any combination thereof.
- the memory includes removable and non-removable memory components, including, for example, magnetic disks, optical disks, or tape.
- the memory 810 stores an operating system 812 and one or more program modules 814 suitable for running software applications 816 .
- the operating system 812 can be configured to control the electronic device 800 and/or one or more software applications 816 being executed by the operating system 812 .
- various program modules and data files are stored in the system memory 810 .
- the program modules 814 and the processor 805 can be configured to perform processes that include one or more of the operations of methods shown and described with respect to FIGS. 7A-F .
- the electronic device 800 also includes communication connections 808 that facilitate communications with additional computing devices 806 .
- the communication connections 808 include a radio-frequency (RF) transmitter, a receiver, and/or transceiver circuitry, universal serial bus (USB) communications, parallel ports and/or serial ports.
- RF radio-frequency
- USB universal serial bus
- Computer readable media can, optionally include computer storage media.
- Computer storage media can, optionally include volatile and nonvolatile media and/or removable and non-removable media implemented in any method or technology for the storage of information. Examples include computer-readable instructions, data structures, or program modules.
- the memory 810 which can, optionally include the removable and non-removable storage devices, is one example of computer storage media.
- Computer storage media can, optionally include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the electronic device 800 . Any such computer storage media can, optionally be part of the electronic device 800 .
- FIG. 9 depicts additional aspects of the electronic device 800 according to one or more embodiments of the present disclosure.
- FIG. 9 is a block diagram illustrating the architecture of an electronic device such as electronic device 100 shown and described with respect to FIG. 1A or electronic device 130 shown and described with respect to FIG. 1B .
- a touch-sensitive device 820 such as a touch sensor or touch screen is integrated with a surface of the device.
- the touch sensor or touch screen includes a capacitive sensor that is configured to detect the location of one or more touches on the surface of the device.
- the device includes a force sensor that is configured to detect and measure the force of a touch on a device.
- the device 800 includes one or more buttons 822 and knobs 824 that are configured to accept user input. Additional user input can, optionally be provided via a keyboard, mouse, pen or stylus, sound input device, and the like.
- the knob 824 can include a crown of a portable electronic device.
- the knob 834 or crown is operatively coupled to a position sensor, such as an optical encoder, that is configured to produce an output in response to a rotational input.
- a position sensor such as an optical encoder
- an example device includes one or more environmental sensors that are configured to monitor and detect one or more environmental conditions.
- Example sensors 830 include motion sensors, including accelerometers, gyroscopes, tilt sensors, and the like.
- the sensors 830 also include one or more optical sensors, including, an image sensor, ALS sensor, proximity sensor, and the like.
- the sensors 830 also include a microphone or other audio sensing device.
- the device 800 includes one or more devices or components for providing output to the user. As shown in FIG. 9 , the device includes a display 840 for presenting visual information or output to the user. In some embodiments, the display 840 is formed from a liquid crystal display (LCD), organic light emitting diode (OLED) display, organic electroluminescence (OEL) display, or other type of display device. In some embodiments, the device 800 includes a visual indicator 842 , such as a beacon or strobe light, that is configured to provide additional visual output to the user.
- a visual indicator 842 such as a beacon or strobe light
- the device 800 includes a speaker 844 or other acoustic component.
- the speaker 844 can be used to produce an audio output in accordance with some aspects of the disclosure.
- An example speaker component is described below with respect to FIG. 10 .
- the device 800 also includes a haptic actuator 846 that is configured to produce a haptic output in accordance with some aspects of the disclosure.
- An example haptic actuator is described below with respect to FIGS. 11A-B .
- data and information generated or captured by the electronic device 800 is stored locally. Additionally or alternatively, the data can be stored on any number of storage media that can, optionally be accessed by the electronic device 800 using the communications connection ( 808 in FIG. 8 ), a wired connection or a wireless connection between the electronic device 800 and a remote computing device 806 . Additionally, data and information can be readily transferred between computing devices.
- FIG. 10 depicts an example acoustic module in accordance with some embodiments.
- the device includes one or more devices for transmitting acoustic energy.
- embodiments of the device include a speaker for transmitting acoustic energy.
- FIG. 10 depicts a simplified schematic cross-sectional view of a first embodiment of a device having an speaker 1000 .
- the representation depicted in FIG. 10 is not drawn to scale and does not include all elements of every embodiment of a speaker.
- the speaker 1000 is representative of speakers or acoustic elements described with respect to one or more embodiments described herein.
- the speaker 1000 includes various components for producing and transmitting sound, including a diaphragm 1010 , a voice coil 1009 , a center magnet 1008 , and side magnets/coils 1007 .
- the diaphragm 1010 is configured to produce sound waves or an acoustic signal in response to a stimulus signal in the center magnet 1008 .
- a modulated stimulus signal in the center magnet 1008 causes movement of the voice coil 1009 , which is coupled to the diaphragm 1010 .
- Movement of the diaphragm 1010 creates the sound waves, which propagate through the acoustic cavity 1011 of acoustic module 106 and eventually out the acoustic port 1020 to a region external to the device.
- the acoustic cavity 1011 functions as an acoustical resonator having a shape and size that is configured to amplify and/or dampen sound waves produced by movement of the diaphragm 1010 .
- the speaker 1000 also includes a yoke 1014 , support 1013 , connector element 1012 , and a cavity wall 1013 . These elements provide the physical support of the speaker elements. Additionally, the connector element 1012 and the cavity wall 1013 together form at least part of the acoustic cavity 1011 .
- the specific structural configuration of FIG. 10 is not intended to be limiting.
- the acoustic cavity can, optionally be formed from additional components or can, optionally be formed from a single component.
- the speaker 1000 depicted in FIG. 10 is provided as one example of a type of speaker or acoustic module.
- the speaker includes different configurations for producing and transmitting sound, including, for example, a vibrating membrane, piezoelectric transducer, vibrating ribbon, or the like.
- the acoustic module is a microphone acoustic module having one or more elements for converting acoustic energy into an electrical impulse.
- the acoustic module can, optionally alternatively include a piezoelectric microphone element for producing a charge in response to acoustic energy or sound.
- an acoustic port 1020 is formed in the case 1021 of the electronic device.
- the acoustic port 1020 includes a first and second orifice 1031 , 1032 that are formed in the case 1021 and acoustically couple the acoustic cavity 1011 of the speaker 1000 to the external environment (external to the electronic device).
- the first and second orifices 1031 , 1032 are offset with respect to the opening of the acoustic cavity 1011 . This configuration may help reduce the direct ingress of liquid 1001 into acoustic cavity 1011 of the speaker 1000 . Also, as shown in FIG.
- the speaker 1000 also includes a screen element 1015 disposed at one end of the acoustic cavity 1011 , which may also prevent the ingress of liquid or other foreign debris into the acoustic cavity 1011 .
- FIGS. 11A-B depict an example haptic actuator in accordance with some embodiments.
- the device includes one or more haptic modules for providing haptic feedback to the user.
- a haptic device is configured to produce a mechanical movement or vibration that is transmitted through the case and/or other component of the device. In some cases, the movement or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user.
- a haptic mechanism may use a moving mass used to create the movement or vibration of the haptic output. The larger the mass that is moved, the easier it may be to create a perceivable stimulus using the haptic mechanism.
- a large moving mass and the supporting mechanism may be difficult to integrate into the compact space of, for example, the case of a wearable electronic wristwatch device.
- FIGS. 11A-B depict one example haptic mechanism suitable for use in a wearable electronic device. While the embodiment described with respect to FIGS. 11A-B is provided as one example, the haptic module is not limited to this particular configuration.
- FIG. 11A depicts a three-quarters perspective view of the of a haptic module 1100 , with a top, front and left sidewall of the case 1120 removed to expose internal components.
- FIG. 11B depicts a cross-sectional perspective view of the haptic module 1100 cut in half to expose the internal components.
- a coil 1101 is used to induce movement of a frame 1160 , which houses a central magnet array 1110 .
- the movement of the frame 1160 is guided by a shaft 1150 that is fixed with respect to a case 1120 .
- the coil 1101 is energized by transmitting a current (e.g., from the battery) along a length of a wire that forms the coil 1101 .
- a direction of the current along the wire of the coil 1101 determines a direction of a magnetic field that emanates from the coil 1101 .
- the direction of the magnetic field determines a direction of movement of the frame 1160 housing the central magnet array 1110 .
- One or more springs can, optionally bias the frame 1160 towards the middle region of the travel.
- the frame 1160 and central magnet array 1110 through operation of the coil 1101 function as a moving mass, which generates a tap or vibration.
- the output of the haptic module 1100 created by the moving mass of the frame 1160 and central magnet array 1110 , may be perceived as a haptic feedback or stimulus to the user wearing the device.
- the coil 1101 when the coil 1101 is energized, the coil 1101 generates a magnetic field.
- the opposing polarities of the magnets in the magnet array 1110 generates a radial magnetic field that interacts with the magnetic field of the coil 1101 .
- the Lorentz force resulting from the interaction of the magnetic fields causes the frame 1160 to move along the shaft 1150 in a first direction. Reversing current flow through the coil 1101 reverses the Lorentz force.
- the magnetic field or force on the central magnet array 1110 is also reversed and the frame 1160 moves in a second direction.
- frame 1160 can, optionally move in both directions along the shaft 1150 , depending on the direction of current flow through the coil.
- the coil 1101 encircles the central magnet array 1110 , which is disposed near the center of the frame 1160 .
- the coil 1101 can, selectively be energized by transmitting a current along the length of the wire forming the coil 1101 and the direction of the current flow determines the direction of the magnetic flux emanating from the coil 1101 in response to the current. Passing an alternating current through the coil 1101 causes the central magnet array 1110 (and frame 1160 ) to move back and forth along a shaft 1150 .
- the shaft 1150 can, optionally be formed from a non-ferritic material such as tungsten, titanium, stainless steel, or the like.
- the coil 1101 is positioned within a frame 1160 that holds the central magnet array 1110 , but is not affixed to the coil 1101 . Rather, an air gap separates the coil 1101 from the central magnet array 1110 and the frame 1160 is free to move with respect to the coil 1101 , which is generally stationary. Further, the frame 1160 generally moves with the central magnet array 1110 . As illustrated in FIG. 11A-B , the frame 1160 has an aperture formed therein of sufficient size to contain the coil 1101 .
- the coil 1101 does not contact any portion of the frame 1160 .
- the coil 1101 remains stationary in the case 1120 while the frame 1160 and central magnet array 1110 move, although in other embodiments the coil 1101 moves instead of, or in addition to, the frame and/or central magnet array.
- the central magnet array 1110 is formed from at least two magnets 1111 , 1112 of opposing polarities.
- a center interface 1170 can, optionally be formed from a ferritic or non-ferritic material, depending on the embodiment.
- a ferritic material for the center interface 1170 may enhance the overall magnetic field generated by the central magnet array 1110 , while a non-ferritic material may provide at least a portion of a return path for magnetic flux and thus assist in localizing the flux within the case 1120 .
- the magnets 1111 , 1112 are formed from neodymium while the frame is tungsten. This combination may provide a strong magnetic field and a dense mass, thereby yielding a high weight per volume structure that may be used as the moving part of the haptic module 1100 .
- FIG. 12 depicts an example crown with an optical encoder in accordance with some embodiments.
- the crown and optical encoder of FIG. 12 may correspond to the example crown 610 described above with respect to FIG. 6 .
- embodiments of the device include a crown used to accept rotary input from the user, which can be used to control aspects of the device.
- the crown can be turned by the user to scroll a display or select from a range of values.
- the crown can be rotated to move a cursor or other type of selection mechanism from a first displayed location to a second displayed location in order to select an icon or move the selection mechanism between various icons that are output on the display.
- the crown can also be used to adjust the position of watch hands or index digits displayed on the display of the device.
- the crown can also be used to control the volume of a speaker, the brightness of the display screen, or control other hardware settings.
- the embodiments described herein can be used for at least a portion of the crown module integrated into a wearable electronic device.
- the embodiments are provided as examples and do not necessarily include all of the components or elements used in a particular implementation. Additionally, the crown module is not intended to be limited to the specific examples described below and can vary in some aspects depending on the implementation.
- an optical encoder is used to detect the rotational motion of the crown. More specifically, the example provided below with respect to FIG. 12 uses an optical encoder to detect rotational movement, rotational direction and/or rotational speed of a component of the electronic device. Once the rotational movement, rotational direction and/or rotational speed have been determined, this information can be used to output or change information and images that are presented on a display or user interface of the electronic device.
- the optical encoder of the present disclosure includes a light source 1270 , a photodiode array 1280 , and a shaft 1260 .
- the optical encoder of the present disclosure utilizes an encoding pattern 1265 disposed directly on the shaft 1260 .
- the encoding pattern 1265 includes a number of light and dark markings or stripes that are axially disposed along the shaft 1260 . Each stripe or combination of stripes on the shaft can be used to identify a position of the shaft 1260 .
- Light emitted from the light source 1270 is reflected off of the shaft 1260 and into the photodiode array 1280 .
- the reflected light can be used to determine the movement of the encoding pattern 1265 , and thus the movement of the shaft 1260 and the crown 1200 .
- Using the output from the photodiode array 1280 can be used to determine a position, rotation, rotation direction, and rotation speed of the shaft 1260 . Based on the rotation, rotation direction, and/or speed, the encoder output may be used to change information or images that are presented on the display or user interface of the electronic device.
- a photodiode array is specifically mentioned, embodiments disclosed herein can, optionally use various types of sensors that are arranged in various configurations for detecting the movement described herein.
- the movement of the shaft 1260 is detected by an image sensor, a light sensor such as a CMOS light sensor or imager, a photovoltaic cell or system, photo resistive component, a laser scanner and the like.
- the signals or output of the optical encoder can be used to control various aspects of other components or modules of the device.
- the dial 1240 can be rotated in a clockwise manner in order to advance the displayed time forward.
- the optical encoder can be used to detect the rotational movement of the dial 1240 , the direction of the movement, and the speed at which the dial 1240 is being rotated.
- the displayed hands of a time keeping application may rotate or otherwise move in accordance with the user-provided rotational input.
- an audio and/or haptic output may be generated in accordance with the rotational movement of the dial 1240 .
- an audio click and/or a haptic tap can be output for every 5 degrees, 10 degrees, or other degree amount of rotation of the dial 1240 .
- the crown 1200 is formed from dial 1240 that is coupled to the shaft 1260 .
- the shaft 1260 and dial 1240 are formed as a single piece.
- the shaft 1260 is coupled to, or is otherwise a part of the dial 1240 , as the dial 1240 rotates or moves in a particular direction and at a particular speed, the shaft 1260 also rotates or moves in the same direction and with the same speed.
- the shaft 1260 of the optical encoder includes an encoding pattern 1265 .
- the encoding pattern 1265 can be used to determine positional information about the shaft 1260 including rotational movement, angular displacement and movement speed.
- the encoding pattern 1265 includes an array of light and dark stripes.
- the encoding pattern 1265 can consist of various types of stripes having various shades or colors that provide surface contrasts.
- the encoding pattern 1265 can include a stripe or marking that has a high reflective surface and another stripe that has a low reflective surface regardless of the color or shading of the stripes or markings.
- a first stripe of the encoding pattern 1265 causes specular reflection while a second stripe of the encoding pattern causes diffuse reflection.
- the light from the light source 1270 diffracts from the shaft. Based on the diffracted light, the photodiode array 1280 can determine the position, movement and direction of movement of the shaft 1260 .
- the stripes of the encoding pattern 1265 extend axially along the shaft 1260 .
- the stripes extend along the entire length of the shaft 1260 or partially along a length of the shaft 1260 .
- the encoding pattern 1265 is disposed or formed around the entire circumference of the shaft 1260 .
- the encoding pattern 1265 includes a radial component. In yet other embodiments, the encoding pattern 1265 includes both a radial component and an axial component.
- FIG. 13 shows a functional block diagram of an electronic device 1300 configured in accordance with the principles of the various described embodiments.
- the electronic device 1300 can be used to perform the process 700 described above with respect to FIG. 7A .
- the functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described in FIG. 13 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein.
- an electronic device 1300 includes an event detection unit 1302 configured to detect the occurrence of an event, an alert output unit 1304 configured to output an alert, a sensing unit 1306 configured to detect one or more environmental conditions, and a processing unit 1310 coupled to the event detection unit 1302 , the alert output unit 1304 , and the sensing unit 1306 .
- the processing unit 1310 includes a response determining unit 1312 and a selecting unit 1314 .
- the processing unit 1310 is configured to, while the device is subject to the one or more environmental conditions, detect the occurrence of an event (e.g., using the event detection unit). In response to detecting the occurrence of the event, determine a response to the event (e.g., using the response determining unit) based on a current alert mode selected from a set of three or more alert modes (e.g., using the selecting unit), the selection based on the one or more environmental conditions (e.g., using the sensing unit).
- the determining the response includes: in accordance with a determination that the current alert mode is a first alert mode, outputting a first alert in response to the event (e.g., using the alert output unit), and in accordance with a determination that the current alert mode is a second alert mode, outputting a second alert in response to the event (e.g., using the alert output unit), wherein the second alert is different from the first alert.
- the current alert mode is automatically selected (e.g., using the selecting unit), based on the one or more environmental conditions prior detecting the occurrence of the event.
- the current alert mode is automatically selected using an environmental sensor (e.g., of the sensing unit 1306 ) that is configured to detect the one or more environmental conditions.
- the environmental sensor is a microphone configured to detect an ambient sound level
- the current alert mode that is selected includes one or more of: a visual component that corresponds to the ambient sound level, an audio component that corresponds to the ambient sound level and a haptic component that corresponds to the ambient sound level.
- the environmental sensor is a motion sensor configured to detect an activity level
- the current alert mode that is selected includes one or more of: a visual component that corresponds to the activity level, an audio component that corresponds to the activity level and a haptic component that corresponds to the activity level.
- the environmental sensor is an image sensor configured to detect an ambient light level
- the current alert mode that is selected includes one or more of: a visual component that corresponds to the ambient light level, an audio component that corresponds to the ambient light level, and a haptic component that corresponds to the ambient light level.
- the environmental sensor is a battery power sensor configured to detect a current battery level
- the current alert mode that is selected includes one or more of an audio component and a haptic component, wherein an estimated peak power output of the current alert mode corresponds to the current battery level.
- the first alert mode includes a first haptic component and a first visual component
- the second alert mode includes a second haptic component and no visual component.
- the first alert mode includes a first audio component and a first haptic component
- the second alert mode includes a second audio component and second haptic component, wherein the first audio and first haptic component are different than the second audio component and the second haptic component, respectively.
- the first alert mode includes no audio component and no haptic component.
- the first alert mode includes a first audio component and a first haptic component offset by a first delay
- the second alert mode includes the first audio component and the first haptic component offset by a second delay that is different than the first delay.
- the processing unit 1310 is further configured to, after selecting the current alert mode, select a subsequent current alert mode (e.g., using the selecting unit 1314 ) based on a changed environmental condition.
- determining the response to the event includes, in accordance with a determination that the current alert mode is a third alert mode (e.g., using the response determining unit 1312 ), outputting a third alert in response to the event, wherein the third alert is different from the first alert and the second alert.
- FIG. 14 shows a functional block diagram of an electronic device 1400 configured in accordance with the principles of the various described embodiments.
- the electronic device 1400 can be used to perform the process 710 described above with respect to FIG. 7B .
- the functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described in FIG. 14 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein.
- an electronic device 1400 includes an event detection unit 1402 configured to detect the occurrence of an event, an alert output unit 1404 configured to output an alert, a sensing unit 1406 configured to detect an activity level, and a processing unit 1410 coupled to the event detection unit 1402 , the alert output unit 1404 , and the sensing unit 1406 .
- the processing unit 1410 includes a response determining unit 1412 configured to determine if an activity level exceeds a threshold.
- the processing unit 1410 is configured to, detect an event (e.g., using the event detection unit 1402 ) and, in response to detecting the event, in accordance with a determination that an activity level exceeds a threshold (e.g., using the threshold determining unit 1412 ), forgoing outputting an alert. In accordance with a determination that the activity level does not exceed the threshold (e.g., using the threshold determining unit 1412 ), outputting the alert (e.g., using the alert output unit 1404 ).
- an event e.g., using the event detection unit 1402
- a threshold e.g., using the threshold determining unit 1412
- the activity level is determined using a motion sensor of the electronic device (e.g., using the sensing unit 1406 ). In some embodiments, the activity level is determined using a motion sensor of the electronic device to detect a number of motion events over a predetermined time (e.g., using the sensing unit 1406 ).
- the processing unit 1410 is further configured to, after forgoing outputting the alert, detect that the activity level has dropped below a low-activity threshold (e.g., using the threshold determining unit 1412 ) and output the alert (e.g., using the alert output unit 1404 ).
- the low-activity threshold is different than the threshold.
- the processing unit 1410 is further comprised to, after a predetermined amount of time after forgoing the alert, make a subsequent determination (e.g., using the threshold determining unit 1412 ) whether the threshold has been exceeded, and, in accordance with the subsequent determination that the activity level exceeds a threshold (e.g., using the threshold determining unit 1412 ), forgo outputting an alert, and in accordance with the subsequent determination that the activity level does not exceed the threshold (e.g., using the threshold determining unit 1412 ), outputting the alert (e.g., using the alert outputting unit 1404 ).
- a subsequent determination e.g., using the threshold determining unit 1412
- FIG. 15 shows a functional block diagram of an electronic device 1500 configured in accordance with the principles of the various described embodiments.
- the electronic device 1500 can be used to perform the process 720 described above with respect to FIG. 7C .
- the functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described in FIG. 15 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein.
- an electronic device 1500 includes an event detection unit 1502 configured to detect the occurrence of an event, an alert output unit 1504 configured to output an alert, and a processing unit 1510 coupled to the event detection unit 1502 and the alert output unit 1504 .
- the processing unit 1410 includes a threshold determining unit 1512 that is configured to determine whether a number of events exceeds a threshold.
- the processing unit 1510 is configured to detecting an event (e.g., using the event detection unit 1502 ). In response to detecting the event, the processing unit 1510 is further configured to, in accordance with a determination that a number of events that have been detected over a predetermined period exceeds a threshold (e.g., using the threshold determining unit 1512 ), and outputting an alert (e.g., using the alert output unit 1504 ), and in accordance with a determination that the number of events that have been detected over the predetermined period does not exceed the threshold (e.g., using the threshold determining unit 1512 ), forgoing outputting the alert.
- a threshold e.g., using the threshold determining unit 1512
- the processing unit 1510 is further configured to detect a subsequent event (e.g., using the event detection unit), and in response to detecting the subsequent event: if an alert associated with a previous event has been forgone, and in accordance with a determination that the number of events that have been detected over the predetermined period exceeds the threshold (e.g., using the threshold determining unit 1512 ), outputting the alert (e.g., using the alert output unit 1504 ), wherein the alert is based, at least in part, on the previous event having the alert that has been forgone.
- the determination as to whether or not the number of events that have been detected over the predetermined period includes counting the event.
- the determination as to whether or not the number of events that have been detected over the predetermined period includes counting one or more prior events that were detected within the predetermined period before the event was detected.
- the alert includes information indicative of the event and one or more prior events occurring prior to the event.
- a strength of the alert (e.g., using the alert output unit 1504 ) corresponds to the number of detected events.
- the strength of the alert corresponds to a frequency and a type of detected events.
- the event includes one or more of: receiving an e-mail, receiving a phone call, receiving a message, and receiving calendar reminder.
- FIG. 16 shows a functional block diagram of an electronic device 1600 configured in accordance with the principles of the various described embodiments.
- the electronic device 1600 can be used to perform the process 730 described above with respect to FIG. 7D .
- the functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described in FIG. 16 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein.
- an electronic device 1600 includes an alert output unit 1602 configured to output an alert, an input unit 1604 configured to receive an interaction from the user, and a processing unit 1610 coupled to the alert output unit 1602 and the input unit 1604 .
- the processing unit 1610 includes a detection unit 1612 configured cooperate with the input unit 1604 to detect an interaction received by the user, and a selecting unit 1614 configured to select a modified alert sequence.
- the processing unit 1610 is configured to, output a portion of an alert sequence (e.g., using the alert output unit 1602 ).
- the alert sequence includes predetermined sequence of alert outputs.
- the processing unit 1610 is also configured to detect an interaction from the user (e.g., using the detecting unit) during the output of the portion of the alarm sequence (e.g., using the alert output unit 1602 ). In response to detecting the interaction, the processing unit 1610 is further configured to select a modified alert sequence (e.g., using the selecting unit 1614 ) in response to the input, and output the modified alert sequence (e.g., using the alert output unit 1602 ).
- the alert sequence includes a series of alarm outputs that escalate in intensity over time.
- the modified alert sequence is a non-escalating alert sequence.
- the alert sequence is a sequence of alerts that correspond to a single event.
- the modified alert sequence is a silent alert sequence having no audio component.
- the input received at the input unit 1604 includes a request to reduce an intrusiveness the portion of the alert sequence, and the modified alert sequence has a reduced intrusiveness.
- the input received at the input unit 1604 includes a request to increase an intrusiveness of the portion of the alert sequence and the modified alert sequence has an increased intrusiveness.
- FIG. 17 shows a functional block diagram of an electronic device 1700 configured in accordance with the principles of the various described embodiments.
- the electronic device 1700 can be used to perform the process 740 described above with respect to FIG. 7E .
- the functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described in FIG. 17 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein.
- an electronic device 1700 includes an event detection unit 1702 configured to detect an event, an alert output unit 1704 configured to output an alert, a communication unit 1706 that is configured to conduct communication between the electronic device and an external device, and a processing unit 1710 coupled to the event detection unit 1702 and the alert output unit 1704 .
- the processing unit 1710 includes a proximity determining unit 1712 configured to determine if a second device is proximate to the electronic device and a device selection unit 1714 configured to select a device to output an alert.
- the processing unit 1710 is configured to detecting an event (e.g., using the event detection unit 1702 ). In response to detecting the event, the processing unit 1710 is configured to, in accordance with a determination that a second device is in proximity to the first device (e.g., using the proximity detection unit 1712 ), select an alert-output device (using the device selection unit 1714 ), and output the alert on the alert-output device (e.g., using the alert output unit 1704 ). In some embodiments, the alert is not output on a device that is not selected as the alert-output device. In some embodiments, the alert is relayed to the second device using the first device (e.g., using the communication unit 1706 ). In some embodiments, the first device is a mobile phone and the second device is a wearable computing device. In some embodiments, a communication channel is established between the second device and the first device using a pairing operation (e.g., using the communication unit).
- At least one additional device is in proximity to the first device, and the alert-output device is selected (e.g., using the device selecting unit 1714 ) from the first device, the second device, and the at least one additional device.
- the alert-output device is selected (e.g., using the device selecting unit 1714 ) based on a user-provided prioritization.
- a second alert is sent using a device that was not selected as the alert-output device.
- the first and second devices are updated in response to the detected event, and wherein the alert is only output on the selected alert-output device.
- the alert-output device is selected (e.g., using the device selection unit 1714 ) based on a usage of one or more of: the first device and the second device, wherein the usage includes a time of usage, and either the first or second device having a time of usage that is most recent is selected as the alert-output device.
- the alert-output device is selected (e.g., using the device selection unit 1714 ) based on a usage of one or more of: the first device and the second device, wherein the usage includes a time of usage and an amount of usage, and either the first or second device having an amount of usage that is greater over a predetermined time period is selected as the alert-output device.
- the alert-output device is selected (e.g., using the device selection unit 1714 ) based on a usage of one or more of: the first device and the second device, wherein the usage includes a type of usage, and either the first or second device having a type of usage that corresponds to a predetermined usage type is selected as the alert-output device.
- FIG. 18 shows a functional block diagram of an electronic device 1800 configured in accordance with the principles of the various described embodiments.
- the electronic device 1800 can be used to perform the process 750 described above with respect to FIG. 7F .
- the functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described in FIG. 18 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein.
- an electronic device 1800 includes an input unit 1802 configured to receive an input from the user, an alert output unit 1804 configured to output an alert, and a processing unit 1810 coupled to the input unit 1802 and the alert output unit 1804 .
- the processing unit 1810 includes a detection unit 1812 configured cooperate with the input unit 1802 to detect a property of the input provided by the user, and a selecting unit 1814 configured to select a modified alert sequence.
- the input unit 1802 is configured to receive a first input on the device, the first input being below an input threshold.
- the processing unit 1810 is configured to produce a first output (e.g., using the alert output unit 1804 ).
- the first output includes a haptic component for the first input that is coordinated with an audio component for the first input.
- the input unit 1802 is also configured to receive a second input on the device.
- the processing unit 1810 is configured to, in response to detecting the second input (e.g., using the detection unit 1812 ), produce a second output in response to the second input (e.g., using the alert output unit 1804 ).
- the processing unit 1810 is further configured to, in accordance with a determination that the second input is below the input threshold (e.g., using the determining unit 1816 ), produce a second output (e.g., using the alert output unit 1804 ), which includes a haptic component for the second input that is coordinated with an audio component for the second input.
- the processing unit 1810 is further configured to, in accordance with a determination that the second input is above the input threshold (e.g., using the determining unit 1816 ), produce a second output (e.g., using the alert output unit 1804 ) that includes a modified haptic component for the second input.
- the haptic component for the first input is synchronized with the audio component for the first input, if the second input is below the input threshold, the haptic component for the second input is synchronized with the audio component for the second input, and if the second input is above the input threshold, the haptic component for the second input is asynchronous with respect to the audio component for the second input.
- the input threshold includes a speed threshold
- the synchronous haptic is a discrete haptic output that corresponds to a discrete audio output
- the asynchronous haptic is a continuous haptic output.
- the input unit 1802 receives a first input and a second input that are rotation inputs. In some embodiments, the input unit 1802 receives a rotation that is a circular motion on a touch-sensitive region of the electronic device. In some embodiments, the input unit 1802 receives a rotation that is rotation of a physical knob integrated into the device. In some embodiments, the alert output unit 1804 produces an audio component that includes a series of click sounds that corresponds to changes in angular position of the knob. In some embodiments, the input unit 1802 receives a first input and second inputs that are scrolling inputs for a display of the electronic device. In some embodiments, the alert output unit 1804 produces an audio component that includes a series of click sounds that correspond to a movement through a list of items on the display of the electronic device.
- Embodiments of the present disclosure are described above with reference to block diagrams and operational illustrations of methods and the like.
- the operations described may occur out of the order as shown in any of the figures. Additionally, one or more operations may be removed or executed substantially concurrently. For example, two blocks shown in succession may be executed substantially concurrently. Additionally, the blocks may be executed in the reverse order.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Embodiments of the present disclosure provide a system and method for providing an output for an electronic device. In certain embodiments, an alert is output in accordance with a current alert mode, which are selected based on one or more environmental conditions. The environmental conditions may be detected using one or more environmental sensors. The alert can optionally include one or more of: an audio component, a haptic component and a visual component. One or more of alert components correspond to an aspect of the environmental condition detected by the one or more environmental sensors.
Description
- This application is a division of U.S. patent application Ser. No. 16/900,440, filed Jun. 12, 2020, which is a continuation of U.S. patent application Ser. No. 16/226,535, filed Dec. 19, 2018, which is a continuation of U.S. patent application Ser. No. 15/595,593, filed May 15, 2017, now U.S. Pat. No. 10,210,743, which is a continuation of U.S. patent application Ser. No. 14/503,339, filed Sep. 30, 2014, now U.S. Pat. No. 9,659,482, which claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/044,657, filed on Sep. 2, 2014, the contents of which are incorporated by reference as if fully disclosed herein.
- Generally, the present disclosure is directed to selecting and providing an alert level for an electronic device. Specifically, the present disclosure is directed to providing an alert that is selected from a set of three or more alert modes based on one or more environmental conditions associated with the electronic device.
- Electronic devices have become ubiquitous in our daily lives. Certain electronic devices including, cell phones, tablet computers, personal digital assistants, and the like have become common items in the workplace and at home. Some of these electronic devices include an ability to notify a user particular item of interest, such as, for example, an incoming phone call, or may otherwise attempt to gain the user's attention through the use of an alarm or signal.
- In many electronic devices, certain qualities of the notification are either fixed or must be manually adjust by the user to accommodate different environmental conditions. However, depending on the operating environment of the device, the notification may be undesirable or inappropriate. It is with respect to these and other general considerations that embodiments of the present disclosure have been made. Also, although relatively specific problems have been discussed, it should be understood that the embodiments disclosed herein should not be limited to solving the specific problems identified in the background.
- This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description section. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Embodiments of the present disclosure provide a system and method for providing an alert in response to detecting an occurrence of an event. In some embodiments, in response to detecting the occurrence of the event, a response to the event is determined based on a current alert mode selected from a set of three or more alert modes. The selection may be based on the one or more environmental conditions. In accordance with a determination that the current alert mode is a first alert mode, a first alert may be output in response to the event. In accordance with a determination that the current alert mode is a second alert mode, a second alert may be output in response to the event. The second alert may be different from the first alert.
- Embodiments of the present disclosure provide a system and method for forgoing an alert in response to detecting a level of user activity or receiving a number or event notifications that are below a threshold. In some embodiments, in response to detecting the event, an output of an alert is forgone, in accordance with a determination that an activity level exceeds a threshold. The alert is output in accordance with a determination that the activity level does not exceed the threshold. In some embodiments, in response to detecting the event, an alert is output in accordance with a determination that a number of events that have been detected over a predetermined period exceeds a threshold. The output of an alert is forgone in accordance with a determination that the number of events that have been detected over the predetermined period does not exceed the threshold.
- Embodiments of the present disclosure provide a system and method for providing a modified alert sequence in response to detecting an interaction by the user. In some embodiments, a portion of an alert sequence is output. An interaction with the user is detected during the outputting of the portion of the alarm sequence, and in response to detecting the interaction, a modified alert sequence is selected in response to the input. The modified alert sequence is output using the device.
- Embodiments of the present disclosure provide a system and method for selecting a device to output an alert in response to detecting another device that is in proximity to the electronic device. In some embodiments, in response to detecting an event, an alert-output device is selected in accordance with a determination that a second device is in proximity to the first device, and the alert is output on the alert-output device.
- Embodiments of the present disclosure provide a system and method for providing an audio and haptic output that depends on the speed of a user input. In some embodiments, a first input is received on the that is below an input threshold. In response to detecting the first input, a first output is produced using the device. The first output includes a haptic component for the first input that is coordinated with an audio component for the first input. A second input is received on the device, and in response to detecting the second input, a second output is produced. In accordance with a determination that the second input is below the input threshold, the second output includes a haptic component for the second input that is coordinated with an audio component for the second input, and in accordance with a determination that the second input is above the input threshold, the second output includes a modified haptic component for the second input.
-
FIGS. 1A-B depict example electronic devices that may be used to provide an alert according to one or more embodiments of the present disclosure. -
FIG. 2 depicts an example electronic device being worn by a user according to one or more embodiments of the present disclosure. -
FIG. 3 depicts an example electronic device being worn and another example electronic device being carried by the user according to one or more embodiments of the present disclosure. -
FIG. 4 depicts an example electronic device in an exemplary operating environment according to one or more embodiments of the present disclosure. -
FIG. 5 depicts a user interacting with an example electronic device according to one or more embodiments of the present disclosure; -
FIG. 6 depicts example user input to an electronic device according to one or more embodiments of the present disclosure. -
FIG. 7A depicts a process for determining a response to an event according to one or more embodiments of the present disclosure. -
FIG. 7B depicts a process for determining whether or not to respond to an event based on user activity according to one or more embodiments of the present disclosure. -
FIG. 7C depicts a process for determining whether or not to respond to an event based on a number of events according to one or more embodiments of the present disclosure. -
FIG. 7D depicts a process for outputting a modified alert sequence according to one or more embodiments of the present disclosure. -
FIG. 7E depicts a process for determining an output device according to one or more embodiments of the present disclosure. -
FIG. 7F depicts a process for producing an audio and haptic feedback in response to a user input according to one or more embodiments of the present disclosure. -
FIGS. 8-9 are block diagrams of an example electronic device that may be used with one or more embodiments of the present disclosure. -
FIG. 10 depicts an example acoustic module of an electronic device that may be used with one or more embodiments of the present disclosure. -
FIGS. 11A-B depict an example haptic actuator of an electronic device that may be used with one or more embodiments of the present disclosure. -
FIG. 12 depicts an example crown with an optical encoder that may be used with one or more embodiments of the present disclosure. -
FIGS. 13-18 depict functional block diagrams of electronic devices in accordance with some embodiments. - As discussed above, embodiments of the present disclosure provide a system and method for producing an alert according to an alert mode that is automatically selected based on one or more environmental conditions. The environmental conditions optionally relate to the ambient conditions in which the electronic device is being operated. In some implementations, the electronic device detects or senses the environmental conditions using one or more sensors associated with an electronic device. The output from the one or more sensors is, optionally used to determine or estimate certain qualities of the environmental conditions or operating environment of the electronic device, including, for example, noise level, light level, motion level, and the like. Based on the one or more environmental conditions, an alert mode is, optionally selected from a set of three or more alert modes. In response to detecting the occurrence of an event, the device optionally produces an alert, in accordance with the selected alert mode that corresponds to the one or more environmental conditions.
- Each alert mode may define a distinct alert that may include multiple components that provide different types of stimuli to the user. For example, an alert mode may define an audio component, a visual component, and/or a haptic component. Additionally, an alert mode may define a relative timing between components. For example, the alert mode optionally defines a slight delay between an audio component and a haptic component to produce a composite stimulus that is more readily detected by the user in some situations. The components of the alert, including the relative timing of the components, can be varied to provide a composite stimulus that is tailored to a particular scenario or set of environmental conditions. In some cases, the alert mode can be automatically selected based on the one or more environmental conditions that are detected.
- In some embodiments, the environmental sensor is a microphone that is configured to detect an ambient sound level. The alert mode may be selected based on ambient sound level detected by the sensor. In some cases, the selected alert mode includes an audio component that corresponds to or is appropriate for the ambient sound level detected by the sensor. In some cases, the environmental sensor is a motion sensor that is configured to detect an activity level, which is used to select an alert mode. The alert mode that is selected can have an audio component, a haptic component, and/or visual component that corresponds to detected the activity level. In some cases, the environmental sensor is an image sensor that is configured to detect an ambient light level, which is used to select an alert mode. In some cases, one or more sensors are configured to detect a current battery level, which can, optionally be used to select an alert mode that conserves power or reduced peak power usage. For example, by separating the timing of audio and haptic alert components of an alert the peak power output may be reduced. Also, by reducing the amplitude of audio and/or haptic alert components, the peak power output may be reduced.
- In some implementations, the alert is tailored to represent a series of events that are detected over a predetermined time. In some embodiments, a series of closely occurring events results in a single, batched alert instead of triggering a series of individual alerts for each event. For example, a series of text messages may be received over a relatively short time period. Instead of producing a separate alert for each text message, an alert output may be held or forgone for a period of time and then a single, batched output may be produced. A combined or batched alert may be useful, for example, when a large number of event occur over a period of time, or when the time between events is very small. In these cases, producing a single alert may be more effective in capturing the user's attention and may also prevent alert fatigue. For example, if a user receives a large number of alerts over a short time period, or receives a nearly continuous stream of alerts, the user may begin to ignore or disregard the alerts.
- In one specific example, the number of events that occur over a period of time are monitored by the device. If the number of events is less than a threshold amount, the device can, optionally forgo outputting an alert. However, once the number of events exceeds the threshold, a composite or batched alert can, optionally be produced or output by the device. Events that are monitored include, without limitation, receiving an e-mail, receiving a phone call, receiving a message, and/or receiving calendar reminder.
- In some implementations, an alert is conditionally delayed or forgone while a user is active. If, for example, the user is engaged in exercise or heavy activity, the stimulus provided by an alert may not be readily detected. Thus, in some cases it may be advantageous to monitor or detect a user's activity level and, if an event occurs during a period of high or heavy activity, the alert associated with that event is, optionally delayed or forgone until the activity is below a threshold level. In some implementations, the activity level is based on the movement of the device, as detected by one or more motion sensors, or using one or more biometric sensors that are configured detect a user's physiological state, such as a pulse or blood oxygenation.
- In some implementations, an alert is a sequence of alert outputs that are configured to escalate by producing a stimulus or output that increases in intensity over time. In some cases, the escalation sequence or progression of the alert is interrupted and caused to be modified due to a user interaction with the device. In some embodiments, an escalating alert sequence is output by the device up until receiving an input from the user or other interaction from the user. (e.g., the user may touch the screen of the device or provide another form of input that is detected by the device.) In response to receiving the input, the device may select and output a modified alert sequence. In the case that the original alert sequence included an intensifying stimulus, the modified alert sequence may be non-escalating or have a substantially uniform stimulus.
- In some implementations, the device is configured to detect or determine if another device is in proximity to the user when an event is received or detected. The device can, conditionally determine which device is appropriate for outputting an alert associated with the event. In some implementations, the alert is output on only one of the devices that are determined to be in proximity to the user. The appropriate device can be selected based on a number of different criteria. For example, the last device that has been used by the user can be selected. Additionally or alternatively, the device that the user is currently interacting with or is predicted to be most likely to capture the user's attention can be selected to output the alert. This feature may be advantageous in reducing the number of alerts that are output and increase the likelihood that the alert will capture the user's attention.
- In some implementations, the device is configured to produce a stimulus that provides feedback for a user-action or input to the device. This feature may be advantageous for some user input components, such as electronic sensors, that may have few or no moving parts to provide feedback to the user that an input is being received. For example, when a user scrolls through a list of items using a touch screen, an audio click and/or a haptic tap may indicate the progression through the list. This may be more readily perceived by the user or more satisfying than, for example, the visual scrolling of the items alone. In some cases, the stimulus may be adapted to mimic a sound or haptic response that the user may associate with a more traditional mechanical device. In some embodiments, an audio and/or haptic output corresponds to a user input using, for example, an electronic dial or button.
- For example, a user can, optionally provide an input on a device used to drive a function or task and a synchronized audio and haptic response is used to provide the user with feedback. In some cases, if the feedback corresponds to the speed of the input, it may be possible to exceed the mechanical response of, for example, a haptic actuator used to produce the feedback. Thus, in some cases, it may be beneficial to monitor or detect the speed of the user input and transition the haptic output from, for example, a synchronous to an asynchronous or continuous output when the speed of the input exceeds a threshold.
- The implementations described above may be implemented on an electronic device that is configured to produce one or more forms of output to the user.
FIGS. 1A-B illustrate exemplaryelectronic devices electronic devices FIG. 1A , theelectronic device 100 is a wearable electronic device. In some embodiments, as shown inFIG. 1B , theelectronic device 130 is a mobile phone. Although specific examples have been given, additional electronic devices may be used. For example, the electronic device of the present disclosure can include various types of portable computing devices, including tablet computers, laptop computers, time keeping devices, computerized glasses, navigation devices, sports devices, portable music players, health devices, medical devices and the like. - As shown in
FIG. 1A , the wearableelectronic device 100 included adisplay 110. Thedisplay 110 can, optionally be formed from a liquid crystal display (LCD), organic light emitting diode (OLED) display, organic electroluminescence (OEL) display, or other type of display device. Thedisplay 110 can, optionally also include or be integrated with a touch sensor configured to accept touch input from the user over an input area. In some implementations, the input area covers the entire area of thedisplay 110 or a portion of thedisplay 110. In some implementations, the touch sensor is able to detect and measure a location and/or a force of a touch in the input area. Theelectronic device 130 also includes one ormore buttons 140 or components for receiving input from the user. - The
display 110 is configured to present various forms of visual output to the user. For example, thedisplay 110 can, optionally provide a user interface that outputs information generated or received by the wearableelectronic device 100. In some instances, thedisplay 110 presents information corresponding to one or more applications that are executed or stored on theelectronic device 100 and/or information related to communications received by theelectronic device 100. Such applications can, optionally include e-mail applications, phone applications, calendaring applications, game applications, time keeping applications and the like. In some implementations, thedisplay 110 also provides a visual output that corresponds to an alert associated with an event detected by or received by the wearableelectronic device 100. Example events include, without limitation, receiving an e-mail message, receiving a phone call, receiving a text message, receiving calendar reminder, and the like. - As shown in
FIG. 1B , theelectronic device 130 can, optionally also include a mobile phone or other such computing device. Theelectronic device 130 includes adisplay 150 for providing an visual output generated or received byelectronic device 130, as described above with respect toFIG. 1A , including the output of a visual component of an alert. Thedisplay 150 can, optionally also include or be integrated with a touch sensor configured to detect and measure a location and/or a force of a touch of touch input provided by the user. - The wearable
electronic device 100 and theelectronic device 130 can, optionally also include other devices or components for producing output, including, without limitation, a speaker, buzzer, or other device configured to generate an audio output. An audio output can be used as part of an alert produced by the device. As previously mentioned, an alert can, optionally include an audio component as part of a composite alert that includes multiple forms of stimuli, including, audio, visual, and/or haptic components. In some implementations, an audio output is also used to provide feedback to the user that is related to an action or function being performed on the device. In some embodiments, described in more detail below, an audio output corresponds to a user input to provide the user with feedback that the input is being received by the device. - The wearable
electronic device 100 and theelectronic device 130 can, optionally also include other components for producing a visual output, including, for example, a light beacon, a light source, a glowing component, a display, or the like. Components that are configured to produce a visual output can be used to provide a visual component of an alert. In some implementations, the output produced by these components is combined with the visual output of thedisplay - The wearable
electronic device 100 and theelectronic device 130 can, optionally also include a haptic actuator for producing a haptic output that may be perceived as a stimulus by the user. The haptic output can be used as part of an alert produced by the device. As previously mentioned, the haptic output can, optionally form part an alert associated with an event detected or received by thedevice - The wearable
electronic device 100 can, optionally also include aband 120 or a strap that is used to connect or secure the wearableelectronic device 100 to a user. In some embodiments, the wearableelectronic device 100 includes a lanyard or necklace. In some embodiments, the wearableelectronic device 100 is secured to or within another part of a user's body. In these and other embodiments, the strap, band, lanyard, or other securing mechanism can, optionally include one or more electronic components or sensors in wireless or wired communication with an accessory. For example, theband 120 can, optionally include a haptic actuator that is configured to produce a haptic output that may be sensed on the wrist of the user. In some embodiments, theband 120 also includes a component for producing an audio and/or visual output, similar to as discussed above with respect to thedevice band 120 includes one or more sensors, an auxiliary battery, a camera, or any other suitable electronic component. - The wearable
electronic device 100 and theelectronic device 130 can, optionally also include one or more sensors for monitoring and detecting environmental conditions. Some example sensor components are described in more detail with respect toFIGS. 8-9 . In the present example, thedevices devices devices - In the present example, the
devices devices - The
devices - In addition, the
devices devices device - Although not shown in
FIGS. 1A-B , the wearableelectronic device 100 and theelectronic device 130 can, optionally include a processor, a memory, and other components. These components, as well as other components of an exemplary computing device are described in more detail below with respect toFIGS. 8-9 . Further, the wearableelectronic device 100 and theelectronic device 130 can also, optionally include or be integrated with other components, including, for example, a keyboard or other input mechanism. Additionally, in some embodiments, thedevices devices devices devices -
FIG. 2 depicts an example electronic device being worn by a user and subjected to one or more environmental conditions according to one or more embodiments of the present disclosure.FIG. 2 may represent anelectronic device 100 subjected to one or more environmental conditions that may relevant to the user's potential interaction with thedevice 100, particularly an alert or stimulus produced by the device. For example, as shown inFIG. 2 , thedevice 100 may be subjected tomotion 220 due to movement or activity of theuser 210. As shown inFIG. 2 , themotion 220 may include movement in more than one direction and may also include a combination of rotational and translational movement. As described above with respect toFIGS. 1A-B , thedevice 100 can, optionally include one or more motion sensors that are configured to produce an output that can be used to compute or determine an activity level of theuser 210. In some cases, the activity level of the user, as detected by the one or more motion sensors, is indicative of the ability of theuser 210 to perceive certain types of stimuli. In some implementations, an appropriate alert mode is selected that corresponds to the user's activity level. In some cases, the alert mode that is selected has one or more components (e.g., audio, haptic, visual) that correspond to the activity level of theuser 210. Additionally, or alternatively, the activity level of theuser 210 can, conditionally be used to forgo or delay the output of an alert until theuser 210 is at rest and may be more likely to perceive the alert. - In addition, the
device 100 may be subjected to particular type acoustic environmental condition or conditions. For example, if theuser 210 is walking through a crowded area or in a noisy environment, thedevice 100 may be subjected to loud or high acoustic level environmental conditions. Conversely, thedevice 100 may be subjected to quiet or low acoustic level environmental conditions if, for example, theuser 210 is alone in a room or interior space. As described above with respect toFIGS. 1A-B , the device can, optionally include a microphone or other acoustic sensor that is configured to produce an output that can be used to compute or determine an ambient sound level surrounding theuser 210. In some implementations, the ambient sound level detected by the sensor(s) is indicative of the user's 210 ability to perceive certain types of stimuli. In some implementations, an appropriate alert mode is selected that corresponds to the ambient sound level. In some cases, the alert mode that is selected has one or more components (e.g., audio, haptic, visual) that correspond to the acoustic level detected by the sensor(s). - With respect to
FIG. 2 , thedevice 100, may also be subjected to ambient lighting conditions, which may be detected using one or more optical sensors, as described above with respect toFIGS. 1A-B . For example, the one or more optical sensors are able to detect low level or dark lighting conditions, which may be consistent with theuser 210 being located in a movie theater, presentation, or other quiet area. Similarly, the one or more optical sensors are also be able to detect if thedevice 100 is being subjected to sunlight conditions, which may be consistent with an outdoor setting or open public area. In some implementations, an appropriate alert mode is selected based on the ambient lighting conditions. In some cases, the alert mode that is selected has one or more components (e.g., audio, haptic, visual) that correspond to the light level detected by the sensor(s). - Additionally, the output from one or more types of sensors can be combined to detect an environmental condition or set of conditions. Specifically, in some embodiments, the light sensor(s), acoustic sensor(s), and/or motion sensor(s) are used to estimate or detect a one or more environmental conditions. In some circumstances, the activity level of the user can be more accurately determined by using the output of the one or more motion sensors with the output of the acoustic sensor. More specific examples are provided below with respect to
FIGS. 7A-B . -
FIG. 3 depicts an example electronic device being worn and another example electronic device being carried by the user and subjected to one or more environmental conditions according to one or more embodiments of the present disclosure. In some embodiments,multiple devices user 210 at the same time. As shown inFIG. 3 , a wearableelectronic device 100 and amobile phone 130 are located proximate to the user. Additionally, a laptop computer, desktop computer, or other electronic device may be located in the near-immediate vicinity. In some implementations, one or more of the devices (100, 130) are be used to determine the environmental conditions surrounding theuser 210. In some cases, if more than one devices are proximate to theuser 210, the devices automatically pair by a Bluetooth or similar wireless communications protocol. - With respect to
FIG. 3 , thedevices wearable device 100 can be used in combination with the motion sensor output of theother device 100 to compute or determine a more accurate estimate of the activity level of theuser 210. Additionally, in some embodiments, the optical sensor output from eachdevice respective devices device 130 is located in the pocket of a user rather than in a dark room. Similarly, in some embodiments, the output from the acoustic sensors (e.g., microphones) of therespective devices user 210. In some cases, one output device is selected or designated to output an alert, thereby preventing multiple alerts being sent to theuser 210 at or near the same time. - Additionally, with respect to
FIG. 3 , if there are multiple electronic devices proximate to or in the immediate vicinity of theuser 210, it may be undesirable to output an alert on each device separately when an event is detect. Thus, in some cases, it may be advantageous to determine or identify a single device for outputting an alert. For example, the device that is most likely to be perceived by the user can, conditionally be selected or identified as the output device. Specific examples of this functionality are described below with respect toFIG. 7E . -
FIG. 4 depicts an example electronic device in an exemplary operating environment according to one or more embodiments of the present disclosure. As shown inFIG. 4 , thedevice 130 is placed on a desk, table, or other surface when, for example, thedevice 130 is not in use. In some implementations, the one or more sensors are used to detect this scenario, which may correspond to a conditions where the user is not proximate to the device or may not readily perceive a stimulus or alert output by thedevice 130. In some embodiments, this scenario or environmental condition is detected using one or more motion sensors, which are used to determine a static activity level. The output from other sensors, including the microphone and the one or more optical sensors can, conditionally also be used to determine that thedevice 130 is subjected to a static activity level or environmental conditions consistent with a device that is not in use. -
FIG. 5 depicts a user interacting with an example electronic device according to one or more embodiments of the present disclosure. As shown inFIG. 5 , theuser 210 may interact with the device by, for example, making a selection on a touch-sensitive surface of thedevice 100. In particular, the user may actively interact with the device by touching or pressing a touch-sensitive display of thedevice 100. In some cases, thedevice 100 is be able to sense that the user is looking at the display, and therefore, at least passively interacting with the device. Passive interaction may be detected, for example, using one or more optical sensors to detect the position and movement of the user's head. In some cases, the one or more optical sensors are configured to sense the location and movement of the user's eye, which may be consistent with theuser 210 reading or watching the display of thedevice 100. A passive interaction may also be detected, for example, using one or more touch sensors that detect the user's hand position or grip on the device. In some cases, an active mode is selected which corresponds to a scenario or condition in which the user is either actively or passively interacting with the device. Additionally, in some cases, active or passive interaction from the user may be used to interrupt an escalating alert sequence and output modified alert sequence that is non-escalating or otherwise different. -
FIG. 6 depicts example user input to an electronic device according to one or more embodiments of the present disclosure. As previously discussed, thedevice 100 can, optionally be configured to output a stimulus in response to a user input on a device. For example, as shown inFIG. 6 , the user may providetouch input 615 on atouch display 110 or other touch-sensitive surface of the device. As shown inFIG. 6 , a two-dimensional scrolling or panning input may be provided by moving the touch along one or more directions on the touch sensitive surface of the device. In some implementations, an audible audio output, such as a beep or click, is be produced as items or objects are indexed on thedisplay 110 in response to theuser input 615. In some implementations, a haptic output is coordinated with the audio output, such as a tap or bump. The audio and haptic output can, conditionally be synchronized. - Similarly, in some implementations, an audio and/or haptic output is produced in response to a
rotational user input 612 provided using thecrown 610 or knob. In some embodiments, thecrown 610 is operatively coupled to a position sensor, such as an optical encoder, that is configured to produce an output signal that corresponds to the rotational input provided by the user. An example crown and position sensor are described in more detail below with respect toFIG. 12 . - In some embodiments, an audible click and a haptic tap is output by the device for a predetermined amount of movement of the
crown 610 or knob. For example, the device may, conditionally produce an output for every 5 degrees of movement of thecrown 610. As explained in more detail below with respect toFIG. 7F , the output may be dependent, at least in part, on the speed or rate that the user input (615, 612) is provided. For example, if the speed of the user input (615, 612) exceeds a certain threshold, the device used to produce the haptic output may not be able to keep up. In particular, the response time of the haptic device may be higher than the time between haptic outputs. To help address this limitation, in some cases, the haptic output can be configurable to change from a synchronous output to an asynchronous output as the user input exceeds a certain threshold. - Specific example processes for producing an output using a device are described below with respect to
FIGS. 7A-F . In accordance with the following examples, one or more of the devices described above with respect toFIGS. 1A-B can be used. Additionally, the device(s) can, optionally include internal components or elements consistent withFIGS. 8-9 , described in more detail below. While certain processes and the device hardware implementations are provided by way of example, it is not intended that the description be limited to those example embodiments. -
FIG. 7A illustrates anexample process 700 determining a response to an event according to one or more embodiments of the present disclosure. As discussed above, it may be advantageous for a device to produce an output that corresponds to or has been adapted for one or more environmental conditions. In particular, a device can be configured to produce an alert or stimulus that is formulated to capture the attention of the user. However, the effectiveness of the alert may depend, in part, on one or more environmental conditions, which may change over time or user activity. Thus, it may be beneficial to detect the present state of one or more environmental conditions and select an output having a stimulus that corresponds to the detected environmental condition(s). The operations ofprocess 700 may be performed using, for example, the example devices described above with respect toFIGS. 1A-B . - In
operation 701, an event is detected by the device. In some implementations, the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event. In some embodiments, the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like. In some implementations, the notification or message is received by an external device or service via a wired or wireless communication network. Alternatively, in some embodiments, an event is detected by, for example, receiving a notification or message from an application or program that is being executed on the device. For example, a clock alarm, clock timer, calendar scheduler, or similar program may trigger an event that is detected by the device. In other examples, an event is triggered in relation to a wide range of activities, including, satisfying personal health goal, reaching a geographic location, or meeting some other criteria or condition. In some implementations, when the device is a personal health device capable of one or more measuring physiological functions, an event corresponds to a physiological function exceeding a threshold or satisfying a condition. For example, an event can, conditionally be triggered in response to reaching a target heart rate, oxygenation level, or similar physiological condition. - In
operation 702, a response to the event is determined. In some implementations, the response to the detected event is determined based on a current alert mode. In some embodiments, the current alert mode is selected based on one or more environmental conditions. In order to assure the relevancy of the selection, in some implementations, the one or more environmental conditions are detected concurrently with either the detection of the alert and/or the selection of the current alert mode. However, in some implementations, all of the environmental conditions are not present or occurring exactly when the detection and/or selection occurs. - With respect to
operation 702, in some implementations, the current alert mode is selected from a set of three or more alert modes. As previously discussed, in some implementations, an alert includes multiple forms of stimuli, including, for example, audio, haptic, or visual components. In one embodiment, each of the three or more alert modes includes one or more of: an audio component, a haptic component, and a visual component. Additionally, the components (audio, haptic, visual) can, conditionally vary in intensity and in form depending on the alert mode. - In
operations operations - In some implementations, the alert mode is automatically selected based on the one more environmental conditions that are detected by the device. In some cases, the current alert mode is selected prior to detecting the occurrence of the event. For example, the relevancy of the alert mode may be checked and the current alert mode may be selected or confirmed according to a regularly repeating interval. Additionally, in some implementations, the current alert mode is selected or confirmed at or near the same time that the event occurs. For example, the occurrence of an event can be used to trigger the selection of the current alert mode.
- In some implementations, environmental conditions relate to the physical environment that in which the device is being operated in or conditions that the device is subjected. Environmental conditions that are used to select the current alert mode can include, without limitation, acoustic noise, user activity, device motion, device orientation ambient light, and others. Environmental conditions generally do not include specific alert setting established by a user, such as quiet hours or a silent mode. Additionally, environmental conditions may not, in some cases, include the geographic location of the device.
- In some embodiments, the environmental conditions are monitored and detected using one or more of the sensors associated with the device. Example environmental sensors include, without limitation, accelerometers, gyroscopes, tilt sensors, microphones, light sensors, image sensors, proximity sensors, and the like. Example environmental sensors are described above with respect to
FIGS. 1A-B , above, andFIGS. 8-10 , below. It is not necessary that each of the sensors be located on the device. As mentioned previously with respect toFIG. 3 , multiple devices that are located in the same vicinity or proximate to each other can be configured to share sensor data via a data link or other communication scheme. - With respect to each of the following examples, one or more environmental sensors can be configured to detect particular environmental conditions, and the output of those sensors used to select an alert mode having elements or components that correspond to the detected conditions. In some cases, the environmental sensors are used to compute a changed or changing environmental condition and automatically provide for different alert outputs for the same type of event.
- In some embodiments embodiment, the environmental sensor is a microphone that is configured to detect ambient acoustic conditions. The microphone may can, optionally be integrated with the device and configured to record audio signals or input over a sample time period. In some implementations, the collected audio data is stored and further analyzed to compute or determine an ambient acoustic level. In some embodiments, the collected audio data is filtered and processed to remove audio input that may correlate to the user's voice. The audio data can, optionally also be processed to determine an average or representative acoustic level over a given period of time. In some instances, the audio data from multiple sample time periods are used to compute or determine the acoustic level.
- In some embodiments, the acoustic level is used to select an appropriate alert mode as the current alert mode, in accordance with
operation 702. In one embodiment, the alert mode that is selected includes an audio component that corresponds to the acoustic level determined using the environmental sensors. For example, if the acoustic level represents a loud or noisy ambient acoustic environmental condition, a first alert mode can, conditionally be selected as the current alert mode, the first alert mode having an audio component with an elevated volume or intensity (as compared to other alert modes). Similarly, if the acoustic level represents a condition that is less loud or noisy, a second alert mode can, conditionally be selected as the current alert mode, the second alert mode having an audio component with a volume or intensity that is reduced with respect the audio component of the first alert mode. Additional alert modes may be similarly defined and selected according to an audio component that may correspond to a detected ambient acoustic noise level. - In some implementations, the current alert mode includes or defines another component that corresponds to the detected acoustic level. In some embodiments, a third alert mode is selected as the current alert mode, the third alert mode having an haptic component that corresponds to the acoustic noise level. For example, the intensity or energy of the haptic output may be stronger in accordance with a loud or noisy acoustic level. Similarly, a haptic output may have and intensity or energy that is weaker or reduced in accordance with a quiet or less noisy acoustic level. In some implementations, an alert mode is selected as having a visual component that corresponds to the detected acoustic level. For example, the alert mode may include a visual component, such as a beacon or strobe having an intensity or frequency that corresponds to the detected acoustic level. In some instances, one or more components (audio, haptic, visual) are used in conjunction with another component to produce an appropriate level of stimulation to the user, depending on the environmental conditions.
- In one example, a user is wearing a wearable electronic device, in accordance with the embodiments described above with respect to
FIG. 1A . In one scenario, the user and the device are subjected to a noisy environment, such as a gymnasium or workout room. In accordance with some embodiments, the device detects the noisy environmental condition using the microphone, which is used to determine or compute an ambient sound level. In response to a high-ambient sound level, the device selects an alert mode having an audio component having an increased volume (example audio component) that corresponds to the high-ambient sound level. Additionally or alternatively, in some implementations the device selects an alert mode having an increased haptic vibration (example haptic component and/or visual strobe (example visual component). The device then outputs an alert in accordance with the selected alert mode. - In some embodiments, the environmental sensor includes one or more motion sensors that are configured to detect device motion and/or user activity. The one or more motion sensors can, optionally be integrated or associated with the device and may be configured to record motion and/or activity over a sample time period. Example motion sensors include, for example, an accelerometer, gyroscope, tilt sensor, and the like, as discussed above with respect to
FIGS. 1A-B . In some implementations, the collected motion data is stored and further analyzed to compute or determine an activity level. In some embodiments, the collected motion data is filtered and processed to determine discrete number or movements (translational or rotational) over a given period of time. In some cases, the number of movements is used to compute or determine an activity level. Additionally or alternatively, the intensity of the movements over a period of time is used to compute or determine an activity level. - In some cases, the activity level corresponds to or represents the activity of the user. Thus, a high activity level may represent an environmental condition in which the user may be exercising or moving rapidly. Similarly, a low activity level may represent an environmental condition in which the user is at rest or sedentary.
- In some embodiments, the activity level is used to select an appropriate alert mode as the current alert mode, in accordance with
operation 702. In one embodiment, the alert mode that is selected includes an audio component that corresponds to the activity level determined using the environmental sensors. For example, if the activity level represents a highly active environmental condition, a first alert mode may be selected as the current alert mode, the first alert mode having an audio component with an elevated volume or intensity (as compared to other alert modes). Similarly, if the activity level represents a condition that less active, a second alert mode may be selected as the current alert mode, the second alert mode having an audio component with a volume or intensity that is reduced with respect the audio component of the first alert mode. Additional alert modes may be similarly defined and selected according to an audio component that may correspond to a detected activity level. As in the previous example, the alert mode that is selected may have other components (e.g., haptic, visual) that also correspond to the detected activity level. - In some embodiments, the environmental sensor includes one or more optical sensors that are configured to detect optical or lighting environmental conditions. The one or more optical sensors can, optionally be integrated with the device and configured to record light quantity or lighting conditions over a sample time period. Example optical sensors include, for example, an ALS, an image sensor, a proximity sensor, and the like, as discussed above with respect to
FIGS. 1A-B . In some implementation, the collected optical data is stored and further analyzed to compute or determine an ambient light level. In some embodiments, the collected optical data is filtered and processed to determine an average amount of light over a given period of time, which can, conditionally be used to compute or determine ambient light level. Additionally or alternatively, the intensity of light received over a period of time can be used to compute or determine a light level. - In some cases, the light level corresponds to or represents the setting in which the device is being operated. For example, bright or a high light level may represent an outdoor or public operating environment. In some cases, bright environmental conditions indicate that an alert may be more intense because the user is outdoors. Conversely, a low or dim light level may correspond to or represent an environmental condition in which the user is indoors or more private operating environment. For example, a low light level may correspond to a user being located in a movie theater or presentation. In some cases, a low light level may indicate that an alert should be less intense to avoid disrupting indoor activities.
- In some embodiments, the light level is used to select an appropriate alert mode as the current alert mode, in accordance with
operation 702. In one embodiment, the alert mode that is selected includes an audio component that corresponds to the light level determined using the environmental sensors. For example, if the light level represents a brightly lit environmental condition, a first alert mode is selected as the current alert mode, the first alert mode having an audio component with an elevated volume or intensity (as compared to other alert modes). Similarly, if the lighting level represents a condition that less bright, a second alert mode is selected as the current alert mode, the second alert mode having an audio component with a volume or intensity that is reduced with respect the audio component of the first alert mode. Additional alert modes can, conditionally be similarly defined and/or selected according to an audio component that may correspond to a detected activity level. As in the previous examples, the alert mode that is selected can, optionally have other components (e.g., haptic, visual) that also correspond to the detected light level. - In some embodiments, the environmental sensor includes a battery power sensor that is configured to detect a current battery level. In some implementations, the battery power sensor includes a circuit integrated into the device that is configured to measure an electrical property of the battery (e.g., voltage, current, impedance) that may be indicative of the remaining battery power. Similar to the examples provided above, an alert mode can, conditionally be selected as the current alert mode based on a correspondence between the battery power level and one of the components (audio, haptic, visual) of the alert.
- In one specific example, based on the current battery level, the alert mode is selected based on the power that may be consumed during an alert output. For example, if the battery level is low (e.g., 5%, 10%, or 15% of total battery power), an alert mode is selected that uses less power as compared to some other alert modes. One technique may be to eliminate or reduce the intensity of alert components that consume a large amount of energy. In some implementations, an alert mode having no haptic component is be selected based on a low battery level. Additionally or alternatively, the output of the components is staggered or delayed in some alert modes in order to reduce peak power usage.
- In accordance with each of the examples provided above, the alert mode that is selected can, optionally include a variety of component combinations. In various alert modes, a component can, optionally be eliminated. For example, a first alert mode includes a first haptic component and a first visual component. A second alert mode includes a second haptic component and no visual component. Additionally, one or both of the components can, conditionally vary depending on the alert mode. In some embodiments, a first alert mode includes a first audio component and a first haptic component, and a second alert mode includes a second audio component and second haptic component, where the first audio and first haptic component are different than the second audio component and the second haptic component, respectively. Additionally, in some implementations, an alert mode only includes a visual component. For example, a first alert mode includes no audio component and no haptic component, and only includes only a visual component such as a notification displayed on a display of the device.
- Additionally, as previously discussed, in some implementations an alert mode includes two or more components that are staggered or offset by a delay. In some embodiments, a first alert mode includes a first audio component and a first haptic component offset by a first delay. A second alert mode includes the first audio component and the first haptic component, but offset by a second delay that is different than the first delay. The difference in delay between the alert modes may depend, in part, on the likelihood that the user will be able to perceive a haptic output given certain environmental conditions. In some implementations, the delay between components is increased based on the likelihood that the user is distracted or already receiving a high level of stimulation. In some implementations, the delay between alert components is increased if the activity level and/or ambient acoustic levels are high. In particular, a haptic component can, conditionally proceed an acoustic component by a short offset. In some implementations, the haptic component provides a priming stimulus that may increase the likelihood that the audio stimulus will be perceived by the user.
-
FIG. 7B depicts aprocess 710 for determining whether or not to respond to an event based on user activity according to one or more embodiments of the present disclosure. When a level of user activity is high, it may be difficult for a user to perceive an alert associated with an event. Additionally, even when an alert is perceived by a user engaged in heavy activity, the user may not be as likely to respond to the alert until the activity is complete. Thus, in some implementations, it may be advantageous to delay or forgo the output of an alert until the user has completed an activity or is at rest. The operations ofprocess 710 may be performed using, for example, the example devices described above with respect toFIGS. 1A-B . - In
operation 711, an event is detected by the device. In some implementations, the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event. As in theprevious example operation 701, in some implementations, the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like. In some implementations, an event is detected by, for example, receiving a notification or message from an application or program that is being executed on the device. As in the example provided above, an event can, conditionally be triggered in relation to a wide range of activities or functions, including, satisfying personal health goal, reaching a geographic location, reaching a target heart rate, reaching an oxygenation level, or satisfaction of other criteria. - In
operation 712, an activity level is determined. In some implementations, one or more sensors is used to detect the motion of the device, which is used to determine an activity level. Similar to the example provided above with respect toFIG. 7A , one or more motion sensors can, optionally be integrated into or associated with the device and may be configured to record motion and/or activity over a sample time period. Example motion sensors include, for example, an accelerometer, gyroscope, tilt sensor, and the like, as discussed above with respect toFIGS. 1A-B . In some implementations, the collected motion data is stored and further analyzed to compute or determine an activity level. In some embodiments, the collected motion data is filtered and processed to determine discrete number or movements (translational or rotational) over a given period of time. In some cases, the number of movements is used to compute or determine an activity level. Additionally or alternatively, the intensity of the movements over a period of time is used to compute or determine an activity level. As discussed in the earlier example, the activity level may correspond to or represent the activity of the user. Thus, a high activity level may represent an environmental condition in which the user may be exercising or moving rapidly. Similarly, a low activity level may represent an environmental condition in which the user is at rest or sedentary. - In operation, 713, a determination is made with regard to the activity level exceeding a threshold. The threshold may correspond to the method used to determine the activity level in
operation 712. For example, if the activity level is based on the number of motion events over a period of time, the threshold may similarly represent a threshold number of motion events over a similar period of time. If the activity level is based in part, on the intensity of the activity, the threshold may also represent a threshold level that is based, at least in part, on the intensity of the activity. In some cases, the threshold is customized based on an average level of user activity or device motion. For example, if a user is more active, then the threshold can, optionally be set higher than a user that is less active. - In
operation 714, in accordance with a determination that an activity level exceeds a threshold, the outputting of an alert is forgone or delayed. In some implementations, if the activity level is high, the output of an alert is delayed until a later time when the activity level may be lower. For example, the activity level can, optionally be periodically determined or checked over a predetermined time period. In some instances, the output of the alert is further delayed or forgone as long as the activity level exceeds a low-activity threshold, which can, optionally be the same or different than the original threshold. - In
operation 715, in accordance with a determination that the activity level does not exceed the threshold, the alert is output. In some implementations, the output is provided in accordance with one or more of the other examples provided herein. For example, the alert that is provided can, optionally include one or more alert components (audio, haptic, visual) that correspond to the environmental conditions associated with the device. Additionally, the alert that is output can, optionally be a fixed alert that is not dependent on one or more environmental conditions. - In some cases, the output of the alert is forgone or delayed until a subsequent criteria is satisfied. As previously mentioned, in some implementations, the output is forgone or delayed until the activity level drops below a low-activity threshold, which can, optionally be the same or different than the original threshold. Additionally, in some implementations, the output is forgone or delayed until a predetermined time period has passed since the first time the output of the alert was forgone. For example, the device can be configured to wait until an activity level drops below a threshold, and if the level does not drop over the predetermined period of time, the alert is output anyway.
- In some embodiments, if multiple alert outputs that are associated with multiple events have been forgone due to a high activity level, the alerts are combined into a single alert when a determination is eventually made to output an alert. For example, if multiple events occur during a period of high activity, the user will be notified by a single (combined) alert that represents all of the alerts that were forgone during the period of high activity.
- One example implementation of
process 710 includes user activity associated with an exercise or workout routine. In this scenario, the device receives or detects one or more events associated with one or more physiological conditions while the user is performing an exercise. When the user is at rest, due to a break between exercise sets or when the workout is complete, the device, in some implementations, automatically detects the reduced activity and outputs the one or more alerts associated with the events that occurred during the exercise. In another example, one or more alerts are delayed or forgone while a user is typing or performing another activity that introduces high-frequency movement near the device. The delayed or forgone alert output may increase the user's perception of, for example, a haptic component output that may be masked by the high-frequency movement. In another example, decision to forgo the alert is also based on other environmental conditions. For example, the output of an alert, in some implementations, is forgone or delayed while a user is walking in a busy or loud environment. When the user stops walking, at for example, a traffic light, the device automatically outputs the previously forgone alert. -
FIG. 7C depicts aprocess 720 for determining whether or not to respond to an event based on a number of events according to one or more embodiments of the present disclosure. In some cases, it may be advantageous to group or batch together multiple alerts that are associated with multiple events may be related or that may occur over a short period of time. In some embodiments, a series of closely occurring events may result in a single, batched alert instead of triggering a series of individual alerts for each event. A combined or batched alert may be useful, for example, when a large number of event occur over a period of time, or when the time between events is very small. In another example, it may be determined that two or more events may be related and, thus only a single alert is warranted. In these cases, producing a single alert may be more effective in capturing the user's attention and may prevent alert fatigue, as discussed previously. The operations ofprocess 720 may be performed using, for example, the example devices described above with respect toFIGS. 1A-B . - In
operation 721, an event is detected. In some implementations, the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event or receiving a notification or message from an application or program that is being executed on the device. As in the previous examples, in some implementations, the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like. As in the examples provided above, an event can, conditionally be triggered in relation to a wide range of activities or functions, including, satisfying personal health goal, reaching a geographic location, reaching a target heart rate, reaching an oxygenation level, or satisfaction of other criteria. - In
operation 722, the number of events is determined. In some embodiments, the device is configured to wait a predetermined period of time after detecting an event. In some implementations, the number of events (requiring an alert output) that occur over the predetermined amount of time are be used to compute the number of events foroperation 722. In some implementations, the device is configured to detect and count the number of events received over a predetermined, regularly repeating time period. Thus, in some cases, the number of events includes one or more events that occurred prior to the event detected inoperation 721. A variety of other techniques can, optionally be used to determine the number of events that are received by the device over a period of time. - With regard to
operation 722, in some implementations, the number of events that are determined depends on the type of event that occurred, a person associated with the event, and/or the content associated with the event. For example, in some implementations only text messages are counted pursuant tooperation 722. Similarly, in some implementations, only events associated with communications (e.g., e-mail, text messages, telephone calls) are counted pursuant tooperation 722. Additionally or alternatively, in some implementations, only events that are associated with the same sender or group of senders are counted foroperation 722. Similarly, in some implementations, events that share similar content or subject matter are counted foroperation 722. - In
operation 723, a determination is made whether the number of events exceeds a threshold. In some instances, the threshold is a fixed threshold that is determined by the device or device settings. In some instances, the threshold is configurable by the user. Inoperation 725, in accordance with a determination that the number of events that have been detected over the predetermined period does not exceed the threshold, the output of the alert is forgone. - In
operation 724, in accordance with a determination that a number of events that have been detected over a predetermined period exceeds a threshold, an alert is output. In some cases, if a previous alert associated with a previous event has been forgone, the alert that is output is based, at least in part, on the previously occurring event. In some cases, the alert includes information indicative of the event and one or more prior events occurring prior to the event. In some implementations, the alert includes a visual component that includes, a list of the subject lines or senders for one or more e-mail messages that were received during the predefined time period. - With respect to
operation 724, in some implementations, the alert includes an indication of the number of events that have been batched or combined in the alert output. In some implementations, the strength of the alert (e.g., intensity of the audio or haptic component) is based on the number of events that have forgone alerts. For example, the more events that have been combined into a single alert may result in a more prolonged or intense alert output. Additionally, in some implementations, the strength of the alert output is based, in part, on the frequency and/or type of events that have had an alert forgone. - In some embodiments of
process 720, events of different types are batched together. For example, if the threshold is 3 messages in 2 minutes, receiving an email, an SMS message and a phone call within 2 minutes will trigger an alert. In some embodiments, events of different types of events are batched together. For example, if the threshold is 3 messages in 2 minutes, receiving an email, an SMS message and a phone call within 2 minutes will not trigger an alert, while receiving 3 emails within 2 minutes will trigger an alert. In accordance with the description ofprocess 720, a variety of events and criteria may be counted and used to forgo and/or eventually output an alert. -
FIG. 7D depicts aprocess 730 for outputting a modified alert sequence according to one or more embodiments of the present disclosure. In some implementations, an alert is configured to escalate by producing a stimulus or output that increases in intensity over time. In some cases, the escalation sequence or progression of the alert is interrupted and modified as a result of a user interaction with the device. In some cases, an interaction from the user may indicate that the user's attention is already focused on the device and that further escalation of the alert sequence may not be necessary. In some embodiments, an escalating alert sequence is output by the device until receiving an input or an interaction from the user. In some embodiments, the device detects passive interaction, such detecting or estimating when a user is reading content on the display. The operations ofprocess 730 may be performed using, for example, the example devices described above with respect toFIGS. 1A-B . - In
operation 731, a portion of an alert sequence is output using the device. In some cases, the alert sequence includes predetermined sequence of alert outputs that escalate in intensity over time. In some embodiments, each of the alert outputs includes one or more alert components (audio, haptic, visual) that increase in intensity over time. Additionally or alternatively, in some implementations, the alert escalates by adding components to subsequent outputs to increase the overall intensity of the alert. In some embodiments, a first output includes only a haptic component, a second output includes a haptic and audio component, and a third or subsequent output includes a haptic, audio, and visual component. - With regard to
operation 731, in some implementations, the alert sequence corresponds to the occurrence of a single event. In some implementations, the alert sequence corresponds to the receipt of a message or other communication. The alert sequence can, optionally also correspond to an upcoming calendar event, a timing notification, or other type of reminder. The alert sequence can, optionally also correspond to a series of activity monitor alerts that provide feedback with regard to the progress of the user toward meeting a fitness goal, such as completing a workout routine. - In
operation 732, an interaction with the user is detected. In order to affect the alert, the interaction with the user is, optionally detected while the portion of the alert sequence is being output. As mentioned previously, the interaction with the user may include an active and/or passive interaction. In some implementations, a user interaction indicates that the user is already be paying attention to the device and may not need or want an alert to continue to escalate. In some implementations, a user interaction is interpreted as a request to modify the alert sequence to either increase or decrease the intrusiveness of the alert. - With reference to
FIG. 5 , theuser 210 may actively interact with the device by, for example, contacting a touch-sensitive surface of thedevice 100. In particular, the user may actively interact with the device by selecting an item displayed on a touch-sensitive display of thedevice 100. Theuser 210 may also actively interact with the device by pressing a button, turning a knob, or by providing some other form of user input. In some embodiments, the user may speak a voice command, which may detected by the microphone of the device and interpreted as user input. - Additionally or alternatively, in some implementations, the device is configured to detect passive interaction with the user. With reference again to
FIG. 5 , thedevice 100 can be configured to use one or more sensors to determine or estimate if the user is looking at the display, and therefore, at least passively interacting with the device. In some implementations, passive interaction is detected, for example, using one or more optical sensors to detect the position and movement of the user's head. In some cases, the one or more optical sensors are configured to sense the location and movement of the user's eye, which may be consistent with theuser 210 reading or viewing the display of thedevice 100. In some implementations, the device is configured to detect the user's grip on the device, which may also indicate that the user is currently viewing or interacting with the device. - Returning to
FIG. 7D , inoperation 733, a modified alert sequence is selected and output in response to the interaction with the user. In particular, in some implementations, the original alert sequence output inoperation 731 is paused or terminated and a new, modified alert sequence is output instead. In some cases, the modified alert sequence is a continuation of the original alert sequence, but is modified in intensity or intrusiveness. For example, if the original alert sequence includes a series of 10 outputs and the user interaction is detected on the 6th output, the modified alert sequence includes 4 outputs, which replace the 4 remaining outputs of the original series of 10. - With regard to
operation 733, in some implementations, the modified alert sequence is a non-escalating alert sequence. In some instances, the modified alert sequence includes a series of outputs that do not increase in intensity over time. In some cases, the modified alert sequence includes a series of outputs that decrease in intensity over time. In some instances, the modified alert sequence is a silent alert sequence. In some instances, the modified alert sequence produces only a visual component in response to the interaction with the user. - In some embodiments,
process 730 is used to increase or decrease the intrusiveness of an alert sequence. In some implementations, the user interaction ofoperation 732 includes an input that is a request to reduce the intrusiveness of the portion of the alert sequence output inoperation 731. In some implementations, the request to reduce the intrusiveness is input via the touch-sensitive display of the device and results in the output of a modified alert sequence that is less intrusive. In some instances, a less intrusive output includes an output having an audio component that is reduced in volume and/or a haptic component having a shorter or lower energy output. In some implementations, the user interaction ofoperation 732 includes an input that is a request to increase the intrusiveness of the portion of the alert sequence output inoperation 731. In some instances, a user input results in a modified alert sequence having an increased intrusiveness. An output having an increased level of intrusiveness can, optionally include, for example, an audio component having increased volume and/or a haptic component having an increased duration or energy output. - In one example, a user receives an alert sequence triggered by an event associated with the user meeting a health-related goal. For example, the alert sequence may be produced in response to the user reaching a target heart rate, as detected by a heart-rate monitor. In some instances, the alert sequence escalates or increases in intensity until the user interacts with the device by, for example, shaking the device, which is perceived by one or more motion sensors integrated with the device. Additionally or alternatively, the user interacts with the device by touching the touch screen, pushing a button, or turning a knob. In response to the user interaction, the alert sequence is interrupted and a modified alert sequence is output. In some implementations, the modified alert sequence is a non-escalating sequence, such as a sequence of audio beeps. In some implementations, the modified alert sequence continues as long as the user maintains the target heart rate or another event triggers another alert.
-
FIG. 7E depicts aprocess 740 for determining an output device according to one or more embodiments of the present disclosure. In some scenarios, multiple devices are proximate to a user when an event is detected. It may be undesirable for each of the devices to output an alert in response to the same event. Thus, in some cases, it may be advantageous to select one device to output the alert. The operations ofprocess 740 may be performed using, for example, the example devices described above with respect toFIGS. 1A-B and 3. - In
operation 741, an event is detected. In some implementations, the device detects the occurrence of an event by, for example, receiving a notification or a message related to the event or receiving a notification or message from an application or program that is being executed on the device. As in the previous examples, in some implementations, the device receives a notification or message that the user has received an incoming e-mail message, text message, telephone call, voicemail message, and the like. As in the examples provided above, an event can, optionally be triggered in relation to a wide range of activities or functions, including, satisfying personal health goal, reaching a geographic location, reaching a target heart rate, reaching an oxygenation level, or satisfaction of other criteria. - In
operation 742, a determination that a second device is in proximity to the first device is made. In some cases, the determination is made in response to detecting the event. In one embodiment, the first device is a wearableelectronic device 100 and the second device is amobile telephone 130, as depicted inFIG. 3 . In some embodiments, one or more of another type of device, such as a notebook computer, a desktop computer, a tablet device, a personal media player device, a television device, or the like are in proximity to the user. In some implementations, a third device, a fourth device, and other additional devices are also detected in accordance withoperation 742. - With regard to
operation 742, in some implementations, a second device (or other additional device) are detected using, for example, a wireless communication signal or beacon. In some embodiments, the first and second device have been previously paired using, for example, a Bluetooth communication scheme. In some cases, the second device (or other additional device) is detected using an automatic detect and connect process as part of the Bluetooth connection routine. In some implementations, the second device is detected using a beacon or intermittent broadcast signal that is transmitted from the first device. In some implementations, the second device is detected as being connected to a common wireless communication node. For example, the second device can, selectively be detected due to a shared WiFi connection with the first device. Alternatively or additionally, one or both of the devices may include location determining software and/or hardware (e.g., global positioning system (GPS), base-station triangulation) that is used to determine if the second device is proximate to the first device. - In
operation 743, in accordance with the a determination that more than one device is present, an alert-output device is selected. In some implementations, if there are more than two devices in proximity, the alert-output device is selected from a group of three or more devices. As described above, the devices may include a wearable electronic device, a mobile telephone, a notebook computer, a desktop computer, a tablet device, a personal media player device, a television device, or the like. - With regard to
operation 743, in some implementations the selection of the alert-output device is performed in accordance with a number of different techniques. In some embodiments the alert-output device is selected based on a user-provided prioritization. For example, the user may provide or designate an ordered priority of multiple devices that may be used to select an appropriate alert-output device. - In some implementations, the alert-output device is selected based on a usage of one or more of the devices that are in proximity to each other. In some embodiments, the usage includes a time of usage. In some cases, a device that has a time of usage that is most recent is selected as the alert-output device. In some implementations, a time of usage is only used if the most recent time is within a threshold (e.g., the data is not too old to be relevant). In some implementations, the usage includes both a time of usage and an amount of usage. In some implementations, the device having an amount of usage that is greater over a predetermined time period is selected as the alert-output device. In some implementations, the usage includes a type of usage. In some implementations, a device having a usage that corresponds to a predetermined usage type is selected as the alert-output device. In some implementations, if the device is being used in accordance with a communications program (e.g., an e-mail program, a text messaging program), the device is selected as the alert-output device.
- In
operation 745, if an alert-output device is selected, the alert is output on the alert-output device. In some instances, the alert is output in accordance with one or more other aspects of the present disclosure. In some instances, the alert output includes one or more components (audio, haptic, visual) that correspond to one or more environmental conditions. In some cases, the alert output is fixed or selected by the user. - With regard to
operation 745, in some embodiments, only the alert-output device outputs the alert associated with a particular event. For example, no other device that has been determined to be proximate to the user outputs an alert associated with the event. In some embodiments all devices that are determined to be proximate to the user are updated in response to the event, but only the alert-output device outputs an alert. For example, an email or message may be loaded or delivered to all of the devices, but the alert associated with the reception of the e-mail is only output on, for example, a wearable electronic device worn by the user. In some cases, the user may perceive the alert on the wearable device and then check the message on a phone, laptop, or tablet. - With regard to
operation 745, in some cases, the alert is relayed to the alert-output device using a device that is not selected as alert-output device. For example, with reference toFIG. 3 , a notification of an incoming message may be received by a user'smobile telephone 130. If the user's wearableelectronic device 100 is selected as the alert-output device, themobile telephone 130 may relay the notification to thewearable device 100 in order to trigger an alert output. Alternatively, themobile telephone 130 may generate a portion of the alert, which may be relayed and output using thewearable device 100. In some cases, if no interaction with the alert-output device is detected for a predetermined time after the alert is sent, a subsequent alert is output using one or more of the other devices that were not selected as an alert-output device. - In
operation 744, in accordance with a determination that there is not another device present, the alert is output on the first device. In particular, if there are no other devices proximate to the user (or determined to be proximate to the user), the alert is output on the device that detected the event. - In one specific example depicted in
FIG. 3 , the user wearing a wearableelectronic device 100 and has amobile telephone 130 placed in a pocket. In one scenario, the user receives an e-mail, which triggers an event that is detected by themobile telephone 130. In some implementations, themobile telephone 130 detects or has detected the proximity of the wearableelectronic device 100 by, for example, having previously been paired using a Bluetooth or other wireless connection. In some implementations, themobile telephone 130 has been in a dormant state for a period of time due to the placement in the user's pocket. Due to the low or non-usage of themobile telephone 130, the wearableelectronic device 100 is selected as the alert-output device. In this scenario, the alert associated with the incoming e-mail is relayed to and output on the wearableelectronic device 100. -
FIG. 7F depicts aprocess 750 for producing an audio and haptic feedback in response to a user input according to one or more embodiments of the present disclosure. In some implementations, the device is configured to produce a stimulus that may provide feedback for a user-action or input to the device. As previously mentioned, a stimulus or feedback may be useful for some user input components, such as electronic sensors, that may have few or no moving parts to provide feedback to the user that an input is being received. For example, an audio and haptic component may be output in response to a user's interaction with a touch screen or interaction with a rotational dial or button on the device. In some cases, the stimulus may be adapted to mimic a sound or haptic response that the user may associate with a more traditional mechanical device. The operations ofprocess 750 may be performed using, for example, the example devices described above with respect toFIGS. 1A-B and 5. - In
operation 751, a first input is received on the device. In some implementations, the first input is received via, for example, touch-sensitive surface of the device, a crown or dial, or some other user input device. With reference toFIG. 5 , in some case, a user input is received as a translational panning or scrollinginput 615 on the touch-sensitive surface of thedisplay 110. In some implementations, the user input is received as arotational input 612 provided using thecrown 610 of thedevice 100. - Returning to
FIG. 7F , with respect tooperation 751, in some cases the first input is below an input threshold. For example, the speed or rate of the first input may be slower than a threshold. With respect to an input on a touch-sensitive surface, the movement of the touch may be below a threshold rate. Similarly, with respect to an input via a crown or dial, the speed of the rotation may be below a threshold rate. - In
operation 752, in response to detecting the first input, a first output is produced. In some embodiments, the first output includes a haptic component for the first input that is coordinated with an audio component for the first input. In some cases, the haptic component is a tap or bump created by a haptic actuator integrated with the device. In some instances, the audio component include sa beep or click that corresponds with the tap or bump created by the haptic actuator. In some implementations, the haptic component is synchronized with the audio component. In some instances, the haptic component has an output that is simultaneous to or at a fixed timing relationship with respect to the output of the audio component. - With regard to
operation 752, the first output corresponds to the rate or speed of the first input. In some implementations, when a user scrolls through a list of items using a touch screen, a haptic tap and an audio beep or click corresponds to the progression through the list caused by the first input. As previously mentioned, this may be more readily perceived by the user or more satisfying than, for example, the visual scrolling of the items alone. In some implementations, a user provides a rotational input via, for example the crown 160 or knob of the device depicted inFIG. 5 . In this case, a haptic tap and an audio click can, optionally be output for every, for example, 5 degrees of rotation of the crown or knob. In this way, the user receives feedback on the speed that the input is being received by the device. - In
operation 753, a second input is received on the device. Similar tooperation 751, in some implementations, the second input is received via a touch-sensitive surface, crown, or other input device. In the present example, the second input is provided via the same input device as the first input ofoperation 751. In some implementations, the second input occurs immediately after the first input or, alternatively, the second input occurs after a delay. - In
operation 754, a determination is made with regard to the second input. In particular, a determination is made regarding whether the second input is above or below an input threshold. In some implementations, the input threshold is set or determined based, in part, on the limitations of the hardware used to provide the feedback output. In some implementations, a haptic actuator has a minimum response time that is an inherent property or physical limitation of the haptic actuator mechanism, which typically includes a moving mass. An example haptic actuator is described in more detail below with respect toFIG. 11 . In some embodiments, the minimum response is due to the time it takes to initiate movement of the mass, produce a haptic output, and stop movement of the mass. If a series of haptic outputs has an output rate that exceeds the minimum response time, the actuator may not be able to recover from a previous output before sending out the next output in the series. Thus, in some embodiments there is an upper limit on the rate at which the haptic actuator can provide a series of distinct outputs. In some cases, input threshold ofoperation 754 is determined, at least in part, based on the upper limit of the haptic actuator. - In
operation 755, in accordance with a determination that the second input is below the input threshold, the output is similar to the output produced foroperation 752. In some implementations, the second input output includes a haptic component for the second input that is coordinated with an audio component for the second input. Similar to the previous example, in some implementations, the haptic component is synchronized with the audio component. In some implementations, the haptic component has an output that is simultaneous to or at a fixed timing relationship with respect to the output of the audio component. - In
operation 756, in accordance with a determination that the second input is above the input threshold, the second output includes a modified haptic component for the second input. In some instances, if it is determined that the rate or speed of the input may exceed the capabilities of the haptic actuator, the output is modified. In some embodiments, if the second input is above the input threshold, the haptic component for the second input is asynchronous with respect to the audio component for the second input. In some embodiments, the asynchronous haptic output is a continuous haptic output. In some instances, the second output includes a continuous haptic output but maintain a distinct audio “click” output that corresponds to an amount of input that is provided. In some cases, the continuous haptic output includes inflection points or periods of varying intensity. In some implementations, the inflection points may of the haptic output are not be synchronized with the audio output. -
FIG. 8 depicts a block diagram illustrating exemplary components, such as, for example, hardware components of anelectronic device 800 according to one or more embodiments of the present disclosure. In certain embodiments, theelectronic device 800 is similar to the wearableelectronic device 100 described above with respect toFIG. 1A or theelectronic device 130 described above with respect toFIG. 1B . Although various components of thedevice 800 are shown, connections and communication channels between each of the components are omitted for simplicity. - In a basic configuration, the
electronic device 800 may include at least oneprocessor 805 and an associatedmemory 810. In some embodiments, thememory 810 comprises, but is not limited to, volatile storage such as random access memory, non-volatile storage such as read-only memory, flash memory, or any combination thereof. In some embodiments, the memory includes removable and non-removable memory components, including, for example, magnetic disks, optical disks, or tape. In some embodiments, thememory 810 stores anoperating system 812 and one or more program modules 814 suitable for runningsoftware applications 816. Theoperating system 812 can be configured to control theelectronic device 800 and/or one ormore software applications 816 being executed by theoperating system 812. In certain embodiments, various program modules and data files are stored in thesystem memory 810. The program modules 814 and theprocessor 805 can be configured to perform processes that include one or more of the operations of methods shown and described with respect toFIGS. 7A-F . - The
electronic device 800 also includescommunication connections 808 that facilitate communications withadditional computing devices 806. In some implementations, thecommunication connections 808 include a radio-frequency (RF) transmitter, a receiver, and/or transceiver circuitry, universal serial bus (USB) communications, parallel ports and/or serial ports. - As used herein, the term computer readable media can, optionally include computer storage media. Computer storage media can, optionally include volatile and nonvolatile media and/or removable and non-removable media implemented in any method or technology for the storage of information. Examples include computer-readable instructions, data structures, or program modules. The
memory 810, which can, optionally include the removable and non-removable storage devices, is one example of computer storage media. Computer storage media can, optionally include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by theelectronic device 800. Any such computer storage media can, optionally be part of theelectronic device 800. -
FIG. 9 depicts additional aspects of theelectronic device 800 according to one or more embodiments of the present disclosure.FIG. 9 is a block diagram illustrating the architecture of an electronic device such aselectronic device 100 shown and described with respect toFIG. 1A orelectronic device 130 shown and described with respect toFIG. 1B . - As shown in
FIG. 9 , multiple components are operably connected to theprocessor 805 and thememory 810 of thedevice 800. In particular, one or input components are coupled to theprocessor 805. In some embodiments, a touch-sensitive device 820, such as a touch sensor or touch screen is integrated with a surface of the device. In some embodiments, the touch sensor or touch screen includes a capacitive sensor that is configured to detect the location of one or more touches on the surface of the device. Additionally or alternatively, in some embodiments, the device includes a force sensor that is configured to detect and measure the force of a touch on a device. - In the present embodiment, the
device 800 includes one ormore buttons 822 andknobs 824 that are configured to accept user input. Additional user input can, optionally be provided via a keyboard, mouse, pen or stylus, sound input device, and the like. With reference to the example described above with respect toFIG. 6 , theknob 824 can include a crown of a portable electronic device. In some embodiments, the knob 834 or crown is operatively coupled to a position sensor, such as an optical encoder, that is configured to produce an output in response to a rotational input. A more detailed description of a knob/crown having a position sensor is described below with respect toFIG. 12 . - In some embodiments, other input to the
device 800 is provided by one ormore sensors 830. As previously described with respect toFIGS. 1A-B , an example device includes one or more environmental sensors that are configured to monitor and detect one or more environmental conditions.Example sensors 830 include motion sensors, including accelerometers, gyroscopes, tilt sensors, and the like. In some embodiments, thesensors 830 also include one or more optical sensors, including, an image sensor, ALS sensor, proximity sensor, and the like. In some embodiments, thesensors 830 also include a microphone or other audio sensing device. - In some embodiments, the
device 800 includes one or more devices or components for providing output to the user. As shown inFIG. 9 , the device includes adisplay 840 for presenting visual information or output to the user. In some embodiments, thedisplay 840 is formed from a liquid crystal display (LCD), organic light emitting diode (OLED) display, organic electroluminescence (OEL) display, or other type of display device. In some embodiments, thedevice 800 includes avisual indicator 842, such as a beacon or strobe light, that is configured to provide additional visual output to the user. - In the example of
FIG. 9 , thedevice 800 includes aspeaker 844 or other acoustic component. Thespeaker 844 can be used to produce an audio output in accordance with some aspects of the disclosure. An example speaker component is described below with respect toFIG. 10 . In the example ofFIG. 9 , thedevice 800 also includes ahaptic actuator 846 that is configured to produce a haptic output in accordance with some aspects of the disclosure. An example haptic actuator is described below with respect toFIGS. 11A-B . - In one or more embodiments, data and information generated or captured by the
electronic device 800 is stored locally. Additionally or alternatively, the data can be stored on any number of storage media that can, optionally be accessed by theelectronic device 800 using the communications connection (808 inFIG. 8 ), a wired connection or a wireless connection between theelectronic device 800 and aremote computing device 806. Additionally, data and information can be readily transferred between computing devices. -
FIG. 10 depicts an example acoustic module in accordance with some embodiments. As described above, in some embodiments, the device includes one or more devices for transmitting acoustic energy. In particular, embodiments of the device include a speaker for transmitting acoustic energy.FIG. 10 depicts a simplified schematic cross-sectional view of a first embodiment of a device having anspeaker 1000. The representation depicted inFIG. 10 is not drawn to scale and does not include all elements of every embodiment of a speaker. Thespeaker 1000 is representative of speakers or acoustic elements described with respect to one or more embodiments described herein. - In the example depicted in
FIG. 10 , thespeaker 1000 includes various components for producing and transmitting sound, including adiaphragm 1010, avoice coil 1009, acenter magnet 1008, and side magnets/coils 1007. In one implementation, thediaphragm 1010 is configured to produce sound waves or an acoustic signal in response to a stimulus signal in thecenter magnet 1008. For example, a modulated stimulus signal in thecenter magnet 1008 causes movement of thevoice coil 1009, which is coupled to thediaphragm 1010. Movement of thediaphragm 1010 creates the sound waves, which propagate through theacoustic cavity 1011 of acoustic module 106 and eventually out theacoustic port 1020 to a region external to the device. In some cases, theacoustic cavity 1011 functions as an acoustical resonator having a shape and size that is configured to amplify and/or dampen sound waves produced by movement of thediaphragm 1010. - As shown in
FIG. 10 , thespeaker 1000 also includes ayoke 1014,support 1013,connector element 1012, and acavity wall 1013. These elements provide the physical support of the speaker elements. Additionally, theconnector element 1012 and thecavity wall 1013 together form at least part of theacoustic cavity 1011. The specific structural configuration ofFIG. 10 is not intended to be limiting. For example, in alternative embodiments, the acoustic cavity can, optionally be formed from additional components or can, optionally be formed from a single component. - The
speaker 1000 depicted inFIG. 10 is provided as one example of a type of speaker or acoustic module. In some implementations, the speaker includes different configurations for producing and transmitting sound, including, for example, a vibrating membrane, piezoelectric transducer, vibrating ribbon, or the like. In some implementations, the acoustic module is a microphone acoustic module having one or more elements for converting acoustic energy into an electrical impulse. For example, the acoustic module can, optionally alternatively include a piezoelectric microphone element for producing a charge in response to acoustic energy or sound. - As shown in
FIG. 10 , anacoustic port 1020 is formed in thecase 1021 of the electronic device. In the present example, theacoustic port 1020 includes a first andsecond orifice case 1021 and acoustically couple theacoustic cavity 1011 of thespeaker 1000 to the external environment (external to the electronic device). In the present embodiment, the first andsecond orifices acoustic cavity 1011. This configuration may help reduce the direct ingress of liquid 1001 intoacoustic cavity 1011 of thespeaker 1000. Also, as shown inFIG. 10 ashield 1021 or umbrella structure that is formed between theorifices acoustic cavity 1011. As shown inFIG. 10 , thespeaker 1000 also includes ascreen element 1015 disposed at one end of theacoustic cavity 1011, which may also prevent the ingress of liquid or other foreign debris into theacoustic cavity 1011. -
FIGS. 11A-B depict an example haptic actuator in accordance with some embodiments. As described above, some embodiments of the device includes one or more haptic modules for providing haptic feedback to the user. In some embodiments, a haptic device is configured to produce a mechanical movement or vibration that is transmitted through the case and/or other component of the device. In some cases, the movement or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user. - The space constraints associated with a wrist-worn device may present unique challenges to integrating a haptic mechanism into wearable electronics. In particular, a haptic mechanism may use a moving mass used to create the movement or vibration of the haptic output. The larger the mass that is moved, the easier it may be to create a perceivable stimulus using the haptic mechanism. However, a large moving mass and the supporting mechanism may be difficult to integrate into the compact space of, for example, the case of a wearable electronic wristwatch device.
FIGS. 11A-B depict one example haptic mechanism suitable for use in a wearable electronic device. While the embodiment described with respect toFIGS. 11A-B is provided as one example, the haptic module is not limited to this particular configuration. -
FIG. 11A depicts a three-quarters perspective view of the of a haptic module 1100, with a top, front and left sidewall of thecase 1120 removed to expose internal components.FIG. 11B depicts a cross-sectional perspective view of the haptic module 1100 cut in half to expose the internal components. In this example, acoil 1101 is used to induce movement of aframe 1160, which houses acentral magnet array 1110. As shown inFIGS. 11A-B , the movement of theframe 1160 is guided by ashaft 1150 that is fixed with respect to acase 1120. - In the present example, the
coil 1101 is energized by transmitting a current (e.g., from the battery) along a length of a wire that forms thecoil 1101. A direction of the current along the wire of thecoil 1101 determines a direction of a magnetic field that emanates from thecoil 1101. In turn, the direction of the magnetic field determines a direction of movement of theframe 1160 housing thecentral magnet array 1110. One or more springs can, optionally bias theframe 1160 towards the middle region of the travel. In this example, theframe 1160 andcentral magnet array 1110, through operation of thecoil 1101 function as a moving mass, which generates a tap or vibration. The output of the haptic module 1100, created by the moving mass of theframe 1160 andcentral magnet array 1110, may be perceived as a haptic feedback or stimulus to the user wearing the device. - For example, when the
coil 1101 is energized, thecoil 1101 generates a magnetic field. The opposing polarities of the magnets in themagnet array 1110 generates a radial magnetic field that interacts with the magnetic field of thecoil 1101. The Lorentz force resulting from the interaction of the magnetic fields causes theframe 1160 to move along theshaft 1150 in a first direction. Reversing current flow through thecoil 1101 reverses the Lorentz force. As a result, the magnetic field or force on thecentral magnet array 1110 is also reversed and theframe 1160 moves in a second direction. Thus,frame 1160 can, optionally move in both directions along theshaft 1150, depending on the direction of current flow through the coil. - As shown in
FIG. 11A , thecoil 1101 encircles thecentral magnet array 1110, which is disposed near the center of theframe 1160. As previously described, thecoil 1101 can, selectively be energized by transmitting a current along the length of the wire forming thecoil 1101 and the direction of the current flow determines the direction of the magnetic flux emanating from thecoil 1101 in response to the current. Passing an alternating current through thecoil 1101 causes the central magnet array 1110 (and frame 1160) to move back and forth along ashaft 1150. In order to prevent thecentral magnet array 1110 from being attracted to theshaft 1150, which could increase friction between the two and thereby increase the force necessary to move thecentral magnet array 1110 andframe 1160, theshaft 1150 can, optionally be formed from a non-ferritic material such as tungsten, titanium, stainless steel, or the like. - As depicted in
FIGS. 11A-B , thecoil 1101 is positioned within aframe 1160 that holds thecentral magnet array 1110, but is not affixed to thecoil 1101. Rather, an air gap separates thecoil 1101 from thecentral magnet array 1110 and theframe 1160 is free to move with respect to thecoil 1101, which is generally stationary. Further, theframe 1160 generally moves with thecentral magnet array 1110. As illustrated inFIG. 11A-B , theframe 1160 has an aperture formed therein of sufficient size to contain thecoil 1101. Even when the frame and central magnet array are maximally displaced within the case 1120 (e.g., to one end or the other of the shaft 1150), thecoil 1101 does not contact any portion of theframe 1160. In the present embodiment, thecoil 1101 remains stationary in thecase 1120 while theframe 1160 andcentral magnet array 1110 move, although in other embodiments thecoil 1101 moves instead of, or in addition to, the frame and/or central magnet array. - As shown in
FIGS. 11A-B , thecentral magnet array 1110 is formed from at least twomagnets center interface 1170 can, optionally be formed from a ferritic or non-ferritic material, depending on the embodiment. A ferritic material for thecenter interface 1170 may enhance the overall magnetic field generated by thecentral magnet array 1110, while a non-ferritic material may provide at least a portion of a return path for magnetic flux and thus assist in localizing the flux within thecase 1120. In some embodiments, themagnets -
FIG. 12 depicts an example crown with an optical encoder in accordance with some embodiments. The crown and optical encoder ofFIG. 12 may correspond to theexample crown 610 described above with respect toFIG. 6 . In particular, as described above, embodiments of the device include a crown used to accept rotary input from the user, which can be used to control aspects of the device. For example, the crown can be turned by the user to scroll a display or select from a range of values. In some embodiments, the crown can be rotated to move a cursor or other type of selection mechanism from a first displayed location to a second displayed location in order to select an icon or move the selection mechanism between various icons that are output on the display. In a time keeping application, the crown can also be used to adjust the position of watch hands or index digits displayed on the display of the device. The crown can also be used to control the volume of a speaker, the brightness of the display screen, or control other hardware settings. - The embodiments described herein can be used for at least a portion of the crown module integrated into a wearable electronic device. The embodiments are provided as examples and do not necessarily include all of the components or elements used in a particular implementation. Additionally, the crown module is not intended to be limited to the specific examples described below and can vary in some aspects depending on the implementation.
- In some embodiments, an optical encoder is used to detect the rotational motion of the crown. More specifically, the example provided below with respect to
FIG. 12 uses an optical encoder to detect rotational movement, rotational direction and/or rotational speed of a component of the electronic device. Once the rotational movement, rotational direction and/or rotational speed have been determined, this information can be used to output or change information and images that are presented on a display or user interface of the electronic device. - As shown in the example embodiment of
FIG. 12 , the optical encoder of the present disclosure includes alight source 1270, aphotodiode array 1280, and ashaft 1260. However, unlike some traditional optical encoders, the optical encoder of the present disclosure utilizes anencoding pattern 1265 disposed directly on theshaft 1260. As shown inFIG. 12 , theencoding pattern 1265 includes a number of light and dark markings or stripes that are axially disposed along theshaft 1260. Each stripe or combination of stripes on the shaft can be used to identify a position of theshaft 1260. Light emitted from thelight source 1270 is reflected off of theshaft 1260 and into thephotodiode array 1280. The reflected light can be used to determine the movement of theencoding pattern 1265, and thus the movement of theshaft 1260 and thecrown 1200. Using the output from thephotodiode array 1280 can be used to determine a position, rotation, rotation direction, and rotation speed of theshaft 1260. Based on the rotation, rotation direction, and/or speed, the encoder output may be used to change information or images that are presented on the display or user interface of the electronic device. - Although a photodiode array is specifically mentioned, embodiments disclosed herein can, optionally use various types of sensors that are arranged in various configurations for detecting the movement described herein. In some embodiments, the movement of the
shaft 1260 is detected by an image sensor, a light sensor such as a CMOS light sensor or imager, a photovoltaic cell or system, photo resistive component, a laser scanner and the like. - The signals or output of the optical encoder can be used to control various aspects of other components or modules of the device. For example, continuing with the time keeping application example discussed above, the
dial 1240 can be rotated in a clockwise manner in order to advance the displayed time forward. In one implementation, the optical encoder can be used to detect the rotational movement of thedial 1240, the direction of the movement, and the speed at which thedial 1240 is being rotated. Using the output from the optical encoder, the displayed hands of a time keeping application may rotate or otherwise move in accordance with the user-provided rotational input. Additionally, or alternatively, an audio and/or haptic output may be generated in accordance with the rotational movement of thedial 1240. For example, an audio click and/or a haptic tap can be output for every 5 degrees, 10 degrees, or other degree amount of rotation of thedial 1240. - Referring back to
FIG. 12 , thecrown 1200 is formed fromdial 1240 that is coupled to theshaft 1260. In some cases, theshaft 1260 and dial 1240 are formed as a single piece. As theshaft 1260 is coupled to, or is otherwise a part of thedial 1240, as thedial 1240 rotates or moves in a particular direction and at a particular speed, theshaft 1260 also rotates or moves in the same direction and with the same speed. - As shown in
FIG. 12 , theshaft 1260 of the optical encoder includes anencoding pattern 1265. As discussed above, theencoding pattern 1265 can be used to determine positional information about theshaft 1260 including rotational movement, angular displacement and movement speed. As shown inFIG. 12 , theencoding pattern 1265 includes an array of light and dark stripes. - Although light stripes and dark stripes are specifically mentioned and shown, the
encoding pattern 1265 can consist of various types of stripes having various shades or colors that provide surface contrasts. For example, theencoding pattern 1265 can include a stripe or marking that has a high reflective surface and another stripe that has a low reflective surface regardless of the color or shading of the stripes or markings. In another embodiment, a first stripe of theencoding pattern 1265 causes specular reflection while a second stripe of the encoding pattern causes diffuse reflection. When the reflected light is received by thephotodiode array 1280, a determination can be made as to the position and movement of the shaft such as described below. In embodiments where a holographic or diffractive pattern is used, the light from thelight source 1270 diffracts from the shaft. Based on the diffracted light, thephotodiode array 1280 can determine the position, movement and direction of movement of theshaft 1260. - In some embodiments, the stripes of the
encoding pattern 1265 extend axially along theshaft 1260. The stripes extend along the entire length of theshaft 1260 or partially along a length of theshaft 1260. In addition, theencoding pattern 1265 is disposed or formed around the entire circumference of theshaft 1260. In some embodiments, theencoding pattern 1265 includes a radial component. In yet other embodiments, theencoding pattern 1265 includes both a radial component and an axial component. - In accordance with some embodiments,
FIG. 13 shows a functional block diagram of anelectronic device 1300 configured in accordance with the principles of the various described embodiments. In particular, theelectronic device 1300 can be used to perform theprocess 700 described above with respect toFIG. 7A . The functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described inFIG. 13 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein. - As shown in
FIG. 13 , anelectronic device 1300 includes anevent detection unit 1302 configured to detect the occurrence of an event, analert output unit 1304 configured to output an alert, asensing unit 1306 configured to detect one or more environmental conditions, and aprocessing unit 1310 coupled to theevent detection unit 1302, thealert output unit 1304, and thesensing unit 1306. In some embodiments, theprocessing unit 1310 includes aresponse determining unit 1312 and a selectingunit 1314. - The
processing unit 1310 is configured to, while the device is subject to the one or more environmental conditions, detect the occurrence of an event (e.g., using the event detection unit). In response to detecting the occurrence of the event, determine a response to the event (e.g., using the response determining unit) based on a current alert mode selected from a set of three or more alert modes (e.g., using the selecting unit), the selection based on the one or more environmental conditions (e.g., using the sensing unit). The determining the response includes: in accordance with a determination that the current alert mode is a first alert mode, outputting a first alert in response to the event (e.g., using the alert output unit), and in accordance with a determination that the current alert mode is a second alert mode, outputting a second alert in response to the event (e.g., using the alert output unit), wherein the second alert is different from the first alert. - In some embodiments, the current alert mode is automatically selected (e.g., using the selecting unit), based on the one or more environmental conditions prior detecting the occurrence of the event. The current alert mode is automatically selected using an environmental sensor (e.g., of the sensing unit 1306) that is configured to detect the one or more environmental conditions. In some embodiments, the environmental sensor is a microphone configured to detect an ambient sound level, and the current alert mode that is selected (e.g., using the selecting unit 1314) includes one or more of: a visual component that corresponds to the ambient sound level, an audio component that corresponds to the ambient sound level and a haptic component that corresponds to the ambient sound level. In some embodiments, the environmental sensor is a motion sensor configured to detect an activity level, and the current alert mode that is selected (e.g., using the selecting unit 1314) includes one or more of: a visual component that corresponds to the activity level, an audio component that corresponds to the activity level and a haptic component that corresponds to the activity level. In some embodiments, the environmental sensor is an image sensor configured to detect an ambient light level, and the current alert mode that is selected (e.g., using the selecting unit 1314) includes one or more of: a visual component that corresponds to the ambient light level, an audio component that corresponds to the ambient light level, and a haptic component that corresponds to the ambient light level. In some embodiments, the environmental sensor is a battery power sensor configured to detect a current battery level, and the current alert mode that is selected (e.g., using the selecting unit 1314) includes one or more of an audio component and a haptic component, wherein an estimated peak power output of the current alert mode corresponds to the current battery level.
- In some embodiments, the first alert mode includes a first haptic component and a first visual component, and the second alert mode includes a second haptic component and no visual component. In some embodiments, the first alert mode includes a first audio component and a first haptic component, and the second alert mode includes a second audio component and second haptic component, wherein the first audio and first haptic component are different than the second audio component and the second haptic component, respectively. In some embodiments, the first alert mode includes no audio component and no haptic component. In some embodiments, the first alert mode includes a first audio component and a first haptic component offset by a first delay, and the second alert mode includes the first audio component and the first haptic component offset by a second delay that is different than the first delay.
- In some embodiments, the
processing unit 1310 is further configured to, after selecting the current alert mode, select a subsequent current alert mode (e.g., using the selecting unit 1314) based on a changed environmental condition. In some embodiments, determining the response to the event includes, in accordance with a determination that the current alert mode is a third alert mode (e.g., using the response determining unit 1312), outputting a third alert in response to the event, wherein the third alert is different from the first alert and the second alert. - In accordance with some embodiments,
FIG. 14 shows a functional block diagram of anelectronic device 1400 configured in accordance with the principles of the various described embodiments. In particular, theelectronic device 1400 can be used to perform theprocess 710 described above with respect toFIG. 7B . The functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described inFIG. 14 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein. - As shown in
FIG. 14 , anelectronic device 1400 includes anevent detection unit 1402 configured to detect the occurrence of an event, analert output unit 1404 configured to output an alert, asensing unit 1406 configured to detect an activity level, and aprocessing unit 1410 coupled to theevent detection unit 1402, thealert output unit 1404, and thesensing unit 1406. In some embodiments, theprocessing unit 1410 includes aresponse determining unit 1412 configured to determine if an activity level exceeds a threshold. - The
processing unit 1410 is configured to, detect an event (e.g., using the event detection unit 1402) and, in response to detecting the event, in accordance with a determination that an activity level exceeds a threshold (e.g., using the threshold determining unit 1412), forgoing outputting an alert. In accordance with a determination that the activity level does not exceed the threshold (e.g., using the threshold determining unit 1412), outputting the alert (e.g., using the alert output unit 1404). - In some embodiments, the activity level is determined using a motion sensor of the electronic device (e.g., using the sensing unit 1406). In some embodiments, the activity level is determined using a motion sensor of the electronic device to detect a number of motion events over a predetermined time (e.g., using the sensing unit 1406).
- In some embodiments, the
processing unit 1410 is further configured to, after forgoing outputting the alert, detect that the activity level has dropped below a low-activity threshold (e.g., using the threshold determining unit 1412) and output the alert (e.g., using the alert output unit 1404). In some embodiments, the low-activity threshold is different than the threshold. In some embodiments, theprocessing unit 1410 is further comprised to, after a predetermined amount of time after forgoing the alert, make a subsequent determination (e.g., using the threshold determining unit 1412) whether the threshold has been exceeded, and, in accordance with the subsequent determination that the activity level exceeds a threshold (e.g., using the threshold determining unit 1412), forgo outputting an alert, and in accordance with the subsequent determination that the activity level does not exceed the threshold (e.g., using the threshold determining unit 1412), outputting the alert (e.g., using the alert outputting unit 1404). - In accordance with some embodiments,
FIG. 15 shows a functional block diagram of anelectronic device 1500 configured in accordance with the principles of the various described embodiments. In particular, theelectronic device 1500 can be used to perform theprocess 720 described above with respect toFIG. 7C . The functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described inFIG. 15 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein. - As shown in
FIG. 15 , anelectronic device 1500 includes anevent detection unit 1502 configured to detect the occurrence of an event, analert output unit 1504 configured to output an alert, and aprocessing unit 1510 coupled to theevent detection unit 1502 and thealert output unit 1504. In some embodiments, theprocessing unit 1410 includes athreshold determining unit 1512 that is configured to determine whether a number of events exceeds a threshold. - The
processing unit 1510 is configured to detecting an event (e.g., using the event detection unit 1502). In response to detecting the event, theprocessing unit 1510 is further configured to, in accordance with a determination that a number of events that have been detected over a predetermined period exceeds a threshold (e.g., using the threshold determining unit 1512), and outputting an alert (e.g., using the alert output unit 1504), and in accordance with a determination that the number of events that have been detected over the predetermined period does not exceed the threshold (e.g., using the threshold determining unit 1512), forgoing outputting the alert. - In some embodiments, the
processing unit 1510 is further configured to detect a subsequent event (e.g., using the event detection unit), and in response to detecting the subsequent event: if an alert associated with a previous event has been forgone, and in accordance with a determination that the number of events that have been detected over the predetermined period exceeds the threshold (e.g., using the threshold determining unit 1512), outputting the alert (e.g., using the alert output unit 1504), wherein the alert is based, at least in part, on the previous event having the alert that has been forgone. In some embodiments, the determination as to whether or not the number of events that have been detected over the predetermined period (e.g., using the threshold determining unit 1512) includes counting the event. In some embodiments, the determination as to whether or not the number of events that have been detected over the predetermined period (e.g., using the threshold determining unit 1512) includes counting one or more prior events that were detected within the predetermined period before the event was detected. In some embodiments, the alert includes information indicative of the event and one or more prior events occurring prior to the event. In some embodiments, a strength of the alert (e.g., using the alert output unit 1504) corresponds to the number of detected events. In some embodiments, the strength of the alert corresponds to a frequency and a type of detected events. In some embodiments, the event includes one or more of: receiving an e-mail, receiving a phone call, receiving a message, and receiving calendar reminder. - In accordance with some embodiments,
FIG. 16 shows a functional block diagram of anelectronic device 1600 configured in accordance with the principles of the various described embodiments. In particular, theelectronic device 1600 can be used to perform theprocess 730 described above with respect toFIG. 7D . The functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described inFIG. 16 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein. - As shown in
FIG. 16 , anelectronic device 1600 includes analert output unit 1602 configured to output an alert, aninput unit 1604 configured to receive an interaction from the user, and aprocessing unit 1610 coupled to thealert output unit 1602 and theinput unit 1604. In some embodiments, theprocessing unit 1610 includes adetection unit 1612 configured cooperate with theinput unit 1604 to detect an interaction received by the user, and a selectingunit 1614 configured to select a modified alert sequence. - The
processing unit 1610 is configured to, output a portion of an alert sequence (e.g., using the alert output unit 1602). The alert sequence includes predetermined sequence of alert outputs. Theprocessing unit 1610 is also configured to detect an interaction from the user (e.g., using the detecting unit) during the output of the portion of the alarm sequence (e.g., using the alert output unit 1602). In response to detecting the interaction, theprocessing unit 1610 is further configured to select a modified alert sequence (e.g., using the selecting unit 1614) in response to the input, and output the modified alert sequence (e.g., using the alert output unit 1602). - In some embodiments, the alert sequence includes a series of alarm outputs that escalate in intensity over time. In some embodiments, the modified alert sequence is a non-escalating alert sequence. In some embodiments, the alert sequence is a sequence of alerts that correspond to a single event. In some embodiments, the modified alert sequence is a silent alert sequence having no audio component. In some embodiments, the input received at the
input unit 1604 includes a request to reduce an intrusiveness the portion of the alert sequence, and the modified alert sequence has a reduced intrusiveness. In some embodiments, the input received at theinput unit 1604 includes a request to increase an intrusiveness of the portion of the alert sequence and the modified alert sequence has an increased intrusiveness. - In accordance with some embodiments,
FIG. 17 shows a functional block diagram of anelectronic device 1700 configured in accordance with the principles of the various described embodiments. In particular, theelectronic device 1700 can be used to perform theprocess 740 described above with respect toFIG. 7E . The functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described inFIG. 17 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein. - As shown in
FIG. 17 , anelectronic device 1700 includes anevent detection unit 1702 configured to detect an event, analert output unit 1704 configured to output an alert, acommunication unit 1706 that is configured to conduct communication between the electronic device and an external device, and aprocessing unit 1710 coupled to theevent detection unit 1702 and thealert output unit 1704. In some embodiments, theprocessing unit 1710 includes aproximity determining unit 1712 configured to determine if a second device is proximate to the electronic device and adevice selection unit 1714 configured to select a device to output an alert. - The
processing unit 1710 is configured to detecting an event (e.g., using the event detection unit 1702). In response to detecting the event, theprocessing unit 1710 is configured to, in accordance with a determination that a second device is in proximity to the first device (e.g., using the proximity detection unit 1712), select an alert-output device (using the device selection unit 1714), and output the alert on the alert-output device (e.g., using the alert output unit 1704). In some embodiments, the alert is not output on a device that is not selected as the alert-output device. In some embodiments, the alert is relayed to the second device using the first device (e.g., using the communication unit 1706). In some embodiments, the first device is a mobile phone and the second device is a wearable computing device. In some embodiments, a communication channel is established between the second device and the first device using a pairing operation (e.g., using the communication unit). - In some embodiments, at least one additional device is in proximity to the first device, and the alert-output device is selected (e.g., using the device selecting unit 1714) from the first device, the second device, and the at least one additional device. In some embodiments, the alert-output device is selected (e.g., using the device selecting unit 1714) based on a user-provided prioritization. In some embodiments, if the user does not interact with the alert-output device after the alert is sent, then a second alert is sent using a device that was not selected as the alert-output device. In some embodiments, the first and second devices are updated in response to the detected event, and wherein the alert is only output on the selected alert-output device.
- In some embodiments, the alert-output device is selected (e.g., using the device selection unit 1714) based on a usage of one or more of: the first device and the second device, wherein the usage includes a time of usage, and either the first or second device having a time of usage that is most recent is selected as the alert-output device.
- In some embodiments, the alert-output device is selected (e.g., using the device selection unit 1714) based on a usage of one or more of: the first device and the second device, wherein the usage includes a time of usage and an amount of usage, and either the first or second device having an amount of usage that is greater over a predetermined time period is selected as the alert-output device.
- In some embodiments, the alert-output device is selected (e.g., using the device selection unit 1714) based on a usage of one or more of: the first device and the second device, wherein the usage includes a type of usage, and either the first or second device having a type of usage that corresponds to a predetermined usage type is selected as the alert-output device.
- In accordance with some embodiments,
FIG. 18 shows a functional block diagram of anelectronic device 1800 configured in accordance with the principles of the various described embodiments. In particular, theelectronic device 1800 can be used to perform theprocess 750 described above with respect toFIG. 7F . The functional blocks of the device are, optionally, implemented by hardware, software, or a combination of hardware and software to carry out the principles of the various described embodiments. It is understood by persons of skill in the art that the functional blocks described inFIG. 18 are, optionally, combined or separated into sub-blocks to implement the principles of the various described embodiments. Therefore, the description herein optionally supports any possible combination or separation or further definition of the functional blocks described herein. - As shown in
FIG. 18 , anelectronic device 1800 includes aninput unit 1802 configured to receive an input from the user, analert output unit 1804 configured to output an alert, and aprocessing unit 1810 coupled to theinput unit 1802 and thealert output unit 1804. In some embodiments, theprocessing unit 1810 includes adetection unit 1812 configured cooperate with theinput unit 1802 to detect a property of the input provided by the user, and a selectingunit 1814 configured to select a modified alert sequence. - The
input unit 1802 is configured to receive a first input on the device, the first input being below an input threshold. In response to detecting the first input (e.g., using the detection unit 1812), theprocessing unit 1810 is configured to produce a first output (e.g., using the alert output unit 1804). The first output includes a haptic component for the first input that is coordinated with an audio component for the first input. Theinput unit 1802 is also configured to receive a second input on the device. Theprocessing unit 1810 is configured to, in response to detecting the second input (e.g., using the detection unit 1812), produce a second output in response to the second input (e.g., using the alert output unit 1804). Theprocessing unit 1810 is further configured to, in accordance with a determination that the second input is below the input threshold (e.g., using the determining unit 1816), produce a second output (e.g., using the alert output unit 1804), which includes a haptic component for the second input that is coordinated with an audio component for the second input. Theprocessing unit 1810 is further configured to, in accordance with a determination that the second input is above the input threshold (e.g., using the determining unit 1816), produce a second output (e.g., using the alert output unit 1804) that includes a modified haptic component for the second input. - In some embodiments, the haptic component for the first input is synchronized with the audio component for the first input, if the second input is below the input threshold, the haptic component for the second input is synchronized with the audio component for the second input, and if the second input is above the input threshold, the haptic component for the second input is asynchronous with respect to the audio component for the second input. In some embodiments, the input threshold includes a speed threshold, the synchronous haptic is a discrete haptic output that corresponds to a discrete audio output, and the asynchronous haptic is a continuous haptic output.
- In some embodiments, the
input unit 1802 receives a first input and a second input that are rotation inputs. In some embodiments, theinput unit 1802 receives a rotation that is a circular motion on a touch-sensitive region of the electronic device. In some embodiments, theinput unit 1802 receives a rotation that is rotation of a physical knob integrated into the device. In some embodiments, thealert output unit 1804 produces an audio component that includes a series of click sounds that corresponds to changes in angular position of the knob. In some embodiments, theinput unit 1802 receives a first input and second inputs that are scrolling inputs for a display of the electronic device. In some embodiments, thealert output unit 1804 produces an audio component that includes a series of click sounds that correspond to a movement through a list of items on the display of the electronic device. - Embodiments of the present disclosure are described above with reference to block diagrams and operational illustrations of methods and the like. The operations described may occur out of the order as shown in any of the figures. Additionally, one or more operations may be removed or executed substantially concurrently. For example, two blocks shown in succession may be executed substantially concurrently. Additionally, the blocks may be executed in the reverse order.
- The description and illustration of one or more embodiments provided in this disclosure are not intended to limit or restrict the scope of the present disclosure as claimed. The embodiments, examples, and details provided in this disclosure are considered sufficient to convey possession and enable others to make and use the best mode of the claimed embodiments. Additionally, the claimed embodiments should not be construed as being limited to any embodiment, example, or detail provided above. Regardless of whether shown and described in combination or separately, the various features, including structural features and methodological features, are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the embodiments described herein that do not depart from the broader scope of the claimed embodiments.
Claims (21)
1-9. (canceled)
10. A method, comprising:
at an electronic device:
detecting an alert condition; and
in response to detecting the alert condition:
in accordance with a determination that ambient light detected by the electronic device is at a first ambient light level, generating a first alert that corresponds to the alert condition, wherein the first alert comprises a first haptic alert component; and
in accordance with a determination that the ambient light detected by the electronic device is at a second ambient light level that is different from the first ambient light level, generating a second alert that corresponds to the alert condition without generating the first haptic alert component.
11. The method of claim 10 , wherein the second alert comprises a second haptic alert component that has a lower amplitude than the first haptic alert component.
12. The method of claim 10 , wherein:
the first alert further comprises a first audio alert component; and
the second alert comprises a second audio alert component that has a lower amplitude than the first audio alert component.
13. The method of claim 12 , wherein:
the second alert comprises a second haptic alert component that has a lower amplitude than the first haptic alert component;
the first haptic alert component overlaps in time with the first audio alert component; and
the second haptic alert component does not overlap in time with the second audio alert component.
14. The method of claim 12 , wherein the second audio alert component has a shorter duration than the first audio alert component.
15. The method of claim 10 , wherein:
the first alert further comprises a first audio alert component; and
the second alert comprises a second audio alert component that has a different frequency than the first audio alert component.
16. The method of claim 11 , wherein:
the first alert further comprises a first audio alert component;
the second alert comprises a second audio alert component;
the first haptic alert component overlaps in time with the first audio alert component; and
the second haptic alert component does not overlap in time with the second audio alert component.
17. An electronic device, comprising:
a haptic output generator;
a battery;
an ambient light sensor;
one or more processors;
memory; and
one or more programs stored in the memory and configured to be executed by the one or more processors, and including instructions for:
detecting an alert condition; and
in response to detecting the alert condition:
in accordance with a determination that ambient light detected by the ambient light sensor is at a first ambient light level, generating a first alert that corresponds to the alert condition, wherein the first alert comprises a first haptic alert component produced by the haptic output generator; and
in accordance with a determination that the ambient light detected by the ambient light sensor is at a second ambient light level that is different from the first ambient light level, generating a second alert that corresponds to the alert condition without generating the first haptic alert component.
18. The electronic device of claim 17 , wherein the second alert comprises a second haptic alert component produced by the haptic output generator that has a lower amplitude than the first haptic alert component.
19. The electronic device of claim 17 , wherein the second alert comprises a second haptic alert component produced by the haptic output generator that has a shorter duration than the first haptic alert component.
20. The electronic device of claim 17 , wherein:
the electronic device further comprises an audio output generator;
the first alert further comprises a first audio alert component produced by the audio output generator; and
the second alert comprises a second audio alert component produced by the audio output generator.
21. The electronic device of claim 20 , wherein the second audio alert component has a lower amplitude than the first audio alert component.
22. The electronic device of claim 20 , wherein the second audio alert component has a different frequency than the first audio alert component.
23. The electronic device of claim 20 , wherein:
the second alert comprises a second haptic alert component produced by the haptic output generator;
the first haptic alert component overlaps in time with the first audio alert component; and
the second haptic alert component does not overlap in time with the second audio alert component.
24. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device, cause the electronic device to:
detect an alert condition; and
in response to detecting the alert condition:
in accordance with a determination that ambient light detected by the electronic device is at a first ambient light level, generate a first alert that corresponds to the alert condition, wherein the first alert comprises a first haptic alert component; and
in accordance with a determination that the ambient light detected by the electronic device is at a second ambient light level that is different from the first ambient light level, generate a second alert that corresponds to the alert condition without generating the first haptic alert component.
25. The non-transitory computer readable storage medium of claim 24 , wherein the second alert comprises a second haptic alert component that has a lower amplitude than the first haptic alert component.
26. The non-transitory computer readable storage medium of claim 24 , wherein the second alert comprises a second haptic alert component that has a different frequency than the first haptic alert component.
27. The non-transitory computer readable storage medium of claim 24 , wherein the second alert comprises a second haptic alert component that has a shorter duration than the first haptic alert component.
28. The non-transitory computer readable storage medium of claim 24 , wherein:
the first alert further comprises a first audio alert component; and
the second alert further comprises a second audio alert component that has a lower amplitude than the first audio alert component.
29. The non-transitory computer readable storage medium of claim 28 , wherein:
the second alert comprises a second haptic alert component;
the first haptic alert component overlaps in time with the first audio alert component; and
the second haptic alert component does not overlap in time with the second audio alert component.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/516,451 US20220058935A1 (en) | 2014-09-02 | 2021-11-01 | Context-Based Alerts for an Electronic Device |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462044657P | 2014-09-02 | 2014-09-02 | |
US14/503,339 US9659482B2 (en) | 2014-09-02 | 2014-09-30 | Context-based alerts for an electronic device |
US15/595,593 US10210743B2 (en) | 2014-09-02 | 2017-05-15 | Context-based alerts for an electronic device |
US16/226,535 US10685553B2 (en) | 2014-09-02 | 2018-12-19 | Context-based alerts for an electronic device |
US16/900,440 US20210035435A1 (en) | 2014-09-02 | 2020-06-12 | Context-Based Alerts for an Electronic Device |
US17/516,451 US20220058935A1 (en) | 2014-09-02 | 2021-11-01 | Context-Based Alerts for an Electronic Device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/900,440 Division US20210035435A1 (en) | 2014-09-02 | 2020-06-12 | Context-Based Alerts for an Electronic Device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220058935A1 true US20220058935A1 (en) | 2022-02-24 |
Family
ID=55403131
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/503,339 Active US9659482B2 (en) | 2014-09-02 | 2014-09-30 | Context-based alerts for an electronic device |
US15/595,593 Active US10210743B2 (en) | 2014-09-02 | 2017-05-15 | Context-based alerts for an electronic device |
US16/226,535 Active US10685553B2 (en) | 2014-09-02 | 2018-12-19 | Context-based alerts for an electronic device |
US16/900,440 Abandoned US20210035435A1 (en) | 2014-09-02 | 2020-06-12 | Context-Based Alerts for an Electronic Device |
US17/516,451 Abandoned US20220058935A1 (en) | 2014-09-02 | 2021-11-01 | Context-Based Alerts for an Electronic Device |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/503,339 Active US9659482B2 (en) | 2014-09-02 | 2014-09-30 | Context-based alerts for an electronic device |
US15/595,593 Active US10210743B2 (en) | 2014-09-02 | 2017-05-15 | Context-based alerts for an electronic device |
US16/226,535 Active US10685553B2 (en) | 2014-09-02 | 2018-12-19 | Context-based alerts for an electronic device |
US16/900,440 Abandoned US20210035435A1 (en) | 2014-09-02 | 2020-06-12 | Context-Based Alerts for an Electronic Device |
Country Status (2)
Country | Link |
---|---|
US (5) | US9659482B2 (en) |
WO (1) | WO2016036672A1 (en) |
Families Citing this family (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8487759B2 (en) | 2009-09-30 | 2013-07-16 | Apple Inc. | Self adapting haptic device |
US8717152B2 (en) | 2011-02-11 | 2014-05-06 | Immersion Corporation | Sound to haptic effect conversion system using waveform |
US9715276B2 (en) * | 2012-04-04 | 2017-07-25 | Immersion Corporation | Sound to haptic effect conversion system using multiple actuators |
US9280637B2 (en) | 2012-10-05 | 2016-03-08 | Cerner Innovation, Inc. | Multi-action button for mobile devices |
US10275570B2 (en) | 2012-12-31 | 2019-04-30 | Cerner Innovation, Inc. | Closed loop alert management |
US9185202B2 (en) | 2012-12-31 | 2015-11-10 | Cerner Innovation, Inc. | Alert management utilizing mobile devices |
US9753436B2 (en) | 2013-06-11 | 2017-09-05 | Apple Inc. | Rotary input mechanism for an electronic device |
KR102231031B1 (en) | 2013-08-09 | 2021-03-23 | 애플 인크. | Tactile switch for an electronic device |
US10545657B2 (en) | 2013-09-03 | 2020-01-28 | Apple Inc. | User interface for manipulating user interface objects |
US10503388B2 (en) | 2013-09-03 | 2019-12-10 | Apple Inc. | Crown input for a wearable electronic device |
CN110262711B (en) | 2013-09-03 | 2023-03-03 | 苹果公司 | User interface object manipulation in a user interface |
US11068128B2 (en) | 2013-09-03 | 2021-07-20 | Apple Inc. | User interface object manipulations in a user interface |
CN105683865B (en) | 2013-09-30 | 2018-11-09 | 苹果公司 | Magnetic actuator for haptic response |
US9317118B2 (en) | 2013-10-22 | 2016-04-19 | Apple Inc. | Touch surface for simulating materials |
WO2015076695A1 (en) * | 2013-11-25 | 2015-05-28 | Yandex Llc | System, method and user interface for gesture-based scheduling of computer tasks |
US10276001B2 (en) | 2013-12-10 | 2019-04-30 | Apple Inc. | Band attachment mechanism with haptic response |
US10048802B2 (en) | 2014-02-12 | 2018-08-14 | Apple Inc. | Rejection of false turns of rotary inputs for electronic devices |
WO2015163842A1 (en) | 2014-04-21 | 2015-10-29 | Yknots Industries Llc | Apportionment of forces for multi-touch input devices of electronic devices |
EP3161581A1 (en) | 2014-06-27 | 2017-05-03 | Apple Inc. | Electronic device with rotatable input mechanism for navigating calendar application |
US10190891B1 (en) | 2014-07-16 | 2019-01-29 | Apple Inc. | Optical encoder for detecting rotational and axial movement |
JP6538825B2 (en) | 2014-09-02 | 2019-07-03 | アップル インコーポレイテッドApple Inc. | Semantic framework for variable haptic output |
WO2016036414A1 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Button functionality |
TWI676127B (en) | 2014-09-02 | 2019-11-01 | 美商蘋果公司 | Method, system, electronic device and computer-readable storage medium regarding electronic mail user interface |
US9659482B2 (en) | 2014-09-02 | 2017-05-23 | Apple Inc. | Context-based alerts for an electronic device |
US10235014B2 (en) | 2014-09-02 | 2019-03-19 | Apple Inc. | Music user interface |
US10073590B2 (en) | 2014-09-02 | 2018-09-11 | Apple Inc. | Reduced size user interface |
US9830782B2 (en) | 2014-09-02 | 2017-11-28 | Apple Inc. | Haptic notifications |
KR102130259B1 (en) | 2014-09-02 | 2020-07-03 | 애플 인크. | Wearable electronic device |
KR102269797B1 (en) * | 2014-10-08 | 2021-06-28 | 엘지전자 주식회사 | Wearable device |
US20160125721A1 (en) * | 2014-10-29 | 2016-05-05 | Verizon Patent And Licensing Inc. | Alerting users when a user device is dropped |
US10365807B2 (en) | 2015-03-02 | 2019-07-30 | Apple Inc. | Control of system zoom magnification using a rotatable input mechanism |
US10145711B2 (en) | 2015-03-05 | 2018-12-04 | Apple Inc. | Optical encoder with direction-dependent optical properties having an optically anisotropic region to produce a first and a second light distribution |
US10353467B2 (en) | 2015-03-06 | 2019-07-16 | Apple Inc. | Calibration of haptic devices |
WO2016144919A1 (en) | 2015-03-08 | 2016-09-15 | Apple Inc. | Compressible seal for rotatable and translatable input mechanisms |
US10768704B2 (en) | 2015-03-17 | 2020-09-08 | Whirlwind VR, Inc. | System and method for modulating a peripheral device based on an unscripted feed using computer vision |
US10129713B2 (en) * | 2015-03-24 | 2018-11-13 | Htc Corporation | Method for generating notification and a mobile device and system using the same |
AU2016100399B4 (en) | 2015-04-17 | 2017-02-02 | Apple Inc. | Contracting and elongating materials for providing input and output for an electronic device |
US10018966B2 (en) | 2015-04-24 | 2018-07-10 | Apple Inc. | Cover member for an input mechanism of an electronic device |
US9939923B2 (en) * | 2015-06-19 | 2018-04-10 | Microsoft Technology Licensing, Llc | Selecting events based on user input and current context |
US9881486B2 (en) * | 2015-06-26 | 2018-01-30 | International Business Machines Corporation | Wearable device for automatic detection of emergency situations |
JP2017050810A (en) * | 2015-09-04 | 2017-03-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Control method, communication terminal, communication system and program |
US10566888B2 (en) | 2015-09-08 | 2020-02-18 | Apple Inc. | Linear actuators for use in electronic devices |
DE112015006894T5 (en) * | 2015-09-11 | 2018-06-07 | Intel IP Corporation | Wireless network access of portable devices |
US9955021B1 (en) * | 2015-09-18 | 2018-04-24 | 8X8, Inc. | Analysis of call metrics for call direction |
US9811987B2 (en) * | 2015-10-01 | 2017-11-07 | International Business Machines Corporation | Detecting object theft using smart textiles |
US10607728B2 (en) | 2015-10-06 | 2020-03-31 | Cerner Innovation, Inc. | Alert optimizer |
US10037411B2 (en) * | 2015-12-30 | 2018-07-31 | Cerner Innovation, Inc. | Intelligent alert suppression |
JP2017127458A (en) * | 2016-01-20 | 2017-07-27 | セイコーエプソン株式会社 | Athletic performance measuring device |
US9891651B2 (en) | 2016-02-27 | 2018-02-13 | Apple Inc. | Rotatable input mechanism having adjustable output |
US10039080B2 (en) | 2016-03-04 | 2018-07-31 | Apple Inc. | Situationally-aware alerts |
US10268272B2 (en) | 2016-03-31 | 2019-04-23 | Apple Inc. | Dampening mechanical modes of a haptic actuator using a delay |
US10551798B1 (en) | 2016-05-17 | 2020-02-04 | Apple Inc. | Rotatable crown for an electronic device |
US10893113B2 (en) * | 2016-06-06 | 2021-01-12 | International Business Machines Corporation | Generating push notifications |
DK179657B1 (en) | 2016-06-12 | 2019-03-13 | Apple Inc. | Devices, methods and graphical user interfaces for providing haptic feedback |
DK179823B1 (en) * | 2016-06-12 | 2019-07-12 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
US10061399B2 (en) | 2016-07-15 | 2018-08-28 | Apple Inc. | Capacitive gap sensor ring for an input device |
US10019097B2 (en) | 2016-07-25 | 2018-07-10 | Apple Inc. | Force-detecting input structure |
DK179278B1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, methods and graphical user interfaces for haptic mixing |
DK201670720A1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
US10666751B1 (en) | 2016-12-28 | 2020-05-26 | Wells Fargo Bank, N.A. | Notification system and method |
EP3580921A1 (en) * | 2017-02-09 | 2019-12-18 | Sony Mobile Communications Inc. | System and method for controlling notifications in an electronic device according to user status |
JP6838434B2 (en) * | 2017-03-13 | 2021-03-03 | オムロン株式会社 | Environment sensor |
US10848578B1 (en) | 2017-04-11 | 2020-11-24 | Wells Fargo Bank, N.A. | Systems and methods for content delivery |
US10798180B1 (en) * | 2017-04-11 | 2020-10-06 | Wells Fargo Bank, N.A. | Systems and methods for optimizing information collaboration |
DK201770372A1 (en) | 2017-05-16 | 2019-01-08 | Apple Inc. | Tactile feedback for locked device user interfaces |
KR20230117638A (en) * | 2017-05-16 | 2023-08-08 | 애플 인크. | Image data for enhanced user interactions |
GB2562758B (en) * | 2017-05-24 | 2021-05-12 | Sony Interactive Entertainment Inc | Input device and method |
US10270270B2 (en) | 2017-06-04 | 2019-04-23 | Apple Inc. | Coordinating complementary notifications across related computing devices connected to a wireless charging apparatus |
US10664074B2 (en) * | 2017-06-19 | 2020-05-26 | Apple Inc. | Contact-sensitive crown for an electronic watch |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
US10962935B1 (en) | 2017-07-18 | 2021-03-30 | Apple Inc. | Tri-axis force sensor |
US10957445B2 (en) | 2017-10-05 | 2021-03-23 | Hill-Rom Services, Inc. | Caregiver and staff information system |
KR102518400B1 (en) * | 2017-11-22 | 2023-04-06 | 삼성전자주식회사 | Method for providing vibration and electronic device for supporting the same |
KR102222133B1 (en) * | 2018-01-12 | 2021-03-03 | 엔에이치엔 주식회사 | Mobile terminal and method for management application of the mobile terminal and target advertisement providing system using the same |
US20190373072A1 (en) * | 2018-05-30 | 2019-12-05 | Lenovo (Singapore) Pte. Ltd. | Event notification |
US11360440B2 (en) | 2018-06-25 | 2022-06-14 | Apple Inc. | Crown for an electronic watch |
US10917180B2 (en) * | 2018-07-24 | 2021-02-09 | Comcast Cable Communications, Llc | Controlling vibration output from a computing device |
US11561515B2 (en) | 2018-08-02 | 2023-01-24 | Apple Inc. | Crown for an electronic watch |
CN209560398U (en) | 2018-08-24 | 2019-10-29 | 苹果公司 | Electronic watch |
US11181863B2 (en) | 2018-08-24 | 2021-11-23 | Apple Inc. | Conductive cap for watch crown |
CN209625187U (en) | 2018-08-30 | 2019-11-12 | 苹果公司 | Electronic watch and electronic equipment |
US11194298B2 (en) | 2018-08-30 | 2021-12-07 | Apple Inc. | Crown assembly for an electronic watch |
US11435830B2 (en) | 2018-09-11 | 2022-09-06 | Apple Inc. | Content-based tactile outputs |
DK179888B1 (en) * | 2018-09-11 | 2019-08-27 | Apple Inc. | CONTENT-BASED TACTICAL OUTPUTS |
CN110134248B (en) * | 2018-09-11 | 2020-12-11 | 苹果公司 | Content-based haptic output |
US10599223B1 (en) | 2018-09-28 | 2020-03-24 | Apple Inc. | Button providing force sensing and/or haptic output |
US10691211B2 (en) | 2018-09-28 | 2020-06-23 | Apple Inc. | Button providing force sensing and/or haptic output |
US10719130B1 (en) * | 2018-12-27 | 2020-07-21 | Apple Inc. | Haptic actuator including overmolded field member and related methods |
US11194299B1 (en) * | 2019-02-12 | 2021-12-07 | Apple Inc. | Variable frictional feedback device for a digital crown of an electronic watch |
US20200302768A1 (en) * | 2019-03-22 | 2020-09-24 | Eaton Intelligent Power Limited | Locating device |
US10996761B2 (en) | 2019-06-01 | 2021-05-04 | Apple Inc. | User interfaces for non-visual output of time |
US11380470B2 (en) | 2019-09-24 | 2022-07-05 | Apple Inc. | Methods to control force in reluctance actuators based on flux related parameters |
JPWO2021220339A1 (en) * | 2020-04-27 | 2021-11-04 | ||
US11550268B2 (en) | 2020-06-02 | 2023-01-10 | Apple Inc. | Switch module for electronic crown assembly |
EP4264460A1 (en) | 2021-01-25 | 2023-10-25 | Apple Inc. | Implementation of biometric authentication |
US11977683B2 (en) | 2021-03-12 | 2024-05-07 | Apple Inc. | Modular systems configured to provide localized haptic feedback using inertial actuators |
CN113220144B (en) * | 2021-03-15 | 2022-06-07 | 荣耀终端有限公司 | Touch control pen |
US12092996B2 (en) | 2021-07-16 | 2024-09-17 | Apple Inc. | Laser-based rotation sensor for a crown of an electronic watch |
US12088681B2 (en) * | 2021-07-24 | 2024-09-10 | VMware LLC | Synchronization of notification actions across multiple enrolled devices |
US11809631B2 (en) | 2021-09-21 | 2023-11-07 | Apple Inc. | Reluctance haptic engine for an electronic device |
US11657699B1 (en) * | 2021-11-15 | 2023-05-23 | Dish Network L.L.C. | Methods and systems for outputting alerts on user interfaces |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030153366A1 (en) * | 2002-01-30 | 2003-08-14 | Nec Corporation | Cellular phone set and incoming call notification control method used therein |
US20060116175A1 (en) * | 2004-11-29 | 2006-06-01 | Cisco Technology, Inc. | Handheld communications device with automatic alert mode selection |
US20110175932A1 (en) * | 2010-01-21 | 2011-07-21 | Tobii Technology Ab | Eye tracker based contextual action |
US20140240122A1 (en) * | 2014-02-27 | 2014-08-28 | Fitbit, Inc. | Notifications on a User Device Based on Activity Detected By an Activity Monitoring Device |
US20150298169A1 (en) * | 2014-04-17 | 2015-10-22 | Lenovo (Singapore) Pte. Ltd. | Actuating vibration element on device based on sensor input |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6477117B1 (en) * | 2000-06-30 | 2002-11-05 | International Business Machines Corporation | Alarm interface for a smart watch |
US6993349B2 (en) * | 2001-07-18 | 2006-01-31 | Kyocera Wireless Corp. | Smart ringer |
US20040127198A1 (en) * | 2002-12-30 | 2004-07-01 | Roskind James A. | Automatically changing a mobile device configuration based on environmental condition |
US7162026B2 (en) * | 2003-07-25 | 2007-01-09 | William J Furnas | Alert muting method through indirect contact for portable devices |
US20060014569A1 (en) * | 2004-07-13 | 2006-01-19 | Broadcom Corporation | Mobile communication device with adaptive audible user notification |
US7136482B2 (en) * | 2004-10-26 | 2006-11-14 | Motorola, Inc. | Progressive alert indications in a communication device |
DE112005003333T5 (en) * | 2005-01-27 | 2007-12-13 | Fujitsu Ltd. | Electronic device, call notification control system and call notification control program |
US7570975B2 (en) * | 2005-10-26 | 2009-08-04 | Motorola, Inc. | Method and apparatus for management of low-battery mobile stations |
CN101018242A (en) * | 2006-02-11 | 2007-08-15 | 鸿富锦精密工业(深圳)有限公司 | A mobile communication device and method for automatically adjusting the ring mode |
US7747293B2 (en) * | 2006-10-17 | 2010-06-29 | Marvell Worl Trade Ltd. | Display control for cellular phone |
US8260366B2 (en) * | 2007-09-28 | 2012-09-04 | At&T Intellectual Property I, Lp | Automatic setting of an alert mode on a wireless device |
US8836502B2 (en) * | 2007-12-28 | 2014-09-16 | Apple Inc. | Personal media device input and output control based on associated conditions |
US8428053B2 (en) * | 2009-02-26 | 2013-04-23 | Plantronics, Inc. | Presence based telephony call signaling |
US8195194B1 (en) * | 2010-11-02 | 2012-06-05 | Google Inc. | Alarm for mobile communication device |
US9526127B1 (en) * | 2011-11-18 | 2016-12-20 | Google Inc. | Affecting the behavior of a user device based on a user's gaze |
US9168419B2 (en) * | 2012-06-22 | 2015-10-27 | Fitbit, Inc. | Use of gyroscopes in personal fitness tracking devices |
US8823507B1 (en) * | 2012-09-19 | 2014-09-02 | Amazon Technologies, Inc. | Variable notification alerts |
US20140085077A1 (en) * | 2012-09-26 | 2014-03-27 | Aliphcom | Sedentary activity management method and apparatus using data from a data-capable band for managing health and wellness |
US20140253323A1 (en) * | 2013-03-06 | 2014-09-11 | Lifescan, Inc. | Low analyte level alert system from continuous analyte monitor |
US9143898B1 (en) * | 2013-04-22 | 2015-09-22 | Amazon Technologies, Inc. | Automatically selecting alert modes based on location |
KR20150006195A (en) * | 2013-07-08 | 2015-01-16 | 엘지전자 주식회사 | Wearable device and the method for controlling the same |
US9451100B2 (en) * | 2013-08-28 | 2016-09-20 | Samsung Electronics Co., Ltd. | Method for transmitting notification information and electronic device thereof |
WO2015035098A2 (en) * | 2013-09-04 | 2015-03-12 | Zero360, Inc. | Processing system and method |
WO2015065494A1 (en) * | 2013-11-04 | 2015-05-07 | Bodhi Technology Ventures Llc | Detecting stowing or unstowing of a mobile device |
CN106231997A (en) * | 2014-07-07 | 2016-12-14 | 深圳市汇顶科技股份有限公司 | Intelligent watch |
WO2016017997A1 (en) * | 2014-07-31 | 2016-02-04 | Samsung Electronics Co., Ltd. | Wearable glasses and method of providing content using the same |
US9659482B2 (en) | 2014-09-02 | 2017-05-23 | Apple Inc. | Context-based alerts for an electronic device |
-
2014
- 2014-09-30 US US14/503,339 patent/US9659482B2/en active Active
-
2015
- 2015-08-31 WO PCT/US2015/047814 patent/WO2016036672A1/en active Application Filing
-
2017
- 2017-05-15 US US15/595,593 patent/US10210743B2/en active Active
-
2018
- 2018-12-19 US US16/226,535 patent/US10685553B2/en active Active
-
2020
- 2020-06-12 US US16/900,440 patent/US20210035435A1/en not_active Abandoned
-
2021
- 2021-11-01 US US17/516,451 patent/US20220058935A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030153366A1 (en) * | 2002-01-30 | 2003-08-14 | Nec Corporation | Cellular phone set and incoming call notification control method used therein |
US20060116175A1 (en) * | 2004-11-29 | 2006-06-01 | Cisco Technology, Inc. | Handheld communications device with automatic alert mode selection |
US20110175932A1 (en) * | 2010-01-21 | 2011-07-21 | Tobii Technology Ab | Eye tracker based contextual action |
US20140240122A1 (en) * | 2014-02-27 | 2014-08-28 | Fitbit, Inc. | Notifications on a User Device Based on Activity Detected By an Activity Monitoring Device |
US20150298169A1 (en) * | 2014-04-17 | 2015-10-22 | Lenovo (Singapore) Pte. Ltd. | Actuating vibration element on device based on sensor input |
Also Published As
Publication number | Publication date |
---|---|
US20210035435A1 (en) | 2021-02-04 |
US20190122528A1 (en) | 2019-04-25 |
US10210743B2 (en) | 2019-02-19 |
US20160063850A1 (en) | 2016-03-03 |
WO2016036672A1 (en) | 2016-03-10 |
US9659482B2 (en) | 2017-05-23 |
US20170337804A1 (en) | 2017-11-23 |
US10685553B2 (en) | 2020-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220058935A1 (en) | Context-Based Alerts for an Electronic Device | |
US10133351B2 (en) | Providing haptic output based on a determined orientation of an electronic device | |
JP6955603B2 (en) | Providing priming queues for electronic device users | |
US9794402B2 (en) | Updating device behavior based on user behavior | |
AU2017240907B2 (en) | Sharing updatable graphical user interface elements | |
US20200288008A1 (en) | Digital device and method for controlling the same | |
US9075435B1 (en) | Context-aware notifications | |
US20160198319A1 (en) | Method and system for communicatively coupling a wearable computer with one or more non-wearable computers | |
AU2013230800B2 (en) | Portable electronic device and method for controlling operation thereof based on user motion | |
US10091143B2 (en) | Dynamic rule-based notifications | |
WO2017071059A1 (en) | Communication method, apparatus and system for wearable device | |
US20150172441A1 (en) | Communication management for periods of inconvenience on wearable devices | |
CN106462223A (en) | Automatic sending of an electronic message to a caller indicating a called user will return the incoming call in a time frame corresponding to a numerical count of detected user gesture(s) | |
KR20140132232A (en) | Smart watch and method for controlling thereof | |
US10499203B2 (en) | Method for generating notification and a mobile device and system using the same | |
EP3850880B1 (en) | Fall detection - audio looping | |
US20180120930A1 (en) | Use of Body-Area Network (BAN) as a Kinetic User Interface (KUI) | |
CN109361822A (en) | Incoming call reminding method, mobile terminal and computer readable storage medium | |
EP3319304B1 (en) | Terminal control method and accessory device | |
US9197760B2 (en) | Hand activated mode setting | |
JP2018164140A (en) | Portable terminal, program, and event transmitting/receiving system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |