[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20140197963A1 - Portable monitoring devices and methods of operating the same - Google Patents

Portable monitoring devices and methods of operating the same Download PDF

Info

Publication number
US20140197963A1
US20140197963A1 US14/029,759 US201314029759A US2014197963A1 US 20140197963 A1 US20140197963 A1 US 20140197963A1 US 201314029759 A US201314029759 A US 201314029759A US 2014197963 A1 US2014197963 A1 US 2014197963A1
Authority
US
United States
Prior art keywords
user
activity
data
implementations
monitoring device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/029,759
Inventor
James Park
Shelten Gee Jao Yuen
Eric Nathan Friedman
Christine Boomer Brumback
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fitbit LLC
Original Assignee
Fitbit LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fitbit LLC filed Critical Fitbit LLC
Priority to US14/029,759 priority Critical patent/US20140197963A1/en
Assigned to FITBIT, INC. reassignment FITBIT, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FRIEDMAN, ERIC NATHAN, BRUMBACK, CHRISTINE BOOMER, PARK, JAMES, YUEN, SHELTEN GEE JAO
Priority to US14/062,717 priority patent/US8903671B2/en
Publication of US20140197963A1 publication Critical patent/US20140197963A1/en
Assigned to MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT reassignment MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT SECURITY INTEREST Assignors: FITBIT, INC.
Assigned to SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT reassignment SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FITBIT, INC.
Priority to CN201410475447.4A priority patent/CN104434315B/en
Priority to CN202110530090.5A priority patent/CN113367689A/en
Priority to CN201710251926.1A priority patent/CN107260178B/en
Priority to US14/524,909 priority patent/US9286789B2/en
Assigned to FitStar, Inc., FITBIT, INC. reassignment FitStar, Inc. RELEASE OF SECURITY INTEREST IN PATENTS Assignors: MORGAN STANLEY SENIOR FUNDING, INC.
Assigned to FitStar, Inc., FITBIT, INC. reassignment FitStar, Inc. TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT
Priority to US15/017,356 priority patent/US9600994B2/en
Priority to US15/427,638 priority patent/US10134256B2/en
Priority to US16/167,386 priority patent/US11423757B2/en
Priority to US17/892,400 priority patent/US12002341B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0446Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/002Monitoring the patient using a local or closed circuit, e.g. in a room or building
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/10Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B5/00Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied
    • G08B5/22Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16ZINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
    • G16Z99/00Subject matter not provided for in other main groups of this subclass
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches

Definitions

  • the Fitbit Ultra (manufactured by Fitbit Inc. headquartered in San Francisco, Calif.) is a biometric monitoring device that is approximately 2′′ long, 0.75′′ wide, and 0.5′′ deep.
  • the Fitbit Ultra has a pixelated display, battery, sensors, wireless communications capability, power source, and interface button, as well as an integrated clip for attaching the device to a pocket or other portion of clothing, all packaged within this small volume.
  • a device in one aspect of the disclosed implementations, includes one or more motion sensors for sensing motion of the device and providing activity data indicative of the sensed motion.
  • the device also includes one or more processors for monitoring the activity data, and receiving or generating annotation data for annotating the activity data with one or more markers or indicators to define one or more characteristics of an activity session.
  • the device also includes one or more feedback devices for providing feedback, a notice, or an indication to a user based on the monitoring.
  • the device further includes a portable housing that encloses at least portions of the motion sensors, the processors and the feedback devices.
  • the device further includes a memory and the processors are further configured to store the activity data or data derived from the activity data in the memory. In some such implementations, the processors are further configured to store the annotation data in the memory. In some implementations, the processors are further configured to determine one or more activity metrics based on the activity data. In some such implementations, the processors are further configured to determine one or more activity metrics based on the annotation data. In some implementations, the device further includes one or more user input devices included in or on the housing for receiving or sensing user input, and the processors are further configured to receive and interpret the user input received or sensed via the user input devices.
  • the motion sensors themselves also can function as user input devices by sensing a user's touch, tapping, or other physical gesture made on or with the device.
  • the device further includes transmitting and receiving circuitry.
  • the user input is input by the user via an external or remote device and then communicated to the receiving circuitry.
  • the processors are further configured to receive and interpret the user input received from the receiving circuitry.
  • the transmitting and receiving circuitry is configured for wireless communication over a computer network, and the user input is input via a web or mobile application.
  • the device is configured to enable a user to input the annotation data via user input.
  • the user input is input based on a physical interaction with the device that is then interpreted by the processors.
  • the user input is input to an external device that is communicatively-coupled with the device via one or more wired or wireless connections or networks.
  • the device is configured to enable a user to initiate, indicate, or mark a start time or an end time of a user-defined activity session in response to user input received via one the one or more user interface devices.
  • the device is configured to enable a user to indicate a particular user activity performed during a user-defined activity session in response to user input received via one the one or more user interface devices.
  • the device is configured to automatically annotate the activity data or to generate the annotation data. In some such implementations, the device is configured to automatically annotate the activity data or to generate the annotation data based on a device state. In some implementations, the device is configured to automatically annotate the activity data or to generate the annotation data based on an analysis of the activity data. In some implementations, the processors are configured to automatically determine the activity data to track or the activity metrics to calculate based on the annotation data.
  • the housing includes a wrist- or arm-band, is configured for physical attachment to or coupling with a wrist- or arm-band, or is configured to be inserted into a wrist- or arm-band.
  • the device further includes one or more of: one or more gyroscopes, one or more physiological sensors, one or more biometric sensors, one or more altitude sensors, one or more temperature sensors, and one or more heart-rate sensors.
  • a method of monitoring one or more activity metrics using a portable monitoring device includes sensing, by one or more motion sensors within the device, motion of the device. The method also includes outputting, by the one or more motion sensors, activity data indicative of the sensed movement. The method also includes receiving, by one or more processors within the device, the activity data. The method additionally includes receiving or generating, by the one or more processors, annotation data. The method further includes annotating the activity data based on the annotation data.
  • the method further includes receiving the annotation data from a user. In some other implementations, the method includes generating the annotation data based on the activity data. In some implementations, the processors are further configured to determine or calculate one or more activity metrics based on the activity data and the annotation data.
  • a device in still another aspect of the disclosed implementations, includes one or more motion sensors for sensing motion of the device and providing activity data indicative of the sensed motion.
  • the device also includes one or more processors for monitoring the activity data, and for switching among a plurality of activity-tracking modes.
  • the device also includes one or more feedback devices for providing feedback, a notice, or an indication to a user based on the monitoring.
  • the device further includes a portable housing that encloses at least portions of the motion sensors, the processors and the feedback devices.
  • the processors are further configured to determine one or more activity metrics based on the activity data. In some implementations, the processors are further configured to determine the one or more activity metrics based on which of the activity-tracking modes is currently active. In some implementations, the device is configured to enable a user to set or select a particular activity-tracking mode to switch into based on user input. In some implementations, the device is configured to automatically determine or select a particular activity-tracking mode to switch into in response to the activity data. In some implementations, the plurality of activity-tracking modes include one or more activity-specific activity-tracking modes and a sleep-tracking mode.
  • FIG. 1 depicts a block diagram of an example portable monitoring device.
  • FIG. 2 depicts a portable monitoring device that may be inserted into a holder with a belt clip or into a pocket on a wristband.
  • FIG. 3 depicts a portable monitoring device that may be worn on a person's forearm like a wristwatch.
  • FIG. 4 depicts another example of a portable monitoring device that may be worn on a person's forearm.
  • the present disclosure relates generally to portable monitoring devices (also referred to herein as “portable tracking devices” or simply as “devices”), and more particularly, to wearable monitoring devices including wearable biometric monitoring devices.
  • portable monitoring devices capable of monitoring and tracking movements or activities and related data.
  • the portable monitoring device can include one or more motion sensors for detecting movement data or various other biometric, physiological, or environmental sensors for detecting biometric data, physiological data, environmental data, or related data (hereinafter also collectively referred to as “activity data”).
  • the portable monitoring device includes a general or default activity-tracking mode.
  • the default activity-tracking mode is an “annotation mode.”
  • the activity data monitored or tracked (hereinafter “monitored” and “tracked” may be used interchangeably) while in the annotation mode can be annotated or otherwise marked to indicate, specify, or delineate the starting and ending time points, a duration, or other time points of or within an activity session.
  • an “activity session” may generally refer to a user-defined duration of time, or a duration of time associated with a particular activity or time of day, in which the device is monitoring activity data.
  • the activity data monitored while in the default annotation mode also can be annotated or otherwise marked to indicate, specify, or define a specific activity that is being performed by the user during the activity session such as, for example, walking, running, stair climbing, bicycling, swimming, or even sleeping.
  • the user can annotate the activity data prior to, during, or after completion of an associated activity.
  • one or more activity metrics can be determined, calculated, or analyzed based on the activity data.
  • the activity metrics are communicated to the user via a display, lighting, noise, or via vibrational or haptic feedback.
  • one or more achieved goals, progress indicators, alerts, or other activity-based notifications may be communicated to the user based on one or more of the activity metrics.
  • one or more alarms, reminders, or other time-, physiologically-, biometrically-, state-, condition-, or environment-based notifications also can be communicated to the user.
  • Such achieved goals, progress indicators, alerts, or other notifications can be communicated to the user via a display, lighting, noise, or via vibrational or haptic feedback.
  • the portable monitoring device is capable of switching among two or more modes such as two or more activity-tracking modes.
  • the two or more activity-tracking modes include one or more activity-specific activity-tracking modes including, for example, a walking mode, a running mode, a stair-climbing mode, a bicycling mode, a swimming mode, a climbing mode, and a golfing mode, among other example activity-tracking modes configured for other corresponding activities.
  • the two or more activity-tracking modes also can include a sleep-tracking mode.
  • the portable monitoring device itself can determine which activity data to monitor or which activity (or sleep) metrics (hereinafter “sleep metrics” also may generally be referred to as “activity metrics”) to determine, compute, calculate, track or analyze (hereinafter used interchangeably) based on which of the activity-tracking modes is currently active or initiated.
  • sleep metrics also may generally be referred to as “activity metrics”
  • activity metrics can be used interchangeably to determine, compute, calculate, track or analyze (hereinafter used interchangeably) based on which of the activity-tracking modes is currently active or initiated.
  • an external computing device or a back-end server can request certain activity data from the portable monitoring device based on which of the activity-tracking modes is currently active or initiated.
  • one or both of an external computing device or a back-end server can receive all activity data monitored by the portable monitoring device and subsequently filter or otherwise selectively process certain activity data to determine certain activity (or sleep) metrics based on which of the activity-tracking modes is currently active or initiated.
  • FIG. 1 depicts a block diagram of an example portable monitoring device 100 .
  • the portable monitoring device 100 includes one or more sensors 102 .
  • the portable monitoring device 100 also includes a processing unit 104 , a memory 106 , a user interface 108 , and input/output (I/O) interface 110 .
  • the one or more sensors 102 , the processing unit 104 , the memory 106 , the user interface 108 , and the I/O interface 110 are communicatively connected with one or more of one another via one or more communication paths collectively referred to as communication bus 112 .
  • the portable monitoring device 100 further includes a power source 114 , such as, for example, one or more rechargeable or a removable batteries.
  • the sensors 102 include one or more motion sensors configured for sensing and outputting movement data indicative of the motion of the portable monitoring device 100 .
  • the motion sensors can include one or more accelerometers for sensing movement data.
  • the portable monitoring device 100 includes one or more accelerometers for sensing acceleration or other movement data in each of, for example, three directions, which may be orthogonal.
  • the sensors 102 additionally can include one or more gyroscopes for sensing rotation data.
  • the portable monitoring device 100 includes one or more gyroscopes for sensing rotation about each of, for example, three axes, which may be orthogonal. As will be described later, movement data and rotation data also can be used to capture user input.
  • the sensors 102 additionally can include one or more altimeters (hereinafter also referred to as “altitude sensors.”
  • the portable monitoring device 100 can include a pressure or barometric altimeter.
  • the sensors 102 additionally can include one or more temperature sensors for sensing one or both of a temperature of the environment outside of the user's body or an internal temperature of the user's body.
  • the sensors 102 additionally can include one or more light sensors (for example, photodetectors).
  • the light sensors can be used to detect an ambient light level of the environment for use in, for example, determining a suitable or optimal intensity of a display of the portable monitoring device 100 .
  • Other light sensors can be used to gather other biometric data such as an oxygen level of the user's blood.
  • the sensors 102 also can include one or more pressure or proximity sensors for receiving user input. Such pressure or proximity sensors can be based on mechanical designs or on, for example, capacitive, resistive, or other electrical or optical designs.
  • the portable monitoring device 100 also can be coupled with external sensing devices such as an external heart rate monitor (for example, a chest-strap heart rate monitor) for sensing a user's hear rate.
  • the portable monitoring device 100 also can include or be coupled with other physiological or biometric sensors.
  • the portable monitoring device 100 additionally is configured to sense or monitor one or more other types of biometric data or to measure, calculate, or determine biometric data based on the movement, rotation, or other data described above.
  • Biometric data as used herein, may refer to virtually any data pertaining to the physical characteristics of the human body, and as described above, activity data may also refer to such biometric data.
  • the processing unit 104 can include one or more processors, processing circuits, computing units, computing circuits, controllers, or microcontrollers (hereinafter used interchangeably). Each of the processors can be implemented by a general- or special-purpose processor (or set of processing cores) and can execute sequences of programmed instructions (“code”) to perform tasks and effectuate various operations. Additionally or alternatively, the processing unit 104 can include a custom-built hardware ASIC (application-specific integrated circuit), or can be programmed into a programmable logic device such as an FPGA (field-programmable gate array).
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate array
  • the memory 106 stores executable instructions that, when executed by the processing unit 104 , cause the processing unit 104 to control one or more of the sensors 102 , to sample or extract data received from the sensors 102 , to store the received data in the memory 106 , and to retrieve or load data previously stored in the memory 106 .
  • the activity data received from the sensors 102 may be stored in raw format in the memory 106 by the processing unit 104 , or may be pre-processed prior to storage in the memory 106 .
  • the processing unit 104 may store or buffer the most recent 10 minutes of activity data in raw form, but may then store data from prior to the ten-minute window as filtered data, for example, with a lower sampling rate or with some form of numerical analysis, such as a moving average, performed, or as converted data: for example, acceleration data may be converted to activity metrics such as “steps taken,” “stairs climbed,” or “distance traveled.”
  • Activity data from the sensors 102 may be further analyzed to determine if the activity data is indicative of a pre-defined biometric state or condition that is associated with a user input. If such analysis indicates that such data has been collected, the processing unit 104 may then treat such an event as equivalent to a user input. In some implementations, the processing unit 104 also may derive secondary data based on the raw activity data. In some implementations, the processing unit 104 also performs an analysis on the raw activity data received from the sensors 102 or on raw or previously-processed (“post-processed”) activity data retrieved from the memory 106 and initiates various actions based on the analysis.
  • post-processed raw or previously-processed
  • the processing unit 104 may track, determine, compute, calculate or analyze one or more physiological, biometric, activity or sleep metrics (hereinafter collectively referred to as “activity metrics”) based on the raw, pre-processed or secondary activity data (also collectively referred to herein generally as “activity data”).
  • activity metrics physiological, biometric, activity or sleep metrics
  • the memory 106 can include any suitable memory architecture.
  • the memory 106 includes different classes of storage devices or units to store different classes of data.
  • the memory 106 includes non-volatile storage media, such as fixed or removable semiconductor-, optical-, or magnetic-based media, for storing executable code (also referred to as “executable instructions”) and related data for enabling the implementations and carrying out the processes described herein.
  • the memory 106 also is configured for storing configuration data or other information for implementing various default or user-defined settings, or for implementing the default or activity-specific activity-tracking modes described herein.
  • the memory 106 also can be configured for storing other configuration data used during the execution of various programs or instruction sets or otherwise used to configure the portable monitoring device 100 .
  • any of the afore-described raw activity data generated by the sensors 102 can be stored in the non-volatile storage media within the memory 106 for later analysis or other use.
  • the activity metrics calculated by the processing unit 104 or received from an external computing device or server also can be stored in the non-volatile storage media within the memory 106 for later analysis, viewing or other use.
  • the memory 106 also includes volatile storage media, such as static or dynamic random-access memory (RAM), for temporarily or non-permanently storing more transient information and other variable data as well as, in some implementations, executable code retrieved from the non-volatile storage media.
  • RAM static or dynamic random-access memory
  • the volatile storage media may also store any of the afore-described data generated by sensors 102 or data derived from sensed data (for example, including activity- or sleep tracking metrics) for later analysis, later storage in non-volatile media within memory 106 , or for subsequent communication over a wired or wireless connection via I/O interface 110 .
  • the processing unit 104 can additionally or alternatively include its own volatile memory, such as RAM, for loading executable code from non-volatile memory for execution by the processing unit 104 , or for tracking, analyzing, or otherwise processing any of the afore-described data generated by sensors 102 or data derived from sensed data (for example, including activity- or sleep tracking metrics) for later analysis, later storage in non-volatile media within memory 106 , or for subsequent communication over a wired or wireless connection via I/O interface 110 .
  • volatile memory such as RAM
  • the processing unit 104 also can be configured to track and determine when the activity data received from the sensors 102 or retrieved from the memory 106 , or the activity metrics generated from such activity data, indicate that a goal has been achieved or a progress point has been reached.
  • a goal can be a specific activity metrics such as a distance, a number of steps, an elevation change, or number of calories burned, among other goals as described in more detail below.
  • the processing unit 104 may then notify the user of the achievement of the goal or progress indicator via the user interface 108 .
  • the processing unit 104 may cause a display to show content on the display marking or celebrating the achievement of the goal.
  • the processing unit 104 may cause one or more lights (for example, LEDs) to light up, flash, change intensity, or otherwise reflect a visual pattern or display that notifies the user of the achievement of the goal. Additionally or alternatively, the processing unit 104 may cause one or more sound-producing devices to alert, beep or otherwise make noise that notifies the user of the achievement of the goal. Additionally or alternatively, the processing unit 104 may cause one or more vibrating devices to vibrate or otherwise provide haptic feedback in the form of one or more vibration patterns and, in some implementations, with differing or varying vibrational characteristics to notify the user of the achievement of particular goals.
  • lights for example, LEDs
  • the processing unit 104 may cause one or more sound-producing devices to alert, beep or otherwise make noise that notifies the user of the achievement of the goal. Additionally or alternatively, the processing unit 104 may cause one or more vibrating devices to vibrate or otherwise provide haptic feedback in the form of one or more vibration patterns and, in some implementations, with differing or varying vibrational characteristics to notify
  • user interface 108 collectively refers to and includes one or more user input devices and one or more output devices.
  • the memory 106 also can store executable instructions that, when executed by the processing unit 104 , cause the processing unit 104 to receive and interpret user input received via the user interface 108 , or to output or communicate information to a user via the user interface 108 .
  • the user interface 108 can incorporate one or more types of user interfaces including, for example, visual, auditory, touch, vibration, or combinations thereof.
  • user interface 108 can include one or more buttons in or on a device housing that encloses the processing unit 104 , the memory 106 and other electrical or mechanical components of the portable monitoring device 100 .
  • buttons can be based on mechanical designs and electrical designs, and may incorporate, for example, one or more pressure sensors, proximity sensors, resistive sensors, capacitive sensors, or optical sensors.
  • the user interface 108 also can include a touchpad or a touchscreen interface, which may be disposed over or integrated with a display, and which can incorporate these or other types of sensors.
  • the afore-described motion sensors, gyroscopes, or other sensors also can be used to detect a physical gesture corresponding to a user input. This allows a user to interact with the device using physical gestures.
  • accelerometers and gyroscopes can be used to detect when a user “taps,” shakes, rotates, flips or makes other “gestures” with the portable monitoring device.
  • the portable monitoring device 100 can include a magnetometer, which may be used to detect the device's orientation with respect to the Earth's magnetic field.
  • Other gestures that may be used to cause the portable monitoring device 100 to perform some action include, but are not limited to, multiple taps, or a specific pattern of taps. For example, a user may tap anywhere on the exterior (for example, the housing) of the portable monitoring device two times within a specific time period to cause the display to show particular content, to annotate activity data, or to change device modes.
  • the user interface 108 also can include a display can be included on or in the housing that encloses the processing unit 104 and the memory 106 .
  • the display can be configured as an alphanumeric display, transiently-visible display, or dead-front display.
  • the display also can include or be based on any suitable display technology including liquid crystal display (LCD) technology or light-emitting diode (LED) technology among other suitable display technologies.
  • the display can be configured to display various information to a user.
  • a user can input a selection, navigate through a menu, or input other information via a button, a pressure sensor, a proximity sensor, a resistive sensor, a capacitive sensor, an optical sensor, or a touchscreen incorporating these or other types of sensors.
  • the display can show activity data, biometric data, contextual data, environmental data, system or intrinsic condition data, or data derived from activity or other sensed data, one or more activity metrics, one or more sleep metrics, a currently-active activity-tracking mode, one or more menus, one or more settings, one or more alarms or other indicators, a clock, a timer, a “stopwatch,” among other suitable information.
  • the information that is displayed is customizable by the user or, additionally or alternatively, dependent on a current device state or mode of the portable monitoring device 100 .
  • the data displayed in association with each device state or mode may be partitioned into a plurality of different data display pages, and a user may “advance” through the data display pages associated with a given device state or mode by providing input to the biometric monitoring device.
  • data display page may refer to a visual display including text, graphics, and/or indicators, e.g., LEDs or other lights such as are used on the Fitbit Flex, that are arranged to communicate data measured, produced, or received by a portable monitoring device 100 to a user viewing a display of the portable monitoring device.
  • the portable monitoring device 100 may track its device state through a variety of mechanisms and transition through different device states as contextual states, environmental states, or modes change.
  • the device may include and be capable of operating in multiple active modes, multiple active environmental states, multiple active contextual states, or combinations of these, simultaneously. In such a case, the device state may be different for each different combination of environmental states, contextual states, or modes.
  • an annotation data display page may indicate that the portable monitoring device 100 is in annotation mode.
  • information related to the activity being annotated may be displayed.
  • data display pages for various types of activity data or activity metrics may show quantities measured while the portable monitoring device 100 is in the annotation mode.
  • a data display page for “steps taken” may only display a quantity of steps that have been taken while the portable monitoring device 100 is in the annotation mode or in an activity session defined using annotation data (rather than, for example, the quantity of steps taken throughout the entire day, week, month, year or during the lifetime of the device).
  • the processing unit 104 may decrease the sensitivities of various user input detection mechanisms, especially a touchscreen, (or turn the display or the entire device completely off) to reduce the risk of accidental inputs or to save power. In other device states, it may be desirable to change the user input method based on the limitations of various input mechanisms in various environments.
  • the portable monitoring device 100 determines that it is in a device state associated with swimming (for example, the portable monitoring device 100 can be configured to independently determine through moisture sensors or pressure sensor data that it is in water), or if the portable monitoring device is actively placed into a swimming mode by the user via the user interface 108 , then, in some implementations, a touchscreen interface or other user interface of the portable monitoring device 100 may be deactivated since it may not function well in water. The wearer may instead interact with the portable monitoring device 100 using physical buttons or other appropriate or suitable input mechanisms, including physical gesutres sensed by the device.
  • the portable monitoring device 100 and particularly the user interface 108 , also can include other mechanisms to provide feedback or other information to a user.
  • the user interface 108 can include one or more lights, such as one or more LEDs, in addition to the display for communicating information, such as the achievement of a goal, an alarm, an alert, indicator or other notification, a current state, a current mode, or a power level, to the user.
  • the processing unit 104 can control the intensities, colors, or patterns of flashing of one or more of the LEDs of the user interface 108 based on what information is being communicated.
  • the user interface 108 additionally or alternatively includes one or more speakers or sound-producing devices.
  • the user interface 108 also can include one or more microphones or other audio devices.
  • the user interface 108 includes one or more vibramotors (also referred to herein as “vibrators” or simply as “vibrating devices”) for communicating information with or to the user.
  • the processing unit 104 can utilize the vibramotors to communicate one or more alarms, achieved goals, progress indicators, inactivity indicators, reminders, indications that a timer has expired, or other indicators, feedback or notifications to a user wearing or holding the portable monitoring device 100 .
  • the portable monitoring device 100 can utilize the vibramotors to communicate such information to the user in addition to communicating the same or similar information via the display, the lights, or the sound-producing devices.
  • the portable monitoring device 100 can utilize the vibramotors to communicate such information to the user instead of or in lieu of communicating the same or similar information via the display, the lights, or the sound-producing devices.
  • the vibramotors can cause the portable monitoring device 100 to vibrate to wake the user from sleep while not making noise so as to not wake the user's partner.
  • the vibramotors can cause the portable monitoring device 100 to vibrate to alert the user that the user's goal has been achieved or that a milestone or other progress point en route to the goal has been reached without requiring the user to look at a display or hear an indication output from a speaker.
  • a user can define one or more custom vibration patterns or other vibrational characteristics and assign such differing vibration patterns or other vibrational characteristics to different alarms, goals, or other vibrating indicators so that the user can distinguish among the vibrating indicators to determine what information is being communicated by the portable monitoring device 100 . Additionally or alternatively, in some implementations, a user can select one or more default vibration patterns or other vibrational characteristics stored in the memory 106 and assign such differing vibration patterns or other vibrational characteristics to various vibrating indicators.
  • the user can customize such patterns, characteristics, or settings or make such selections via the user interface 108 , or via an application or program (including a web application, mobile application, or client-side software program) executing on an external computing device (for example, a personal computer, smartphone or multimedia device) communicatively coupled with the portable monitoring device 100 via the I/O interface 110 and one or more wired or wireless connections or networks.
  • an application or program including a web application, mobile application, or client-side software program
  • an external computing device for example, a personal computer, smartphone or multimedia device
  • one or more of the sensors 102 themselves also can be used to implement at least a portion of the user interface 108 .
  • one or more accelerometers or other motion sensors 102 can be used to detect when a person taps the housing of the portable monitoring device 100 with a finger or other object, and then interpret such data as a user input for the purposes of controlling the portable monitoring device 100 .
  • double-tapping the housing of the portable monitoring device 100 may be recognized by the processing unit 104 as a user input that will cause a display of the portable monitoring device to turn on from an off state or that will cause the portable monitoring device to transition between different monitoring states, sessions, or modes.
  • the tapping may cause the processing unit 104 to switch from a state where the portable monitoring device 100 collects and interprets activity data according to rules established for an “active” person to a state where the portable monitoring device collects and interprets activity data according to rules established for a “sleeping” or “resting” person.
  • tapping the housing of the portable monitoring device 100 may be recognized by the processing unit 104 as a user input that will annotate monitored activity data, such as by, for example, indicating a starting or ending time of an activity session of user-defined duration.
  • the tapping may cause the processing unit 104 to switch from one activity-specific activity-tracking mode to another.
  • tapping may cause the processing unit 104 to switch from a walking mode where the portable monitoring device 100 collects and interprets activity data according to rules established for a “walking” person to a bicycling mode where the portable monitoring device interprets data according to rules established for a bicycle rider.
  • the processing unit 104 may communicate activity data received from the sensors 102 or retrieved from the memory 106 via the I/O interface 110 to an external or remote computing device (for example, a personal computer, smartphone or multimedia device) or to a back-end server over one or more computer networks.
  • an external or remote computing device for example, a personal computer, smartphone or multimedia device
  • a back-end server over one or more computer networks.
  • the I/O interface 110 includes a transmitter and a receiver (also referred to collectively herein as a “transceiver” or simply as “transmitting and receiving circuitry”) that can transmit the activity data or other information through a wired or wireless connection to one or more external computing devices or to one or more back-end servers (either directly via one or more networks or indirectly via an external computing device that first receives the activity data and subsequently communicates the data via one or more networks to the back-end servers).
  • the memory 106 also can store executable instructions that, when executed by the processing unit 104 , cause the processing unit 104 to transmit and receive information via the I/O interface 110 .
  • the one or more computer networks include one or more local-area networks (LANs), private networks, social networks, or wide-area networks (WANs) including the Internet.
  • the I/O interface 110 can include wireless communication functionality so that when the portable monitoring device 100 comes within range of a wireless base station or access point, or within range of certain equipped external computing devices (for example, a personal computer, smartphone or multimedia device), certain activity data or other data is automatically synced or uploaded to the external computing device or back-end server for further analysis, processing, viewing, or storing.
  • the wireless communication functionality of I/O interface 110 may be provided or enabled via one or more communications technologies known in the art such as, for example, Wi-Fi, Bluetooth, RFID, Near-Field Communications (NFC), Zigbee, Ant, optical data transmission, among others. Additionally or alternatively, the I/O interface 110 also can include wired-communication capability, such as, for example, a Universal Serial Bus (USB) interface.
  • a Universal Serial Bus USB
  • one or more back-end servers or computing systems can support a web-based application (“web application”), web site, web page or web portal (hereinafter “web application,” “web page,” “web site,” and “web portal” may be used interchangeably) enabling a user to remotely interact with the portable monitoring device 100 , or to interact with or view the activity data or activity metrics calculated based on the activity data, via any computing device (for example, a personal computer, smartphone or multimedia device) capable of supporting a web browser or other web client suitable for use in rendering the web page or web-based application.
  • web application web-based application
  • web page web page
  • web portal may be used interchangeably
  • any computing device for example, a personal computer, smartphone or multimedia device
  • the data can be stored at an Internet-viewable or Internet-accessible source such as a web site (for example, www.Fitbit.com) permitting the activity data, or data or activity metrics derived or calculated therefrom, to be viewed, for example, using a web browser or network-based application.
  • a web application, web page, web site or web portal may refer to any structured document or user interface made available for viewing on a client device (for example, a personal computer, smartphone or multimedia device) over any of one or more of the described networks or other suitable networks or communication links.
  • the processing unit 104 may calculate the user's step count based on activity data received from one or more sensors 102 .
  • the processing unit 104 may temporarily store the activity data and calculated step count in the memory 106 .
  • the processing unit 104 may then transmit the step count, or raw or pre-processed activity data representative of the user's step count, via I/O interface 110 to an account on a web service (for example, www.fitbit.com), an external computing device such as a personal computer or a mobile phone (especially a smartphone), or to a health station where the data may be stored, further-processed, and visualized by the user or friends of the user.
  • a web service for example, www.fitbit.com
  • an external computing device such as a personal computer or a mobile phone (especially a smartphone)
  • a health station where the data may be stored, further-processed, and visualized by the user or friends of the user.
  • the activity metrics that can be tracked, determined, calculate or analyzed by the processing unit 104 , or by an external computing device or back-end server based on activity data transmitted from portable monitoring device 100 include one or more of, for example: energy expenditure (for example, calories burned), distance traveled, steps taken, stairs or floors climbed or descended, elevation gained or lost (e.g., based on an altimeter or global positioning satellite (GPS) device), pace, maximum speed, location, direction, heading, ambulatory speed, rotation or distance traveled, swimming stroke count, swimming lap count, swimming distance, bicycle distance, bicycle speed, heart rate, heart rate variability, heart rate recovery, blood pressure, blood glucose, blood oxygen level, skin conduction, skin or body temperature, electromyography data, electroencephalography data, weight, body fat, caloric intake, nutritional intake from food, medication intake, sleep periods, sleep phases, sleep quality, sleep duration, pH levels, hydration levels, and respiration rate.
  • energy expenditure for example, calories burned
  • distance traveled for example, distance traveled, steps taken, stairs or floors climbed or
  • the processing unit 104 also tracks, determines, or calculates metrics related to the environment around the user such as, for example, one or more of: barometric pressure, temperature, humidity, rain/snow conditions, wind speed, other weather conditions, light exposure (ambient light), ultraviolet (UV) light exposure, time or duration spent in darkness, pollen count, air quality, noise exposure, radiation exposure, and magnetic field.
  • metrics related to the environment around the user such as, for example, one or more of: barometric pressure, temperature, humidity, rain/snow conditions, wind speed, other weather conditions, light exposure (ambient light), ultraviolet (UV) light exposure, time or duration spent in darkness, pollen count, air quality, noise exposure, radiation exposure, and magnetic field.
  • the user may input his height, weight, or stride in a user profile on a fitness-tracking website and such information may then be communicated to the portable monitoring device 100 via the I/O interface 110 and used to evaluate, in conjunction with activity data measured by the sensors 102 , the distance traveled or calories burned by the user.
  • a general listing of potential types of sensors 102 and activity data types is shown below in Table 1. This listing is not exclusive, and other types of sensors other than those listed may be used. Moreover, the data that is potentially derivable from the listed sensors may also be derived, either in whole or in part, from other sensors. For example, an evaluation of stairs climbed may involve evaluating altimeter data to determine altitude change, clock data to determine how quickly the altitude changed, and accelerometer data to determine whether biometric monitoring device is being worn by a person who is walking (as opposed to standing still).
  • the strain gauges may be Body Mass Index (BMI) (in Sensors located in a device remote from conjunction with user-supplied height the biometric monitoring device, and gender information, for example) e.g., a Fitbit Aria TM scale, and communicate weight-related data to the biometric monitoring device, either directly or via a shared account over the Internet)
  • Bioelectrical Body fat percentage may be Impedance Sensors included in remote device, such as Aria TM scale) Respiration Rate Respiration rate Sleep apnea detection Sensors Blood Pressure Systolic blood pressure, diastolic Sensors blood pressure Heart Rate Sensors Heart rate Blood Glucose Blood glucose levels Sensors Moisture Sensors Moisture levels Whether user is swimming, showering, bathing, etc.
  • biometric data may be calculated or estimated by the portable monitoring device 100 without direct reference to data obtained from the sensors 102 .
  • a person's basal metabolic rate which is a measure of the “default” caloric expenditure that a person experiences throughout the day while at rest (in other words, simply to provide energy for basic bodily functions such as breathing, circulating blood, etc.)
  • an application or program including a web application, mobile application, or client-side software program
  • an external computing device for example, a personal computer, smartphone or multimedia device
  • Such user-entered data may be used, in conjunction with data from an internal clock indicating the time of day, to determine how many calories have been expended by a person thus far in the day to provide energy for basic bodily functions.
  • the portable monitoring device 100 includes a default activity-tracking mode also referred to herein as an “annotation” mode.
  • the activity data monitored while in the default annotation mode can be annotated or otherwise marked to indicate, specify, or delineate the starting and ending time points or other time points of and within an activity session.
  • an “activity session” may generally refer to a user-defined duration of time, or a duration of time associated with a particular activity or time of day, in which the device is monitoring activity data.
  • the activity data monitored while in the annotation mode also can be annotated or otherwise marked to indicate, specify, or define a specific activity that is being performed by the user during the activity session such as, for example, walking, running, stair climbing, bicycling, swimming, or even sleeping.
  • the user can annotate the activity data prior to, during, or after completion of an associated activity.
  • a user can annotate an activity session via physical interactions with the portable monitoring device 100 , itself.
  • the user can annotate the activity data using, for example, any of the components described above that may be included within user interface 108 .
  • the user can annotate the activity session via an external or remote computer (for example, a personal computer, a smartphone, or a multimedia device).
  • an external or remote computer for example, a personal computer, a smartphone, or a multimedia device.
  • one or both of the portable monitoring device 100 and a coupled external computing device also can communicate with one or more back-end servers as described above.
  • the portable monitoring device or external computing device can transmit the annotations (also referred to herein as “annotation data”), the activity data, as well as information about the portable monitoring device or the user, to the servers for storage and, in some implementations, for additional processing or analysis.
  • annotations also referred to herein as “annotation data”
  • the activity data as well as information about the portable monitoring device or the user
  • the portable monitoring device 100 and particular the processing unit 104 , is configured to use the sensors 102 to monitor the same type of activity data in the same way regardless of the activity being performed or in which the user in currently engaged. That is, in some implementations, regardless of what activity the user is engaging in, be it walking, running, stair climbing, bicycling, swimming, or even sleeping, the same sensors are used to sense movements or other sensed activity data in the same way.
  • the processing unit 104 is configured to determine, calculate or analyze one or more activity metrics
  • the processing unit itself, can determine which activity metrics to determine, calculate or analyze based on the annotation data received for the activity session.
  • the portable monitoring device 100 can automatically annotate one or more activity sessions.
  • the processing unit 104 can analyze the activity data from the sensors 102 dynamically (for example, substantially in real time) and automatically determine a starting point, an ending point, or other time points for which to record timestamps or store markers or digital flags in the memory 106 to annotate the activity data monitored in an activity session.
  • the processing unit can analyze activity data retrieved from the memory 106 to automatically annotate the stored activity data.
  • the processing unit 104 can transmit the activity data via I/O interface 110 to one or both of an external computing device (for example, a personal computer, a smartphone or a multimedia device) or a back-end server (either directly over one or more wired or wireless networks or indirectly by way of an external computing device, such as a personal computer, a smartphone or a multimedia device, in conjunction with one or more wired or wireless networks) that then automatically annotates the received activity data.
  • the annotation data can be stored with the corresponding activity data; that is, together with the activity data in the same locations within the memory 106 .
  • the annotation data can be stored separately from the activity data within the memory 106 but linked to the activity data by way of, for example, one or more tables and timestamps.
  • the portable monitoring device 100 may record biometric data that indicates that the wearer was largely stationary and horizontal during the time that the biometric monitoring device was in the annotation mode. This, in combination with the time of day that the annotated biometric data was collected, may cause the portable monitoring device to automatically annotate such data as a “sleeping” activity.
  • a wearer of the biometric monitoring device may, alternatively, indicate that the annotated biometric data is associated with a particular activity, e.g., by entering a label or other identifier of the activity in association with the annotated data after the biometric data is exported from the portable monitoring device to a one more back-end servers via a website, web application, mobile application, or other application or by inputting such a label or other identifier into an external computing device (for example, a smartphone, multimedia device, or personal computer) that is paired with the portable monitoring device and within communication range of the portable monitoring device, and particularly the I/O interface 110 .
  • an external computing device for example, a smartphone, multimedia device, or personal computer
  • the portable monitoring device 100 may automatically detect or determine when the user is attempting to go to sleep, entering sleep, is asleep, or is awoken from a period of sleep.
  • the portable monitoring device 100 may employ physiological, motion or other sensors to acquire activity data.
  • the processing unit 104 then correlates a combination of one or more of: motion, heart rate, heart rate variability, respiration rate, galvanic skin response, or skin or body temperature sensing to detect or determine if the user is attempting to go to sleep, entering sleep, is asleep or is awoken from a period of sleep.
  • the portable monitoring device 100 may, for example, acquire physiological data (such as of the type and in the manner as described herein) or determine physiological conditions of the user (such as of the type and in the manner as described herein). For example, a decrease or cessation of user motion combined with a reduction in user heart rate and/or a change in heart rate variability may indicate that the user has fallen asleep. Subsequent changes in heart rate variability and galvanic skin response may be used to determine transitions of the user's sleep state between two or more stages of sleep (for example, into lighter and/or deeper stages of sleep). Motion by the user and/or an elevated heart rate and/or a change in heart rate variability may be used to determine that the user has awoken.
  • Real-time, windowed, or batch processing to maybe used to determine the transitions between wake, sleep, and sleep stages, as well as in other activity stages. For instance, a decrease in heart rate may be measured in a time window where the heart rate is elevated at the start of the window and reduced in the middle (and/or end) of the window.
  • the awake and sleep stages may be classified by a hidden Markov model using changes in motion signal (e.g., decreasing intensity), heart rate, heart rate variability, skin temperature, galvanic skin response, and/or ambient light levels.
  • the transition points may be determined through a changepoint algorithm (e.g., Bayesian changepoint analysis).
  • the transition between awake and sleep may be determined by observing periods where the user's heart rate decreases over a predetermined time duration by at least a certain threshold but within a predetermined margin of the user's resting heart rate (that is observed as, for instance, the minimum heart rate of the user while sleeping). Similarly, the transition between sleep and awake may be determined by observing an increase in the user's heart rate above a predetermined threshold of the user's resting heart rate.
  • a back-end server determines which activity metrics to calculate or analyze based on annotation data generated by the server or another server and stored in one or both of the servers, annotation data received from an external computing device, or annotation data also received from the portable monitoring device 100 . Additionally, the servers also can determine which activity metrics to calculate or analyze based on an analysis of the tracked activity data. In some such implementations, the portable monitoring device 100 may not track, determine, calculate or analyze any activity metrics at all; rather, the portable monitoring device may monitor the sensed activity data and subsequently store or transmit the activity data for later analysis and processing by an external computing device or back-end servers.
  • one or output mechanisms may be used alone or in any combination with each other or another method of communication to communicate that the user has met or achieved or made progress towards one or more of the following goals: the traversal of a certain distance; the achievement of certain mile (or other lap) pace; the achievement of a certain speed; the achievement of a certain elevation gain; the achievement of a certain number of steps; the achievement of a certain maximum or average heart rate; the completion of a certain number of swimming strokes or laps in a pool.
  • the data used to determine whether or not a goal is achieved or whether the condition for an alert has been met may be acquired from the portable monitoring device 100 or another device.
  • the portable monitoring device 100 itself may determine whether the criteria for an alert, goal, or notification has been met.
  • a computing device in communication with the device e.g. a server and/or a mobile phone
  • the device may communicate with the user when a goal has been met.
  • the criteria for meeting this goal may be based on physiological, contextual, and environmental sensors on a first device, and/or other sensor data from one or more secondary devices.
  • the goal may be set by the user or may be set by the device itself and/or another computing device in communication with the device (e.g. a server).
  • the portable monitoring device 100 may vibrate to notify the user. For example, the portable monitoring device 100 may detect (or be informed) that the wearer has exceeded a predefined goal or achievement threshold, for example, 10,000 steps taken in one day, and may, responsive to such an event, vibrate to alert or congratulate the user.
  • a predefined goal or achievement threshold for example, 10,000 steps taken in one day
  • the display may turn on and present data about the goal that the user reached, for example, what goal was reached, if the goal was previously reached one or more times on a different day, week, month, or year, or how long it took to reach the goal).
  • the color and/or intensity of one or more LEDs may serve as notifications that the user is winning or losing against a friend in a competition in, for example, step count.
  • the biometric monitoring device may be a wrist-mounted device that may vibrate or emit audio feedback to notify the user of an incoming email, text message, or other alert.
  • the display of the biometric monitoring device may be turned on and a data display page relating data relevant to the alert may be presented to the user.
  • the biometric monitoring device may present increasingly noticeable feedback methods based on the importance or urgency of the alert.
  • a high priority alert may include audio, vibration, and/or visual feedback, whereas a low priority alert may only include visual feedback.
  • the criteria to distinguish a high priority alert from lower-priority alerts may be defined by the user.
  • a high-priority alert may be triggered if an email message or text is sent with a particular priority, e.g., “urgent,” if an email message or text is sent from a particular person, e.g., a person that the user has identified as being high-priority, if a meeting notification or reminder is received or occurs, if a certain goal is achieved or if a dangerous health condition, such as a high heart rate is detected.
  • the portable monitoring device 100 may operate within or according to a plurality of modes.
  • various modes may include: a general or default activity-tracking mode such as the annotation mode described above, a timer mode, a stopwatch mode, a clock/time/watch mode, a sleep-monitoring (or “sleep-tracking”) mode, a work mode, a home mode, a commute mode, as well as one or more activity-specific activity-tracking modes for tracking user activities such as biking, swimming, walking, running, stair-climbing, rock climbing, weight-lifting, treadmill exercise, and elliptical machine exercise.
  • the portable monitoring device 100 also enables a user to annotate activity data monitored in one or more modes including one or more activity-specific activity-tracking modes as described above.
  • the processing unit 104 may automatically determine or select a mode for the device to operate in based on a plurality of signals, data or other information. For example, the processing unit may automatically select a mode based on one or more activity metrics (for example, a step count, stair or floor count, or a number of calories burned) or, additionally or alternatively, based on one or more of: contextual or environmental data (for example, time of day, GPS or other determined or entered location or position data, ambient light brightness, temperature, or humidity); physiological or other person-centric data (for example, heart rate, body temperature, hydration level, or blood oxygen level); or system condition data (for example, in response to a low battery or low memory); or based on one or more user-defined conditions being met.
  • activity metrics for example, a step count, stair or floor count, or a number of calories burned
  • contextual or environmental data for example, time of day, GPS or other determined or entered location or position data, ambient light brightness, temperature, or humidity
  • physiological or other person-centric data for example, heart
  • the portable monitoring device itself can determine which activity data to monitor, or, additionally or alternatively, which activity (or sleep) metrics (hereinafter “sleep metrics” also may generally be referred to as “activity metrics”) to determine, calculate or analyze, based on which of the activity-tracking or other modes is currently active or initiated.
  • sleep metrics also may generally be referred to as “activity metrics”
  • one or both of an external computing device or a back-end server can request certain activity data from the portable monitoring device based on which of the activity-tracking modes is currently active or initiated.
  • one or both of an external computing device or a back-end server can receive all activity data monitored by the portable monitoring device and subsequently filter or otherwise selectively process certain activity data to determine, calculate or analyze certain activity metrics based on which of the activity-tracking modes is currently active or initiated.
  • a user can select which of the modes is currently active or initiated via the user interface 108 , or via an application or program (including a web application, mobile application, or client-side software program) executing on an external computing device (for example, a personal computer, smartphone or multimedia device) communicatively coupled with the portable monitoring device 100 via the I/O interface 110 and one or more wired or wireless connections or networks.
  • an application or program including a web application, mobile application, or client-side software program
  • an external computing device for example, a personal computer, smartphone or multimedia device
  • a user may select the mode of the portable monitoring device 100 using an application on a smartphone that sends the mode selection to a server.
  • the server sends the mode selection to an external computing device that then sends the mode selection to the portable monitoring device 100 via the I/O interface 110 .
  • the smart phone application or the server
  • a user also can select which activity metrics to track while in each of the corresponding activity-tracking modes.
  • the portable monitoring device 100 also can be configured to automatically switch among two or more activity-tracking or other modes.
  • the processing unit 104 can analyze the activity data from the sensors 102 and automatically determine a most suitable, appropriate, or optimal activity-tracking or other mode to switch into based on the analysis of the activity data dynamically in substantially real-time.
  • the processing unit 104 can transmit the activity data via I/O interface 110 through a wired or wireless connection to one or both of an external computing device or back-end server that then analyzes the activity data, determines the most suitable, appropriate, or optimal activity-tracking or other mode to switch into, and subsequently transmits one or more instructions to the portable monitoring device 100 that, when executed by the processing unit 104 , cause the processing unit 104 (in conjunction with one or more other components described above) to switch into the determined mode.
  • the portable monitoring device 100 includes an alarm clock function intended to wake the wearer or user from sleep or otherwise alert the user.
  • the portable monitoring device 100 acts as a wrist-mounted vibrating alarm to silently wake the user from sleep.
  • the portable monitoring device also can be configured to track the user's sleep quality, waking periods, sleep latency, sleep efficiency, sleep stages (e.g., deep sleep vs REM), or other sleep-related metrics through one or a combination of heart rate, heart rate variability, galvanic skin response, motion sensing (e.g., accelerometer, gyroscope, magnetometer), and skin temperature.
  • the user may specify a desired alarm time or window of time (e.g. set alarm to go off between 7 and 8 am).
  • the processing unit 104 uses one or more of the sleep metrics to determine an optimal time within the alarm window to wake the user.
  • the user may cause it to hibernate, snooze, or turn off by slapping or tapping the device (which is detected, for example, via motion sensor(s), a pressure/force sensor and/or capacitive touch sensor in the device).
  • the portable monitoring device 100 can be configured to attempt to arouse the user at an optimum point in the sleep cycle by starting a small vibration at a specific user sleep stage or time prior to the alarm setting. It may progressively increase the intensity or noticeability of the vibration as the user progresses toward wakefulness or toward the alarm setting.
  • the wearer or user may have the ability to set one or more daily, periodic, or other recurring alarms.
  • the alarm function can be configured to “snooze,” i.e., temporarily stop the alarm for a short period of time, typically minutes, and then have the alarm re-trigger.
  • Fitbit manufactures a variety of extremely compact portable monitoring devices, including biometric tracking units, that each incorporate a suite of sensors, a battery, a display, a power-charging interface, and one or more wireless communications interfaces.
  • the portable monitoring devices also incorporate a vibramotor and/or a button.
  • housings measuring approximately 2′′ long, 0.75′′ wide, and 0.5′′ thick (Fitbit UltraTM); approximately 1.9′′ in length, 0.75′′ wide, and 0.375′′ thick (Fitbit OneTM); approximately 1.4′′ long, 1.1′′ wide, and 0.375′′ thick (Fitbit ZipTM); and approximately 1.3′′ in length, 0.5′′ wide, and 0.25′′ thick (Fitbit FlexTM).
  • housings of other sizes may be used in other implementations of biometric monitoring devices; the above list is merely intended to illustrate the small size of many such biometric monitoring devices.
  • the Fitbit Flex due to its smaller size, uses discrete light-emitting diode (LED) indicators, e.g., 5 LEDs arranged in a row, to convey information visually.
  • LED light-emitting diode
  • Each of the above-listed Fitbit devices also have an input mechanism that allows a user to affect some aspect of the device's operation.
  • the Fitbit Ultra and Fitbit One each include a discrete pushbutton that allows a user to affect how the device operates.
  • the Fitbit Zip and Fitbit Flex do not have a discrete pushbutton but are instead each configured to detect, using their biometric sensors, when the user taps the housing of the device; such events are construed by the processor or processors of such devices as signaling a user input, i.e., acting as the input mechanism.
  • the portable monitoring device 100 can be configured to remain in an “always on” state to allow it to continually collect activity data throughout the day and night.
  • the sensors 102 and processing unit 104 of the portable monitoring device must generally remain powered to some degree in order to collect the activity data, it can be advantageous to implement power-saving features elsewhere in the device, such as by, for example, causing a display to automatically turn off after a period of time.
  • the Fitbit UltraTM is an example of a portable monitoring device that includes a data display that is typically turned off to save power unless the device is being interacted with by the user. A typical user interaction may be provided by, for example, pressing a button on the device.
  • a housing of the portable monitoring device 100 itself is designed or configured such that it may be inserted into, and removed from, a plurality of compatible cases, housings, or holders (hereinafter “cases,” “housings,” and “holders” may be used interchangeably).
  • the portable monitoring device 100 is configured for removable insertion into a wristband or armband that can be worn on a person's wrist, forearm or upper arm.
  • the portable monitoring device is additionally or alternatively configured for removable insertion into a belt-clip case or configured for coupling with a clip that can be attached to a person's belt or clothing.
  • the term “wristband” may refer to a band that is designed to fully or partially encircle a person's forearm near the wrist joint.
  • the band can be continuous, for example, without any “breaks’; that is, it may stretch to fit over a person's hand or have an expanding portion similar to a dress watchband.
  • the band can be discontinuous, for example, having a clasp or other connection enabling a user to close the band similar to a watchband.
  • the band can simply be simply “open,” for example, having a C-shape that clasps the wearer's wrist.
  • a portable monitoring device that is inserted, combined, or otherwise coupled with a separate removable case or some other structure enabling it to be worn or easily carried by or attached to a person or his clothing may be referred to as a “portable monitoring system.”
  • FIG. 2 depicts a monitoring device similar in shape to a Fitbit One, which may be inserted into a holder with a belt clip or into a pocket on a wristband.
  • Portable monitoring device 200 has a housing 202 that contains the electronics associated with the biometric monitoring device 200 .
  • a button 204 and a display 206 may be accessible/visible through the housing 202 .
  • FIG. 3 depicts a portable monitoring device that may be worn on a person's forearm like a wristwatch, much like a Fitbit Flex.
  • Portable monitoring device 300 has a housing 302 that contains the electronics associated with the biometric monitoring device 300 .
  • a button 304 and a display 306 may be accessible/visible through the housing 302 .
  • a wristband 308 may be integrated with the housing 302 .
  • FIG. 4 depicts another example of a portable monitoring device that may be worn on a person's forearm like a wristwatch, although with a bigger display than the portable monitoring device of FIG. 3 .
  • Portable monitoring device 400 has a housing 402 that contains the electronics associated with the portable monitoring device 400 .
  • a button 404 and a display 406 may be accessible/visible through the housing 402 .
  • a wristband 408 may be integrated with the housing 402 .
  • the present disclosure is neither limited to any single aspect nor implementation, nor to any single combination and/or permutation of such aspects and/or implementations. Moreover, each of the aspects of the present disclosure, and/or implementations thereof, may be employed alone or in combination with one or more of the other aspects and/or implementations thereof. For the sake of brevity, many of those permutations and combinations will not be discussed and/or illustrated separately herein.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Emergency Management (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Electromagnetism (AREA)
  • Vascular Medicine (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Geometry (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

In one aspect of the disclosed implementations, a device includes one or more motion sensors for sensing motion of the device and providing activity data indicative of the sensed motion. The device also includes one or more processors for monitoring the activity data, and receiving or generating annotation data for annotating the activity data with one or more markers or indicators to define one or more characteristics of an activity session. The device also includes one or more feedback devices for providing feedback, a notice, or an indication to a user based on the monitoring. The device further includes a portable housing that encloses at least portions of the motion sensors, the processors and the feedback devices.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of priority under 35 U.S.C. §119(e) to U.S. Provisional Application No. 61/752,826 (Attorney Docket No. FTBTP002P2), filed 15 Jan. 2013, and titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME,” and to U.S. Provisional Patent Application No. 61/830,600 (Attorney Docket No. FTBTP002X1P), filed 3 Jun. 2013, and titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME,” both of which are hereby incorporated by reference in their entireties.
  • BACKGROUND
  • Increasing consumer interest in personal health has resulted in the development of a variety of personal health monitoring devices. Such devices have tended to be complicated to use or typically designed for use with only one activity: for example, running or bicycling, but not both. Relatively recent advances in the miniaturization of sensors, power sources, and other electronics or components have enabled personal health monitoring devices to be offered in smaller sizes, form factors, or shapes than were previously feasible or industrially practical. For example, the Fitbit Ultra (manufactured by Fitbit Inc. headquartered in San Francisco, Calif.) is a biometric monitoring device that is approximately 2″ long, 0.75″ wide, and 0.5″ deep. The Fitbit Ultra has a pixelated display, battery, sensors, wireless communications capability, power source, and interface button, as well as an integrated clip for attaching the device to a pocket or other portion of clothing, all packaged within this small volume.
  • SUMMARY
  • Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that the relative dimensions of the following figures may not be drawn to scale unless specifically indicated as being scaled drawings.
  • In one aspect of the disclosed implementations, a device includes one or more motion sensors for sensing motion of the device and providing activity data indicative of the sensed motion. The device also includes one or more processors for monitoring the activity data, and receiving or generating annotation data for annotating the activity data with one or more markers or indicators to define one or more characteristics of an activity session. The device also includes one or more feedback devices for providing feedback, a notice, or an indication to a user based on the monitoring. The device further includes a portable housing that encloses at least portions of the motion sensors, the processors and the feedback devices.
  • In some implementations, the device further includes a memory and the processors are further configured to store the activity data or data derived from the activity data in the memory. In some such implementations, the processors are further configured to store the annotation data in the memory. In some implementations, the processors are further configured to determine one or more activity metrics based on the activity data. In some such implementations, the processors are further configured to determine one or more activity metrics based on the annotation data. In some implementations, the device further includes one or more user input devices included in or on the housing for receiving or sensing user input, and the processors are further configured to receive and interpret the user input received or sensed via the user input devices. In some such implementations, the motion sensors themselves also can function as user input devices by sensing a user's touch, tapping, or other physical gesture made on or with the device. In some implementations, the device further includes transmitting and receiving circuitry. In some such implementations, the user input is input by the user via an external or remote device and then communicated to the receiving circuitry. In some such implementations, the processors are further configured to receive and interpret the user input received from the receiving circuitry. In some such implementations, the transmitting and receiving circuitry is configured for wireless communication over a computer network, and the user input is input via a web or mobile application.
  • In some implementations, the device is configured to enable a user to input the annotation data via user input. In some such implementations, the user input is input based on a physical interaction with the device that is then interpreted by the processors. In some other implementations, the user input is input to an external device that is communicatively-coupled with the device via one or more wired or wireless connections or networks. In some implementations, the device is configured to enable a user to initiate, indicate, or mark a start time or an end time of a user-defined activity session in response to user input received via one the one or more user interface devices. In some implementations, the device is configured to enable a user to indicate a particular user activity performed during a user-defined activity session in response to user input received via one the one or more user interface devices.
  • In some implementations, the device is configured to automatically annotate the activity data or to generate the annotation data. In some such implementations, the device is configured to automatically annotate the activity data or to generate the annotation data based on a device state. In some implementations, the device is configured to automatically annotate the activity data or to generate the annotation data based on an analysis of the activity data. In some implementations, the processors are configured to automatically determine the activity data to track or the activity metrics to calculate based on the annotation data.
  • In some implementations, the housing includes a wrist- or arm-band, is configured for physical attachment to or coupling with a wrist- or arm-band, or is configured to be inserted into a wrist- or arm-band. In some implementations, the device further includes one or more of: one or more gyroscopes, one or more physiological sensors, one or more biometric sensors, one or more altitude sensors, one or more temperature sensors, and one or more heart-rate sensors.
  • In another aspect of the disclosed implementations, a method of monitoring one or more activity metrics using a portable monitoring device is described. In some implementations, the method includes sensing, by one or more motion sensors within the device, motion of the device. The method also includes outputting, by the one or more motion sensors, activity data indicative of the sensed movement. The method also includes receiving, by one or more processors within the device, the activity data. The method additionally includes receiving or generating, by the one or more processors, annotation data. The method further includes annotating the activity data based on the annotation data.
  • In some implementations, the method further includes receiving the annotation data from a user. In some other implementations, the method includes generating the annotation data based on the activity data. In some implementations, the processors are further configured to determine or calculate one or more activity metrics based on the activity data and the annotation data.
  • In still another aspect of the disclosed implementations, a device includes one or more motion sensors for sensing motion of the device and providing activity data indicative of the sensed motion. The device also includes one or more processors for monitoring the activity data, and for switching among a plurality of activity-tracking modes. The device also includes one or more feedback devices for providing feedback, a notice, or an indication to a user based on the monitoring. The device further includes a portable housing that encloses at least portions of the motion sensors, the processors and the feedback devices.
  • In some implementations, the processors are further configured to determine one or more activity metrics based on the activity data. In some implementations, the processors are further configured to determine the one or more activity metrics based on which of the activity-tracking modes is currently active. In some implementations, the device is configured to enable a user to set or select a particular activity-tracking mode to switch into based on user input. In some implementations, the device is configured to automatically determine or select a particular activity-tracking mode to switch into in response to the activity data. In some implementations, the plurality of activity-tracking modes include one or more activity-specific activity-tracking modes and a sleep-tracking mode.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The various implementations disclosed herein are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which like reference numerals may refer to similar elements.
  • FIG. 1 depicts a block diagram of an example portable monitoring device.
  • FIG. 2 depicts a portable monitoring device that may be inserted into a holder with a belt clip or into a pocket on a wristband.
  • FIG. 3 depicts a portable monitoring device that may be worn on a person's forearm like a wristwatch.
  • FIG. 4 depicts another example of a portable monitoring device that may be worn on a person's forearm.
  • DETAILED DESCRIPTION
  • The present disclosure relates generally to portable monitoring devices (also referred to herein as “portable tracking devices” or simply as “devices”), and more particularly, to wearable monitoring devices including wearable biometric monitoring devices. Various implementations relate to a portable monitoring device capable of monitoring and tracking movements or activities and related data. For example, the portable monitoring device can include one or more motion sensors for detecting movement data or various other biometric, physiological, or environmental sensors for detecting biometric data, physiological data, environmental data, or related data (hereinafter also collectively referred to as “activity data”). In some example implementations, the portable monitoring device includes a general or default activity-tracking mode. In some such implementations, the default activity-tracking mode is an “annotation mode.” In some such implementations, the activity data monitored or tracked (hereinafter “monitored” and “tracked” may be used interchangeably) while in the annotation mode can be annotated or otherwise marked to indicate, specify, or delineate the starting and ending time points, a duration, or other time points of or within an activity session.
  • For purposes of this disclosure, an “activity session” may generally refer to a user-defined duration of time, or a duration of time associated with a particular activity or time of day, in which the device is monitoring activity data. In some implementations, the activity data monitored while in the default annotation mode also can be annotated or otherwise marked to indicate, specify, or define a specific activity that is being performed by the user during the activity session such as, for example, walking, running, stair climbing, bicycling, swimming, or even sleeping. In various implementations, the user can annotate the activity data prior to, during, or after completion of an associated activity. In various implementations, one or more activity metrics can be determined, calculated, or analyzed based on the activity data. In some such implementations, the activity metrics are communicated to the user via a display, lighting, noise, or via vibrational or haptic feedback. In some implementations, one or more achieved goals, progress indicators, alerts, or other activity-based notifications may be communicated to the user based on one or more of the activity metrics. Additionally or alternatively, in some implementations, one or more alarms, reminders, or other time-, physiologically-, biometrically-, state-, condition-, or environment-based notifications also can be communicated to the user. Such achieved goals, progress indicators, alerts, or other notifications can be communicated to the user via a display, lighting, noise, or via vibrational or haptic feedback.
  • In some other implementations, the portable monitoring device is capable of switching among two or more modes such as two or more activity-tracking modes. In some such implementations, the two or more activity-tracking modes include one or more activity-specific activity-tracking modes including, for example, a walking mode, a running mode, a stair-climbing mode, a bicycling mode, a swimming mode, a climbing mode, and a golfing mode, among other example activity-tracking modes configured for other corresponding activities. In some implementations, the two or more activity-tracking modes also can include a sleep-tracking mode. In some implementations, the portable monitoring device itself can determine which activity data to monitor or which activity (or sleep) metrics (hereinafter “sleep metrics” also may generally be referred to as “activity metrics”) to determine, compute, calculate, track or analyze (hereinafter used interchangeably) based on which of the activity-tracking modes is currently active or initiated. Additionally or alternatively, in some implementations, one or both of an external computing device or a back-end server can request certain activity data from the portable monitoring device based on which of the activity-tracking modes is currently active or initiated. Additionally or alternatively, in some implementations, one or both of an external computing device or a back-end server can receive all activity data monitored by the portable monitoring device and subsequently filter or otherwise selectively process certain activity data to determine certain activity (or sleep) metrics based on which of the activity-tracking modes is currently active or initiated.
  • FIG. 1 depicts a block diagram of an example portable monitoring device 100. The portable monitoring device 100 includes one or more sensors 102. The portable monitoring device 100 also includes a processing unit 104, a memory 106, a user interface 108, and input/output (I/O) interface 110. The one or more sensors 102, the processing unit 104, the memory 106, the user interface 108, and the I/O interface 110 are communicatively connected with one or more of one another via one or more communication paths collectively referred to as communication bus 112. The portable monitoring device 100 further includes a power source 114, such as, for example, one or more rechargeable or a removable batteries.
  • The sensors 102 include one or more motion sensors configured for sensing and outputting movement data indicative of the motion of the portable monitoring device 100. For example, the motion sensors can include one or more accelerometers for sensing movement data. In some implementations, the portable monitoring device 100 includes one or more accelerometers for sensing acceleration or other movement data in each of, for example, three directions, which may be orthogonal. The sensors 102 additionally can include one or more gyroscopes for sensing rotation data. In some implementations, the portable monitoring device 100 includes one or more gyroscopes for sensing rotation about each of, for example, three axes, which may be orthogonal. As will be described later, movement data and rotation data also can be used to capture user input. The sensors 102 additionally can include one or more altimeters (hereinafter also referred to as “altitude sensors.” For example, the portable monitoring device 100 can include a pressure or barometric altimeter. The sensors 102 additionally can include one or more temperature sensors for sensing one or both of a temperature of the environment outside of the user's body or an internal temperature of the user's body. The sensors 102 additionally can include one or more light sensors (for example, photodetectors). For example, the light sensors can be used to detect an ambient light level of the environment for use in, for example, determining a suitable or optimal intensity of a display of the portable monitoring device 100. Other light sensors can be used to gather other biometric data such as an oxygen level of the user's blood. The sensors 102 also can include one or more pressure or proximity sensors for receiving user input. Such pressure or proximity sensors can be based on mechanical designs or on, for example, capacitive, resistive, or other electrical or optical designs. The portable monitoring device 100 also can be coupled with external sensing devices such as an external heart rate monitor (for example, a chest-strap heart rate monitor) for sensing a user's hear rate. The portable monitoring device 100 also can include or be coupled with other physiological or biometric sensors. In some implementations, the portable monitoring device 100 additionally is configured to sense or monitor one or more other types of biometric data or to measure, calculate, or determine biometric data based on the movement, rotation, or other data described above. Biometric data, as used herein, may refer to virtually any data pertaining to the physical characteristics of the human body, and as described above, activity data may also refer to such biometric data.
  • The processing unit 104 can include one or more processors, processing circuits, computing units, computing circuits, controllers, or microcontrollers (hereinafter used interchangeably). Each of the processors can be implemented by a general- or special-purpose processor (or set of processing cores) and can execute sequences of programmed instructions (“code”) to perform tasks and effectuate various operations. Additionally or alternatively, the processing unit 104 can include a custom-built hardware ASIC (application-specific integrated circuit), or can be programmed into a programmable logic device such as an FPGA (field-programmable gate array). In some implementations, the memory 106 stores executable instructions that, when executed by the processing unit 104, cause the processing unit 104 to control one or more of the sensors 102, to sample or extract data received from the sensors 102, to store the received data in the memory 106, and to retrieve or load data previously stored in the memory 106. The activity data received from the sensors 102 may be stored in raw format in the memory 106 by the processing unit 104, or may be pre-processed prior to storage in the memory 106. For example, the processing unit 104 may store or buffer the most recent 10 minutes of activity data in raw form, but may then store data from prior to the ten-minute window as filtered data, for example, with a lower sampling rate or with some form of numerical analysis, such as a moving average, performed, or as converted data: for example, acceleration data may be converted to activity metrics such as “steps taken,” “stairs climbed,” or “distance traveled.”
  • Activity data from the sensors 102, including raw data or post-processed data, may be further analyzed to determine if the activity data is indicative of a pre-defined biometric state or condition that is associated with a user input. If such analysis indicates that such data has been collected, the processing unit 104 may then treat such an event as equivalent to a user input. In some implementations, the processing unit 104 also may derive secondary data based on the raw activity data. In some implementations, the processing unit 104 also performs an analysis on the raw activity data received from the sensors 102 or on raw or previously-processed (“post-processed”) activity data retrieved from the memory 106 and initiates various actions based on the analysis. For example, the processing unit 104 may track, determine, compute, calculate or analyze one or more physiological, biometric, activity or sleep metrics (hereinafter collectively referred to as “activity metrics”) based on the raw, pre-processed or secondary activity data (also collectively referred to herein generally as “activity data”).
  • The memory 106 can include any suitable memory architecture. In some implementations, the memory 106 includes different classes of storage devices or units to store different classes of data. In some implementations, the memory 106 includes non-volatile storage media, such as fixed or removable semiconductor-, optical-, or magnetic-based media, for storing executable code (also referred to as “executable instructions”) and related data for enabling the implementations and carrying out the processes described herein. In some implementations, the memory 106 also is configured for storing configuration data or other information for implementing various default or user-defined settings, or for implementing the default or activity-specific activity-tracking modes described herein. The memory 106 also can be configured for storing other configuration data used during the execution of various programs or instruction sets or otherwise used to configure the portable monitoring device 100. Additionally, any of the afore-described raw activity data generated by the sensors 102, as well as pre-processed or derived data, can be stored in the non-volatile storage media within the memory 106 for later analysis or other use. Additionally, in some implementations, the activity metrics calculated by the processing unit 104 or received from an external computing device or server also can be stored in the non-volatile storage media within the memory 106 for later analysis, viewing or other use. In some implementations, the memory 106 also includes volatile storage media, such as static or dynamic random-access memory (RAM), for temporarily or non-permanently storing more transient information and other variable data as well as, in some implementations, executable code retrieved from the non-volatile storage media. The volatile storage media may also store any of the afore-described data generated by sensors 102 or data derived from sensed data (for example, including activity- or sleep tracking metrics) for later analysis, later storage in non-volatile media within memory 106, or for subsequent communication over a wired or wireless connection via I/O interface 110. The processing unit 104 can additionally or alternatively include its own volatile memory, such as RAM, for loading executable code from non-volatile memory for execution by the processing unit 104, or for tracking, analyzing, or otherwise processing any of the afore-described data generated by sensors 102 or data derived from sensed data (for example, including activity- or sleep tracking metrics) for later analysis, later storage in non-volatile media within memory 106, or for subsequent communication over a wired or wireless connection via I/O interface 110.
  • As will be described in more detail below, the processing unit 104 also can be configured to track and determine when the activity data received from the sensors 102 or retrieved from the memory 106, or the activity metrics generated from such activity data, indicate that a goal has been achieved or a progress point has been reached. For example, such a goal can be a specific activity metrics such as a distance, a number of steps, an elevation change, or number of calories burned, among other goals as described in more detail below. The processing unit 104 may then notify the user of the achievement of the goal or progress indicator via the user interface 108. For example, the processing unit 104 may cause a display to show content on the display marking or celebrating the achievement of the goal. Additionally or alternatively, the processing unit 104 may cause one or more lights (for example, LEDs) to light up, flash, change intensity, or otherwise reflect a visual pattern or display that notifies the user of the achievement of the goal. Additionally or alternatively, the processing unit 104 may cause one or more sound-producing devices to alert, beep or otherwise make noise that notifies the user of the achievement of the goal. Additionally or alternatively, the processing unit 104 may cause one or more vibrating devices to vibrate or otherwise provide haptic feedback in the form of one or more vibration patterns and, in some implementations, with differing or varying vibrational characteristics to notify the user of the achievement of particular goals.
  • In some implementations, user interface 108 collectively refers to and includes one or more user input devices and one or more output devices. The memory 106 also can store executable instructions that, when executed by the processing unit 104, cause the processing unit 104 to receive and interpret user input received via the user interface 108, or to output or communicate information to a user via the user interface 108. In various implementations, the user interface 108 can incorporate one or more types of user interfaces including, for example, visual, auditory, touch, vibration, or combinations thereof. For example, user interface 108 can include one or more buttons in or on a device housing that encloses the processing unit 104, the memory 106 and other electrical or mechanical components of the portable monitoring device 100. The buttons can be based on mechanical designs and electrical designs, and may incorporate, for example, one or more pressure sensors, proximity sensors, resistive sensors, capacitive sensors, or optical sensors. The user interface 108 also can include a touchpad or a touchscreen interface, which may be disposed over or integrated with a display, and which can incorporate these or other types of sensors.
  • In some implementations, the afore-described motion sensors, gyroscopes, or other sensors also can be used to detect a physical gesture corresponding to a user input. This allows a user to interact with the device using physical gestures. For example, accelerometers and gyroscopes can be used to detect when a user “taps,” shakes, rotates, flips or makes other “gestures” with the portable monitoring device. As another example, the portable monitoring device 100 can include a magnetometer, which may be used to detect the device's orientation with respect to the Earth's magnetic field. Other gestures that may be used to cause the portable monitoring device 100 to perform some action include, but are not limited to, multiple taps, or a specific pattern of taps. For example, a user may tap anywhere on the exterior (for example, the housing) of the portable monitoring device two times within a specific time period to cause the display to show particular content, to annotate activity data, or to change device modes.
  • As just described, the user interface 108 also can include a display can be included on or in the housing that encloses the processing unit 104 and the memory 106. In various implementations, the display can be configured as an alphanumeric display, transiently-visible display, or dead-front display. The display also can include or be based on any suitable display technology including liquid crystal display (LCD) technology or light-emitting diode (LED) technology among other suitable display technologies. The display can be configured to display various information to a user. In some implementations, a user can input a selection, navigate through a menu, or input other information via a button, a pressure sensor, a proximity sensor, a resistive sensor, a capacitive sensor, an optical sensor, or a touchscreen incorporating these or other types of sensors.
  • In various implementations, the display can show activity data, biometric data, contextual data, environmental data, system or intrinsic condition data, or data derived from activity or other sensed data, one or more activity metrics, one or more sleep metrics, a currently-active activity-tracking mode, one or more menus, one or more settings, one or more alarms or other indicators, a clock, a timer, a “stopwatch,” among other suitable information. In some implementations, the information that is displayed is customizable by the user or, additionally or alternatively, dependent on a current device state or mode of the portable monitoring device 100. For example, as a consequence of limited display space (to keep the portable monitoring device as small, portable or wearable as possible without sacrificing functionality or ease of use), the data displayed in association with each device state or mode may be partitioned into a plurality of different data display pages, and a user may “advance” through the data display pages associated with a given device state or mode by providing input to the biometric monitoring device.
  • The term “data display page” as used herein may refer to a visual display including text, graphics, and/or indicators, e.g., LEDs or other lights such as are used on the Fitbit Flex, that are arranged to communicate data measured, produced, or received by a portable monitoring device 100 to a user viewing a display of the portable monitoring device. In order to more dynamically change the display or the notifications provided to a user, the portable monitoring device 100 may track its device state through a variety of mechanisms and transition through different device states as contextual states, environmental states, or modes change. In some implementations, the device may include and be capable of operating in multiple active modes, multiple active environmental states, multiple active contextual states, or combinations of these, simultaneously. In such a case, the device state may be different for each different combination of environmental states, contextual states, or modes.
  • In implementations that include an annotation mode, an annotation data display page may indicate that the portable monitoring device 100 is in annotation mode. When the portable monitoring device 100 is in the annotation mode (or said differently, when the annotation mode is active or initiated), information related to the activity being annotated may be displayed. For example, data display pages for various types of activity data or activity metrics may show quantities measured while the portable monitoring device 100 is in the annotation mode. For example, while operating in the annotation mode, a data display page for “steps taken” may only display a quantity of steps that have been taken while the portable monitoring device 100 is in the annotation mode or in an activity session defined using annotation data (rather than, for example, the quantity of steps taken throughout the entire day, week, month, year or during the lifetime of the device).
  • In an example implementation, if the portable monitoring device 100 is in a device state associated with the wearer being asleep (for example, an annotated sleep-tracking state or a sleep-tracking mode), it may be less likely for the wearer to input information into or otherwise interact with the portable monitoring device. Thus, in some implementations, the processing unit 104 may decrease the sensitivities of various user input detection mechanisms, especially a touchscreen, (or turn the display or the entire device completely off) to reduce the risk of accidental inputs or to save power. In other device states, it may be desirable to change the user input method based on the limitations of various input mechanisms in various environments. For example, if the portable monitoring device 100 determines that it is in a device state associated with swimming (for example, the portable monitoring device 100 can be configured to independently determine through moisture sensors or pressure sensor data that it is in water), or if the portable monitoring device is actively placed into a swimming mode by the user via the user interface 108, then, in some implementations, a touchscreen interface or other user interface of the portable monitoring device 100 may be deactivated since it may not function well in water. The wearer may instead interact with the portable monitoring device 100 using physical buttons or other appropriate or suitable input mechanisms, including physical gesutres sensed by the device.
  • In additional to a display, the portable monitoring device 100, and particularly the user interface 108, also can include other mechanisms to provide feedback or other information to a user. For example, the user interface 108 can include one or more lights, such as one or more LEDs, in addition to the display for communicating information, such as the achievement of a goal, an alarm, an alert, indicator or other notification, a current state, a current mode, or a power level, to the user. For example, the processing unit 104 can control the intensities, colors, or patterns of flashing of one or more of the LEDs of the user interface 108 based on what information is being communicated. In some implementations, the user interface 108 additionally or alternatively includes one or more speakers or sound-producing devices. The user interface 108 also can include one or more microphones or other audio devices.
  • In some implementations, the user interface 108 includes one or more vibramotors (also referred to herein as “vibrators” or simply as “vibrating devices”) for communicating information with or to the user. For example, the processing unit 104 can utilize the vibramotors to communicate one or more alarms, achieved goals, progress indicators, inactivity indicators, reminders, indications that a timer has expired, or other indicators, feedback or notifications to a user wearing or holding the portable monitoring device 100. In some such implementations, the portable monitoring device 100 can utilize the vibramotors to communicate such information to the user in addition to communicating the same or similar information via the display, the lights, or the sound-producing devices. In some other such implementations, the portable monitoring device 100 can utilize the vibramotors to communicate such information to the user instead of or in lieu of communicating the same or similar information via the display, the lights, or the sound-producing devices. For example, in the case of an alarm, the vibramotors can cause the portable monitoring device 100 to vibrate to wake the user from sleep while not making noise so as to not wake the user's partner. As another example, in the case of a goal-achievement or progress indicator, the vibramotors can cause the portable monitoring device 100 to vibrate to alert the user that the user's goal has been achieved or that a milestone or other progress point en route to the goal has been reached without requiring the user to look at a display or hear an indication output from a speaker. In some implementations, a user can define one or more custom vibration patterns or other vibrational characteristics and assign such differing vibration patterns or other vibrational characteristics to different alarms, goals, or other vibrating indicators so that the user can distinguish among the vibrating indicators to determine what information is being communicated by the portable monitoring device 100. Additionally or alternatively, in some implementations, a user can select one or more default vibration patterns or other vibrational characteristics stored in the memory 106 and assign such differing vibration patterns or other vibrational characteristics to various vibrating indicators. In various implementations, the user can customize such patterns, characteristics, or settings or make such selections via the user interface 108, or via an application or program (including a web application, mobile application, or client-side software program) executing on an external computing device (for example, a personal computer, smartphone or multimedia device) communicatively coupled with the portable monitoring device 100 via the I/O interface 110 and one or more wired or wireless connections or networks.
  • In some implementations, as described above, one or more of the sensors 102 themselves also can be used to implement at least a portion of the user interface 108. For example, one or more accelerometers or other motion sensors 102 can be used to detect when a person taps the housing of the portable monitoring device 100 with a finger or other object, and then interpret such data as a user input for the purposes of controlling the portable monitoring device 100. For example, double-tapping the housing of the portable monitoring device 100 may be recognized by the processing unit 104 as a user input that will cause a display of the portable monitoring device to turn on from an off state or that will cause the portable monitoring device to transition between different monitoring states, sessions, or modes. For example, in an implementation in which the portable monitoring device includes a single annotation or other general activity-tracking mode, the tapping may cause the processing unit 104 to switch from a state where the portable monitoring device 100 collects and interprets activity data according to rules established for an “active” person to a state where the portable monitoring device collects and interprets activity data according to rules established for a “sleeping” or “resting” person. As another example, tapping the housing of the portable monitoring device 100 may be recognized by the processing unit 104 as a user input that will annotate monitored activity data, such as by, for example, indicating a starting or ending time of an activity session of user-defined duration. In some other implementations, such as in implementations in which the portable monitoring device 100 includes two or more activity-specific activity-tracking modes, the tapping may cause the processing unit 104 to switch from one activity-specific activity-tracking mode to another. For example, tapping may cause the processing unit 104 to switch from a walking mode where the portable monitoring device 100 collects and interprets activity data according to rules established for a “walking” person to a bicycling mode where the portable monitoring device interprets data according to rules established for a bicycle rider.
  • In some implementations, the processing unit 104 may communicate activity data received from the sensors 102 or retrieved from the memory 106 via the I/O interface 110 to an external or remote computing device (for example, a personal computer, smartphone or multimedia device) or to a back-end server over one or more computer networks. In some implementations, the I/O interface 110 includes a transmitter and a receiver (also referred to collectively herein as a “transceiver” or simply as “transmitting and receiving circuitry”) that can transmit the activity data or other information through a wired or wireless connection to one or more external computing devices or to one or more back-end servers (either directly via one or more networks or indirectly via an external computing device that first receives the activity data and subsequently communicates the data via one or more networks to the back-end servers). For example, the memory 106 also can store executable instructions that, when executed by the processing unit 104, cause the processing unit 104 to transmit and receive information via the I/O interface 110. In some implementations, the one or more computer networks include one or more local-area networks (LANs), private networks, social networks, or wide-area networks (WANs) including the Internet. The I/O interface 110 can include wireless communication functionality so that when the portable monitoring device 100 comes within range of a wireless base station or access point, or within range of certain equipped external computing devices (for example, a personal computer, smartphone or multimedia device), certain activity data or other data is automatically synced or uploaded to the external computing device or back-end server for further analysis, processing, viewing, or storing. In various implementations, the wireless communication functionality of I/O interface 110 may be provided or enabled via one or more communications technologies known in the art such as, for example, Wi-Fi, Bluetooth, RFID, Near-Field Communications (NFC), Zigbee, Ant, optical data transmission, among others. Additionally or alternatively, the I/O interface 110 also can include wired-communication capability, such as, for example, a Universal Serial Bus (USB) interface.
  • In some implementations, one or more back-end servers or computing systems can support a web-based application (“web application”), web site, web page or web portal (hereinafter “web application,” “web page,” “web site,” and “web portal” may be used interchangeably) enabling a user to remotely interact with the portable monitoring device 100, or to interact with or view the activity data or activity metrics calculated based on the activity data, via any computing device (for example, a personal computer, smartphone or multimedia device) capable of supporting a web browser or other web client suitable for use in rendering the web page or web-based application. For example, in some implementations, the data can be stored at an Internet-viewable or Internet-accessible source such as a web site (for example, www.Fitbit.com) permitting the activity data, or data or activity metrics derived or calculated therefrom, to be viewed, for example, using a web browser or network-based application. Hereinafter, reference to a web application, web page, web site or web portal may refer to any structured document or user interface made available for viewing on a client device (for example, a personal computer, smartphone or multimedia device) over any of one or more of the described networks or other suitable networks or communication links.
  • For example, while the user is wearing a portable monitoring device 100, the processing unit 104 may calculate the user's step count based on activity data received from one or more sensors 102. The processing unit 104 may temporarily store the activity data and calculated step count in the memory 106. The processing unit 104 may then transmit the step count, or raw or pre-processed activity data representative of the user's step count, via I/O interface 110 to an account on a web service (for example, www.fitbit.com), an external computing device such as a personal computer or a mobile phone (especially a smartphone), or to a health station where the data may be stored, further-processed, and visualized by the user or friends of the user.
  • Other implementations relating to the use of short range wireless communication are described in U.S. patent application Ser. No. 13/785,904, titled “Near Field Communication System, and Method of Operating Same” filed Mar. 5, 2013 which is hereby incorporated herein by reference in its entirety.
  • In various implementations, the activity metrics that can be tracked, determined, calculate or analyzed by the processing unit 104, or by an external computing device or back-end server based on activity data transmitted from portable monitoring device 100, include one or more of, for example: energy expenditure (for example, calories burned), distance traveled, steps taken, stairs or floors climbed or descended, elevation gained or lost (e.g., based on an altimeter or global positioning satellite (GPS) device), pace, maximum speed, location, direction, heading, ambulatory speed, rotation or distance traveled, swimming stroke count, swimming lap count, swimming distance, bicycle distance, bicycle speed, heart rate, heart rate variability, heart rate recovery, blood pressure, blood glucose, blood oxygen level, skin conduction, skin or body temperature, electromyography data, electroencephalography data, weight, body fat, caloric intake, nutritional intake from food, medication intake, sleep periods, sleep phases, sleep quality, sleep duration, pH levels, hydration levels, and respiration rate. In some implementations, the processing unit 104 also tracks, determines, or calculates metrics related to the environment around the user such as, for example, one or more of: barometric pressure, temperature, humidity, rain/snow conditions, wind speed, other weather conditions, light exposure (ambient light), ultraviolet (UV) light exposure, time or duration spent in darkness, pollen count, air quality, noise exposure, radiation exposure, and magnetic field. Some of the data used to calculate one or more of the metrics just described may be provided to the portable monitoring device from an external source. For example, the user may input his height, weight, or stride in a user profile on a fitness-tracking website and such information may then be communicated to the portable monitoring device 100 via the I/O interface 110 and used to evaluate, in conjunction with activity data measured by the sensors 102, the distance traveled or calories burned by the user.
  • A general listing of potential types of sensors 102 and activity data types is shown below in Table 1. This listing is not exclusive, and other types of sensors other than those listed may be used. Moreover, the data that is potentially derivable from the listed sensors may also be derived, either in whole or in part, from other sensors. For example, an evaluation of stairs climbed may involve evaluating altimeter data to determine altitude change, clock data to determine how quickly the altitude changed, and accelerometer data to determine whether biometric monitoring device is being worn by a person who is walking (as opposed to standing still).
  • TABLE 1
    Sensor Type Activity Data Potentially-Derivable Activity Data
    Accelerometers Accelerations experienced at Rotation, translation, velocity/speed,
    location worn distance traveled, steps taken,
    elevation gained, fall indications,
    calories burned (in combination with
    data such as user weight, stride, etc.)
    Gyroscopes Angular orientation and/or rotation Rotation, orientation
    Altimeters Barometric pressure Altitude change, flights of stairs
    climbed, local pressure changes,
    submersion in liquid
    Pulse Oximeters Blood oxygen saturation (SpO2), Heart rate variability, stress levels,
    heart rate, blood volume active heart rate, resting heart rate,
    sleeping heart rate, sedentary heart
    rate, cardiac arrhythmia, cardiac
    arrest, pulse transit time, heart rate
    recovery time, blood volume
    Galvanic Skin Electrical conductance of skin Perspiration, stress levels,
    Response Sensors exertion/arousal levels
    Global Positioning Location, elevation Distance traveled, velocity/speed
    System (GPS)
    Electromyographic Electrical pulses Muscle tension/extension
    Sensors
    Audio Sensors Local environmental sound levels Laugh detection, breathing detection,
    snoring detection, respiration type
    (snoring, breathing, labored
    breathing, gasping), voice detection,
    typing detection
    Photo/Light Ambient light intensity, ambient Day/night, sleep, UV exposure, TV
    Sensors light wavelength watching, indoor v. outdoor
    environment
    Temperature Temperature Body temperature, ambient
    Sensors environment temperature
    Strain Gauge Weight (the strain gauges may be Body Mass Index (BMI) (in
    Sensors located in a device remote from conjunction with user-supplied height
    the biometric monitoring device, and gender information, for example)
    e.g., a Fitbit Aria ™ scale, and
    communicate weight-related data
    to the biometric monitoring
    device, either directly or via a
    shared account over the Internet)
    Bioelectrical Body fat percentage (may be
    Impedance Sensors included in remote device, such as
    Aria ™ scale)
    Respiration Rate Respiration rate Sleep apnea detection
    Sensors
    Blood Pressure Systolic blood pressure, diastolic
    Sensors blood pressure
    Heart Rate Sensors Heart rate
    Blood Glucose Blood glucose levels
    Sensors
    Moisture Sensors Moisture levels Whether user is swimming,
    showering, bathing, etc.
  • In addition to the above, some biometric data may be calculated or estimated by the portable monitoring device 100 without direct reference to data obtained from the sensors 102. For example, a person's basal metabolic rate, which is a measure of the “default” caloric expenditure that a person experiences throughout the day while at rest (in other words, simply to provide energy for basic bodily functions such as breathing, circulating blood, etc.), may be calculated based on data entered by the user via the user interface 108, or via an application or program (including a web application, mobile application, or client-side software program) executing on an external computing device (for example, a personal computer, smartphone or multimedia device) communicatively coupled with the portable monitoring device 100 via the I/O interface 110 and one or more wired or wireless connections or networks. Such user-entered data may be used, in conjunction with data from an internal clock indicating the time of day, to determine how many calories have been expended by a person thus far in the day to provide energy for basic bodily functions.
  • As described above, in some example implementations, the portable monitoring device 100, and particularly processing unit 104, includes a default activity-tracking mode also referred to herein as an “annotation” mode. In some such implementations, the activity data monitored while in the default annotation mode can be annotated or otherwise marked to indicate, specify, or delineate the starting and ending time points or other time points of and within an activity session. Again, for purposes of this disclosure, an “activity session” may generally refer to a user-defined duration of time, or a duration of time associated with a particular activity or time of day, in which the device is monitoring activity data. In some implementations, the activity data monitored while in the annotation mode also can be annotated or otherwise marked to indicate, specify, or define a specific activity that is being performed by the user during the activity session such as, for example, walking, running, stair climbing, bicycling, swimming, or even sleeping. In various implementations, the user can annotate the activity data prior to, during, or after completion of an associated activity.
  • In some implementations, a user can annotate an activity session via physical interactions with the portable monitoring device 100, itself. For example, the user can annotate the activity data using, for example, any of the components described above that may be included within user interface 108. Additionally or alternatively, the user can annotate the activity session via an external or remote computer (for example, a personal computer, a smartphone, or a multimedia device). In some such implementations, one or both of the portable monitoring device 100 and a coupled external computing device also can communicate with one or more back-end servers as described above. In some such implementations, the portable monitoring device or external computing device can transmit the annotations (also referred to herein as “annotation data”), the activity data, as well as information about the portable monitoring device or the user, to the servers for storage and, in some implementations, for additional processing or analysis.
  • In some such implementations, the portable monitoring device 100, and particular the processing unit 104, is configured to use the sensors 102 to monitor the same type of activity data in the same way regardless of the activity being performed or in which the user in currently engaged. That is, in some implementations, regardless of what activity the user is engaging in, be it walking, running, stair climbing, bicycling, swimming, or even sleeping, the same sensors are used to sense movements or other sensed activity data in the same way. In some implementations in which the processing unit 104 is configured to determine, calculate or analyze one or more activity metrics, the processing unit, itself, can determine which activity metrics to determine, calculate or analyze based on the annotation data received for the activity session.
  • In some implementations, the portable monitoring device 100 can automatically annotate one or more activity sessions. In some such implementations, the processing unit 104 can analyze the activity data from the sensors 102 dynamically (for example, substantially in real time) and automatically determine a starting point, an ending point, or other time points for which to record timestamps or store markers or digital flags in the memory 106 to annotate the activity data monitored in an activity session. In some other implementations, the processing unit can analyze activity data retrieved from the memory 106 to automatically annotate the stored activity data. In still other implementations, the processing unit 104 can transmit the activity data via I/O interface 110 to one or both of an external computing device (for example, a personal computer, a smartphone or a multimedia device) or a back-end server (either directly over one or more wired or wireless networks or indirectly by way of an external computing device, such as a personal computer, a smartphone or a multimedia device, in conjunction with one or more wired or wireless networks) that then automatically annotates the received activity data. In some of the aforementioned implementations, the annotation data can be stored with the corresponding activity data; that is, together with the activity data in the same locations within the memory 106. In some other implementations, the annotation data can be stored separately from the activity data within the memory 106 but linked to the activity data by way of, for example, one or more tables and timestamps.
  • In an example implementation, if the portable monitoring device 100 is placed in an annotation mode prior to the wearer going to sleep and then taken out of the annotation mode after the wearer wakes up, e.g., via user interactions or based on sensed biometric or other activity data, the portable monitoring device 100 may record biometric data that indicates that the wearer was largely stationary and horizontal during the time that the biometric monitoring device was in the annotation mode. This, in combination with the time of day that the annotated biometric data was collected, may cause the portable monitoring device to automatically annotate such data as a “sleeping” activity. A wearer of the biometric monitoring device may, alternatively, indicate that the annotated biometric data is associated with a particular activity, e.g., by entering a label or other identifier of the activity in association with the annotated data after the biometric data is exported from the portable monitoring device to a one more back-end servers via a website, web application, mobile application, or other application or by inputting such a label or other identifier into an external computing device (for example, a smartphone, multimedia device, or personal computer) that is paired with the portable monitoring device and within communication range of the portable monitoring device, and particularly the I/O interface 110.
  • In some other example implementations, the portable monitoring device 100 may automatically detect or determine when the user is attempting to go to sleep, entering sleep, is asleep, or is awoken from a period of sleep. In some such implementations, the portable monitoring device 100 may employ physiological, motion or other sensors to acquire activity data. In some such implementations, the processing unit 104 then correlates a combination of one or more of: motion, heart rate, heart rate variability, respiration rate, galvanic skin response, or skin or body temperature sensing to detect or determine if the user is attempting to go to sleep, entering sleep, is asleep or is awoken from a period of sleep. In response, the portable monitoring device 100 may, for example, acquire physiological data (such as of the type and in the manner as described herein) or determine physiological conditions of the user (such as of the type and in the manner as described herein). For example, a decrease or cessation of user motion combined with a reduction in user heart rate and/or a change in heart rate variability may indicate that the user has fallen asleep. Subsequent changes in heart rate variability and galvanic skin response may be used to determine transitions of the user's sleep state between two or more stages of sleep (for example, into lighter and/or deeper stages of sleep). Motion by the user and/or an elevated heart rate and/or a change in heart rate variability may be used to determine that the user has awoken.
  • Real-time, windowed, or batch processing to maybe used to determine the transitions between wake, sleep, and sleep stages, as well as in other activity stages. For instance, a decrease in heart rate may be measured in a time window where the heart rate is elevated at the start of the window and reduced in the middle (and/or end) of the window. The awake and sleep stages may be classified by a hidden Markov model using changes in motion signal (e.g., decreasing intensity), heart rate, heart rate variability, skin temperature, galvanic skin response, and/or ambient light levels. The transition points may be determined through a changepoint algorithm (e.g., Bayesian changepoint analysis). The transition between awake and sleep may be determined by observing periods where the user's heart rate decreases over a predetermined time duration by at least a certain threshold but within a predetermined margin of the user's resting heart rate (that is observed as, for instance, the minimum heart rate of the user while sleeping). Similarly, the transition between sleep and awake may be determined by observing an increase in the user's heart rate above a predetermined threshold of the user's resting heart rate.
  • In some implementations, a back-end server determines which activity metrics to calculate or analyze based on annotation data generated by the server or another server and stored in one or both of the servers, annotation data received from an external computing device, or annotation data also received from the portable monitoring device 100. Additionally, the servers also can determine which activity metrics to calculate or analyze based on an analysis of the tracked activity data. In some such implementations, the portable monitoring device 100 may not track, determine, calculate or analyze any activity metrics at all; rather, the portable monitoring device may monitor the sensed activity data and subsequently store or transmit the activity data for later analysis and processing by an external computing device or back-end servers.
  • As described above, one or output mechanisms—visual, auditory, or motion/vibration—may be used alone or in any combination with each other or another method of communication to communicate any one of or a plurality of the following information notifications: that a user needs to wake up at certain time (e.g., an alarm); that a user should wake up as they are in a certain sleep phase (e.g., a smart alarm); that a user should go to sleep as it is a certain time; that a user should wake up as they are in a certain sleep phase or stage and in a preselected or previously-user-defined time window bounded by the earliest and latest time that the user wants to wake up; that an email, text or other communication was received; that the user has been inactive for a certain period of time (such a notification function may integrate with other applications like, for instance, a meeting calendar or sleep tracking application to block out, reduce, or adjust the behavior of the inactivity alert); that the user has been active for a certain period of time; that the user has an appointment or calendar event (e.g., a reminder); or that the user has reached a certain activity metric or combination of activity metrics. Also as described above, one or output mechanisms—visual, auditory, or motion/vibration—may be used alone or in any combination with each other or another method of communication to communicate that the user has met or achieved or made progress towards one or more of the following goals: the traversal of a certain distance; the achievement of certain mile (or other lap) pace; the achievement of a certain speed; the achievement of a certain elevation gain; the achievement of a certain number of steps; the achievement of a certain maximum or average heart rate; the completion of a certain number of swimming strokes or laps in a pool.
  • These examples are provided for illustration and are not intended to limit the scope of information that may be communicated by the device (for example, to the user). As described above, the data used to determine whether or not a goal is achieved or whether the condition for an alert has been met may be acquired from the portable monitoring device 100 or another device. The portable monitoring device 100 itself may determine whether the criteria for an alert, goal, or notification has been met. Alternatively, a computing device in communication with the device (e.g. a server and/or a mobile phone) may determine when the alert should occur. In view of this disclosure, other information that the device may communicate to the user can be envisioned by one of ordinary skill in the art. For example, the device may communicate with the user when a goal has been met. The criteria for meeting this goal may be based on physiological, contextual, and environmental sensors on a first device, and/or other sensor data from one or more secondary devices. The goal may be set by the user or may be set by the device itself and/or another computing device in communication with the device (e.g. a server).
  • In one example implementation, upon detecting or determining that the user has reached a biometric or activity goal, the portable monitoring device 100 may vibrate to notify the user. For example, the portable monitoring device 100 may detect (or be informed) that the wearer has exceeded a predefined goal or achievement threshold, for example, 10,000 steps taken in one day, and may, responsive to such an event, vibrate to alert or congratulate the user. In some such implementations, if the user then presses a button, the display may turn on and present data about the goal that the user reached, for example, what goal was reached, if the goal was previously reached one or more times on a different day, week, month, or year, or how long it took to reach the goal). In another example, the color and/or intensity of one or more LEDs may serve as notifications that the user is winning or losing against a friend in a competition in, for example, step count. In yet another example, the biometric monitoring device may be a wrist-mounted device that may vibrate or emit audio feedback to notify the user of an incoming email, text message, or other alert. In some such implementations, if the user then moves his or her wrist in a gesture similar to checking a watch, the display of the biometric monitoring device may be turned on and a data display page relating data relevant to the alert may be presented to the user. In yet another example, the biometric monitoring device may present increasingly noticeable feedback methods based on the importance or urgency of the alert. For example, a high priority alert may include audio, vibration, and/or visual feedback, whereas a low priority alert may only include visual feedback. The criteria to distinguish a high priority alert from lower-priority alerts may be defined by the user. For example, a high-priority alert may be triggered if an email message or text is sent with a particular priority, e.g., “urgent,” if an email message or text is sent from a particular person, e.g., a person that the user has identified as being high-priority, if a meeting notification or reminder is received or occurs, if a certain goal is achieved or if a dangerous health condition, such as a high heart rate is detected.
  • As described above, in some other implementations, the portable monitoring device 100 may operate within or according to a plurality of modes. For example, various modes may include: a general or default activity-tracking mode such as the annotation mode described above, a timer mode, a stopwatch mode, a clock/time/watch mode, a sleep-monitoring (or “sleep-tracking”) mode, a work mode, a home mode, a commute mode, as well as one or more activity-specific activity-tracking modes for tracking user activities such as biking, swimming, walking, running, stair-climbing, rock climbing, weight-lifting, treadmill exercise, and elliptical machine exercise. In some multi-mode implementations, the portable monitoring device 100 also enables a user to annotate activity data monitored in one or more modes including one or more activity-specific activity-tracking modes as described above.
  • The processing unit 104 may automatically determine or select a mode for the device to operate in based on a plurality of signals, data or other information. For example, the processing unit may automatically select a mode based on one or more activity metrics (for example, a step count, stair or floor count, or a number of calories burned) or, additionally or alternatively, based on one or more of: contextual or environmental data (for example, time of day, GPS or other determined or entered location or position data, ambient light brightness, temperature, or humidity); physiological or other person-centric data (for example, heart rate, body temperature, hydration level, or blood oxygen level); or system condition data (for example, in response to a low battery or low memory); or based on one or more user-defined conditions being met.
  • In some implementations, the portable monitoring device itself can determine which activity data to monitor, or, additionally or alternatively, which activity (or sleep) metrics (hereinafter “sleep metrics” also may generally be referred to as “activity metrics”) to determine, calculate or analyze, based on which of the activity-tracking or other modes is currently active or initiated. Additionally or alternatively, in some implementations, one or both of an external computing device or a back-end server can request certain activity data from the portable monitoring device based on which of the activity-tracking modes is currently active or initiated. Additionally or alternatively, in some implementations, one or both of an external computing device or a back-end server can receive all activity data monitored by the portable monitoring device and subsequently filter or otherwise selectively process certain activity data to determine, calculate or analyze certain activity metrics based on which of the activity-tracking modes is currently active or initiated.
  • In some multi-mode implementations, a user can select which of the modes is currently active or initiated via the user interface 108, or via an application or program (including a web application, mobile application, or client-side software program) executing on an external computing device (for example, a personal computer, smartphone or multimedia device) communicatively coupled with the portable monitoring device 100 via the I/O interface 110 and one or more wired or wireless connections or networks. For example, a user may select the mode of the portable monitoring device 100 using an application on a smartphone that sends the mode selection to a server. The server, in turn, sends the mode selection to an external computing device that then sends the mode selection to the portable monitoring device 100 via the I/O interface 110. Alternatively, the smart phone application (or the server) may send the mode selection directly to the portable monitoring device 100.
  • In some implementations, a user also can select which activity metrics to track while in each of the corresponding activity-tracking modes. As described above, in some implementations, the portable monitoring device 100 also can be configured to automatically switch among two or more activity-tracking or other modes. In some such implementations, the processing unit 104 can analyze the activity data from the sensors 102 and automatically determine a most suitable, appropriate, or optimal activity-tracking or other mode to switch into based on the analysis of the activity data dynamically in substantially real-time. In some other such implementations, the processing unit 104 can transmit the activity data via I/O interface 110 through a wired or wireless connection to one or both of an external computing device or back-end server that then analyzes the activity data, determines the most suitable, appropriate, or optimal activity-tracking or other mode to switch into, and subsequently transmits one or more instructions to the portable monitoring device 100 that, when executed by the processing unit 104, cause the processing unit 104 (in conjunction with one or more other components described above) to switch into the determined mode.
  • In some implementations, the portable monitoring device 100 includes an alarm clock function intended to wake the wearer or user from sleep or otherwise alert the user. In some such implementations, the portable monitoring device 100 acts as a wrist-mounted vibrating alarm to silently wake the user from sleep. The portable monitoring device also can be configured to track the user's sleep quality, waking periods, sleep latency, sleep efficiency, sleep stages (e.g., deep sleep vs REM), or other sleep-related metrics through one or a combination of heart rate, heart rate variability, galvanic skin response, motion sensing (e.g., accelerometer, gyroscope, magnetometer), and skin temperature. In some implementations, the user may specify a desired alarm time or window of time (e.g. set alarm to go off between 7 and 8 am). In some such implementations, the processing unit 104 uses one or more of the sleep metrics to determine an optimal time within the alarm window to wake the user. In some implementations, when the vibrating alarm is active, the user may cause it to hibernate, snooze, or turn off by slapping or tapping the device (which is detected, for example, via motion sensor(s), a pressure/force sensor and/or capacitive touch sensor in the device). In one specific implementation, the portable monitoring device 100 can be configured to attempt to arouse the user at an optimum point in the sleep cycle by starting a small vibration at a specific user sleep stage or time prior to the alarm setting. It may progressively increase the intensity or noticeability of the vibration as the user progresses toward wakefulness or toward the alarm setting. Similar to the way a conventional alarm clock functions, the wearer or user may have the ability to set one or more daily, periodic, or other recurring alarms. Additionally, the alarm function can be configured to “snooze,” i.e., temporarily stop the alarm for a short period of time, typically minutes, and then have the alarm re-trigger.
  • As a result of the small size of many portable monitoring devices, many such monitoring devices have limited space to accommodate various user interface components. For example, Fitbit manufactures a variety of extremely compact portable monitoring devices, including biometric tracking units, that each incorporate a suite of sensors, a battery, a display, a power-charging interface, and one or more wireless communications interfaces. In some such examples, the portable monitoring devices also incorporate a vibramotor and/or a button. These components may be housed, for example, within housings measuring approximately 2″ long, 0.75″ wide, and 0.5″ thick (Fitbit Ultra™); approximately 1.9″ in length, 0.75″ wide, and 0.375″ thick (Fitbit One™); approximately 1.4″ long, 1.1″ wide, and 0.375″ thick (Fitbit Zip™); and approximately 1.3″ in length, 0.5″ wide, and 0.25″ thick (Fitbit Flex™). Of course, housings of other sizes may be used in other implementations of biometric monitoring devices; the above list is merely intended to illustrate the small size of many such biometric monitoring devices.
  • Despite the small sizes of the above-listed Fitbit devices, each includes a display of some type—the Fitbit Ultra, Fitbit One, and Fitbit Zip, for example, all include small pixelated display screens capable of outputting text, numbers, and graphics. The Fitbit Flex, due to its smaller size, uses discrete light-emitting diode (LED) indicators, e.g., 5 LEDs arranged in a row, to convey information visually. Each of the above-listed Fitbit devices also have an input mechanism that allows a user to affect some aspect of the device's operation. For example, the Fitbit Ultra and Fitbit One each include a discrete pushbutton that allows a user to affect how the device operates. The Fitbit Zip and Fitbit Flex, by contrast, do not have a discrete pushbutton but are instead each configured to detect, using their biometric sensors, when the user taps the housing of the device; such events are construed by the processor or processors of such devices as signaling a user input, i.e., acting as the input mechanism.
  • One component of the portable monitoring device 100 that may be limited in size or performance is the power source 114, for example, a rechargeable, removable, or replaceable battery, capacitor, etc. In some implementations, the portable monitoring device 100 can be configured to remain in an “always on” state to allow it to continually collect activity data throughout the day and night. Given that the sensors 102 and processing unit 104 of the portable monitoring device must generally remain powered to some degree in order to collect the activity data, it can be advantageous to implement power-saving features elsewhere in the device, such as by, for example, causing a display to automatically turn off after a period of time. The Fitbit Ultra™ is an example of a portable monitoring device that includes a data display that is typically turned off to save power unless the device is being interacted with by the user. A typical user interaction may be provided by, for example, pressing a button on the device.
  • In some implementations, a housing of the portable monitoring device 100 itself is designed or configured such that it may be inserted into, and removed from, a plurality of compatible cases, housings, or holders (hereinafter “cases,” “housings,” and “holders” may be used interchangeably). For example, in some implementations, the portable monitoring device 100 is configured for removable insertion into a wristband or armband that can be worn on a person's wrist, forearm or upper arm. In some implementations, the portable monitoring device is additionally or alternatively configured for removable insertion into a belt-clip case or configured for coupling with a clip that can be attached to a person's belt or clothing. As used herein, the term “wristband” may refer to a band that is designed to fully or partially encircle a person's forearm near the wrist joint. The band can be continuous, for example, without any “breaks’; that is, it may stretch to fit over a person's hand or have an expanding portion similar to a dress watchband. Alternatively, the band can be discontinuous, for example, having a clasp or other connection enabling a user to close the band similar to a watchband. In still other implementations, the band can simply be simply “open,” for example, having a C-shape that clasps the wearer's wrist. Hereinafter, a portable monitoring device that is inserted, combined, or otherwise coupled with a separate removable case or some other structure enabling it to be worn or easily carried by or attached to a person or his clothing may be referred to as a “portable monitoring system.”
  • As mentioned above, various implementations of portable monitoring devices described herein may have shapes and sizes adapted for coupling to the body or clothing of a user (e.g., secured to, worn, borne by, etc.). Various examples of such portable monitoring devices are shown in FIGS. 2, 3, and 4. FIG. 2 depicts a monitoring device similar in shape to a Fitbit One, which may be inserted into a holder with a belt clip or into a pocket on a wristband. Portable monitoring device 200 has a housing 202 that contains the electronics associated with the biometric monitoring device 200. A button 204 and a display 206 may be accessible/visible through the housing 202. FIG. 3 depicts a portable monitoring device that may be worn on a person's forearm like a wristwatch, much like a Fitbit Flex. Portable monitoring device 300 has a housing 302 that contains the electronics associated with the biometric monitoring device 300. A button 304 and a display 306 may be accessible/visible through the housing 302. A wristband 308 may be integrated with the housing 302. FIG. 4 depicts another example of a portable monitoring device that may be worn on a person's forearm like a wristwatch, although with a bigger display than the portable monitoring device of FIG. 3. Portable monitoring device 400 has a housing 402 that contains the electronics associated with the portable monitoring device 400. A button 404 and a display 406 may be accessible/visible through the housing 402. A wristband 408 may be integrated with the housing 402.
  • Further embodiments and implementations of portable monitoring devices can be found in U.S. patent application Ser. No. 13/156,304, titled “Portable Biometric Monitoring Devices and Methods of Operating Same” filed Jun. 8, 2011 which is hereby incorporated by reference in its entirety.
  • Unless the context (where the term “context” is used per its typical, general definition) of this disclosure clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of “including, but not limited to.” Words using the singular or plural number also generally include the plural or singular number respectively. Additionally, the words “herein,” “hereunder,” “above,” “below,” and words of similar import refer to this application as a whole and not to any particular portions of this application. When the word “or” is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list. The term “implementation” refers to implementations of techniques and methods described herein, as well as to physical objects that embody the structures and/or incorporate the techniques and/or methods described herein.
  • There are many concepts and implementations described and illustrated herein. While certain features, attributes and advantages of the implementations discussed herein have been described and illustrated, many others, as well as different and/or similar implementations, features, attributes and advantages, are apparent from the description and illustrations. As such, the above implementations are merely exemplary and are not intended to be exhaustive or to limit the disclosure to the precise forms, techniques, materials and/or configurations disclosed. Many modifications and variations are possible in light of this disclosure. Other implementations may be utilized and operational changes may be made without departing from the scope of the present disclosure. As such, the scope of the disclosure is not limited solely to the description above because the description of the above implementations has been presented for the purposes of illustration and description.
  • Importantly, the present disclosure is neither limited to any single aspect nor implementation, nor to any single combination and/or permutation of such aspects and/or implementations. Moreover, each of the aspects of the present disclosure, and/or implementations thereof, may be employed alone or in combination with one or more of the other aspects and/or implementations thereof. For the sake of brevity, many of those permutations and combinations will not be discussed and/or illustrated separately herein.

Claims (2)

1. A device comprising:
one or more motion sensors for sensing motion of the device and providing activity data indicative of the sensed motion;
one or more processors for:
monitoring the activity data; and
receiving or generating annotation data for annotating the activity data with one or more markers or indicators to define one or more characteristics of an activity session;
one or more feedback devices for providing feedback, a notice, or an indication to a user based on the monitoring; and
a portable housing that encloses at least portions of the motion sensors, the processors and the feedback devices.
2-30. (canceled)
US14/029,759 2013-01-15 2013-09-17 Portable monitoring devices and methods of operating the same Abandoned US20140197963A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
US14/029,759 US20140197963A1 (en) 2013-01-15 2013-09-17 Portable monitoring devices and methods of operating the same
US14/062,717 US8903671B2 (en) 2013-01-15 2013-10-24 Portable monitoring devices and methods of operating the same
CN201410475447.4A CN104434315B (en) 2013-01-15 2014-09-17 Portable Monitoring Devices and Methods of Operating Same
CN202110530090.5A CN113367689A (en) 2013-01-15 2014-09-17 Portable monitoring device and method of operating the same
CN201710251926.1A CN107260178B (en) 2013-01-15 2014-09-17 Portable monitoring device and method of operating the same
US14/524,909 US9286789B2 (en) 2013-01-15 2014-10-27 Portable monitoring devices and methods of operating the same
US15/017,356 US9600994B2 (en) 2013-01-15 2016-02-05 Portable monitoring devices and methods of operating the same
US15/427,638 US10134256B2 (en) 2013-01-15 2017-02-08 Portable monitoring devices and methods of operating the same
US16/167,386 US11423757B2 (en) 2013-01-15 2018-10-22 Portable monitoring devices and methods of operating the same
US17/892,400 US12002341B2 (en) 2013-01-15 2022-08-22 Portable monitoring devices and methods of operating the same

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361752826P 2013-01-15 2013-01-15
US201361830600P 2013-06-03 2013-06-03
US14/029,759 US20140197963A1 (en) 2013-01-15 2013-09-17 Portable monitoring devices and methods of operating the same

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/062,717 Continuation US8903671B2 (en) 2013-01-15 2013-10-24 Portable monitoring devices and methods of operating the same

Publications (1)

Publication Number Publication Date
US20140197963A1 true US20140197963A1 (en) 2014-07-17

Family

ID=51164718

Family Applications (9)

Application Number Title Priority Date Filing Date
US14/029,759 Abandoned US20140197963A1 (en) 2013-01-15 2013-09-17 Portable monitoring devices and methods of operating the same
US14/029,760 Active US9098991B2 (en) 2013-01-15 2013-09-17 Portable monitoring devices and methods of operating the same
US14/062,717 Active US8903671B2 (en) 2013-01-15 2013-10-24 Portable monitoring devices and methods of operating the same
US14/524,909 Active 2033-12-09 US9286789B2 (en) 2013-01-15 2014-10-27 Portable monitoring devices and methods of operating the same
US14/749,386 Active US9773396B2 (en) 2013-01-15 2015-06-24 Portable monitoring devices and methods of operating the same
US15/017,356 Active US9600994B2 (en) 2013-01-15 2016-02-05 Portable monitoring devices and methods of operating the same
US15/427,638 Active 2034-01-01 US10134256B2 (en) 2013-01-15 2017-02-08 Portable monitoring devices and methods of operating the same
US16/167,386 Active 2036-05-20 US11423757B2 (en) 2013-01-15 2018-10-22 Portable monitoring devices and methods of operating the same
US17/892,400 Active US12002341B2 (en) 2013-01-15 2022-08-22 Portable monitoring devices and methods of operating the same

Family Applications After (8)

Application Number Title Priority Date Filing Date
US14/029,760 Active US9098991B2 (en) 2013-01-15 2013-09-17 Portable monitoring devices and methods of operating the same
US14/062,717 Active US8903671B2 (en) 2013-01-15 2013-10-24 Portable monitoring devices and methods of operating the same
US14/524,909 Active 2033-12-09 US9286789B2 (en) 2013-01-15 2014-10-27 Portable monitoring devices and methods of operating the same
US14/749,386 Active US9773396B2 (en) 2013-01-15 2015-06-24 Portable monitoring devices and methods of operating the same
US15/017,356 Active US9600994B2 (en) 2013-01-15 2016-02-05 Portable monitoring devices and methods of operating the same
US15/427,638 Active 2034-01-01 US10134256B2 (en) 2013-01-15 2017-02-08 Portable monitoring devices and methods of operating the same
US16/167,386 Active 2036-05-20 US11423757B2 (en) 2013-01-15 2018-10-22 Portable monitoring devices and methods of operating the same
US17/892,400 Active US12002341B2 (en) 2013-01-15 2022-08-22 Portable monitoring devices and methods of operating the same

Country Status (2)

Country Link
US (9) US20140197963A1 (en)
CN (4) CN107260178B (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104167079A (en) * 2014-08-01 2014-11-26 青岛歌尔声学科技有限公司 Rescue method and device applied to multiple scenes
US8903671B2 (en) 2013-01-15 2014-12-02 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US20150305655A1 (en) * 2014-04-25 2015-10-29 Speedo International Limited Activity Monitors
US9582072B2 (en) 2013-09-17 2017-02-28 Medibotics Llc Motion recognition clothing [TM] with flexible electromagnetic, light, or sonic energy pathways
US9588582B2 (en) 2013-09-17 2017-03-07 Medibotics Llc Motion recognition clothing (TM) with two different sets of tubes spanning a body joint
US20170092111A1 (en) * 2015-09-30 2017-03-30 Xiaomi Inc. Method and device for processing abnormality notification from a smart device
US20170135593A1 (en) * 2015-11-13 2017-05-18 Acme Portable Corp. Wearable device which diagnoses personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US10130277B2 (en) 2014-01-28 2018-11-20 Medibotics Llc Willpower glasses (TM)—a wearable food consumption monitor
US10321873B2 (en) 2013-09-17 2019-06-18 Medibotics Llc Smart clothing for ambulatory human motion capture
WO2019235817A1 (en) * 2018-06-07 2019-12-12 Samsung Electronics Co., Ltd. Electronic device for providing exercise information using biometric information and operating method thereof
US10535243B2 (en) 2016-10-28 2020-01-14 HBH Development LLC Target behavior monitoring system
US10602965B2 (en) 2013-09-17 2020-03-31 Medibotics Wearable deformable conductive sensors for human motion capture including trans-joint pitch, yaw, and roll
US10716510B2 (en) 2013-09-17 2020-07-21 Medibotics Smart clothing with converging/diverging bend or stretch sensors for measuring body motion or configuration
US10772559B2 (en) 2012-06-14 2020-09-15 Medibotics Llc Wearable food consumption monitor
US10796549B2 (en) 2014-02-27 2020-10-06 Fitbit, Inc. Notifications on a user device based on activity detected by an activity monitoring device
US10792461B2 (en) 2014-06-05 2020-10-06 Eight Sleep, Inc. Methods and systems for gathering and analyzing human biological signals
US11666284B2 (en) 2018-01-09 2023-06-06 Eight Sleep Inc. Systems and methods for detecting a biological signal of a user of an article of furniture
US11904103B2 (en) 2018-01-19 2024-02-20 Eight Sleep Inc. Sleep pod
US11990019B2 (en) 2014-02-27 2024-05-21 Fitbit, Inc. Notifications on a user device based on activity detected by an activity monitoring device

Families Citing this family (205)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8487759B2 (en) 2009-09-30 2013-07-16 Apple Inc. Self adapting haptic device
US10013058B2 (en) 2010-09-21 2018-07-03 Apple Inc. Touch-based user interface with haptic feedback
US9310909B2 (en) 2010-09-30 2016-04-12 Fitbit, Inc. Methods, systems and devices for physical contact activated display and navigation
US10120446B2 (en) 2010-11-19 2018-11-06 Apple Inc. Haptic input device
US20140210640A1 (en) * 2011-06-10 2014-07-31 Aliphcom Data-capable band management in an integrated application and network communication data environment
JP5348192B2 (en) * 2011-07-11 2013-11-20 日本電気株式会社 Work support system, terminal, method and program
US9339691B2 (en) 2012-01-05 2016-05-17 Icon Health & Fitness, Inc. System and method for controlling an exercise device
US9178509B2 (en) 2012-09-28 2015-11-03 Apple Inc. Ultra low travel keyboard
US10244986B2 (en) 2013-01-23 2019-04-02 Avery Dennison Corporation Wireless sensor patches and methods of manufacturing
JP6098216B2 (en) * 2013-02-20 2017-03-22 株式会社デンソー Timer reminder
WO2014132965A1 (en) * 2013-02-27 2014-09-04 Necカシオモバイルコミュニケーションズ株式会社 Portable electronic device, control method therefor, and program
US9254409B2 (en) 2013-03-14 2016-02-09 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
USD741855S1 (en) * 2013-05-16 2015-10-27 Samsung Electronics Co., Ltd. Smart watch
USD750623S1 (en) * 2013-05-16 2016-03-01 Samsung Electronics Co., Ltd. Smart watch
US9652040B2 (en) 2013-08-08 2017-05-16 Apple Inc. Sculpted waveforms with no or reduced unforced response
US10563981B1 (en) * 2013-08-22 2020-02-18 Moov Inc. Automated motion data processing
JP6309728B2 (en) * 2013-09-18 2018-04-11 シャープ株式会社 Band type radio
US9779592B1 (en) 2013-09-26 2017-10-03 Apple Inc. Geared haptic feedback element
WO2015047356A1 (en) * 2013-09-27 2015-04-02 Bodhi Technology Ventures Llc Band with haptic actuators
WO2015047343A1 (en) 2013-09-27 2015-04-02 Honessa Development Laboratories Llc Polarized magnetic actuators for haptic response
US10126817B2 (en) 2013-09-29 2018-11-13 Apple Inc. Devices and methods for creating haptic effects
CN105578923B (en) 2013-09-29 2019-03-08 苹果公司 Can connecting component identification method, apparatus and system
US10236760B2 (en) 2013-09-30 2019-03-19 Apple Inc. Magnetic actuators for haptic response
US9063164B1 (en) * 2013-10-02 2015-06-23 Fitbit, Inc. Collaborative activity-data acquisition
US9588591B2 (en) * 2013-10-10 2017-03-07 Google Technology Holdings, LLC Primary device that interfaces with a secondary device based on gesture commands
JP5942956B2 (en) 2013-10-11 2016-06-29 セイコーエプソン株式会社 MEASUREMENT INFORMATION MANAGEMENT SYSTEM, INFORMATION DEVICE, MEASUREMENT INFORMATION MANAGEMENT METHOD, AND MEASUREMENT INFORMATION MANAGEMENT PROGRAM
JP2015073826A (en) * 2013-10-11 2015-04-20 セイコーエプソン株式会社 Biological information measuring instrument
US11033238B2 (en) 2013-10-16 2021-06-15 University of Central Oklahoma Intelligent apparatus for guidance and data capture during physical repositioning of a patient on a sleep platform
US10182766B2 (en) * 2013-10-16 2019-01-22 University of Central Oklahoma Intelligent apparatus for patient guidance and data capture during physical therapy and wheelchair usage
US9317118B2 (en) 2013-10-22 2016-04-19 Apple Inc. Touch surface for simulating materials
US9092743B2 (en) * 2013-10-23 2015-07-28 Stenomics, Inc. Machine learning system for assessing heart valves and surrounding cardiovascular tracts
EP3060107A1 (en) 2013-10-23 2016-08-31 Quanttus, Inc. Consumer biometric devices
US12080421B2 (en) 2013-12-04 2024-09-03 Apple Inc. Wellness aggregator
US20160019360A1 (en) 2013-12-04 2016-01-21 Apple Inc. Wellness aggregator
WO2015088491A1 (en) 2013-12-10 2015-06-18 Bodhi Technology Ventures Llc Band attachment mechanism with haptic response
JP6289892B2 (en) * 2013-12-13 2018-03-07 セイコーインスツル株式会社 Electronic device, data processing method, and data processing program
JP6372077B2 (en) * 2013-12-25 2018-08-15 セイコーエプソン株式会社 Biological information measuring device and method for controlling biological information measuring device
WO2015100429A1 (en) 2013-12-26 2015-07-02 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US10417900B2 (en) 2013-12-26 2019-09-17 Intel Corporation Techniques for detecting sensor inputs on a wearable wireless device
US20150190077A1 (en) * 2014-01-07 2015-07-09 Samsung Electronics Co., Ltd. Electronic device and photoplethysmography method
US10600291B2 (en) 2014-01-13 2020-03-24 Alexis Ander Kashar System and method for alerting a user
US10274908B2 (en) 2014-01-13 2019-04-30 Barbara Ander System and method for alerting a user
US9852656B2 (en) * 2014-01-13 2017-12-26 Barbara Ander Alarm monitoring system
US10254836B2 (en) * 2014-02-21 2019-04-09 Immersion Corporation Haptic power consumption management
US9594443B2 (en) * 2014-02-26 2017-03-14 Lenovo (Singapore) Pte. Ltd. Wearable device authentication and operation
CN103823562B (en) * 2014-02-28 2017-07-11 惠州Tcl移动通信有限公司 Method, system and wearable device that automatically prompting user is slept
EP3153093B1 (en) 2014-02-28 2019-04-03 Valencell, Inc. Method and apparatus for generating assessments using physical activity and biometric parameters
KR102348942B1 (en) * 2014-02-28 2022-01-11 삼성전자 주식회사 Displaying Method Of Health Information And Electronic Device Supporting The Same
WO2015138339A1 (en) 2014-03-10 2015-09-17 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US20160367172A1 (en) * 2014-03-12 2016-12-22 Seiko Epson Corporation Biological information measuring device and method of controlling biological information measuring device
WO2015163842A1 (en) 2014-04-21 2015-10-29 Yknots Industries Llc Apportionment of forces for multi-touch input devices of electronic devices
USD750624S1 (en) * 2014-04-23 2016-03-01 Leapfrog Enterprises, Inc. Activity bracelet
JP6453558B2 (en) * 2014-05-22 2019-01-16 京セラ株式会社 Electronic device, electronic device control method, electronic device control program, and electronic device control system
DE102015209639A1 (en) 2014-06-03 2015-12-03 Apple Inc. Linear actuator
WO2015191445A1 (en) 2014-06-09 2015-12-17 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
WO2015195965A1 (en) 2014-06-20 2015-12-23 Icon Health & Fitness, Inc. Post workout massage device
AU2015279545B2 (en) 2014-06-27 2018-02-22 Apple Inc. Manipulation of calendar application in device with touch screen
US10229192B2 (en) * 2014-07-14 2019-03-12 Under Armour, Inc. Hierarchical de-duplication techniques for tracking fitness metrics
US9858328B2 (en) 2014-07-17 2018-01-02 Verily Life Sciences, LLC Data tagging
EP4439231A2 (en) 2014-07-21 2024-10-02 Apple Inc. Remote user interface
US20160034661A1 (en) * 2014-07-29 2016-02-04 Shailesh Dinkar Govande Accessing content based on a health assessment
US9538921B2 (en) * 2014-07-30 2017-01-10 Valencell, Inc. Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same
US9814986B2 (en) * 2014-07-30 2017-11-14 Hasbro, Inc. Multi sourced point accumulation interactive game
US9746901B2 (en) 2014-07-31 2017-08-29 Google Technology Holdings LLC User interface adaptation based on detected user location
EP3742272B1 (en) 2014-08-02 2022-09-14 Apple Inc. Context-specific user interfaces
US9659159B2 (en) * 2014-08-14 2017-05-23 Sleep Data Services, Llc Sleep data chain of custody
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
US20180227735A1 (en) 2014-08-25 2018-08-09 Phyziio, Inc. Proximity-Based Attribution of Rewards
EP4050467A1 (en) 2014-09-02 2022-08-31 Apple Inc. Phone user interface
WO2016036671A2 (en) 2014-09-02 2016-03-10 Apple Inc. Haptic notifications
US9526430B2 (en) 2014-09-02 2016-12-27 Apple Inc. Method and system to estimate day-long calorie expenditure based on posture
EP4462246A2 (en) 2014-09-02 2024-11-13 Apple Inc. User interface for receiving user input
KR102096146B1 (en) 2014-09-02 2020-04-28 애플 인크. Semantic framework for variable haptic output
US9952675B2 (en) 2014-09-23 2018-04-24 Fitbit, Inc. Methods, systems, and apparatuses to display visibility changes responsive to user gestures
US10667571B2 (en) 2014-10-17 2020-06-02 Guardhat, Inc. Condition responsive indication assembly and method
US10383384B2 (en) 2014-10-17 2019-08-20 Guardhat, Inc. Electrical connection for suspension band attachment slot of a hard hat
JP6790825B2 (en) * 2014-10-22 2020-11-25 ソニー株式会社 Information processing equipment, information processing methods, and programs
US9529396B2 (en) * 2014-11-28 2016-12-27 Asia Vital Components Co., Ltd. Heat dissipation structure of intelligent wearable device
US20160163181A1 (en) * 2014-12-08 2016-06-09 Intel Corporation Wearable alram clock
EP3238192B1 (en) * 2014-12-22 2021-02-17 Koninklijke Philips N.V. Method and device for providing an alarm
CN104580403B (en) * 2014-12-24 2017-03-01 腾讯科技(深圳)有限公司 A kind of data statistical approach and its system, user terminal, application server
US9641991B2 (en) 2015-01-06 2017-05-02 Fitbit, Inc. Systems and methods for determining a user context by correlating acceleration data from multiple devices
CN105847525A (en) * 2015-01-16 2016-08-10 阿里巴巴集团控股有限公司 Method, apparatus and system for controlling information reminding function of mobile terminal
JP2016131733A (en) * 2015-01-20 2016-07-25 セイコーエプソン株式会社 Biological information measurement apparatus
US10408482B2 (en) * 2015-02-11 2019-09-10 Bitfinder, Inc. Managing environmental conditions
US10485452B2 (en) * 2015-02-25 2019-11-26 Leonardo Y. Orellano Fall detection systems and methods
US10391361B2 (en) 2015-02-27 2019-08-27 Icon Health & Fitness, Inc. Simulating real-world terrain on an exercise device
US10244948B2 (en) 2015-03-06 2019-04-02 Apple Inc. Statistical heart rate monitoring for estimating calorie expenditure
US10353467B2 (en) 2015-03-06 2019-07-16 Apple Inc. Calibration of haptic devices
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US9792409B2 (en) 2015-03-13 2017-10-17 Kathryn A. Wernow Communicative water bottle and system thereof
USD862277S1 (en) 2015-03-16 2019-10-08 Fitbit, Inc. Set of bands for a fitness tracker
WO2016164485A1 (en) * 2015-04-08 2016-10-13 Amiigo, Inc. Dynamic adjustment of sampling rate based on a state of the user
US20160299978A1 (en) * 2015-04-13 2016-10-13 Google Inc. Device dependent search experience
AU2016100399B4 (en) 2015-04-17 2017-02-02 Apple Inc. Contracting and elongating materials for providing input and output for an electronic device
US20180148144A1 (en) * 2015-06-02 2018-05-31 Ecocraft Systems Pty Ltd Personal Safety Device
CN106310636B (en) * 2015-06-30 2018-10-12 昆达电脑科技(昆山)有限公司 Object wearing device and its vibration control method
US10098594B2 (en) * 2015-07-02 2018-10-16 Hisense Ltd Portable monitoring device, system and method for monitoring an individual
US10566888B2 (en) 2015-09-08 2020-02-18 Apple Inc. Linear actuators for use in electronic devices
US10699594B2 (en) 2015-09-16 2020-06-30 Apple Inc. Calculating an estimate of wind resistance experienced by a cyclist
US10620232B2 (en) 2015-09-22 2020-04-14 Apple Inc. Detecting controllers in vehicles using wearable devices
GB201516978D0 (en) * 2015-09-25 2015-11-11 Mclaren Applied Technologies Ltd Device control
JP2017086524A (en) * 2015-11-11 2017-05-25 セイコーエプソン株式会社 Fatigue degree control device, fatigue degree control system and fatigue degree determination method
KR102468820B1 (en) * 2015-11-17 2022-11-21 삼성전자주식회사 Device For Providing Health Management Service and Method Thereof
EP3173905B1 (en) * 2015-11-24 2019-06-19 Polar Electro Oy Enhancing controlling of haptic output
CN105611443B (en) * 2015-12-29 2019-07-19 歌尔股份有限公司 A kind of control method of earphone, control system and earphone
WO2017113387A1 (en) * 2015-12-31 2017-07-06 深圳市洛书和科技发展有限公司 Sensor capturing platform
CN105688382A (en) * 2016-01-21 2016-06-22 苏州盛世十月软件技术有限公司 Intelligent swimming armlet
US9870533B2 (en) * 2016-01-27 2018-01-16 Striiv, Inc. Autonomous decision logic for a wearable device
US10420514B2 (en) 2016-02-25 2019-09-24 Samsung Electronics Co., Ltd. Detection of chronotropic incompetence
US11164596B2 (en) 2016-02-25 2021-11-02 Samsung Electronics Co., Ltd. Sensor assisted evaluation of health and rehabilitation
US10172517B2 (en) 2016-02-25 2019-01-08 Samsung Electronics Co., Ltd Image-analysis for assessing heart failure
US10362998B2 (en) 2016-02-25 2019-07-30 Samsung Electronics Co., Ltd. Sensor-based detection of changes in health and ventilation threshold
US11042916B1 (en) 2016-02-29 2021-06-22 Canary Medical Inc. Computer-based marketplace for information
US10039080B2 (en) 2016-03-04 2018-07-31 Apple Inc. Situationally-aware alerts
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10694994B2 (en) 2016-03-22 2020-06-30 Apple Inc. Techniques for jointly calibrating load and aerobic capacity
US10268272B2 (en) 2016-03-31 2019-04-23 Apple Inc. Dampening mechanical modes of a haptic actuator using a delay
JP6686656B2 (en) * 2016-04-14 2020-04-22 セイコーエプソン株式会社 Positioning control method and positioning device
US10687707B2 (en) 2016-06-07 2020-06-23 Apple Inc. Detecting activity by a wheelchair user
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
DK179489B1 (en) 2016-06-12 2019-01-04 Apple Inc. Devices, methods and graphical user interfaces for providing haptic feedback
DK179823B1 (en) 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
US11903727B2 (en) * 2016-06-29 2024-02-20 Koninklijke Philips N.V. Method and device for health devices and wearable/implantable devices
CN106097654B (en) * 2016-07-27 2018-09-04 歌尔股份有限公司 A kind of fall detection method and wearable falling detection device
US10709933B2 (en) 2016-08-17 2020-07-14 Apple Inc. Pose and heart rate energy expenditure for yoga
US10687752B2 (en) 2016-08-29 2020-06-23 Apple Inc. Detecting unmeasurable loads using heart rate and work rate
US10617912B2 (en) 2016-08-31 2020-04-14 Apple Inc. Systems and methods of swimming calorimetry
US11896368B2 (en) 2016-08-31 2024-02-13 Apple Inc. Systems and methods for determining swimming metrics
KR102252269B1 (en) 2016-08-31 2021-05-14 애플 인크. Swimming analysis system and method
US10512406B2 (en) 2016-09-01 2019-12-24 Apple Inc. Systems and methods for determining an intensity level of an exercise using photoplethysmogram (PPG)
DK201670720A1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs
DK179278B1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, methods and graphical user interfaces for haptic mixing
WO2018057667A1 (en) 2016-09-20 2018-03-29 Paradromics, Inc. Systems and methods for detecting corrupt or inaccurate sensory representations
US10736543B2 (en) 2016-09-22 2020-08-11 Apple Inc. Workout monitor interface
US10860199B2 (en) 2016-09-23 2020-12-08 Apple Inc. Dynamically adjusting touch hysteresis based on contextual data
US10671705B2 (en) 2016-09-28 2020-06-02 Icon Health & Fitness, Inc. Customizing recipe recommendations
US11032855B2 (en) 2016-10-18 2021-06-08 Dexcom, Inc. System and method for communication of analyte data
EP4241680A3 (en) 2016-10-18 2023-11-15 DexCom, Inc. System and method for communication of analyte data
US20180116607A1 (en) * 2016-10-28 2018-05-03 Garmin Switzerland Gmbh Wearable monitoring device
US10062262B2 (en) 2016-11-16 2018-08-28 The Nielsen Company (Us), Llc People metering enhanced with light projection prompting for audience measurement
CN106802551A (en) * 2017-01-03 2017-06-06 青岛海信移动通信技术股份有限公司 Intelligent wearable device control method and intelligent wearable device
JP7057790B2 (en) * 2017-01-04 2022-04-20 インターリンク エレクトロニクス,インコーポレイテッド Multimodal sensing for power tool user interface
WO2018127043A1 (en) * 2017-01-06 2018-07-12 Auxilia Limited Portable electronic device for data collection, and related charging station and system
CN106887115B (en) * 2017-01-20 2019-05-10 安徽大学 Old people falling monitoring device and falling risk assessment method
US10938767B2 (en) * 2017-03-14 2021-03-02 Google Llc Outputting reengagement alerts by a computing device
CN106980373B (en) * 2017-03-29 2023-10-20 苏州攀特电陶科技股份有限公司 Mobile terminal device, haptic feedback and audio control method and system
CN107124458B (en) * 2017-04-27 2020-01-31 大连云动力科技有限公司 Intelligent sensing equipment and sensing system
US10845955B2 (en) 2017-05-15 2020-11-24 Apple Inc. Displaying a scrollable list of affordances associated with physical activities
US10699247B2 (en) 2017-05-16 2020-06-30 Under Armour, Inc. Systems and methods for providing health task notifications
DK201770372A1 (en) 2017-05-16 2019-01-08 Apple Inc. Tactile feedback for locked device user interfaces
US11051720B2 (en) 2017-06-01 2021-07-06 Apple Inc. Fitness tracking for constrained-arm usage
US10987006B2 (en) 2017-06-02 2021-04-27 Apple Inc. Wearable computer with fitness machine connectivity for improved activity monitoring using caloric expenditure models
US10814167B2 (en) * 2017-06-02 2020-10-27 Apple Inc. Wearable computer with fitness machine connectivity for improved activity monitoring
US10874313B2 (en) * 2017-06-04 2020-12-29 Apple Inc. Heartrate tracking techniques
US10622538B2 (en) 2017-07-18 2020-04-14 Apple Inc. Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body
US10542930B1 (en) 2017-07-25 2020-01-28 BlueOwl, LLC Audio assessment for analyzing sleep trends using machine learning techniques
CN107764280B (en) * 2017-10-20 2021-11-26 郭寒松 Multi-mode accurate step counting method and device
KR102418120B1 (en) * 2017-11-01 2022-07-07 삼성전자 주식회사 Electronic device comprising a plurality of light emitting unit and a plurality of light receiving unit
US10751559B2 (en) * 2018-02-22 2020-08-25 Elio Constanza Fitness training system and method
DK180246B1 (en) 2018-03-12 2020-09-11 Apple Inc User interfaces for health monitoring
DK201870380A1 (en) 2018-05-07 2020-01-29 Apple Inc. Displaying user interfaces associated with physical activities
US11317833B2 (en) 2018-05-07 2022-05-03 Apple Inc. Displaying user interfaces associated with physical activities
KR102715493B1 (en) * 2018-05-07 2024-10-11 애플 인크. Displaying user interfaces associated with physical activities
USD902203S1 (en) 2018-07-13 2020-11-17 Fitbit, Inc. Smart watch with curved body
US10599223B1 (en) 2018-09-28 2020-03-24 Apple Inc. Button providing force sensing and/or haptic output
US10691211B2 (en) 2018-09-28 2020-06-23 Apple Inc. Button providing force sensing and/or haptic output
EP3650960A1 (en) * 2018-11-07 2020-05-13 Tissot S.A. Method for broadcasting a message by a watch
US10475323B1 (en) 2019-01-09 2019-11-12 MedHab, LLC Network hub for an alert reporting system
CN111768591A (en) * 2019-03-30 2020-10-13 深圳市歆歌电子科技有限公司 Quick help calling method for intelligent wearable equipment
DK201970532A1 (en) 2019-05-06 2021-05-03 Apple Inc Activity trends and workouts
US11228835B2 (en) 2019-06-01 2022-01-18 Apple Inc. User interfaces for managing audio exposure
US11209957B2 (en) 2019-06-01 2021-12-28 Apple Inc. User interfaces for cycle tracking
US11152100B2 (en) 2019-06-01 2021-10-19 Apple Inc. Health application user interfaces
US11234077B2 (en) 2019-06-01 2022-01-25 Apple Inc. User interfaces for managing audio exposure
US11315544B2 (en) 2019-06-25 2022-04-26 International Business Machines Corporation Cognitive modification of verbal communications from an interactive computing device
US12002588B2 (en) 2019-07-17 2024-06-04 Apple Inc. Health event logging and coaching user interfaces
US11937904B2 (en) 2019-09-09 2024-03-26 Apple Inc. Detecting the end of cardio machine activities on a wearable device
CN114286975A (en) 2019-09-09 2022-04-05 苹果公司 Research user interface
WO2021051083A1 (en) * 2019-09-12 2021-03-18 Brett Johnson Methods and systems for sports and cognitive training
IT201900016142A1 (en) * 2019-09-12 2021-03-12 St Microelectronics Srl DOUBLE VALIDATION STEP DETECTION SYSTEM AND METHOD
US11380470B2 (en) 2019-09-24 2022-07-05 Apple Inc. Methods to control force in reluctance actuators based on flux related parameters
US12109453B2 (en) 2019-09-27 2024-10-08 Apple Inc. Detecting outdoor walking workouts on a wearable device
CN110680306B (en) * 2019-10-29 2020-11-24 歌尔科技有限公司 ECG (electrocardiogram) electrocardio measurement mode switching method and device, wearable equipment and storage medium
US11170623B2 (en) * 2019-10-29 2021-11-09 Cheryl Spencer Portable hazard communicator device
US11478606B1 (en) 2020-01-08 2022-10-25 New Heights Energy, LLC Wearable devices and methods for providing therapy to a user and/or for measuring physiological parameters of the user
WO2021204036A1 (en) * 2020-04-10 2021-10-14 华为技术有限公司 Sleep risk monitoring method, electronic device and storage medium
DK181037B1 (en) 2020-06-02 2022-10-10 Apple Inc User interfaces for health applications
WO2021257546A1 (en) * 2020-06-15 2021-12-23 Strongarm Technologies, Inc. Methods and apparatus for actions, activities and tasks classifications based on machine learning techniques
US11698710B2 (en) 2020-08-31 2023-07-11 Apple Inc. User interfaces for logging user activities
US20220071563A1 (en) * 2020-09-08 2022-03-10 LEDO Network, Inc. Wearable health monitoring system
US20220071558A1 (en) * 2020-09-08 2022-03-10 LEDO Network, Inc. System, device, and method for wireless health monitoring
CN112288999A (en) * 2020-10-30 2021-01-29 中电万维信息技术有限责任公司 Old people safety monitoring system and using method thereof
US11977683B2 (en) 2021-03-12 2024-05-07 Apple Inc. Modular systems configured to provide localized haptic feedback using inertial actuators
US20220346239A1 (en) * 2021-04-23 2022-10-27 Advanced Semiconductor Engineering, Inc. Electronic device and method of manufacturing the same
US11657701B2 (en) * 2021-08-03 2023-05-23 Toyota Motor North America, Inc. Systems and methods for emergency alert and call regarding driver condition
WO2023027731A1 (en) * 2021-08-27 2023-03-02 Hewlett-Packard Development Company, L.P. Dead front panel for electronic device
US11809631B2 (en) 2021-09-21 2023-11-07 Apple Inc. Reluctance haptic engine for an electronic device
GB2619337A (en) * 2022-06-01 2023-12-06 Prevayl Innovations Ltd A wearable article, an electronics module for a wearable article and a method performed by a controller for an electronics module for a wearable article
US11977729B2 (en) 2022-06-05 2024-05-07 Apple Inc. Physical activity information user interfaces
US11896871B2 (en) 2022-06-05 2024-02-13 Apple Inc. User interfaces for physical activity information
WO2024133988A1 (en) * 2022-12-21 2024-06-27 Universidad De Alicante Device for controlling the intensity of physical activity
CN117281491A (en) * 2023-11-03 2023-12-26 中国科学院苏州生物医学工程技术研究所 Multi-mode physiological signal synchronous acquisition system and method based on Internet of things

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070049836A1 (en) * 2004-12-07 2007-03-01 Yu-Yu Chen Electronic wristwatch-type exercise signal detecting apparatus
US7334472B2 (en) * 2004-07-24 2008-02-26 Samsung Electronics Co., Ltd. Apparatus and method for measuring quantity of physical exercise using acceleration sensor
US20100245078A1 (en) * 2009-03-26 2010-09-30 Wellcore Corporation Wearable Motion Sensing Device
US20100295684A1 (en) * 2009-05-21 2010-11-25 Silverplus, Inc. Personal health management device
US20130310896A1 (en) * 2007-08-31 2013-11-21 Cardiac Pacemakers, Inc. Wireless patient communicator for use in a life critical network
US8903671B2 (en) * 2013-01-15 2014-12-02 Fitbit, Inc. Portable monitoring devices and methods of operating the same

Family Cites Families (199)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4938228A (en) 1989-02-15 1990-07-03 Righter William H Wrist worn heart rate monitor
US5339294A (en) 1993-11-10 1994-08-16 Rodgers Nicholas A Watch with light means
US5612931A (en) 1994-07-07 1997-03-18 Casio Computer Co., Ltd. Switch device and electronic instruments equipped with the switch device
US5738104A (en) 1995-11-08 1998-04-14 Salutron, Inc. EKG based heart rate monitor
WO1997037588A1 (en) 1996-04-08 1997-10-16 Seiko Epson Corporation Motion prescription support device
JP3738507B2 (en) 1996-12-11 2006-01-25 カシオ計算機株式会社 Lighting device
US6209011B1 (en) 1997-05-08 2001-03-27 Microsoft Corporation Handheld computing device with external notification system
US5978923A (en) 1997-08-07 1999-11-02 Toshiba America Information Systems, Inc. Method and apparatus for a computer power management function including selective sleep states
JP4416846B2 (en) 1997-08-22 2010-02-17 ソニー株式会社 Computer-readable recording medium recording menu control data, and menu control method and apparatus
US6300947B1 (en) 1998-07-06 2001-10-09 International Business Machines Corporation Display screen and window size related web page adaptation system
US6888927B1 (en) 1998-12-28 2005-05-03 Nortel Networks Limited Graphical message notification
US6416471B1 (en) * 1999-04-15 2002-07-09 Nexan Limited Portable remote patient telemonitoring system
US7458014B1 (en) 1999-12-07 2008-11-25 Microsoft Corporation Computer user interface architecture wherein both content and user interface are composed of documents with links
US7155729B1 (en) 2000-03-28 2006-12-26 Microsoft Corporation Method and system for displaying transient notifications
US6487906B1 (en) * 2000-09-18 2002-12-03 Advantedge Systems Inc Flexible film sensor system for monitoring body motion
US7458080B2 (en) 2000-12-19 2008-11-25 Microsoft Corporation System and method for optimizing user notifications for small computer devices
AU2002255568B8 (en) * 2001-02-20 2014-01-09 Adidas Ag Modular personal network systems and methods
US7302634B2 (en) 2001-03-14 2007-11-27 Microsoft Corporation Schema-based services for identity-based data access
US20020161644A1 (en) 2001-03-31 2002-10-31 George Duffield Cooperative incentive and promotion system and method for use on a computer networking system
US6583369B2 (en) 2001-04-10 2003-06-24 Sunbeam Products, Inc. Scale with a transiently visible display
EP1256316A1 (en) 2001-05-07 2002-11-13 Move2Health B.V. Portable device comprising an acceleration sensor and method of generating instructions or advice
US7913185B1 (en) 2001-10-25 2011-03-22 Adobe Systems Incorporated Graphical insertion of JavaScript pop-up menus
WO2003083603A2 (en) 2002-03-29 2003-10-09 Oracle International Corporation Methods and systems for non-interrupting notifications
TW595140B (en) * 2002-04-22 2004-06-21 Cognio Inc System and method for spectrum management of a shared frequency band
US7496631B2 (en) 2002-08-27 2009-02-24 Aol Llc Delivery of an electronic communication using a lifespan
DK1551282T3 (en) 2002-10-09 2016-02-22 Bodymedia Inc DEVICE FOR RECEIVING, RECEIVING, DETERMINING AND DISPLAYING PHYSIOLOGICAL AND CONTEXTUAL INFORMATION ON A HUMAN
US20040127198A1 (en) 2002-12-30 2004-07-01 Roskind James A. Automatically changing a mobile device configuration based on environmental condition
CN1860486A (en) * 2003-08-01 2006-11-08 乔治亚州大学研究基金会 Methods, systems, and apparatus for monitoring within-day energy balance deviation
EP1670547B1 (en) 2003-08-18 2008-11-12 Cardiac Pacemakers, Inc. Patient monitoring system
JP3954012B2 (en) 2003-12-01 2007-08-08 三菱電機株式会社 Rotating electrical machine rotor
US8096960B2 (en) * 2004-01-09 2012-01-17 Loree Iv Leonor F Easy wake device
US8403865B2 (en) 2004-02-05 2013-03-26 Earlysense Ltd. Prediction and monitoring of clinical episodes
US7618260B2 (en) 2004-02-27 2009-11-17 Daniel Simon R Wearable modular interface strap
US20050245793A1 (en) 2004-04-14 2005-11-03 Hilton Theodore C Personal wellness monitor system and process
US8620083B2 (en) 2004-12-03 2013-12-31 Google Inc. Method and system for character recognition
US20050250551A1 (en) 2004-05-10 2005-11-10 Nokia Corporation Notification about an event
US20080319330A1 (en) 2004-07-02 2008-12-25 Suunto Oy Transmitter and receiver for observing periodical events
US8109858B2 (en) 2004-07-28 2012-02-07 William G Redmann Device and method for exercise prescription, detection of successful performance, and provision of reward therefore
US20060028429A1 (en) 2004-08-09 2006-02-09 International Business Machines Corporation Controlling devices' behaviors via changes in their relative locations and positions
US8172761B1 (en) 2004-09-28 2012-05-08 Impact Sports Technologies, Inc. Monitoring device with an accelerometer, method and system
US20060090139A1 (en) 2004-10-21 2006-04-27 Microsoft Corporation Collection view objects for displaying data collection items in user interface elements
US7793361B2 (en) 2004-11-12 2010-09-14 Nike, Inc. Article of apparel incorporating a separable electronic device
US7254516B2 (en) 2004-12-17 2007-08-07 Nike, Inc. Multi-sensor monitoring of athletic performance
CA2592042C (en) * 2004-12-22 2014-12-16 Oessur Hf Systems and methods for processing limb motion
US7559877B2 (en) 2005-03-24 2009-07-14 Walkstyles, Inc. Interactive exercise device and system
US20060242590A1 (en) 2005-04-21 2006-10-26 Microsoft Corporation Simple content format for auxiliary display devices
US20060247504A1 (en) * 2005-04-29 2006-11-02 Honeywell International, Inc. Residential monitoring system for selected parameters
CN101283371A (en) 2005-05-06 2008-10-08 游戏改进有限公司 Techniques for awarding random rewards in a reward program
US20060293041A1 (en) 2005-06-24 2006-12-28 Sony Ericsson Mobile Communications Ab Reward based interface for a wireless communications device
US7534206B1 (en) 2005-09-19 2009-05-19 Garmin Ltd. Navigation-assisted fitness and dieting device
DE602005014641D1 (en) 2005-10-03 2009-07-09 St Microelectronics Srl Pedometer device and step detection method by algorithm for self-adaptive calculation of acceleration limits
AU2005229658A1 (en) * 2005-11-02 2007-05-17 Calibre Global Pty Ltd Method for billable timekeeping
US20070118043A1 (en) 2005-11-23 2007-05-24 Microsoft Corporation Algorithms for computing heart rate and movement speed of a user from sensor data
ATE462480T1 (en) 2005-12-16 2010-04-15 2Peak Ag A DYNAMIC ADAPTABLE ENDURANCE TRAINING PROGRAM
US8469805B2 (en) 2006-01-20 2013-06-25 Microsoft Corporation Tiered achievement system
US7770118B2 (en) 2006-02-13 2010-08-03 Research In Motion Limited Navigation tool with audible feedback on a handheld communication device having a full alphabetic keyboard
US20070219059A1 (en) 2006-03-17 2007-09-20 Schwartz Mark H Method and system for continuous monitoring and training of exercise
US7539533B2 (en) 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
JP2009538720A (en) 2006-06-01 2009-11-12 ビアンカメッド リミテッド Apparatus, system, and method for monitoring physiological signs
US20080155455A1 (en) 2006-08-15 2008-06-26 Swaminathan Balasubramanian Notification of state transition of an out-of-focus application with clustering
US7662065B1 (en) 2006-09-01 2010-02-16 Dp Technologies, Inc. Method and apparatus to provide daily goals in accordance with historical data
EP1897598A1 (en) 2006-09-06 2008-03-12 Nederlandse Organisatie voor toegepast- natuurwetenschappelijk onderzoek TNO System for training optimisation
US8600347B2 (en) 2006-09-14 2013-12-03 Futurewei Technologies, Inc. Idle mode notification
US8128410B2 (en) 2006-09-29 2012-03-06 Nike, Inc. Multi-mode acceleration-based athleticism measurement system
US7811201B1 (en) 2006-12-22 2010-10-12 Cingular Wireless Ii, Llc Fitness applications of a wireless device
US7868757B2 (en) 2006-12-29 2011-01-11 Nokia Corporation Method for the monitoring of sleep using an electronic device
US20100185064A1 (en) 2007-01-05 2010-07-22 Jadran Bandic Skin analysis methods
WO2008089084A2 (en) 2007-01-12 2008-07-24 Healthhonors Corporation Behavior modification with intermittent reward
US7946960B2 (en) 2007-02-05 2011-05-24 Smartsports, Inc. System and method for predicting athletic ability
US8162804B2 (en) 2007-02-14 2012-04-24 Nike, Inc. Collection and display of athletic information
JP4898514B2 (en) 2007-03-26 2012-03-14 セイコーインスツル株式会社 Pedometer
US7914419B2 (en) * 2007-05-29 2011-03-29 Microsoft Corporation Physical activity manager
US7752279B2 (en) * 2007-05-29 2010-07-06 Research In Motion Limited System for facilitating thread-based message prioritization
US7647196B2 (en) 2007-08-08 2010-01-12 Dp Technologies, Inc. Human activity monitoring device with distance calculation
ITBO20070701A1 (en) 2007-10-19 2009-04-20 Technogym Spa DEVICE FOR ANALYSIS AND MONITORING OF THE PHYSICAL ACTIVITY OF A USER.
US9311581B2 (en) 2007-10-19 2016-04-12 Daniel W. Shinn System and method for tracking the fulfillment status of requirements for completing an objective
US8600457B2 (en) * 2007-11-30 2013-12-03 Microsoft Corporation Sleep mode for mobile communication device
EP2227771A2 (en) 2007-12-07 2010-09-15 Nike International Ltd. Cardiovascular miles
US8836502B2 (en) 2007-12-28 2014-09-16 Apple Inc. Personal media device input and output control based on associated conditions
US8344998B2 (en) 2008-02-01 2013-01-01 Wimm Labs, Inc. Gesture-based power management of a wearable portable electronic device with display
WO2009111472A2 (en) * 2008-03-03 2009-09-11 Nike, Inc. Interactive athletic equipment system
US7877226B2 (en) 2008-03-03 2011-01-25 Idt Technology Limited Apparatus and method for counting exercise repetitions
WO2009129402A1 (en) * 2008-04-16 2009-10-22 Nike International, Ltd. Athletic performance user interface for mobile device
CA2665847C (en) 2008-05-11 2014-04-22 Research In Motion Limited Electronic device and method providing activation of an improved bedtime mode of operation
US8555201B2 (en) 2008-06-05 2013-10-08 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US8135392B2 (en) 2008-06-06 2012-03-13 Apple Inc. Managing notification service connections and displaying icon badges
US9357052B2 (en) 2008-06-09 2016-05-31 Immersion Corporation Developing a notification framework for electronic device events
US20090320047A1 (en) 2008-06-23 2009-12-24 Ingboo Inc. Event Bundling
JP4644274B2 (en) 2008-07-29 2011-03-02 京セラ株式会社 Portable device, step count method, and gravity direction detection method
US8306508B1 (en) 2008-08-21 2012-11-06 Sprint Communications Company L.P. Motion-based event notification
JP2010088886A (en) * 2008-10-03 2010-04-22 Adidas Ag Program products, methods, and systems for providing location-aware fitness monitoring services
US7980997B2 (en) 2008-10-23 2011-07-19 University Of Southern California System for encouraging a user to perform substantial physical activity
US8647287B2 (en) 2008-12-07 2014-02-11 Andrew Greenberg Wireless synchronized movement monitoring apparatus and system
US8331992B2 (en) 2008-12-19 2012-12-11 Verizon Patent And Licensing Inc. Interactive locked state mobile communication device
US20100182518A1 (en) * 2009-01-16 2010-07-22 Kirmse Noel J System and method for a display system
WO2010090867A2 (en) 2009-01-21 2010-08-12 SwimSense, LLC Multi-state performance monitoring system
US8704767B2 (en) 2009-01-29 2014-04-22 Microsoft Corporation Environmental gesture recognition
WO2010087203A1 (en) 2009-02-02 2010-08-05 パナソニック株式会社 Information display device
US8441356B1 (en) 2009-02-16 2013-05-14 Handhold Adaptive, LLC Methods for remote assistance of disabled persons
CN101843492A (en) * 2009-03-26 2010-09-29 北京超思电子技术有限责任公司 Medical device with vibration alarm apparatus
US9141087B2 (en) 2009-04-26 2015-09-22 Nike, Inc. Athletic watch
WO2010126821A1 (en) 2009-04-26 2010-11-04 Nike International, Ltd. Athletic watch
US8446398B2 (en) 2009-06-16 2013-05-21 Intel Corporation Power conservation for mobile device displays
CH701440A2 (en) 2009-07-03 2011-01-14 Comme Le Temps Sa Wrist touch screen and method for displaying on a watch with touch screen.
KR20110004094A (en) 2009-07-07 2011-01-13 삼성전자주식회사 Document display apparatus and method of displaying document
CN101779953A (en) * 2009-08-26 2010-07-21 深圳市天歌通信有限公司 Health mobile phone monitor
JP5695052B2 (en) * 2009-09-04 2015-04-01 ナイキ イノベイト セー. フェー. How to monitor and track athletic activity
US20110063440A1 (en) * 2009-09-11 2011-03-17 Neustaedter Carman G Time shifted video communications
US8613689B2 (en) 2010-09-23 2013-12-24 Precor Incorporated Universal exercise guidance system
US7955219B2 (en) 2009-10-02 2011-06-07 Precor Incorporated Exercise community system
US9005028B2 (en) 2009-10-20 2015-04-14 Sony Computer Entertainment America Llc Video game competition notifications
WO2011051955A2 (en) * 2009-11-02 2011-05-05 Jonathan Bentwich Computerized system or device and method for diagnosis and treatment of human, physical and planetary conditions
US9176542B2 (en) 2009-11-06 2015-11-03 Sony Corporation Accelerometer-based touchscreen user interface
US20120137367A1 (en) * 2009-11-06 2012-05-31 Cataphora, Inc. Continuous anomaly detection based on behavior modeling and heterogeneous information analysis
CN105286843A (en) * 2009-12-09 2016-02-03 耐克创新有限合伙公司 Athletic performance monitoring system utilizing heart rate information
US20110152696A1 (en) 2009-12-22 2011-06-23 Hall Ryan Laboratories, Inc. Audible biofeedback heart rate monitor with virtual coach
WO2011090722A2 (en) * 2009-12-29 2011-07-28 Advanced Brain Monitoring, Inc. Systems and methods for assessing team dynamics and effectiveness
US8852118B2 (en) * 2010-01-11 2014-10-07 Ethicon Endo-Surgery, Inc. Telemetry device with software user input features
US20120303319A1 (en) 2010-02-02 2012-11-29 Nokia Corporation Pedometer device and step detection method
CN101785675B (en) * 2010-03-04 2014-01-15 重庆理工大学 Movement monitoring device and monitoring method thereof
JP5617299B2 (en) 2010-03-25 2014-11-05 オムロンヘルスケア株式会社 Activity meter, control program, and activity type identification method
EP2378406B1 (en) 2010-04-13 2018-08-22 LG Electronics Inc. Mobile terminal and method of controlling operation of the mobile terminal
US20130043988A1 (en) * 2010-04-28 2013-02-21 James E. Bruno Pillowcase alarm clock
US20110267196A1 (en) * 2010-05-03 2011-11-03 Julia Hu System and method for providing sleep quality feedback
TWI455742B (en) * 2010-06-28 2014-10-11 Nike Innovate Cv Apparatus and method for monitoring and tracking athletic activity and computer readable media implementing the same
FI20105796A0 (en) 2010-07-12 2010-07-12 Polar Electro Oy Analysis of a physiological condition for a cardio exercise
US10039970B2 (en) * 2010-07-14 2018-08-07 Adidas Ag Location-aware fitness monitoring methods, systems, and program products, and applications thereof
JP5713595B2 (en) 2010-07-16 2015-05-07 オムロンヘルスケア株式会社 Motion detection device and control method of motion detection device
US9248340B2 (en) 2010-08-09 2016-02-02 Nike, Inc. Monitoring fitness using a mobile device
US9940682B2 (en) 2010-08-11 2018-04-10 Nike, Inc. Athletic activity user experience and environment
US20120060123A1 (en) 2010-09-03 2012-03-08 Hugh Smith Systems and methods for deterministic control of instant-on mobile devices with touch screens
US20120059664A1 (en) * 2010-09-07 2012-03-08 Emil Markov Georgiev System and method for management of personal health and wellness
US8585606B2 (en) 2010-09-23 2013-11-19 QinetiQ North America, Inc. Physiological status monitoring system
US9167991B2 (en) 2010-09-30 2015-10-27 Fitbit, Inc. Portable monitoring devices and methods of operating same
US20120108215A1 (en) * 2010-10-29 2012-05-03 Nader Kameli Remote notification device
US8974349B2 (en) * 2010-11-01 2015-03-10 Nike, Inc. Wearable device assembly having athletic functionality
US9011292B2 (en) 2010-11-01 2015-04-21 Nike, Inc. Wearable device assembly having athletic functionality
US8814754B2 (en) 2010-11-01 2014-08-26 Nike, Inc. Wearable device having athletic functionality
US8599014B2 (en) 2010-11-05 2013-12-03 Nokia Corporation Method and apparatus for managing notifications
US9081889B2 (en) * 2010-11-10 2015-07-14 Apple Inc. Supporting the monitoring of a physical activity
US8595529B2 (en) 2010-12-16 2013-11-26 Qualcomm Incorporated Efficient power management and optimized event notification in multi-processor computing devices
CN102068261A (en) * 2011-01-21 2011-05-25 上海弘周电子科技有限公司 Safety monitor
CN202027572U (en) * 2011-01-24 2011-11-09 无锡微感科技有限公司 Dynamic recording and analyzing device for electrocardiogram and movement
US8787006B2 (en) * 2011-01-31 2014-07-22 Apple Inc. Wrist-worn electronic device and methods therefor
CN102682041B (en) * 2011-03-18 2014-06-04 日电(中国)有限公司 User behavior identification equipment and method
GB2489399A (en) 2011-03-21 2012-10-03 Sony Corp Suppression of further notifications during initial notification handling and automatic recording of parallel video/audio upon notification display
US8483665B2 (en) 2011-03-31 2013-07-09 Matthew R. Kissinger Mobile device featuring sensor responsive re-notifications
US8751592B2 (en) 2011-11-04 2014-06-10 Facebook, Inc. Controlling notification based on power expense and social factors
US20120258433A1 (en) * 2011-04-05 2012-10-11 Adidas Ag Fitness Monitoring Methods, Systems, And Program Products, And Applications Thereof
US9402550B2 (en) 2011-04-29 2016-08-02 Cybertronics, Inc. Dynamic heart rate threshold for neurological event detection
CN202282004U (en) 2011-06-02 2012-06-20 上海巨浪信息科技有限公司 Mobile health management system based on context awareness and activity analysis
US9383820B2 (en) * 2011-06-03 2016-07-05 Apple Inc. Custom vibration patterns
CN107506249B (en) 2011-06-05 2021-02-12 苹果公司 System and method for displaying notifications received from multiple applications
CN102198003B (en) * 2011-06-07 2014-08-13 嘉兴恒怡科技有限公司 Limb movement detection and evaluation network system and method
CA2814681A1 (en) * 2011-06-10 2012-12-13 Aliphcom Wearable device and platform for sensory input
US20120316896A1 (en) 2011-06-10 2012-12-13 Aliphcom Personal advisor system using data-capable band
US20120326873A1 (en) * 2011-06-10 2012-12-27 Aliphcom Activity attainment method and apparatus for a wellness application using data from a data-capable band
US10108783B2 (en) 2011-07-05 2018-10-23 Saudi Arabian Oil Company Systems, computer medium and computer-implemented methods for monitoring health of employees using mobile devices
US9192326B2 (en) * 2011-07-13 2015-11-24 Dp Technologies, Inc. Sleep monitoring system
US20130017891A1 (en) 2011-07-15 2013-01-17 FunGoPlay LLC Systems and methods for providing virtual incentives for real-world activities
EP2551784A1 (en) 2011-07-28 2013-01-30 Roche Diagnostics GmbH Method of controlling the display of a dataset
US8621026B2 (en) 2011-09-11 2013-12-31 Microsoft Corporation Batching notifications to optimize for battery life
US9177247B2 (en) * 2011-09-23 2015-11-03 Fujitsu Limited Partitioning medical binary decision diagrams for analysis optimization
US20130078958A1 (en) 2011-09-23 2013-03-28 Research In Motion Limited System and method for managing transient notifications using sensors
WO2013040674A1 (en) 2011-09-23 2013-03-28 Research In Motion Limited System and method for managing transient notifications using sensors
US8918665B2 (en) 2011-09-23 2014-12-23 Wing Kong Low Operating input device in low power mode with auxiliary sensor calibrated to main sensor
US10330491B2 (en) 2011-10-10 2019-06-25 Texas Instruments Incorporated Robust step detection using low cost MEMS accelerometer in mobile applications, and processing methods, apparatus and systems
US20130122928A1 (en) 2011-10-28 2013-05-16 Mark Oliver Pfluger Systems and methods for identifying and acting upon states and state changes
US20130325491A1 (en) * 2011-11-04 2013-12-05 Wee Talk Tracker Pro, LLC. Therapy Tracking And Management System
US8541745B2 (en) 2011-11-16 2013-09-24 Motorola Mobility Llc Methods and devices for clothing detection about a wearable electronic device
US9087454B2 (en) 2011-11-29 2015-07-21 At Peak Resources, Llc Interactive training method and system for developing peak user performance
US9734304B2 (en) * 2011-12-02 2017-08-15 Lumiradx Uk Ltd Versatile sensors with data fusion functionality
WO2013086728A1 (en) * 2011-12-15 2013-06-20 北京英福生科技有限公司 Excise reminding device and system
EP2608041B1 (en) 2011-12-23 2015-05-20 Deutsche Telekom AG Monitoring user activity on smart mobile devices
EP2800611A4 (en) 2012-01-06 2015-12-16 Icon Health & Fitness Inc Exercise device with communication linkage for connection with external computing device
US9367085B2 (en) 2012-01-26 2016-06-14 Google Technology Holdings LLC Portable electronic device and method for controlling operation thereof taking into account which limb possesses the electronic device
US8904410B2 (en) 2012-01-31 2014-12-02 MCube Inc. Methods and apparatus for mobile device event detection
US8947239B1 (en) 2012-03-05 2015-02-03 Fitbit, Inc. Near field communication system, and method of operating same
US9189062B2 (en) 2012-03-07 2015-11-17 Google Technology Holdings LLC Portable electronic device and method for controlling operation thereof based on user motion
US8755877B2 (en) 2012-03-12 2014-06-17 Texas Instruments Incoporated Real time QRS detection using adaptive threshold
US9208662B2 (en) 2012-03-16 2015-12-08 Qualcomm Incorporated Methods and devices for selectively controlling and varying surface texture and/or force levels on a mobile device using haptic feedback
US9710056B2 (en) 2012-03-21 2017-07-18 Google Inc. Methods and systems for correlating movement of a device with state changes of the device
US20130271355A1 (en) * 2012-04-13 2013-10-17 Nokia Corporation Multi-segment wearable accessory
US20130290879A1 (en) 2012-04-30 2013-10-31 Research In Motion Tat Ab Displaying notification messages and messages on a portable electronic device
WO2013163943A1 (en) 2012-05-03 2013-11-07 Made in Sense Limited Wristband having user interface and method of using thereof
US10473689B2 (en) 2012-05-07 2019-11-12 Google Llc Pedometer in a low-power device
US9173052B2 (en) * 2012-05-08 2015-10-27 ConnecteDevice Limited Bluetooth low energy watch with event indicators and activation
KR101713451B1 (en) 2012-06-04 2017-03-07 나이키 이노베이트 씨.브이. Fitness training system with energy expenditure calculation that uses multiple sensor inputs
US20140180595A1 (en) 2012-12-26 2014-06-26 Fitbit, Inc. Device state dependent user interface management
EP2895050B8 (en) 2012-09-11 2018-12-19 L.I.F.E. Corporation S.A. Wearable communication platform
US8630741B1 (en) 2012-09-30 2014-01-14 Nest Labs, Inc. Automated presence detection and presence-related control within an intelligent controller
US20140099614A1 (en) 2012-10-08 2014-04-10 Lark Technologies, Inc. Method for delivering behavior change directives to a user
CN102930490A (en) * 2012-10-31 2013-02-13 代万辉 Intelligent terminal based health management system
US11372536B2 (en) 2012-11-20 2022-06-28 Samsung Electronics Company, Ltd. Transition and interaction model for wearable electronic device
CN103049538B (en) * 2012-12-25 2015-10-21 华中科技大学 The action message syndication search of a kind of position-based service and interactive approach and system
CN103269407A (en) * 2013-06-04 2013-08-28 北京邮电大学 Mobile phone and method of novel alarm based on brain wave detection
KR102163915B1 (en) 2013-09-02 2020-10-12 엘지전자 주식회사 Smart watch and method for controlling thereof
US20150091812A1 (en) 2013-09-30 2015-04-02 Kobo Inc. Controlling a computing device using a tap sequence as user input
US8734296B1 (en) 2013-10-02 2014-05-27 Fitbit, Inc. Biometric sensing device having adaptive data threshold, a performance goal, and a goal celebration display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7334472B2 (en) * 2004-07-24 2008-02-26 Samsung Electronics Co., Ltd. Apparatus and method for measuring quantity of physical exercise using acceleration sensor
US20070049836A1 (en) * 2004-12-07 2007-03-01 Yu-Yu Chen Electronic wristwatch-type exercise signal detecting apparatus
US20130310896A1 (en) * 2007-08-31 2013-11-21 Cardiac Pacemakers, Inc. Wireless patient communicator for use in a life critical network
US20100245078A1 (en) * 2009-03-26 2010-09-30 Wellcore Corporation Wearable Motion Sensing Device
US20100295684A1 (en) * 2009-05-21 2010-11-25 Silverplus, Inc. Personal health management device
US8903671B2 (en) * 2013-01-15 2014-12-02 Fitbit, Inc. Portable monitoring devices and methods of operating the same

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10772559B2 (en) 2012-06-14 2020-09-15 Medibotics Llc Wearable food consumption monitor
US10134256B2 (en) 2013-01-15 2018-11-20 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US9773396B2 (en) 2013-01-15 2017-09-26 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US11423757B2 (en) 2013-01-15 2022-08-23 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US9286789B2 (en) 2013-01-15 2016-03-15 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US8903671B2 (en) 2013-01-15 2014-12-02 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US9098991B2 (en) 2013-01-15 2015-08-04 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US9600994B2 (en) 2013-01-15 2017-03-21 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US12002341B2 (en) 2013-01-15 2024-06-04 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US10716510B2 (en) 2013-09-17 2020-07-21 Medibotics Smart clothing with converging/diverging bend or stretch sensors for measuring body motion or configuration
US10602965B2 (en) 2013-09-17 2020-03-31 Medibotics Wearable deformable conductive sensors for human motion capture including trans-joint pitch, yaw, and roll
US10321873B2 (en) 2013-09-17 2019-06-18 Medibotics Llc Smart clothing for ambulatory human motion capture
US10234934B2 (en) 2013-09-17 2019-03-19 Medibotics Llc Sensor array spanning multiple radial quadrants to measure body joint movement
US9582072B2 (en) 2013-09-17 2017-02-28 Medibotics Llc Motion recognition clothing [TM] with flexible electromagnetic, light, or sonic energy pathways
US9588582B2 (en) 2013-09-17 2017-03-07 Medibotics Llc Motion recognition clothing (TM) with two different sets of tubes spanning a body joint
US10130277B2 (en) 2014-01-28 2018-11-20 Medibotics Llc Willpower glasses (TM)—a wearable food consumption monitor
US11990019B2 (en) 2014-02-27 2024-05-21 Fitbit, Inc. Notifications on a user device based on activity detected by an activity monitoring device
US10796549B2 (en) 2014-02-27 2020-10-06 Fitbit, Inc. Notifications on a user device based on activity detected by an activity monitoring device
US20150305655A1 (en) * 2014-04-25 2015-10-29 Speedo International Limited Activity Monitors
US10201292B2 (en) * 2014-04-25 2019-02-12 Speedo International Limited Activity monitors
US12053591B2 (en) 2014-06-05 2024-08-06 Eight Sleep Inc. Methods and systems for gathering and analyzing human biological signals
US10792461B2 (en) 2014-06-05 2020-10-06 Eight Sleep, Inc. Methods and systems for gathering and analyzing human biological signals
CN104167079A (en) * 2014-08-01 2014-11-26 青岛歌尔声学科技有限公司 Rescue method and device applied to multiple scenes
US20170092111A1 (en) * 2015-09-30 2017-03-30 Xiaomi Inc. Method and device for processing abnormality notification from a smart device
US9934673B2 (en) * 2015-09-30 2018-04-03 Xiaomi Inc. Method and device for processing abnormality notification from a smart device
US10028672B2 (en) * 2015-11-13 2018-07-24 Acme Portable Corp. Wearable device which diagnosis personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US20170135593A1 (en) * 2015-11-13 2017-05-18 Acme Portable Corp. Wearable device which diagnoses personal cardiac health condition by monitoring and analyzing heartbeat and the method thereof
US10535243B2 (en) 2016-10-28 2020-01-14 HBH Development LLC Target behavior monitoring system
US11666284B2 (en) 2018-01-09 2023-06-06 Eight Sleep Inc. Systems and methods for detecting a biological signal of a user of an article of furniture
US11904103B2 (en) 2018-01-19 2024-02-20 Eight Sleep Inc. Sleep pod
WO2019235817A1 (en) * 2018-06-07 2019-12-12 Samsung Electronics Co., Ltd. Electronic device for providing exercise information using biometric information and operating method thereof
KR102564269B1 (en) * 2018-06-07 2023-08-07 삼성전자주식회사 Electronic apparatus for providing exercise information using biometric information and operating method thereof
AU2019283484B2 (en) * 2018-06-07 2022-08-04 Samsung Electronics Co., Ltd. Electronic device for providing exercise information using biometric information and operating method thereof
US11311777B2 (en) 2018-06-07 2022-04-26 Samsung Electronics Co., Ltd. Electronic device for providing exercise information using biometric information and operating method thereof
KR20190138969A (en) * 2018-06-07 2019-12-17 삼성전자주식회사 Electronic apparatus for providing exercise information using biometric information and operating method thereof
CN110575150A (en) * 2018-06-07 2019-12-17 三星电子株式会社 Electronic device for providing exercise information using biometric information and method of operating the same

Also Published As

Publication number Publication date
CN104434314A (en) 2015-03-25
US20140197965A1 (en) 2014-07-17
CN104434315B (en) 2017-05-24
CN104434314B (en) 2018-01-09
US9286789B2 (en) 2016-03-15
US20140197946A1 (en) 2014-07-17
US9098991B2 (en) 2015-08-04
CN107260178A (en) 2017-10-20
CN104434315A (en) 2015-03-25
US12002341B2 (en) 2024-06-04
US20170143239A1 (en) 2017-05-25
US9773396B2 (en) 2017-09-26
US20150042471A1 (en) 2015-02-12
US20190057593A1 (en) 2019-02-21
US20160267764A1 (en) 2016-09-15
US9600994B2 (en) 2017-03-21
US10134256B2 (en) 2018-11-20
US20150294554A1 (en) 2015-10-15
US11423757B2 (en) 2022-08-23
CN107260178B (en) 2021-05-18
US8903671B2 (en) 2014-12-02
CN113367689A (en) 2021-09-10
US20230066299A1 (en) 2023-03-02

Similar Documents

Publication Publication Date Title
US12002341B2 (en) Portable monitoring devices and methods of operating the same
US8784271B2 (en) Biometric monitoring device with contextually-or environmentally-dependent display
US11521474B2 (en) Notifications on a user device based on activity detected by an activity monitoring device
US9017221B2 (en) Delayed goal celebration
US11990019B2 (en) Notifications on a user device based on activity detected by an activity monitoring device
CN107928629B (en) Portable monitoring device and method of operating the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: FITBIT, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, JAMES;YUEN, SHELTEN GEE JAO;FRIEDMAN, ERIC NATHAN;AND OTHERS;SIGNING DATES FROM 20130917 TO 20130925;REEL/FRAME:031460/0068

AS Assignment

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:FITBIT, INC.;REEL/FRAME:033532/0027

Effective date: 20140813

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL

Free format text: SECURITY INTEREST;ASSIGNOR:FITBIT, INC.;REEL/FRAME:033532/0027

Effective date: 20140813

AS Assignment

Owner name: SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT, CALI

Free format text: SECURITY INTEREST;ASSIGNOR:FITBIT, INC.;REEL/FRAME:033546/0670

Effective date: 20140813

Owner name: SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:FITBIT, INC.;REEL/FRAME:033546/0670

Effective date: 20140813

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: FITSTAR, INC., CALIFORNIA

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:037269/0076

Effective date: 20151210

Owner name: FITBIT, INC., CALIFORNIA

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:037269/0076

Effective date: 20151210

AS Assignment

Owner name: FITBIT, INC., CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT;REEL/FRAME:037300/0497

Effective date: 20151210

Owner name: FITSTAR, INC., CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT;REEL/FRAME:037300/0497

Effective date: 20151210