[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20110021317A1 - System and method for displaying anonymously annotated physical exercise data - Google Patents

System and method for displaying anonymously annotated physical exercise data Download PDF

Info

Publication number
US20110021317A1
US20110021317A1 US12/673,793 US67379308A US2011021317A1 US 20110021317 A1 US20110021317 A1 US 20110021317A1 US 67379308 A US67379308 A US 67379308A US 2011021317 A1 US2011021317 A1 US 2011021317A1
Authority
US
United States
Prior art keywords
data
person
physical
physical exercise
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/673,793
Inventor
Gerd Lanfermann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Assigned to KONINKLIJKE PHILIPS ELECTRONICS N V reassignment KONINKLIJKE PHILIPS ELECTRONICS N V ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LANDERMANN, GERD
Publication of US20110021317A1 publication Critical patent/US20110021317A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/221Ergometry, e.g. by using bicycle type apparatus
    • A61B5/222Ergometry, e.g. by using bicycle type apparatus combined with detection or measurement of physiological parameters, e.g. heart rate
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • A63B2024/0096Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/833Sensors arranged on the exercise apparatus or sports implement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/15Miscellaneous features of sport apparatus, devices or equipment with identification means that can be read by electronic means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/04Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/202Measuring physiological parameters of the user blood composition characteristics glucose
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/207P-O2, i.e. partial O2 value
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the present invention relates to a system and method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • Home rehabilitation exercises for persons suffering from a medical condition like a stroke or home training exercises for persons wishing to improve body motions like a golf swing can be recorded via sensors.
  • the exercises can also be evaluated by a professional such as a physiotherapist or a golf instructor in order to give the person a direct feedback.
  • U.S. Pat. No. 6,817,979 B2 relates to a system and method which provide for interacting with a virtual physiological model of a user with the use of a mobile communication device.
  • Physiological data associated with the user is acquired from the user.
  • the physiological data is transmitted to the mobile communication device, preferably with the use of a wireless communication protocol.
  • the methodology further involves using the mobile communication device to communicate the physiological data to a network server.
  • the physiological data is integrated into the virtual physiological model of the user.
  • the user can access data and depictions of the user developed from the physiological data.
  • a user can create an avatar representative of the current physical state of the user.
  • the user can adjust the avatar to change the appearance of the avatar to a more desired appearance.
  • the anatomical dimensions of the avatar can be changed to reflect desired waist, chest, upper arms and thigh dimensions.
  • various training, diet and related fitness recommendations can be developed to establish a training regimen most suited to help the user achieve the desired fitness goals.
  • Physiological data is subsequently acquired and applied to the user's avatar, and compared to the desired avatar's data to determine if the training regimen is effective in achieving the desired fitness goals.
  • the present invention is directed to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of:
  • the term anonymously annotated data denotes data where a third person performing the annotation has no knowledge about the identity of the person whose data he is annotating. In particular, the data does not allow for a recognition of the person.
  • One way of achieving the anonymization is by assigning identification numbers to the data.
  • Physical exercise data is data relating to movements or other exercises of a person.
  • the first two steps of the method describe how two different sets of information about the exercise of the person are gathered.
  • physical exercise data is gathered, for example by continuously monitoring sensor signals from the person.
  • visual recordings are gathered, for example by using a digital video camera.
  • the physical exercise data can then be transmitted to a physically separate annotation unit.
  • the physical separation of the annotation unit provides for an anonymization of the data.
  • the physical exercise data can be processed into representations of the exercise for review by a third person.
  • the physical exercise data can then be annotated. This includes automatic processing of the data, for example by detecting deviations from motion templates.
  • the third person can include comments and suggestions to provide helpful feedback to the person performing the exercise.
  • the annotation information is transmitted to a display and processing unit at the site of the person performing the exercise.
  • the annotation information is joined with the visual recordings.
  • the recordings of the person undertaking exercises are then displayed to the person together with the synchronized annotation information.
  • the synchronization provides for displaying the annotation at the correct time so the person can directly understand what has caught the attention of the reviewer or the automatic reviewing system.
  • an exercise of a person can be reviewed anonymously and feedback can be given to the person.
  • the anonymization allows for the sharing of professional resources, making the reviewing process more efficient.
  • the person receives the feedback it is very clearly shown to him, via the visual recordings, which part of the exercise has prompted the feedback.
  • an avatar is calculated based on the physical exercise data.
  • avatar shall denote a computer-generated abstract rendering which represents the posture or motions of a person.
  • the avatar may be a stick figure.
  • the avatar may represent additional information like the pulse rate, the amount of sweating, muscle fatigue and the like.
  • step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person.
  • the person will then see the visual recording of his exercise, the annotations and the avatar.
  • the avatar may depict more clearly the motions of the persons if they are obscured in the visual recording by baggy clothing or if they have not been recorded correctly on camera. Again, the avatar may be rotated to achieve the best viewing perspective. Another option is to provide multiple viewing angles with one or more avatars.
  • transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet.
  • an interconnected computer network preferably the internet.
  • Suitable protocols can include those of the TCP/IP protocol.
  • the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate.
  • motion data either relate to the exercise itself, such as in the case of motion and posture data.
  • Other data types relate to the overall condition or physical fitness of the person. Knowledge about this can give valuable insight into the effectiveness of rehabilitation or training measures. For example, it may be inferred whether the person is in the supercompensation phase after a training stimulus.
  • the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings.
  • Visual information can be in the form of markings such as arrows pointing out a specific issue that are inserted into the images of the avatar. Additionally, small video clips can be inserted to show the correct execution of the exercise.
  • Other visual information can be written comments or graphs showing statistics of data like electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or the respiratory rate. This enables to assess the situation of the person performing the exercise at one glance. Audio signals can be simple beeps when a movement is no performed correctly. Recorded speech comments can be added by the reviewer when this is the simplest way of explaining an exercise.
  • the present invention is further directed towards a system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising:
  • the at least one posture recording device comprises a motion sensor on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors.
  • the motion sensors can be worn on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. They can be commercially available highly integrated solid state sensors.
  • the transmission of the sensor signals to the posture assessment unit can be undertaken via wire, wirelessly or in a body area network using the electrical conductivity of the human skin. After calculation of the person's posture the result can be present in the form of an avatar.
  • the at least one posture recording device comprises an optical mark on the person undertaking exercises.
  • the posture recording device then employs an optical tracking system for tracking the at least one optical mark. Based on the signals of the optical tracking system a representation of the person's posture is then calculated.
  • the optical marks can be borne on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso.
  • the tracking of the marks can be effected with a single camera or a multitude of cameras. When a stereo camera is used, three-dimensional posture and movement data is generated. After image processing and calculation of the person's posture the result can be present in the form of an avatar.
  • a combination of motion sensors and optical tracking may provide complementary data to better calculate the posture of the person.
  • a further aspect of the present invention is the use of a system according to the present invention claims for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • FIG. 1 shows a system according to the present invention
  • FIG. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data
  • FIG. 3 shows a flowchart of a method according to the present invention
  • FIG. 4 shows modules for performing a method according to the present invention.
  • FIG. 1 shows a system according to the present invention for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • the person has motion sensors 3 situated on his thighs and his ankles.
  • optical marks 3 ′ are located on the wrist and the torso.
  • the signals of the motion sensors 3 are transmitted wirelessly to the physical data processing unit 1 where the raw sensor signals are processed into motion and posture data.
  • a video camera 4 records the motions of the person.
  • the physical data processing unit 1 performs optical tracking operations on the video stream of the camera 4 for identifying the position and the movement of the optical marks 3 ′. This is also processed into motion and posture data and complements the data obtained from the motion sensors 3 .
  • the raw or processed sensor signals and positional information from the optical marks 3 ′ are stored in a data storage unit 5 . Furthermore, the video stream of the person performing the exercise is also stored there.
  • the data in the data storage unit 5 is stored together with an information about the time of recording. This makes it possible to correlate or synchronize the information, for example knowing which position as indicated by posture recording devices 3 , 3 ′ corresponds to which frame of a video clip of the person performing the exercise.
  • the physical data processing unit 1 uses an interconnected computer network such as the internet 7 .
  • the physical data processing unit 1 transmits the processed sensor 3 signals and positional information from the optical marks 3 ′ to a physically separate annotation unit 6 .
  • Temporal information is also transmitted.
  • This annotation unit then calculates a visual representation such as an avatar from the received physical data.
  • a physical therapist views the motion of the visual representation on his terminal 8 and comments sequences, thus performing the annotation.
  • the annotation together with the time within the exercise when the annotation has been made is transmitted back to the physical data processing unit 1 at the location of the person undertaking exercises. Again, the transmission is achieved over an interconnected computer network such as the internet 7 .
  • the physical data processing unit 1 then accesses the data storage unit 5 and retrieves the recorded data and video clips from the particular exercise that has been annotated.
  • a movie sequence is generated for viewing by the person and displayed on display 2 .
  • the video stream of the person and an avatar calculated from the recorded data are shown simultaneously.
  • the comments of the physical therapist are also displayed or voiced to the person.
  • FIG. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data.
  • a person has been performing an exercise. Physical data representing his motions has been recorded and used for calculation of an avatar representation. The avatar's motion has been time-resolved and split into a stream of individual frames 20 . Likewise, the person's movements have been recorded by a video camera. This video image sequence has also been time-resolved and split into a stream of individual frames 21 .
  • one common time line can be assigned to them. The time line in FIG. 2 beneath the frame streams arbitrarily begins at 4:16 minutes and ends at 4:21 minutes.
  • the person starts with both of his arms stretched lowered.
  • the left arm is kept stretched and raised along the coronal plane until the hand is above the person's head.
  • the arm is kept in this position while the same movement is supposed to be performed with the right arm.
  • the person is not able to keep his right arm outstretched in the horizontal position.
  • the arm is bent at the elbow. This makes it much easier to lift the arm so at this point no therapeutic benefit is gained.
  • a physical therapist remotely reviewing the avatar frames 20 can then single out the frame at 4:20 minutes and add a visual or verbal comment. This comment, together with the information that it is to be shown at 4:20 minutes into the exercise, is transmitted to the person for future reviewal.
  • the annotation can then be combined with the visual recordings 21 so that the person can relate more directly to the exercise and contemplate his errors in performing it.
  • FIG. 3 shows a flowchart of a method according to the present invention.
  • the first step 30 is to record the exercise a person is performing visually, using a camera, and via posture data, using sensors.
  • the visual recordings are stored 31 and the posture recordings are transmitted to an annotation system 32 .
  • an annotation system Using the annotation system, a person reviews the posture recordings and adds his comments and markers 33 .
  • These annotations are transmitted back to the patient system 34 , wherein ‘patient’ denotes the person performing an exercise.
  • the stored visual recordings are retrieved 35 and combined with the annotations 36 in order to give the person a comprehensive feedback that still does not compromise his anonymity.
  • FIG. 4 shows modules for performing a method according to the present invention to complement the depiction of a system in FIG. 1 .
  • a sensor receiver 40 receives signals from motion sensors or information from the tracking of optical marks. This sensor receiver 40 communicates its data to a movement transmission module 41 . Synchronously with the sensor receiver 40 , a camera 42 captures a video sequence of the person performing exercises. These video sequences are stored in a storage facility 43 .
  • the movement transmission module 41 transmits its data to a remotely located movement receiver 45 . This is symbolized by barrier 44 separating the two sub-groups of modules.
  • the movement receiving module 45 passes the data on to a movement annotator 46 where the data is transformed into processible data and annotated by a reviewer.
  • the annotation together with information on the temporal position of the annotation within the exercise is passed on to annotation transmission module 47 .
  • Aforementioned annotation transmission module 47 transmits the information to an annotation receiver 48 located at the sub-group of modules assigned to the person performing the exercise.
  • This annotation information reaches a processing and overlay module 49 which accesses video sequences from the storage module 43 and combines the sequences with the annotation so that the annotation is present at the appropriate time of the video sequence.
  • a rendering module 50 the overlaid video sequence is displayed to the person who has performed the exercise.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Physiology (AREA)
  • Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Cardiology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises. Based on physical exercise data, the physical exercise data is annotated at a physically separate annotation unit. At the location of the person, visual recordings of the person undertaking exercises together with synchronized annotation information are displayed to the person. A system for performing the method comprises a physical data processing unit (1), a display device (2), at least one posture recording device (3, 3′), a visual recording device (4), a data storage unit (5) and a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to a system and method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • Home rehabilitation exercises for persons suffering from a medical condition like a stroke or home training exercises for persons wishing to improve body motions like a golf swing can be recorded via sensors. The exercises can also be evaluated by a professional such as a physiotherapist or a golf instructor in order to give the person a direct feedback.
  • If the professional performing the review is not present during the exercise, video camera recordings could be sent to him. These recordings could be reviewed intuitively by the professional and the commented recordings could be understood intuitively by the person undertaking the exercise. However, these recordings, especially when sent away to a remote professional, could breach the privacy of the person. Furthermore, a completely automatic processing of such recorded images to provide meaningful feedback is a demanding task.
  • Alternatively, the sole transmission of data from the sensors would not violate the privacy of the person. In this respect, U.S. Pat. No. 6,817,979 B2 relates to a system and method which provide for interacting with a virtual physiological model of a user with the use of a mobile communication device. Physiological data associated with the user is acquired from the user. The physiological data is transmitted to the mobile communication device, preferably with the use of a wireless communication protocol. The methodology further involves using the mobile communication device to communicate the physiological data to a network server. The physiological data is integrated into the virtual physiological model of the user. The user can access data and depictions of the user developed from the physiological data.
  • By way of example, a user can create an avatar representative of the current physical state of the user. The user can adjust the avatar to change the appearance of the avatar to a more desired appearance. For example, the anatomical dimensions of the avatar can be changed to reflect desired waist, chest, upper arms and thigh dimensions. Given differences between the desired avatar features and present avatar features, various training, diet and related fitness recommendations can be developed to establish a training regimen most suited to help the user achieve the desired fitness goals. Physiological data is subsequently acquired and applied to the user's avatar, and compared to the desired avatar's data to determine if the training regimen is effective in achieving the desired fitness goals.
  • However, in general the interpretation of sensor signals in the frontend leads to difficulties on the part of the user. It is hard to relate to an abstract rendering of an artificial screen character.
  • Despite this effort accordingly there still exists a need in the art for a system and a method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • SUMMARY OF THE INVENTION
  • To achieve this and other objects the present invention is directed to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of:
  • a) gathering physical exercise data from a person undertaking exercises;
    b) synchronously gathering visual recordings of the person undertaking exercises;
    c) transmitting the physical exercise data to a physically separate annotation unit;
    d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit;
    e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises;
    f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Before the invention is described in detail, it is to be understood that this invention is not limited to the particular component parts of the devices described or process steps of the methods described as such devices and methods may vary. It is also to be understood that the terminology used herein is for purposes of describing particular embodiments only, and is not intended to be limiting. It must be noted that, as used in the specification and the appended claims, the singular forms “a,” “an” and “the” include singular and/or plural referents unless the context clearly dictates otherwise.
  • In the context of the present invention, the term anonymously annotated data denotes data where a third person performing the annotation has no knowledge about the identity of the person whose data he is annotating. In particular, the data does not allow for a recognition of the person. One way of achieving the anonymization is by assigning identification numbers to the data. Physical exercise data is data relating to movements or other exercises of a person.
  • The first two steps of the method describe how two different sets of information about the exercise of the person are gathered. Firstly, physical exercise data is gathered, for example by continuously monitoring sensor signals from the person. At the same time, visual recordings are gathered, for example by using a digital video camera. By synchronously gathering this data it is ensured that later on, a certain portion of the video stream can be attributed to a certain portion of the sensor signal stream and vice versa.
  • As the visual recordings and the physical exercise data are separate entities, the physical exercise data can then be transmitted to a physically separate annotation unit. The physical separation of the annotation unit provides for an anonymization of the data. At the annotation unit the physical exercise data can be processed into representations of the exercise for review by a third person. The physical exercise data can then be annotated. This includes automatic processing of the data, for example by detecting deviations from motion templates. Furthermore, the third person can include comments and suggestions to provide helpful feedback to the person performing the exercise. Afterwards, the annotation information is transmitted to a display and processing unit at the site of the person performing the exercise. Here, the annotation information is joined with the visual recordings. The recordings of the person undertaking exercises are then displayed to the person together with the synchronized annotation information. The synchronization provides for displaying the annotation at the correct time so the person can directly understand what has caught the attention of the reviewer or the automatic reviewing system.
  • In summary, with the method according to the present invention an exercise of a person can be reviewed anonymously and feedback can be given to the person. The anonymization allows for the sharing of professional resources, making the reviewing process more efficient. At the same time, when the person receives the feedback it is very clearly shown to him, via the visual recordings, which part of the exercise has prompted the feedback.
  • In one embodiment of the invention, at the physically separate annotation unit in step d) an avatar is calculated based on the physical exercise data. For the purposes of this invention, the term ‘avatar’ shall denote a computer-generated abstract rendering which represents the posture or motions of a person. In simple cases, the avatar may be a stick figure. In more sophisticated cases, the avatar may represent additional information like the pulse rate, the amount of sweating, muscle fatigue and the like. An advantage of using an avatar representation is that the avatar can be rotated on the screen of the annotation unit while representing the exercise. This enables the reviewer to choose the best viewing angle for assessing the exercise.
  • In a further embodiment of the invention step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person. In summary, the person will then see the visual recording of his exercise, the annotations and the avatar. This is advantageous as the avatar may depict more clearly the motions of the persons if they are obscured in the visual recording by baggy clothing or if they have not been recorded correctly on camera. Again, the avatar may be rotated to achieve the best viewing perspective. Another option is to provide multiple viewing angles with one or more avatars.
  • In a further embodiment of the invention transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet. This allows a remotely located person to perform the review and the annotation. Suitable protocols can include those of the TCP/IP protocol.
  • In a further embodiment of the invention the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate. These data types either relate to the exercise itself, such as in the case of motion and posture data. Other data types relate to the overall condition or physical fitness of the person. Knowledge about this can give valuable insight into the effectiveness of rehabilitation or training measures. For example, it may be inferred whether the person is in the supercompensation phase after a training stimulus.
  • In a further embodiment of the invention the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings. Visual information can be in the form of markings such as arrows pointing out a specific issue that are inserted into the images of the avatar. Additionally, small video clips can be inserted to show the correct execution of the exercise. Other visual information can be written comments or graphs showing statistics of data like electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or the respiratory rate. This enables to assess the situation of the person performing the exercise at one glance. Audio signals can be simple beeps when a movement is no performed correctly. Recorded speech comments can be added by the reviewer when this is the simplest way of explaining an exercise.
  • The present invention is further directed towards a system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising:
      • a physical data processing unit;
      • a display device in communication with the physical data processing unit;
      • at least one posture recording device assigned to the person undertaking exercises and in communication with the physical data processing unit;
      • a visual recording device in communication with the physical data processing unit;
      • a data storage unit for storing and retrieving data from the physical data processing unit and the visual recording device; the data storage means being in communication with the physical data processing unit;
      • a physically separate annotation unit in connection with the physical data processing unit, the connection being via an interconnected computer network.
  • In one embodiment of the invention the at least one posture recording device comprises a motion sensor on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors. The motion sensors can be worn on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. They can be commercially available highly integrated solid state sensors. The transmission of the sensor signals to the posture assessment unit can be undertaken via wire, wirelessly or in a body area network using the electrical conductivity of the human skin. After calculation of the person's posture the result can be present in the form of an avatar.
  • In a further embodiment of the invention the at least one posture recording device comprises an optical mark on the person undertaking exercises. The posture recording device then employs an optical tracking system for tracking the at least one optical mark. Based on the signals of the optical tracking system a representation of the person's posture is then calculated. The optical marks can be borne on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. The tracking of the marks can be effected with a single camera or a multitude of cameras. When a stereo camera is used, three-dimensional posture and movement data is generated. After image processing and calculation of the person's posture the result can be present in the form of an avatar.
  • It is also possible to combine several posture monitoring principles. For example, a combination of motion sensors and optical tracking may provide complementary data to better calculate the posture of the person.
  • A further aspect of the present invention is the use of a system according to the present invention claims for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will become more readily understood with reference to the following drawing, wherein
  • FIG. 1 shows a system according to the present invention
  • FIG. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data
  • FIG. 3 shows a flowchart of a method according to the present invention
  • FIG. 4 shows modules for performing a method according to the present invention.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a system according to the present invention for displaying anonymously annotated physical exercise data to a person undertaking exercises. As posture recording devices, the person has motion sensors 3 situated on his thighs and his ankles. Furthermore, optical marks 3′ are located on the wrist and the torso. Being physical exercise data, the signals of the motion sensors 3 are transmitted wirelessly to the physical data processing unit 1 where the raw sensor signals are processed into motion and posture data. A video camera 4 records the motions of the person. Furthermore, the physical data processing unit 1 performs optical tracking operations on the video stream of the camera 4 for identifying the position and the movement of the optical marks 3′. This is also processed into motion and posture data and complements the data obtained from the motion sensors 3.
  • The raw or processed sensor signals and positional information from the optical marks 3′ are stored in a data storage unit 5. Furthermore, the video stream of the person performing the exercise is also stored there. The data in the data storage unit 5 is stored together with an information about the time of recording. This makes it possible to correlate or synchronize the information, for example knowing which position as indicated by posture recording devices 3, 3′ corresponds to which frame of a video clip of the person performing the exercise.
  • Using an interconnected computer network such as the internet 7, the physical data processing unit 1 transmits the processed sensor 3 signals and positional information from the optical marks 3′ to a physically separate annotation unit 6. Temporal information is also transmitted. This annotation unit then calculates a visual representation such as an avatar from the received physical data. A physical therapist views the motion of the visual representation on his terminal 8 and comments sequences, thus performing the annotation. The annotation together with the time within the exercise when the annotation has been made is transmitted back to the physical data processing unit 1 at the location of the person undertaking exercises. Again, the transmission is achieved over an interconnected computer network such as the internet 7.
  • The physical data processing unit 1 then accesses the data storage unit 5 and retrieves the recorded data and video clips from the particular exercise that has been annotated. A movie sequence is generated for viewing by the person and displayed on display 2. In this case, the video stream of the person and an avatar calculated from the recorded data are shown simultaneously. At the appropriate time, the comments of the physical therapist are also displayed or voiced to the person.
  • FIG. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data. A person has been performing an exercise. Physical data representing his motions has been recorded and used for calculation of an avatar representation. The avatar's motion has been time-resolved and split into a stream of individual frames 20. Likewise, the person's movements have been recorded by a video camera. This video image sequence has also been time-resolved and split into a stream of individual frames 21. As the physical exercise data and the visual recordings have been gathered simultaneously, one common time line can be assigned to them. The time line in FIG. 2 beneath the frame streams arbitrarily begins at 4:16 minutes and ends at 4:21 minutes.
  • In the exercise of FIG. 2, the person starts with both of his arms stretched lowered. In the images, the left arm is kept stretched and raised along the coronal plane until the hand is above the person's head. The arm is kept in this position while the same movement is supposed to be performed with the right arm. At the time of 4:20, the person is not able to keep his right arm outstretched in the horizontal position. The arm is bent at the elbow. This makes it much easier to lift the arm so at this point no therapeutic benefit is gained. A physical therapist remotely reviewing the avatar frames 20 can then single out the frame at 4:20 minutes and add a visual or verbal comment. This comment, together with the information that it is to be shown at 4:20 minutes into the exercise, is transmitted to the person for future reviewal. At the person's location the annotation can then be combined with the visual recordings 21 so that the person can relate more directly to the exercise and contemplate his errors in performing it.
  • FIG. 3 shows a flowchart of a method according to the present invention. The first step 30 is to record the exercise a person is performing visually, using a camera, and via posture data, using sensors. The visual recordings are stored 31 and the posture recordings are transmitted to an annotation system 32. Using the annotation system, a person reviews the posture recordings and adds his comments and markers 33. These annotations are transmitted back to the patient system 34, wherein ‘patient’ denotes the person performing an exercise. On the patient side, the stored visual recordings are retrieved 35 and combined with the annotations 36 in order to give the person a comprehensive feedback that still does not compromise his anonymity.
  • FIG. 4 shows modules for performing a method according to the present invention to complement the depiction of a system in FIG. 1. A sensor receiver 40 receives signals from motion sensors or information from the tracking of optical marks. This sensor receiver 40 communicates its data to a movement transmission module 41. Synchronously with the sensor receiver 40, a camera 42 captures a video sequence of the person performing exercises. These video sequences are stored in a storage facility 43. The movement transmission module 41 transmits its data to a remotely located movement receiver 45. This is symbolized by barrier 44 separating the two sub-groups of modules.
  • The movement receiving module 45 passes the data on to a movement annotator 46 where the data is transformed into processible data and annotated by a reviewer. The annotation together with information on the temporal position of the annotation within the exercise is passed on to annotation transmission module 47. Aforementioned annotation transmission module 47 transmits the information to an annotation receiver 48 located at the sub-group of modules assigned to the person performing the exercise. This annotation information reaches a processing and overlay module 49 which accesses video sequences from the storage module 43 and combines the sequences with the annotation so that the annotation is present at the appropriate time of the video sequence. Finally, via a rendering module 50, the overlaid video sequence is displayed to the person who has performed the exercise.
  • To provide a comprehensive disclosure without unduly lengthening the specification, the applicant hereby incorporates by reference each of the patents and patent applications referenced above.
  • The particular combinations of elements and features in the above detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this and the patents/applications incorporated by reference are also expressly contemplated. As those skilled in the art will recognize, variations, modifications, and other implementations of what is described herein can occur to those of ordinary skill in the art without departing from the spirit and the scope of the invention as claimed. Accordingly, the foregoing description is by way of example only and is not intended as limiting. The invention's scope is defined in the following claims and the equivalents thereto. Furthermore, reference signs used in the description and claims do not limit the scope of the invention as claimed.

Claims (10)

1. A method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of:
a) gathering physical exercise data from a person undertaking exercises;
b) synchronously gathering visual recordings of the person undertaking exercises;
c) transmitting the physical exercise data to a physically separate annotation unit;
d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit;
e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises;
f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
2. Method according to claim 1, wherein at the physically separate annotation unit in step d) an avatar is calculated based on the physical exercise data.
3. Method according to claim 1, wherein step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person.
4. Method according to claim 1, wherein transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet.
5. Method according to claim 1, wherein the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate.
6. Method according to claim 1, wherein the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings.
7. A system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising:
a physical data processing unit (1);
a display device (2) in communication with the physical data processing unit (1);
at least one posture recording device (3, 3′) assigned to the person undertaking exercises and in communication with the physical data processing unit (1);
a visual recording device (4) in communication with the physical data processing unit (1);
a data storage unit (5) for storing and retrieving data from the physical data processing unit (1) and the visual recording device (4); the data storage means (5) being in communication with the physical data processing unit (1);
a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).
8. System according to claim 7, wherein the at least one posture recording device (3, 3′) comprises a motion sensor (3) on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors.
9. System according to claim 7, wherein the at least one posture recording device (3, 3′) comprises an optical mark (3′) on the person undertaking exercises.
10. Use of a system according to claim 7 for displaying anonymously annotated physical exercise data to a person undertaking exercises.
US12/673,793 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data Abandoned US20110021317A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP07114912.4 2007-08-24
EP07114912 2007-08-24
PCT/IB2008/053386 WO2009027917A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Publications (1)

Publication Number Publication Date
US20110021317A1 true US20110021317A1 (en) 2011-01-27

Family

ID=40122948

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/673,793 Abandoned US20110021317A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Country Status (5)

Country Link
US (1) US20110021317A1 (en)
EP (1) EP2185071A1 (en)
JP (1) JP2010536459A (en)
CN (1) CN101784230A (en)
WO (1) WO2009027917A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110092337A1 (en) * 2009-10-17 2011-04-21 Robert Bosch Gmbh Wearable system for monitoring strength training
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
US20160346612A1 (en) * 2015-05-29 2016-12-01 Nike, Inc. Enhancing Exercise Through Augmented Reality
WO2017055080A1 (en) * 2015-09-28 2017-04-06 Koninklijke Philips N.V. System and method for supporting physical exercises
US9656119B2 (en) 2012-12-27 2017-05-23 Casio Computer Co., Ltd. Exercise information display system, exercise information display method, and computer-readable storage medium having exercise information display program stored thereon
EP3489959A1 (en) * 2017-11-24 2019-05-29 Toyota Jidosha Kabushiki Kaisha Medical data communication apparatus, server, medical data communication method and medical data communication program
US10484437B2 (en) * 2015-01-21 2019-11-19 Logmein, Inc. Remote support service with two-way smart whiteboard
US10546103B2 (en) 2011-05-16 2020-01-28 Dacadoo Ag Optical data capture of exercise data in furtherance of a health score computation
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements
US20200261763A1 (en) * 2016-01-12 2020-08-20 Samsung Electronics Co., Ltd. Display device and control method therefor
US10886016B2 (en) 2010-09-29 2021-01-05 Dacadoo Ag Automated health data acquisition, processing and communication system
US11107579B2 (en) * 2012-10-09 2021-08-31 Kc Holdings I Personalized avatar responsive to user physical state and context
US11490864B2 (en) 2012-10-09 2022-11-08 Kc Holdings I Personalized avatar responsive to user physical state and context
US20230338778A1 (en) * 2011-01-26 2023-10-26 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
US12142375B2 (en) 2022-06-09 2024-11-12 Kc Holdings I Personalized avatar responsive to user physical state and context

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
EP3087858B1 (en) 2008-06-13 2021-04-28 NIKE Innovate C.V. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
CA2817573C (en) 2010-11-10 2018-07-10 Nike International Ltd. Systems and methods for time-based athletic activity measurement and display
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
KR101668841B1 (en) * 2011-02-17 2016-10-25 나이키 이노베이트 씨.브이. Tracking of user performance metrics during a workout session
CA2827684C (en) 2011-02-17 2016-09-27 Nike International Ltd. Footwear having sensor system
US9192816B2 (en) 2011-02-17 2015-11-24 Nike, Inc. Footwear having sensor system
CN102440774A (en) * 2011-09-01 2012-05-09 东南大学 Remote measurement module for related physiological information in rehabilitation training process
ITGE20120011A1 (en) * 2012-01-27 2013-07-28 Paybay Networks S R L PATIENT REHABILITATION SYSTEM
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
US20130213147A1 (en) 2012-02-22 2013-08-22 Nike, Inc. Footwear Having Sensor System
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US9279734B2 (en) 2013-03-15 2016-03-08 Nike, Inc. System and method for analyzing athletic activity
JP2014199613A (en) * 2013-03-29 2014-10-23 株式会社コナミデジタルエンタテインメント Application control program, application control method, and application control device
US20150133820A1 (en) * 2013-11-13 2015-05-14 Motorika Limited Virtual reality based rehabilitation apparatuses and methods
WO2015110298A1 (en) * 2014-01-24 2015-07-30 Icura Aps System and method for mapping moving body parts
CN105641900B (en) * 2015-12-28 2019-07-26 联想(北京)有限公司 A kind of respiratory state based reminding method and electronic equipment and system
CN105615852A (en) * 2016-03-17 2016-06-01 北京永数网络科技有限公司 Blood pressure detection system and method
EP3833454B1 (en) * 2018-08-07 2024-10-09 Interactive Strength, Inc. Interactive exercise machine system with mirror display

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5679004A (en) * 1995-12-07 1997-10-21 Movit, Inc. Myoelectric feedback system
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20030054327A1 (en) * 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
US20040002634A1 (en) * 2002-06-28 2004-01-01 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en) * 2003-12-19 2006-02-02 Satayan Mahajan Motion tracking and analysis apparatus and method and system implementations thereof
US20060166737A1 (en) * 2005-01-26 2006-07-27 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
US20060183980A1 (en) * 2005-02-14 2006-08-17 Chang-Ming Yang Mental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing
US20060247070A1 (en) * 2001-06-11 2006-11-02 Recognition Insight, Llc Swing position recognition and reinforcement
US20080191864A1 (en) * 2005-03-31 2008-08-14 Ronen Wolfson Interactive Surface and Display System
US20090299232A1 (en) * 2006-07-12 2009-12-03 Koninklijke Philips Electronics N.V. Health management device
US20110072457A1 (en) * 2007-08-22 2011-03-24 Koninklijke Philips Electronics N.V. System and method for displaying selected information to a person undertaking exercises

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816986B1 (en) * 1996-07-03 2006-09-06 Hitachi, Ltd. System for recognizing motions

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5679004A (en) * 1995-12-07 1997-10-21 Movit, Inc. Myoelectric feedback system
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20060247070A1 (en) * 2001-06-11 2006-11-02 Recognition Insight, Llc Swing position recognition and reinforcement
US20030054327A1 (en) * 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
US20040002634A1 (en) * 2002-06-28 2004-01-01 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US6817979B2 (en) * 2002-06-28 2004-11-16 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en) * 2003-12-19 2006-02-02 Satayan Mahajan Motion tracking and analysis apparatus and method and system implementations thereof
US20060166737A1 (en) * 2005-01-26 2006-07-27 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
US20060183980A1 (en) * 2005-02-14 2006-08-17 Chang-Ming Yang Mental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing
US20080191864A1 (en) * 2005-03-31 2008-08-14 Ronen Wolfson Interactive Surface and Display System
US20090299232A1 (en) * 2006-07-12 2009-12-03 Koninklijke Philips Electronics N.V. Health management device
US20110072457A1 (en) * 2007-08-22 2011-03-24 Koninklijke Philips Electronics N.V. System and method for displaying selected information to a person undertaking exercises

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8500604B2 (en) * 2009-10-17 2013-08-06 Robert Bosch Gmbh Wearable system for monitoring strength training
US20110092337A1 (en) * 2009-10-17 2011-04-21 Robert Bosch Gmbh Wearable system for monitoring strength training
US10886016B2 (en) 2010-09-29 2021-01-05 Dacadoo Ag Automated health data acquisition, processing and communication system
US20230338778A1 (en) * 2011-01-26 2023-10-26 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
US11417420B2 (en) 2011-05-16 2022-08-16 Dacadoo Ag Optical data capture of exercise data in furtherance of a health score computation
US10546103B2 (en) 2011-05-16 2020-01-28 Dacadoo Ag Optical data capture of exercise data in furtherance of a health score computation
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
US11990233B2 (en) 2012-10-09 2024-05-21 Kc Holdings I Personalized avatar responsive to user physical state and context
US11490864B2 (en) 2012-10-09 2022-11-08 Kc Holdings I Personalized avatar responsive to user physical state and context
US11107579B2 (en) * 2012-10-09 2021-08-31 Kc Holdings I Personalized avatar responsive to user physical state and context
US9656119B2 (en) 2012-12-27 2017-05-23 Casio Computer Co., Ltd. Exercise information display system, exercise information display method, and computer-readable storage medium having exercise information display program stored thereon
US10484437B2 (en) * 2015-01-21 2019-11-19 Logmein, Inc. Remote support service with two-way smart whiteboard
US20160346612A1 (en) * 2015-05-29 2016-12-01 Nike, Inc. Enhancing Exercise Through Augmented Reality
WO2017055080A1 (en) * 2015-09-28 2017-04-06 Koninklijke Philips N.V. System and method for supporting physical exercises
US20200261763A1 (en) * 2016-01-12 2020-08-20 Samsung Electronics Co., Ltd. Display device and control method therefor
US11020628B2 (en) * 2016-01-12 2021-06-01 Samsung Electronics Co., Ltd. Display device and control method therefor
EP3489959A1 (en) * 2017-11-24 2019-05-29 Toyota Jidosha Kabushiki Kaisha Medical data communication apparatus, server, medical data communication method and medical data communication program
US11507689B2 (en) 2017-11-24 2022-11-22 Toyota Jidosha Kabushiki Kaisha Medical data communication apparatus, server, medical data communication method and medical data communication program
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements
US12142375B2 (en) 2022-06-09 2024-11-12 Kc Holdings I Personalized avatar responsive to user physical state and context

Also Published As

Publication number Publication date
CN101784230A (en) 2010-07-21
JP2010536459A (en) 2010-12-02
WO2009027917A1 (en) 2009-03-05
EP2185071A1 (en) 2010-05-19

Similar Documents

Publication Publication Date Title
US20110021317A1 (en) System and method for displaying anonymously annotated physical exercise data
KR100772497B1 (en) Golf clinic system and application method thereof
US11069144B2 (en) Systems and methods for augmented reality body movement guidance and measurement
US9892655B2 (en) Method to provide feedback to a physical therapy patient or athlete
CN108289613B (en) System, method and computer program product for physiological monitoring
JP4594157B2 (en) Exercise support system, user terminal device thereof, and exercise support program
CA2844651C (en) Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
US20170136296A1 (en) System and method for physical rehabilitation and motion training
US20140172460A1 (en) System, Method, and Computer Program Product for Digitally Recorded Musculoskeletal Diagnosis and Treatment
WO2022193425A1 (en) Exercise data display method and system
US9248361B1 (en) Motion capture and analysis systems for use in training athletes
US20220105389A1 (en) System and Method for Providing Guided Augmented Reality Physical Therapy in a Telemedicine Platform
JP2009542397A (en) Health management device
US11682157B2 (en) Motion-based online interactive platform
JP2019118783A (en) Remote rehabilitation analysis device and method thereof
US12009083B2 (en) Remote physical therapy and assessment of patients
US12067324B2 (en) Virtual and augmented reality telecommunication platforms
CN115115810A (en) Multi-person collaborative focus positioning and enhanced display method based on spatial posture capture
KR20140082449A (en) Health and rehabilitation apparatus based on natural interaction
US20240198177A1 (en) Exercise instruction and feedback systems and methods
AU2021107210A4 (en) System, method and virtual reality device for assessing compliance of body movement
JP2022552785A (en) Quantified movement feedback system
JP7353605B2 (en) Inhalation motion estimation device, computer program, and inhalation motion estimation method
US20240135617A1 (en) Online interactive platform with motion detection
US20240215922A1 (en) Patient positioning adaptive guidance system

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONINKLIJKE PHILIPS ELECTRONICS N V, NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LANDERMANN, GERD;REEL/FRAME:023944/0371

Effective date: 20091109

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION