WO2015107737A1 - 情報処理装置、情報処理方法、およびプログラム - Google Patents
情報処理装置、情報処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2015107737A1 WO2015107737A1 PCT/JP2014/077597 JP2014077597W WO2015107737A1 WO 2015107737 A1 WO2015107737 A1 WO 2015107737A1 JP 2014077597 W JP2014077597 W JP 2014077597W WO 2015107737 A1 WO2015107737 A1 WO 2015107737A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mounting position
- information processing
- setting
- sensor
- processing apparatus
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
- A61B5/1038—Measuring plantar pressure during gait
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/113—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6822—Neck
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6823—Trunk, e.g., chest, back, abdomen, hip
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6824—Arm or wrist
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6829—Foot or ankle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/06—Arrangements of multiple sensors of different types
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7455—Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7465—Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- a wearable device that can be worn and used by a user, such as a wristwatch device, as well as a portable device such as a smartphone.
- a wearable device that can be worn and used by a user, such as a wristwatch device, as well as a portable device such as a smartphone.
- the behavior recognition mode setting unit that sets the behavior recognition mode based on the mounting position information of the setting target device, the set behavior recognition mode, and the detection of the sensor corresponding to the setting target device
- An information processing apparatus includes an action recognition unit that recognizes a user's action based on the value, and a process control unit that controls execution of a process corresponding to the recognized user's action.
- the step of setting the behavior recognition mode based on the mounting position information of the setting target device, the set behavior recognition mode, and the detection value of the sensor corresponding to the setting target device An information processing method executed by an information processing apparatus is provided, which includes a step of recognizing a user's behavior and a step of controlling execution of a process corresponding to the recognized user's behavior.
- the step of setting the behavior recognition mode based on the mounting position information of the setting target device, the set behavior recognition mode, and the detection value of the sensor corresponding to the setting target device is provided.
- the device on which the sensor is mounted (or the device to which the external sensor is attached.
- the same applies hereinafter. May be deteriorated in accuracy of recognizing the user's behavior or not be able to recognize the user's behavior.
- FIG. 1A to 1H are explanatory views for explaining an information processing method according to the present embodiment.
- FIG. 1A to FIG. 1H show an example of sensor detection values according to the mounting position and the user's action in an apparatus equipped with a sensor equipped with an acceleration sensor and a gyro sensor.
- FIG. 1A shows an example of a detection value when the device on which the sensor is mounted is mounted on the user's head, and A in FIG. 1A shows that the user swims.
- FIG. 1B shows an example of a detected value when the device on which the sensor is mounted is mounted on the user's chin, and FIG. 1B shows a detected value when the user is speaking. 1B of FIG. 1B shows the detection value when the user is chewing.
- FIG. 1C shows an example of a detection value when a device equipped with a sensor is attached to the user's neck, and FIG. 1C shows a detection value when the user is squatting.
- 1B of FIG. 1C shows the detection value when the user is doing push-ups.
- FIG. 1D shows an example of a detection value when the device on which the sensor is mounted is attached to the user's neck, and A in FIG. 1D is a detection when the user is performing abdominal muscle exercise.
- the value, B in FIG. 1D shows the detected value when the user is doing back muscle exercise.
- FIG. 1E shows an example of a detection value when the device equipped with the sensor is worn on the chest of the user, and shows the detection value when the user is breathing.
- FIG. 1F has shown an example of the detection value when the apparatus with which the sensor is mounted is mounted
- a of FIG. 1F is the detection value when a user is playing soccer.
- B in FIG. 1F indicates a detection value when the user is riding a bicycle.
- FIG. 1G shows an example of a detection value when the device on which the sensor is mounted is worn on the user's finger, and shows the detection value when the user is typing.
- FIG. 1H shows an example of a detection value when the device on which the sensor is mounted is worn on the user's wrist.
- FIG. 1H shows a case where the user is swinging the tennis.
- the detected value, B in FIG. 1H indicates the detected value when the user is performing a baseball swing.
- detection values as shown in FIGS. 1A to 1H are obtained.
- the detection value of the acceleration sensor when swimming is shown, shown in A of FIG. 1A
- the detection value of the acceleration sensor when squatting is shown, shown in A of FIG. 1C.
- the detection values of the acceleration sensor are similar. Therefore, when the user's behavior recognition is performed simply using the detection value of the sensor, it cannot be determined whether the user is swimming or the user is squatting, or There is a risk that an erroneous determination result may be obtained.
- the device on which the sensor is mounted is mounted on the user's chin, for example, as shown in FIG. It is much smaller than the case where it is installed. That is, for example, when a device equipped with a sensor is mounted on the user's chin, it is necessary to recognize the user's action after detecting minute vibrations. Therefore, in order to detect the behavior of the user wearing the device on which the sensor is mounted with higher accuracy, for example, setting the resolution of the sensor according to the mounting position of the device on which the sensor is mounted on the user. It is desirable to switch.
- the information processing apparatus includes, for example, the following (1) action recognition mode setting process, (2) action recognition process, and (3) as processes related to the information processing method according to the present embodiment.
- the execution control process By performing the execution control process, the user's action can be recognized with higher accuracy, and the process according to the recognized user's action is controlled.
- Action recognition mode setting process The information processing apparatus according to the present embodiment sets an action recognition mode based on the mounting position information for a setting target apparatus.
- the mounting position information according to the present embodiment is data indicating the mounting position where the setting target device is mounted on the user.
- the mounting position information according to the present embodiment may be data that directly indicates the mounting position, such as the head and neck (for example, data that represents the mounting position as a character string), or may indirectly indicate the mounting position. It may be data (for example, an ID indicating a mounting position).
- the mounting position information according to the present embodiment is generated, for example, when the information processing apparatus according to the present embodiment performs (4) mounting position recognition processing described later.
- the information processing apparatus according to the present embodiment uses the generated mounting position information to perform processing related to the information processing method according to the present embodiment. Do.
- the mounting position information according to the present embodiment may be generated in an external device that has performed processing similar to (4) mounting position recognition processing described later.
- the information processing apparatus according to the present embodiment communicates from the external device via, for example, a communication unit (described later) or a connected external communication device. Is used to acquire the mounting position information, and the processing according to the information processing method according to the present embodiment is performed using the acquired mounting position information.
- the setting target device is a target device for setting the action recognition mode.
- Examples of the setting target device according to the present embodiment include a device on which a sensor used for recognizing user behavior and a device on which an external sensor for recognizing user behavior is attached.
- the setting target device according to the present embodiment may be the information processing device according to the present embodiment, or may be an external device of the information processing device according to the present embodiment.
- a “portable device such as a smartphone, a mobile phone, or a tablet device” or a “wearable device” may be used.
- the setting target device according to the present embodiment is a wearable device.
- FIG. 2 is an explanatory diagram illustrating an example of a setting target device according to the present embodiment, and illustrates an example of a wearable device when the setting target device according to the present embodiment is a wearable device.
- a to E shown in FIG. 2 each show an example of a wearable device.
- the wearable device according to the present embodiment is not limited to the example shown below.
- Head-mounted device (A in FIG. 2): For example, HMD (Head Mounted Display), imaging device, etc.
- Eyewear-type device (B in FIG. 2): For example, HMD, glasses-type device, etc.
- Neck-mounted device (C in FIG. 2): For example, imaging device, headset, necklace-type device, data logger, etc. ⁇ Wrist / arm-mounted device (D in FIG.
- E2 in FIG. 2 For example, watch-type device Devices, data loggers, bracelet-type devices, wristband-type devices, etc.
- Hand / finger-mounted devices E1 in FIG. 2: For example, glove-type devices, ring-type devices, etc.
- Outer / pocket wearing type device E2 in FIG. 2): belt type device, clip / magnet type device, data logger, etc.
- Ankle / foot wearing type device E3 in FIG. 2E) ): For example, an anklet type device or data Logger, etc.
- Each wearable device has a sensor S used for action recognition.
- the sensor S used for action recognition may be a sensor built in the wearable device (a sensor provided in the wearable device) or an external sensor connected to the wearable device.
- a sensor used for behavior recognition corresponding to a setting target device such as the sensor S used for behavior recognition shown in FIG. 2, may be referred to as a “sensor corresponding to the setting target device”.
- examples of the sensor according to the present embodiment include an acceleration sensor, a GPS (Global Positioning System) device, a gyro sensor, an atmospheric pressure sensor, a proximity sensor, and a biological sensor.
- the sensor which concerns on this embodiment is not restricted above, Arbitrary sensors which can be used for the process which concerns on a user's action recognition, such as the action recognition process which concerns on this embodiment, may be sufficient.
- the action recognition mode is for determining the action state, and indicates one setting related to action recognition or a combination of settings related to a plurality of action recognitions.
- the behavior recognition mode for example, one or a combination of “setting related to sensor” and “setting related to processing related to behavior recognition” may be mentioned.
- Examples of settings related to the sensor according to the present embodiment include setting of the type of sensor used for action recognition, setting of parameters of the sensor used for action recognition (for example, setting of sampling, setting of sensing mode, etc.) One or both of the above may be mentioned.
- Examples of the setting of the type of sensor used for action recognition include setting of a sensor to be operated (a setting for turning off the power of a sensor that is not to be operated may be included).
- examples of sensor parameter settings used for action recognition include arbitrary settings related to sensor operation and sensor detection value output, such as sampling settings and sensing mode settings.
- settings related to processing related to action recognition for example, setting of the type of feature amount used for action recognition from detection values of sensors corresponding to a setting target device, and processing related to action recognition 1 or 2 or more of the setting of the algorithm used for the above and the setting of the model data used for the process related to the action recognition.
- the information processing apparatus can recognize the mounting position where the setting target apparatus is mounted on the user by referring to the mounting position information.
- the information processing apparatus according to the present embodiment sets, for example, an action recognition mode corresponding to the recognized mounting position for the setting target apparatus.
- the information processing apparatus for example, is recognized by the table (or database) in which the mounting position and the action recognition mode to be set are associated with each other and the mounting position recognized based on the mounting position information.
- the action recognition mode corresponding to the position is specified.
- the information processing apparatus which concerns on this embodiment sets the specified action recognition mode with respect to the apparatus of setting object.
- the information processing apparatus when there are a plurality of action recognition modes corresponding to the wearing position, stores the history of the set action recognition mode, the time, and the position of the setting target apparatus corresponding to the wearing position information. It is also possible to specify the action recognition mode corresponding to the mounting position using one or more of them.
- the information processing apparatus when there are a plurality of behavior recognition modes corresponding to the mounting position, the information processing apparatus according to the present embodiment visually sets the behavior recognition mode that is a candidate to be set to the user of the setting target device.
- the behavior recognition mode that is presented auditorily and selected by the user may be specified as the behavior recognition mode corresponding to the wearing position.
- Examples of the action recognition mode corresponding to the mounting position include the action recognition modes corresponding to the user actions shown below.
- the action recognition mode corresponding to a mounting position in this embodiment is not restricted to the action recognition mode corresponding to the user's action shown below.
- processing (1-1) and processing (1-2) are performed as processing related to setting of the action recognition mode by the information processing apparatus according to the present embodiment.
- the information processing apparatus uses mounting position information for a sensor corresponding to a setting target apparatus. Make settings for sensors based.
- the information processing apparatus when the device to be set is an external device of the information processing apparatus according to the present embodiment, the information processing apparatus according to the present embodiment, for example, stores data indicating settings related to sensors. By transmitting to this device, the setting target device is set for the sensor.
- the information processing apparatus according to the present embodiment includes, for example, a setting relating to a sensor in a communication unit (described later) included in the information processing apparatus according to the present embodiment or an external communication device connected to the information processing apparatus according to the present embodiment. Send data indicating.
- the data indicating the setting relating to the sensor according to the present embodiment for example, one or both of the following data can be cited.
- the data which shows the setting regarding the sensor which concerns on this embodiment are not restricted to the example shown below.
- Examples of the data indicating the settings related to the sensor according to the present embodiment include arbitrary data (or data group) that can control the operation of the sensor. -Data indicating the type of sensor that is valid (for example, sensor ID, etc.) ⁇ Data indicating sensor parameters
- the data indicating the setting related to the sensor may include, for example, an instruction for executing the setting related to the sensor.
- the information processing device includes a sensor ( A sensor setting is performed on an example of a sensor corresponding to a setting target device) or an external sensor connected (an example of a sensor corresponding to a setting target device).
- the information processing apparatus includes, for example, a table, data indicating a setting target apparatus, and data indicating settings regarding processing related to action recognition corresponding to the mounting position recognized based on the mounting position information. Settings relating to processing related to action recognition are performed by recording in association with a database or the like.
- the data indicating the device to be set according to the present embodiment includes, for example, a device ID.
- data indicating settings related to processing related to action recognition for example, the following data can be cited.
- data which shows the setting regarding the process which concerns on action recognition which concerns on this embodiment are not restricted to the example shown below.
- Examples of the data indicating the settings related to the process related to action recognition according to the present embodiment include arbitrary data (or data group) capable of controlling the process related to action recognition.
- -Data indicating the type of feature value for example, ID indicating the feature value
- -Data indicating an algorithm used for processing related to action recognition for example, program data, ID indicating an algorithm, etc.
- Data indicating model data used for processing related to action recognition for example, model data itself, ID indicating model data, etc.
- the information processing apparatus refers to, for example, the table described above, and uses the data indicating the settings related to the process related to the action recognition corresponding to the setting target apparatus to perform the process (2) (behavior recognition) described later. Process).
- the information processing apparatus according to the present embodiment is, for example, mounted By causing the external device to transmit data indicating settings related to processing related to behavior recognition corresponding to the mounting position recognized based on the position information, the external device is caused to perform processing related to behavior recognition. Also good.
- the information processing apparatus according to the present embodiment for example, for action recognition to a communication unit (described later) included in the information processing apparatus according to the present embodiment or an external communication device connected to the information processing apparatus according to the present embodiment. Data indicating settings related to such processing is transmitted.
- a command for performing the setting related to the action recognition is provided. Instructions to do it are included.
- the information processing apparatus performs the setting by performing the process according to the first example shown in (1-1) and the process according to the second example shown in (1-2).
- An action recognition mode based on the mounting position information is set for the target device.
- the information processing apparatus recognizes a user action based on the set action recognition mode and the detection value of a sensor corresponding to the setting target apparatus.
- the information processing apparatus performs, for example, pattern matching between the feature value extracted from the detection value of the sensor according to the set action recognition mode and the feature value corresponding to the recognized candidate action. Recognize user behavior.
- the action recognition process according to the present embodiment is not limited to the above.
- the information processing apparatus according to the present embodiment recognizes the user's behavior using any technique that can recognize the user's behavior based on the detection value of the sensor, such as processing using threshold processing. May be.
- Examples of the user behavior recognized in the behavior recognition processing according to the present embodiment include the user behavior shown in the above-described example of the user behavior corresponding to FIGS. 1A to 1H and the user behavior described above such as swimming recognition. Etc.
- the information processing apparatus can also recognize the same user's action at a plurality of mounting positions.
- Examples of user behavior that can be recognized at a plurality of mounting positions include “vehicle recognition” that recognizes that the user is on a vehicle such as a bus or train.
- the processing related to the recognition of the user's behavior may be different for each mounting position.
- an action recognition model and a dictionary used for action recognition are provided for each wearing position, so that processing related to user action recognition can be changed for each wearing position.
- a predetermined process set as a process related to recognition of the user's action can be performed regardless of the wearing position. It is.
- the detection value of the acceleration sensor when swimming is shown, shown in A of FIG. 1A
- the detection value of the acceleration sensor when squatting is shown, shown in A of FIG. 1C
- the detection values of the acceleration sensor an example of a sensor corresponding to the device to be set
- the information processing apparatus sets the action recognition mode based on the wearing position information in the process (1) (behavior recognition mode setting process). Therefore, according to the setting corresponding to the wearing position, Recognition accuracy can be increased.
- the information processing apparatus sets the action recognition mode based on the wearing position information in the process (1) (behavior recognition mode setting process), the resolution of the sensor is set according to the wearing position. Can be switched. Therefore, the information processing apparatus according to the present embodiment can recognize the user's action with higher accuracy based on the detection value of the sensor corresponding to the setting target apparatus.
- Execution control process The information processing apparatus according to the present embodiment controls execution of a process corresponding to a recognized user action. Moreover, the information processing apparatus according to the present embodiment may control execution of processing corresponding to the mounting position indicated by the mounting position information and the recognized user action, for example.
- process corresponding to action the process controlled by the execution control process according to the present embodiment is referred to as “process corresponding to action”.
- the information processing apparatus is, for example, a user (recognized in the table (or database) in which the user's action and the process to be controlled are associated with each other and the process (2) (behavior recognition process).
- the process corresponding to the action is specified based on the action.
- the information processing apparatus includes, for example, a table (or database) in which the mounting position, the user's action, and the process to be controlled are associated, the mounting position indicated by the mounting position information, and the above ( It is also possible to specify the process corresponding to the action based on the user's action recognized in the process 2) (behavior recognition process).
- the apparatus which performs the process corresponding to action is an external apparatus of the information processing apparatus which concerns on this embodiment
- the information processing apparatus which concerns on this embodiment performs the process corresponding to the specified action
- the external device that receives the processing command for performing the processing corresponding to the behavior executes the processing corresponding to the behavior in accordance with the processing command.
- the information processing apparatus transmits data related to the process corresponding to the identified action (for example, an application used for executing the process or a process parameter) to the external apparatus. Also good.
- the information processing apparatus according to the present embodiment corresponds to, for example, a behavior of a communication unit (described later) included in the information processing apparatus according to the present embodiment or an external communication device connected to the information processing apparatus according to the present embodiment.
- the processing instruction for performing the processing to be performed is transmitted.
- the apparatus that executes the process corresponding to the action is the own apparatus (the information processing apparatus according to the present embodiment)
- the information processing apparatus according to the present embodiment performs the process corresponding to the identified action. Execute.
- the information processing apparatus controls the execution of the process corresponding to the action by causing the external apparatus to perform the process corresponding to the action or performing the process corresponding to the action. .
- the execution control process will be described mainly by taking as an example a case where the apparatus that executes the process corresponding to the action is a wearable apparatus (an example of a setting target apparatus). Also, in the following description, the information processing apparatus according to the present embodiment is mainly described as an example of controlling execution of processing corresponding to the mounting position indicated by the mounting position information and the recognized user action. A specific example of the control process will be described. Needless to say, the example of the execution control process according to the present embodiment is not limited to the example shown below.
- the information processing apparatus transmits information to the user wearing the wearable device by outputting sound to the wearable device worn by the user.
- the information processing apparatus may activate an application that reads out the number of trainings with a voice of a specific celebrity or character on the wearable apparatus worn by the user.
- applications that can share the above breathing pace with friends include, for example, a function that adds the breathing pace to the action status and a function that controls the avatar (for example, if the avatar is running and breathing is rough) Increase sweat, and flush cheeks when breathing is rough at rest).
- the application sharing the pace of breathing with a friend may have a function of sharing data with a device in which the same application is activated within a range in which the wearable device can communicate, for example. Good.
- AR Augmented Reality
- the process related to sleep determination may be performed in the information processing apparatus according to the present embodiment, or the external apparatus performs processing related to sleep determination, and the information processing apparatus according to the present embodiment is performed in the external apparatus. You may utilize the result of the process which concerns on the performed sleep determination.
- the sleep apnea syndrome confirmation application has a function of detecting a symptom of sleep apnea syndrome using, for example, both a sleep determination result and a respiration determination result.
- the sleep apnea syndrome confirmation application has a function of, for example, warning a registered user such as the person or family when a symptom of sleep apnea syndrome is detected. It may be.
- ⁇ Processing in which the wearable device attached to the ankle presents the strength of the kick speed and impact with sound for example, “Spa” (if weak) or “Zudon” (if strong): The strength of the kick speed, etc.
- the processing device specifies, for example, “a process related to the cycling function or a process related to the training function” as the process corresponding to the action.
- the information processing apparatus according to the present embodiment determines, for example, processing specified by cadence. Then, the information processing apparatus according to the present embodiment causes the wearable device worn by the user to execute processing related to the cycling function or processing related to the training function.
- examples of the process related to the training function according to the present embodiment include one or both of a process for automatically generating a training menu and a process for instructing pace distribution and a course by voice.
- a process for performing feedback for example, feedback by voice (including music), vibration, text, light, etc.
- a process of performing feedback to the user for example, a process of performing feedback for prompting a break when the user continues to perform typing for a certain time or more can be cited.
- the information processing apparatus according to the embodiment specifies, for example, “a process for capturing a moving image” as the process corresponding to the action. Further, the information processing apparatus according to the present embodiment may further specify “a process for editing a captured moving image” as the process corresponding to the action.
- the information processing apparatus causes the imaging device associated with the wearable device worn by the user to execute processing for capturing a moving image, and is associated with the wearable device.
- the image processing apparatus is caused to execute processing for editing the captured moving image.
- the imaging apparatus according to the present embodiment is arranged at a position where a user wearing the wearable apparatus can be imaged, for example, using a tripod. Note that the imaging device associated with the wearable device and the image processing device associated with the wearable device may be the same device.
- a process of capturing an image for example, a process of starting capturing a moving image using a swing as a trigger, or when the swing is not set for a predetermined time is detected. Processing. By performing the process of capturing an image as described above, it is possible to save energy compared to the case where the image is always captured.
- examples of the process of editing the captured moving image according to the present embodiment include a process of automatically generating a digest image from the captured moving image with the end of imaging as a trigger.
- the process of notifying the user of the road according to the present embodiment for example, a process of performing auditory feedback by outputting sound from a speaker or a tactile feedback by vibrating a vibrator or the like is performed.
- a process of performing auditory feedback by outputting sound from a speaker or a tactile feedback by vibrating a vibrator or the like is performed.
- the process which notifies a user of the road which concerns on this embodiment is not restricted above.
- the process of notifying the user of the road according to the present embodiment is a process of notifying the user of the road by an arbitrary UI (User Interface) capable of substantially notifying the user from within the pants pocket. It may be.
- UI User Interface
- the information processing apparatus performs, for example, the process according to the first example shown in (a) to the process according to the eighth example shown in (h) as the execution process according to the present embodiment.
- the execution processing according to the present embodiment is not limited to the above.
- the information processing apparatus according to the present embodiment can recognize the same user's behavior at a plurality of mounting positions, for example, the user is on a vehicle such as a bus or a train. is there.
- the information processing apparatus When vehicle recognition is performed in the process (2) (behavior recognition process), for example, the information processing apparatus according to the present embodiment restricts the method of notifying the user as the process corresponding to the action. “Process” (an example of the predetermined process set above) is specified. Then, the information processing apparatus according to the present embodiment causes the wearable apparatus or the like to execute processing for limiting a method of notifying the user.
- examples of the process for restricting the method for notifying the user according to the present embodiment include a process for restricting auditory notification by voice.
- the information processing apparatus includes, for example, the process (1) (behavior recognition mode setting process) and the process (2) (behavior recognition process) as a process related to the information processing method according to the present embodiment. And the process (execution control process) of (3) above.
- the information processing apparatus sets the action recognition mode based on the mounting position information in the process (1) (action recognition mode setting process), and the process (2) (action recognition process). ),
- the user's behavior is recognized based on the set behavior recognition mode. That is, the information processing apparatus according to the present embodiment can recognize the user's behavior based on the behavior recognition mode corresponding to the mounting position. Therefore, the information processing apparatus according to the present embodiment can recognize the user's action with higher accuracy based on the detection value of the sensor corresponding to the setting target apparatus.
- the information processing apparatus corresponds to the action corresponding to the user action recognized in the process (2) (behavior recognition process) in the process (3) (execution control process). Control the execution of processing.
- the information processing apparatus includes the process (1) (behavior recognition mode setting process), the process (2) (behavior recognition process), and the process (3) (execution control process). By performing the above, it is possible to recognize the user's action with higher accuracy, and to control the process according to the recognized user's action.
- process according to the information processing method according to the present embodiment is not limited to the process (1) (behavior recognition mode setting process) to the process (3) (execution control process).
- the information processing apparatus can further perform a wearing position recognition process for recognizing a wearing position where a setting target apparatus is worn by a user.
- the information processing apparatus according to the present embodiment performs the above process (1) (behavior recognition mode setting process).
- the action recognition mode is set based on the mounting position information indicating the mounting position recognized in the mounting position recognition process according to the present embodiment.
- the information processing apparatus according to the present embodiment performs the present processing in the process (3) (execution control process). The execution of the process is controlled based on the mounting position information indicating the mounting position recognized in the mounting position recognition process according to the embodiment.
- the mounting position recognition process according to the present embodiment will be described more specifically. As described above, the mounting position recognition process according to the present embodiment, which will be described later, may be performed in an external device of the information processing apparatus according to the present embodiment.
- the setting target device according to the present embodiment is an external device of the information processing device according to the present embodiment
- the setting target device according to the present embodiment may be the information processing device according to the present embodiment.
- the information processing apparatus includes a detection value of a sensor corresponding to a setting target apparatus and a position where the sensor corresponding to the setting target apparatus can be mounted.
- the mounting position is recognized based on the condition corresponding to the.
- the condition corresponding to the position where the sensor corresponding to the setting target device can be mounted according to the present embodiment is, for example, a constraint condition in the detection value of the sensor such as the posture and speed at the mountable position. is there.
- the above-mentioned constraint conditions can change for each position that can be mounted. Therefore, the information processing apparatus according to the present embodiment can recognize the mounting position based on the detection value of the sensor corresponding to the setting target apparatus by considering the constraint condition.
- FIG. 3 is an explanatory diagram for explaining a first example of the mounting position recognition process according to the present embodiment.
- FIG. 3 shows an example of a mounting position recognition process when a setting target device having a sensor used for action recognition is mounted on an attachment A provided at a certain mountable position.
- the information processing apparatus acquires a sensor log indicating the detection value of the sensor from the setting target apparatus (S100).
- S100 setting target apparatus
- the information processing apparatus determines whether or not the wearing time on the attachment A satisfies the condition (S102). In the information processing apparatus according to the present embodiment, for example, when the time since the start of acquisition of the sensor log is equal to or greater than a predetermined threshold, or when the time after the start of acquisition of the sensor log is longer than the predetermined threshold, It is determined that the condition is satisfied.
- step S102 If it is not determined that the condition is satisfied in step S102, the information processing apparatus according to the present embodiment repeats the processing from step S100.
- the information processing apparatus calculates the time distribution of the step count related to the operation of the user wearing the setting target apparatus (S104). Further, the information processing apparatus according to the present embodiment calculates each time distribution of the average of the acceleration in the X-axis, Y-axis, and Z-axis directions (S106). In addition, the information processing apparatus according to the present embodiment includes XY-Attitude (sensor X-axis and Y-axis attitudes), YZ-Attitude (sensor Y-axis and Z-axis attitudes), and ZX-Attitude (sensor Z-axis). , X axis posture) (S108).
- 4 and 5 are explanatory diagrams for explaining a first example of the mounting position recognition process according to the present embodiment, and shows a case where the setting target device is mounted on the wrist portion of the user. .
- XY-Attitude, YZ-Attitude, and ZX-Attitude are calculated by the following Equations 1 to 3. Is done. Further, the gravitational direction (“Gravity” direction shown in FIGS. 4 and 5) is the X-axis direction (“X” direction shown in FIGS. 4 and 5) and the Y-axis direction (“Y” shown in FIGS. 4 and 5). Direction). The acceleration in the X-axis direction, the Y-axis direction, and the Z-axis direction (“Z” direction shown in FIG. 4) depends on the horizontal inclination angle (“ ⁇ ” (theta) shown in FIG. 4).
- XY-Attitude arctan (Y-mean /
- YZ-Attitude arctan (Z-mean /
- ZX-Attitude arctan (X-mean /
- the information processing apparatus calculates the calculation results of steps S104 to S108 based on the detection value of the sensor corresponding to the setting target apparatus, and the position where the sensor corresponding to the setting target apparatus can be mounted (shown in FIG. 3). Based on the conditions corresponding to “mounting position X”, “mounting position Y”, etc, The mounting position is recognized (S110 to S118).
- FIG. 6 and 7 are explanatory diagrams for explaining a first example of the mounting position recognition process according to the present embodiment.
- FIG. 6 shows an example of the calculation result of steps S104 to S108 based on the detection value of the sensor corresponding to the setting target device when the setting target device is worn on the wrist part of the user.
- FIG. 7 shows an example of the calculation results of steps S104 to S108 based on the detection value of the sensor corresponding to the setting target device when the setting target device is worn on the user's waist.
- A1, A2, B1, B2, C1, and C2 shown in FIG. 6 and A, B, and C shown in FIG. 7 are detections of sensors corresponding to the setting target device when the user performs a walking motion.
- An example of the calculation results of steps S104 to S108 based on the values is shown.
- the information processing apparatus recognizes the mounting position by performing threshold determination corresponding to a position where a sensor corresponding to the setting target apparatus can be mounted, for example, as illustrated in steps S110 and S114 of FIG. To do.
- step S118 when the condition corresponding to the position where the sensor corresponding to the setting target device can be mounted is not satisfied, the information processing apparatus according to the present embodiment recognizes that the mounting position is unknown. Also good. When it is recognized that the mounting position is unknown, the information processing apparatus according to the present embodiment, for example, in the process (1) (behavior recognition mode setting process) and the process (3) (execution control process), A preset process (default process) is performed.
- the information processing apparatus includes a detection value of a sensor corresponding to a setting target apparatus, and an output of a reference device that is a reference for mounting position recognition. Based on this, the mounting position is recognized.
- a reference device for example, a sensor used for action recognition corresponding to a device to be set, such as a barometric pressure sensor, can be cited.
- the reference device according to the present embodiment is a sensor used for action recognition corresponding to a setting target device
- the output of the reference device is a detection value of the sensor.
- FIG. 8 is an explanatory diagram for explaining a second example of the mounting position recognition processing according to the present embodiment.
- FIG. 8 shows a case where the reference device is an atmospheric pressure sensor attached to the user's waist, and the sensor used for action recognition corresponding to the setting target device that can be attached to each part of the user includes an atmospheric pressure sensor.
- 1 shows an example of a table used for recognition of the mounting position.
- the information processing apparatus subtracts, for example, the detection value of the pressure sensor attached to the waist part, which is the output of the reference device, from the detection value of the sensor corresponding to the setting target apparatus. Then, it identifies which mounting position in the table shown in FIG. 8 the subtracted value corresponds to, and recognizes the specified mounting position as the mounting position.
- the reference device according to the present embodiment is not limited to a sensor used for action recognition corresponding to a setting target device.
- the reference device according to the present embodiment may be a device other than a sensor used for action recognition corresponding to a setting target device, such as an audio output device such as a speaker.
- the output of the reference device is an audio signal output from the audio output device.
- the sensor used for action recognition corresponding to the setting target device includes an audio input device such as a microphone.
- the information processing apparatus according to the present embodiment specifies, for example, a phase difference between an audio signal output from an audio output device that is a reference device and an audio signal detected by the audio input device. Then, the information processing apparatus according to the present embodiment recognizes the mounting position corresponding to the specified phase difference using, for example, a table in which the phase difference and the mounting position are associated, as in FIG. .
- the reference device according to the present embodiment is a device other than the sensor used for the action recognition corresponding to the setting target device
- the reference device according to the present embodiment is used for the action recognition corresponding to the setting target device.
- the sensor used is not limited to a device related to an audio signal.
- the reference device according to the present embodiment and the sensor used for action recognition corresponding to the setting target device are, for example, any devices that can recognize the mounting position using the phase difference of the signal. There may be.
- the information processing apparatus is based on “estimation result of user behavior estimated based on detection value of sensor corresponding to setting target apparatus”. Then, the mounting position is estimated, and the estimated mounting position is recognized as the mounting position.
- FIG. 9 is an explanatory diagram for explaining a third example of the mounting position recognition processing according to the present embodiment.
- the information processing apparatus sets a mounting position initial probability (S200).
- FIG. 10 is an explanatory diagram for explaining a third example of the mounting position recognition process according to the present embodiment.
- 10A shows an example of the mounting position initial probability
- FIG. 10B shows an example of the mounting position probability updated as a result of the process in step S204 of FIG. 9 described later.
- the information processing apparatus reads, for example, data in which a mounting position probability as illustrated in A of FIG. 10 is set from a storage unit (described later), a connected external recording medium, or the like.
- Set initial position probability. 10A shows an example in which the mounting probability at each mounting position is a constant value, but the mounting probability at each mounting position is set by a biased probability distribution using the user's habits and the like. May be.
- the biased probability distribution for example, “the user always increases the probability of the wrist because the user always wears the setting target device on the wrist”.
- the information processing apparatus determines whether or not the mounting position estimation end condition is satisfied (S202).
- the information processing apparatus determines that the mounting position estimation end condition is satisfied when the bias of the mounting probability becomes sufficiently large. More specifically, the information processing apparatus according to the present embodiment, for example, when the mounting probability at a certain mounting position is equal to or greater than a set threshold or the threshold at which the mounting probability at a certain mounting position is set. When it becomes larger, it is determined that the mounting position estimation end condition is satisfied.
- step S202 If it is determined in step S202 that the mounting position estimation end condition is satisfied, the information processing apparatus according to the present embodiment ends the mounting position recognition process according to the third example.
- the information processing apparatus estimates the user's behavior based on the detection value of the sensor corresponding to the setting target apparatus. (S204).
- the information processing apparatus multiplies the probability distribution of the mounting position probability as illustrated in FIG. 10 and a value indicating the probability of action based on the detection value of the sensor corresponding to the setting target apparatus.
- the action with the larger multiplied value is estimated as the action of the user.
- the process related to the action estimation may not be performed for a wearing position whose wearing probability is equal to or lower than the set probability or smaller than the set probability.
- the user performs an action such that the probability of the recognizer recognizing soccer at the ankle is 50 [%] and the probability of the recognizer recognizing the swing at the wrist is 60 [%]. Then, the behavior is estimated as follows based on the probability distribution of the wearing position probability.
- the probability distribution of the wearing position probability is A in FIG. 10: From “60 [%] ⁇ 12.5> 50 [%] ⁇ 12.5”
- the user's action is estimated as a swing.
- the probability distribution of the wearing position probability is B in FIG. 10: From “60 [%] ⁇ 20 ⁇ 50 [%] ⁇ 30”
- the user's action is estimated to be soccer.
- the information processing apparatus estimates the mounting position based on, for example, a behavior estimation result (S206). For example, the information processing apparatus according to the present embodiment updates the mounting probability illustrated in FIG. 10 and estimates the mounting position with the highest mounting probability as the mounting position.
- step S204 when the user's action is estimated to be soccer and the probability is 80 [%], the information processing apparatus according to the present embodiment uses the probability distribution of the wearing position probability shown in FIG. Increase the probability of wearing an ankle.
- the information processing apparatus according to the present embodiment may increase the wearing probability by a predetermined amount, or may change the way of raising the wearing probability according to the probability.
- the user's action is “a squat when the sensor corresponding to the setting target device is attached to the neck”, “when the sensor corresponding to the setting target device is attached to the neck.
- the information processing apparatus according to this embodiment reduces the probability of wearing the neck.
- the information processing apparatus according to the present embodiment may decrease the mounting probability by a predetermined amount, or may change the method of decreasing the mounting probability according to the probability (or combination of the certainty).
- the information processing apparatus determines whether or not the mounting position probability reset condition is satisfied (S208).
- the mounting position probability reset condition is a condition for resetting the mounting position probability when the mounting position is changed.
- the information processing apparatus determines that the mounting position probability reset condition is satisfied, for example, when a signal indicating that the setting target apparatus is removed from the attachment is detected.
- the information processing apparatus determines that the mounting position probability reset condition is satisfied, for example, when it is detected that the probability of the action recognizer corresponding to the mounting position with a low probability is extremely high. May be.
- step S208 When it is determined in step S208 that the mounting position probability reset condition is satisfied, the information processing apparatus according to the present embodiment performs the processing from step S200. If it is not determined in step S208 that the mounting position probability reset condition is satisfied, the information processing apparatus according to the present embodiment performs the processing from step S202.
- the information processing apparatus gradually narrows down the mounting position candidates as actions with high accuracy continue. be able to.
- the information processing apparatus recognizes a wearing position based on an operation signal based on a user operation that designates the wearing position.
- an operation signal transmitted from an operation unit (described later) included in the information processing apparatus according to the present embodiment or an external operation device such as a remote controller is transmitted, and a communication unit (described later). Or an operation signal received by a connected external communication device.
- the user operation for specifying the mounting position includes, for example, an operation in which the user specifies the mounting position by pressing a button or the like, a gesture operation in which the user performs a gesture indicating the mounting position, An arbitrary operation capable of specifying the mounting position, such as a voice operation for specifying the mounting position by speaking the mounting position, can be mentioned.
- the information processing apparatus recognizes the mounting position based on the detection value of the sensor corresponding to the setting target apparatus.
- the sensor corresponding to the setting target device is a sensor included in the setting target device or an external device connected to the setting target device. It is a sensor.
- the sensor according to the fifth example of the attachment position recognition process include a button, an illuminance sensor, a proximity sensor, and an atmospheric pressure sensor.
- the sensor according to the fifth example of the attachment position recognition process may be included in a sensor used for behavior recognition corresponding to a setting target device, or may be a sensor used for behavior recognition corresponding to a setting target device; May be a separate sensor.
- FIG. 11 is an explanatory diagram for explaining a fifth example of the mounting position recognition process according to the present embodiment.
- FIG. 11A shows an example of a sensor unit corresponding to the setting target device according to the fifth example of the attachment position recognition process.
- S1 to S4 shown in FIG. 11A show the attachment position recognition process.
- An example of the sensor concerning the 5th example is shown.
- 11B shows an attachment to which the sensor unit shown in FIG. 11A can be attached.
- P shown in FIG. 11B is a protrusion that can correspond to each of S1 to S4 shown in FIG. 11A. Shows the part.
- C and D in FIG. 11 show an example in which the protrusion P shown in B of FIG. 11 corresponds to the sensor shown in A of FIG. 11.
- the correspondence relationship between the sensor shown in FIG. 11A and the protrusion P shown in FIG. 11B varies depending on the mounting position, for example.
- the protrusion P shown in B of FIG. 11 corresponds to the sensor shown in A of FIG. 11, for example, when the protrusion P presses a button (an example of a sensor), an illuminance sensor, a proximity sensor, or atmospheric pressure. It means that the protrusion P blocks the sensor (an example of the sensor).
- the protrusions P correspond to the sensors S1 and S2, so that an electric signal corresponding to the pressing is transmitted from the button (an example of the sensor), or an illuminance sensor and a proximity sensor.
- the detection value of the atmospheric pressure sensor changes.
- the information processing apparatus has a mounting position depending on a position of a button (an example of a sensor) to which an electrical signal is transmitted, or a position at which detection values of an illuminance sensor, a proximity sensor, and an atmospheric pressure sensor (an example of a sensor) change. Recognize
- FIG. 11 shows an example in which the protrusion P is provided on the attachment side, but the sensor is provided on the attachment side (an example of a sensor corresponding to the device to be set), and is set to be attached to the attachment.
- the configuration may be such that the protrusion P is provided on the target device side.
- the information processing apparatus is mounted by, for example, pattern recognition using model data learned in advance from sensor detection values at each mounting position and sensor detection values corresponding to a setting target apparatus. Recognize position.
- the information processing apparatus can also recognize the mounting position based on the detection value of the sensor, for example, using the technique described in Japanese Patent Application Laid-Open No. 2006-340903 filed by the applicant of the present application. is there.
- FIG. 12 is a block diagram illustrating an example of the configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 includes a communication unit 102, a detection unit 104, and a control unit 106, for example.
- the information processing apparatus 100 includes, for example, a ROM (Read Only Memory, not shown), a RAM (Random Access Memory, not shown), a storage unit (not shown), and a user-operable operation unit (see FIG. And a display unit (not shown) for displaying various screens on the display screen.
- the information processing apparatus 100 connects the above constituent elements by, for example, a bus as a data transmission path.
- a ROM (not shown) stores control data such as a program used by the control unit 106 and calculation parameters.
- a RAM (not shown) temporarily stores a program executed by the control unit 106.
- the storage unit is a storage unit included in the information processing apparatus 100.
- information according to the present embodiment such as the table shown in FIG. 8 or the data indicating the probability distribution of the mounting position probability shown in FIG.
- Various data such as data relating to the processing method and applications are stored.
- examples of the storage unit (not shown) include a magnetic recording medium such as a hard disk, and a non-volatile memory such as a flash memory. Further, the storage unit (not shown) may be detachable from the information processing apparatus 100.
- an operation input device to be described later can be cited.
- a display part (not shown), the display device mentioned later is mentioned.
- FIG. 13 is an explanatory diagram illustrating an example of a hardware configuration of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 includes, for example, an MPU 150, a ROM 152, a RAM 154, a recording medium 156, an input / output interface 158, an operation input device 160, a display device 162, a communication interface 164, and a sensor 166.
- the information processing apparatus 100 connects each component with a bus 168 as a data transmission path, for example.
- the MPU 150 includes, for example, a processor configured by an arithmetic circuit such as an MPU (Micro Processing Unit), various processing circuits, and the like, and functions as the control unit 106 that controls the information processing apparatus 100 as a whole.
- the MPU 150 serves as, for example, a mounting position recognition unit 110, a behavior recognition mode setting unit 112, a feature extraction unit 114, a behavior recognition unit 116, and a processing control unit 118, which will be described later.
- the ROM 152 stores programs used by the MPU 150, control data such as calculation parameters, and the like.
- the RAM 154 temporarily stores a program executed by the MPU 150, for example.
- the recording medium 156 functions as a storage unit (not shown), and stores various data such as data related to the information processing method according to the present embodiment such as the table shown in FIG. 8 and applications.
- examples of the recording medium 156 include a magnetic recording medium such as a hard disk and a non-volatile memory such as a flash memory. Further, the recording medium 156 may be detachable from the information processing apparatus 100.
- the input / output interface 158 connects, for example, the operation input device 160 and the display device 162.
- the operation input device 160 functions as an operation unit (not shown)
- the display device 162 functions as a display unit (not shown).
- examples of the input / output interface 158 include a USB (Universal Serial Bus) terminal, a DVI (Digital Visual Interface) terminal, an HDMI (High-Definition Multimedia Interface) (registered trademark) terminal, and various processing circuits. .
- the operation input device 160 is provided on the information processing apparatus 100, for example, and is connected to the input / output interface 158 inside the information processing apparatus 100.
- Examples of the operation input device 160 include a button, a direction key, a rotary selector such as a jog dial, or a combination thereof.
- the display device 162 is provided on the information processing apparatus 100, for example, and is connected to the input / output interface 158 inside the information processing apparatus 100.
- Examples of the display device 162 include a liquid crystal display (Liquid Crystal Display), an organic EL display (Organic Electro-Luminescence Display, or an OLED display (Organic Light Emitting Diode Display)), and the like.
- the input / output interface 158 can also be connected to an external device such as an external operation input device (for example, a keyboard or a mouse), an external display device, or an external sensor as an external device of the information processing apparatus 100. Needless to say.
- the display device 162 may be a device capable of display and user operation, such as a touch screen.
- the communication interface 164 is a communication unit included in the information processing apparatus 100, and is a communication for performing wireless or wired communication with an external device such as an external setting target device via a network (or directly). It functions as the unit 102.
- the communication interface 164 for example, a communication antenna and an RF (Radio Frequency) circuit (wireless communication), an IEEE 802.15.1 port and a transmission / reception circuit (wireless communication), an IEEE 802.11 port and a transmission / reception circuit (wireless communication). ), Or a LAN (Local Area Network) terminal and a transmission / reception circuit (wired communication).
- a wired network such as a LAN or a WAN (Wide Area Network), a wireless LAN (WLAN: Wireless Local Area Network) or a wireless WAN via a base station (WWAN: Wireless Wide Area Area).
- a communication protocol such as TCP / IP (Transmission Control Protocol / Internet Protocol).
- the sensor 166 is a sensor used for action recognition provided in the information processing apparatus 100 and functions as the detection unit 104.
- examples of the sensor 166 include an arbitrary sensor that can be used for processing related to user action recognition, such as an acceleration sensor, a GPS device, a gyro sensor, an atmospheric pressure sensor, a proximity sensor, and a biological sensor.
- the sensor 166 may be a sensor group having a plurality of sensors.
- the senor 166 may serve as a sensor (sensor according to the fifth example of the mounting position recognition process) related to the process (4) (mounting position recognition process) described above.
- the information processing apparatus 100 performs processing related to the information processing method according to the present embodiment, for example, with the configuration illustrated in FIG. 13. Note that the hardware configuration of the information processing apparatus 100 according to the present embodiment is not limited to the configuration illustrated in FIG. 13.
- the information processing device 100 may not include the sensor 166.
- the information processing apparatus 100 is a setting target apparatus and the external sensor having the same function as the sensor 166 is connected, the information processing apparatus 100 may not include the sensor 166. .
- the information processing apparatus 100 may not include the communication interface 164 when communicating with the external apparatus via, for example, a connected external communication device. Further, the information processing apparatus 100 can be configured not to include the storage medium 156, the operation device 160, and the display device 162.
- the communication unit 102 is a communication unit included in the information processing apparatus 100, and communicates with an external device such as an external setting target device wirelessly or via a network (or directly).
- the communication of the communication unit 102 is controlled by the control unit 106, for example.
- examples of the communication unit 102 include a communication antenna and an RF circuit, a LAN terminal, and a transmission / reception circuit, but the configuration of the communication unit 102 is not limited to the above.
- the communication unit 102 can take a configuration corresponding to an arbitrary standard capable of performing communication, such as a USB terminal and a transmission / reception circuit, or an arbitrary configuration capable of communicating with an external device via a network.
- the detection unit 104 includes a sensor used for action recognition provided in the information processing apparatus 100 and outputs a detection value.
- the detection unit 104 for example, an arbitrary sensor that can be used for processing related to user action recognition, such as an acceleration sensor or a GPS device, can be used.
- the detection part 104 may be comprised with the sensor group which has several sensors.
- the sensor included in the detection unit 104 may serve as a sensor related to the above-described process (4) (mounting position recognition process) (a sensor according to the fifth example of the mounting position recognition process).
- the control unit 106 is composed of, for example, an MPU and plays a role of controlling the entire information processing apparatus 100.
- the control unit 106 includes, for example, a mounting position recognition unit 110, a behavior recognition mode setting unit 112, a feature extraction unit 114, a behavior recognition unit 116, and a processing control unit 118, and information according to the present embodiment. It plays a role of leading the processing related to the processing method.
- the control unit 106 may further include, for example, a communication control unit (not shown) that controls communication in the communication unit 102.
- a communication control unit (not shown) controls transmission / reception of various information. Note that the functions of the communication control unit (not shown) may be performed by other components such as the communication unit 102.
- the mounting position recognition unit 110 plays a leading role in performing the process (4) (mounting position recognition processing), and recognizes the mounting position.
- the mounting position recognition unit 110 for example, is one of the mounting position recognition processing according to the first example shown in (4-1) to the mounting position recognition processing according to the sixth example shown in (4-6). By performing the processing, the mounting position is recognized.
- the information processing apparatus according to the present embodiment performs the process of step S204 in FIG.
- a mounting position estimation unit (not shown) that fulfills the role and estimates the mounting position may be further provided.
- the behavior recognition mode setting unit 112 plays a role of leading the processing (1) (behavior recognition mode setting processing), and sets the behavior recognition mode based on the mounting position information of the setting target device. For example, the behavior recognition mode setting unit 112 performs processing related to the setting of the behavior recognition mode according to the first example shown in (1-1) above or the behavior recognition according to the second example shown in (1-2) above.
- the action recognition mode is set by performing processing related to the mode setting.
- the feature extraction unit 114 extracts, for example, a feature amount corresponding to the type of feature amount used for behavior recognition corresponding to the set behavior recognition mode from the detection result of the detection unit 104. Note that the information processing apparatus according to the present embodiment may be configured without the feature extraction unit 114.
- the action recognition unit 116 plays a role of leading the process (2) (behavior recognition process) and corresponds to a set target device such as the set action recognition mode and the detection value of the detection unit 104.
- the user's behavior is recognized based on the detection value of the sensor.
- the behavior recognition unit 116 recognizes a predetermined behavior based on, for example, an algorithm or model data corresponding to the set behavior recognition mode and the feature amount extracted by the feature extraction unit 114.
- the behavior recognition unit 116 for example, the algorithm or model data corresponding to the set behavior recognition mode and the detection value of the sensor corresponding to the setting target device. Based on the above, a predetermined action is recognized.
- the process control unit 118 plays a role of leading the process (3) (execution control process), and controls the execution of the process corresponding to the user action recognized by the action recognition unit 116.
- the process control unit 118 can also control execution of processing corresponding to the mounting position indicated by the mounting position information and the user's behavior recognized by the behavior recognition unit 116, for example.
- the process control unit 118 performs, for example, the process according to the first example shown in (a) to the process according to the eighth example shown in (h).
- the control unit 106 includes, for example, a mounting position recognition unit 110, a behavior recognition mode setting unit 112, a feature extraction unit 114, a behavior recognition unit 116, and a processing control unit 118, and thus relates to the information processing method according to the present embodiment. The process is led.
- the information processing apparatus 100 has, for example, the configuration shown in FIG. 12 and processes (for example, the process (1) (behavior recognition mode setting process) to the process (4) (attachment) related to the information processing method according to this embodiment. Position recognition processing)).
- processes for example, the process (1) (behavior recognition mode setting process) to the process (4) (attachment) related to the information processing method according to this embodiment. Position recognition processing)).
- the information processing apparatus 100 can recognize the user's action with higher accuracy by using the configuration shown in FIG. 12, for example, and can control processing according to the recognized user's action.
- the information processing apparatus 100 can exhibit the effects exhibited by performing the processing related to the information processing method according to the present embodiment as described above, for example.
- the processing apparatus may be configured not to include the mounting position recognition unit 110. Even in a configuration that does not include the mounting position recognition unit 110, the information processing apparatus according to the present embodiment performs the process (1) (behavior recognition mode setting process) to the process (3) (execution control process). Is possible. Therefore, even in a configuration that does not include the mounting position recognition unit 110, the information processing apparatus according to the present embodiment can recognize a user's action with higher accuracy, and controls processing according to the recognized user's action. can do.
- the information processing apparatus is, for example, one of the mounting position recognition unit 110, the behavior recognition mode setting unit 112, the feature extraction unit 114, the behavior recognition unit 116, and the processing control unit 118 illustrated in FIG.
- two or more can be provided separately from the control unit 106 (for example, realized by another processing circuit).
- the information processing apparatus when the device to be set is an external device, the information processing apparatus according to this embodiment may not include the detection unit 104.
- the information processing apparatus 100 is a setting target apparatus and an external sensor having the same function as that of the detection unit 104 is connected, the information processing apparatus according to this embodiment includes the detection unit 104. It does not have to be provided.
- the information processing apparatus when communicating with an external device via an external communication device having the same function and configuration as the communication unit 102, does not include the communication unit 102. May be.
- the information processing apparatus has been described as the present embodiment, but the present embodiment is not limited to such a form.
- a communication device such as a mobile phone or a smartphone, a tablet-type device, a video / music playback device (or video / music recording / playback device), a game machine, a computer such as a notebook PC (Personal Computer)
- a wearable device for example.
- the present embodiment can be applied to various devices that are not easy to carry, such as computers such as servers and desktop PCs.
- this embodiment can also be applied to, for example, a processing IC (Integrated Circuit) that can be incorporated in the above devices.
- a processing IC Integrated Circuit
- an information processing system having an information processing device and one or more setting target devices can be realized as a cloud computing type information processing system. Is possible.
- Program related to information processing apparatus A program for causing a computer to function as the information processing apparatus according to the present embodiment (for example, “process (1) (action recognition mode setting process) to process (3)” above. (Execution Control Process) "," Process (1) (Action Recognition Mode Setting Process) to Process (4) (Mounting Position Recognition Process) ", and the like, according to the information processing method according to this embodiment Is executed by a processor or the like in the computer, so that the user's behavior can be recognized with higher accuracy, and processing according to the recognized user's behavior can be controlled. .
- an effect produced by the processing related to the information processing method according to the above-described embodiment by executing a program for causing the computer to function as the information processing apparatus according to the present embodiment by a processor or the like in the computer. Can be played.
- a program for causing a computer to function as the information processing apparatus according to the present embodiment is provided.
- the present embodiment further includes a recording in which the program is stored.
- a medium can also be provided.
- An action recognition mode setting unit for setting an action recognition mode based on the mounting position information of the device to be set;
- An action recognition unit for recognizing a user's action based on the set action recognition mode and a detection value of a sensor corresponding to the setting target device;
- a process control unit that controls execution of a process corresponding to the recognized user behavior;
- An information processing apparatus comprising: (2)
- the behavior recognition mode includes settings related to sensors, The information according to (1), wherein the behavior recognition mode setting unit performs setting related to the sensor based on the mounting position information for the sensor corresponding to the setting target device as the behavior recognition mode setting. Processing equipment.
- the information processing apparatus wherein the setting related to the sensor includes one or both of a setting of a sensor type and a setting of a parameter of the sensor.
- the behavior recognition mode includes a setting related to processing related to behavior recognition, The information processing according to any one of (1) to (3), wherein the behavior recognition mode setting unit performs a setting related to processing related to the behavior recognition based on the mounting position information as the setting of the behavior recognition mode. apparatus.
- the setting related to the process related to the action recognition includes the setting of the type of feature amount used for action recognition from the detection values of the sensor corresponding to the setting target device, and the setting of the algorithm used for the process related to the action recognition.
- the information processing apparatus including one or more of setting of model data used for the process related to the action recognition.
- a mounting position recognition unit for recognizing a mounting position where the setting target device is mounted on a user;
- the behavior recognition mode setting unit sets the behavior recognition mode based on the mounting position information indicating the mounting position recognized by the mounting position recognition unit;
- the processing control unit controls execution of processing based on the mounting position information indicating the mounting position recognized by the mounting position recognition unit, according to any one of (1) to (5).
- Information processing device is included in the mounting position recognition unit.
- the mounting position recognition unit recognizes the mounting position based on a detection value of a sensor corresponding to the setting target device and a condition corresponding to a position where the sensor can be mounted.
- Information processing device including one or more of setting of model data used for the process related to the action recognition.
- the mounting position recognizing unit recognizes the mounting position based on a detection value of a sensor corresponding to the setting target device and an output of a reference device serving as a reference for recognition of the mounting position.
- the information processing apparatus described.
- the mounting position recognition unit Based on the estimation result of the user's behavior estimated based on the detection value of the sensor corresponding to the setting target device, the mounting position is estimated, The information processing apparatus according to (6), wherein the estimated mounting position is recognized as the mounting position.
- the information processing apparatus according to (6), wherein the mounting position recognition unit recognizes the mounting position based on an operation signal based on a user operation that specifies the mounting position.
- the information processing apparatus (11) The information processing apparatus according to (6), wherein the mounting position recognition unit recognizes the mounting position based on a detection value of a sensor corresponding to the setting target apparatus. (12) The mounting position recognition unit recognizes the mounting position based on a detection value of a sensor corresponding to the setting target device and model data learned in advance at each position where the sensor can be mounted. ). (13) A detection unit including a sensor corresponding to the setting target device; The information processing apparatus according to any one of (1) to (12), wherein the behavior recognition unit recognizes a user's behavior based on a detection value of the detection unit.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physiology (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Fuzzy Systems (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Probability & Statistics with Applications (AREA)
- Algebra (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
Abstract
Description
1.本実施形態に係る情報処理方法
2.本実施形態に係る情報処理装置
3.本実施形態に係るプログラム
本実施形態に係る情報処理装置の構成について説明する前に、まず、本実施形態に係る情報処理方法について説明する。以下では、本実施形態に係る情報処理方法に係る処理を本実施形態に係る情報処理装置が行う場合を例に挙げて、本実施形態に係る情報処理方法について説明する。
本実施形態に係る情報処理装置は、設定対象の装置に対して、装着位置情報に基づく行動認識モードを設定する。
・頭部装着型の装置(図2のA):例えば、HMD(Head Mounted Display)や、撮像装置など
・アイウェア型の装置(図2のB):例えば、HMDや、メガネ型の装置など
・首かけ型の装置(図2のC):例えば、撮像装置や、ヘッドセット、ネックレス型の装置、データロガーなど
・手首/腕装着型の装置(図2のD):例えば、時計型の装置や、データロガー、ブレスレット型の装置、リストバンド型の装置など
・手/指装着型の装置(図2のEのE1):例えば、グローブ型の装置や、リング型の装置など
・腰/上着/ポケット装着型の装置(図2のEのE2):例えば、ベルト型の装置や、クリップ/マグネット型の装置、データロガーなど
・足首/足装着型の装置(図2のEのE3):例えば、アンクレット型の装置や、データロガーなど
・装着位置が「頭部」の場合:水泳認識(泳法、ターン回数など)、うなずき認識
・装着位置が「顎」の場合:発話しているか否かの認識、咀嚼回数認識、食べ物種類認識
・装着位置が「首」の場合:トレーニング認識(スクワット、腕立てなど)
・装着位置が「胸」の場合:呼吸しているか否かの認識(体動)
・装着位置が「足首」の場合:サッカー認識(キックのモーションなど)、自転車のケイデンス認識
・装着位置が「指」の場合:タイピング認識
・装着位置が「手首」の場合:スイング認識(野球やテニス、ゴルフなど)
・装着位置が「ズボンのポケット」の場合:乗物認識(電車やバスなど)
本実施形態に係る情報処理装置は、設定対象の装置に対応するセンサに対して、装着位置情報に基づくセンサに関する設定を行う。
・有効とするセンサの種類を示すデータ(例えば、センサIDなど)
・センサのパラメータを示すデータ
本実施形態に係る情報処理装置は、装着位置情報に基づく行動認識に係る処理に関する設定を行う。
・特徴量の種類を示すデータ(例えば、特徴量を示すIDなど)
・行動認識に係る処理に用いるアルゴリズムを示すデータ(例えば、プログラムデータや、アルゴリズムを示すIDなど)
・行動認識に係る処理に用いるモデルデータを示すデータ(例えば、モデルデータそのものや、モデルデータを示すIDなど)
本実施形態に係る情報処理装置は、設定された行動認識モードと、設定対象の装置に対応するセンサの検出値とに基づいてユーザの行動を認識する。
本実施形態に係る情報処理装置は、認識されたユーザの行動に対応する処理の実行を制御する。また、本実施形態に係る情報処理装置は、例えば、装着位置情報が示す装着位置と、認識されたユーザの行動とに対応する処理の実行を制御してもよい。以下では、本実施形態に係る実行制御処理により制御される処理を、「行動に対応する処理」と示す。
上記(2)の処理(行動認識処理)において、スクワットや腕立てなどのトレーニング認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“音声により励ましを行うアプリケーションを起動させる処理”を特定する。そして、本実施形態に係る情報処理装置は、ユーザが装着しているウェアラブル装置に、音声を出力させる。
上記(2)の処理(行動認識処理)において、呼吸しているか否かの認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“呼吸のペースを友達と共有することが可能なアプリケーションを起動させる処理”を特定する。そして、本実施形態に係る情報処理装置は、ユーザが装着しているウェアラブル装置に、呼吸のペースを友達と共有することが可能なアプリケーションを起動させる。
上記(2)の処理(行動認識処理)において、睡眠時に呼吸しているか否かの認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“睡眠時無呼吸症候群確認用のアプリケーションを起動させる処理”を特定する。そして、本実施形態に係る情報処理装置は、ユーザが装着しているウェアラブル装置に、睡眠時無呼吸症候群確認用のアプリケーションを起動させる。
上記(2)の処理(行動認識処理)において、サッカー認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“サッカーエンハンス機能に係る処理”を特定する。そして、本実施形態に係る情報処理装置は、ユーザが装着しているウェアラブル装置に、サッカーエンハンス機能に係る処理を実行させる。
・走量や動き方から疲労度を推定して、足首に装着されたウェアラブル装置が推定結果に応じた色の光を出力する処理:出力される光は、例えば、監督が采配を考える目安や、プレイヤーが戦略を立てる参考として用いることが可能である。
・キックスピードやインパクトの強弱を、足首に装着されたウェアラブル装置が音で提示する処理(例えば、「スパッ」(弱い場合)や、「ズドーン」(強い場合)など):キックスピードなどの強弱が音で提示されることによって、例えば、観戦において情報拡張がなされ、より楽しい観戦とすることが可能である。
上記(2)の処理(行動認識処理)において、自転車のケイデンス認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“サイクリング機能に係る処理、または、トレーニング機能に係る処理”を特定する。本実施形態に係る情報処理装置は、例えば、ケイデンスによって特定する処理を判断する。そして、本実施形態に係る情報処理装置は、ユーザが装着しているウェアラブル装置に、サイクリング機能に係る処理、または、トレーニング機能に係る処理を実行させる。
上記(2)の処理(行動認識処理)において、タイピング認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“ユーザに対してフィードバック(例えば、音声(音楽も含む)、振動、文字、光などによるフィードバック)を行う処理”を特定する。そして、本実施形態に係る情報処理装置は、ユーザが装着しているウェアラブル装置に、ユーザに対してフィードバックを行う処理を実行させる。
上記(2)の処理(行動認識処理)において、テニスやゴルフ、野球などのスイング認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“動画像の撮像を行う処理”を特定する。また、本実施形態に係る情報処理装置は、行動に対応する処理として、さらに“撮像された動画像を編集する処理”を特定してもよい。
上記(2)の処理(行動認識処理)において、自転車や歩きの認識が行われた場合、本実施形態に係る情報処理装置は、行動に対応する処理として、例えば、“道をユーザに通知する処理”を特定する。そして、本実施形態に係る情報処理装置は、ズボンのポケット内で間接的に装着されている装置に、道をユーザに通知する処理を実行させる。
例えば、本実施形態に係る情報処理装置は、設定対象の装置がユーザに装着されている装着位置を認識する装着位置認識処理を、さらに行うことも可能である。
本実施形態に係る情報処理装置は、設定対象の装置に対応するセンサの検出値と、設定対象の装置に対応するセンサが装着可能な位置に対応する条件とに基づいて、装着位置を認識する。
・・・(数式1)
YZ-Attitude=arctan(Z-mean/|Y-mean|)
・・・(数式2)
ZX-Attitude=arctan(X-mean/|Z-mean|)
・・・(数式3)
・if(steps>threshold) then (e1=TRUE)
・if(th_min<XY-attitude<th_max)then(e2=TRUE)
・if(th_min<Z-mean<th_max)then(e3=TRUE)
・if(e1×e2×e3==1)then TRUE else FALSE
本実施形態に係る情報処理装置は、設定対象の装置に対応するセンサの検出値と、装着位置の認識の基準となる基準デバイスの出力とに基づいて、装着位置を認識する。
本実施形態に係る情報処理装置は、“設定対象の装置に対応するセンサの検出値に基づき推定されるユーザの行動の推定結果”に基づいて、装着位置を推定し、推定された装着位置を、装着位置として認識する。
・装着位置確率の確率分布が、図10のAの場合:“60[%]×12.5>50[%]×12.5”より、ユーザの行動が、スイングと推定される。
・装着位置確率の確率分布が、図10のBの場合:“60[%]×20<50[%]×30”より、ユーザの行動が、サッカーと推定される。
本実施形態に係る情報処理装置は、装着位置を指定するユーザ操作に基づく操作信号に基づいて、装着位置を認識する。
本実施形態に係る情報処理装置は、設定対象の装置に対応するセンサの検出値に基づいて、装着位置を認識する。
本実施形態に係る情報処理装置は、設定対象の装置に対応するセンサの検出値と、当該センサが装着可能な位置それぞれにおいて予め学習されたモデルデータとに基づいて、装着位置を認識する。
次に、上述した本実施形態に係る情報処理方法に係る処理を行うことが可能な、本実施形態に係る情報処理装置の構成の一例について説明する。
図13は、本実施形態に係る情報処理装置100のハードウェア構成の一例を示す説明図である。情報処理装置100は、例えば、MPU150と、ROM152と、RAM154と、記録媒体156と、入出力インタフェース158と、操作入力デバイス160と、表示デバイス162と、通信インタフェース164と、センサ166を備える。また、情報処理装置100は、例えば、データの伝送路としてのバス168で各構成要素間を接続する。
[i]情報処理装置に係るプログラム
コンピュータを、本実施形態に係る情報処理装置として機能させるためのプログラム(例えば、“上記(1)の処理(行動認識モード設定処理)~上記(3)の処理(実行制御処理)”や、“上記(1)の処理(行動認識モード設定処理)~上記(4)の処理(装着位置認識処理))”など、本実施形態に係る情報処理方法に係る処理を実行することが可能なプログラム)が、コンピュータにおいてプロセッサなどにより実行されることによって、ユーザの行動をより高い精度で認識可能とし、認識されたユーザの行動に応じた処理を制御することができる。また、コンピュータを、本実施形態に係る情報処理装置として機能させるためのプログラムが、コンピュータにおいてプロセッサなどにより実行されることによって、上述した本実施形態に係る情報処理方法に係る処理によって奏される効果を、奏することができる。
(1)
設定対象の装置の装着位置情報に基づき、行動認識モードを設定する行動認識モード設定部と、
設定された前記行動認識モードと、前記設定対象の装置に対応するセンサの検出値とに基づいて、ユーザの行動を認識する行動認識部と、
認識されたユーザの行動に対応する処理の実行を制御する処理制御部と、
を備える、情報処理装置。
(2)
前記行動認識モードは、センサに関する設定を含み、
前記行動認識モード設定部は、前記行動認識モードの設定として、前記設定対象の装置に対応する前記センサに対して、前記装着位置情報に基づく前記センサに関する設定を行う、(1)に記載の情報処理装置。
(3)
前記センサに関する設定は、センサの種類の設定と、前記センサのパラメータの設定との一方または双方を含む、(2)に記載の情報処理装置。
(4)
前記行動認識モードは、行動認識に係る処理に関する設定を含み、
前記行動認識モード設定部は、前記行動認識モードの設定として、前記装着位置情報に基づく前記行動認識に係る処理に関する設定を行う、(1)~(3)のいずれか1つに記載の情報処理装置。
(5)
前記行動認識に係る処理に関する設定は、前記設定対象の装置に対応する前記センサの検出値の中から行動認識に用いる特徴量の種類の設定と、前記行動認識に係る処理に用いるアルゴリズムの設定と、前記行動認識に係る処理に用いるモデルデータの設定とのうちの、1または2以上を含む、(4)に記載の情報処理装置。
(6)
前記設定対象の装置がユーザに装着されている装着位置を認識する装着位置認識部をさらに備え、
前記行動認識モード設定部は、前記装着位置認識部において認識された前記装着位置を示す、前記装着位置情報に基づいて、前記行動認識モードを設定し、
前記処理制御部は、前記装着位置認識部において認識された前記装着位置を示す、前記装着位置情報に基づいて、処理の実行を制御する、(1)~(5)のいずれか1つに記載の情報処理装置。
(7)
前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値と、前記センサが装着可能な位置に対応する条件とに基づいて、前記装着位置を認識する、(6)に記載の情報処理装置。
(8)
前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値と、前記装着位置の認識の基準となる基準デバイスの出力とに基づいて、前記装着位置を認識する、(6)に記載の情報処理装置。
(9)
前記装着位置認識部は、
前記設定対象の装置に対応するセンサの検出値に基づき推定される、ユーザの行動の推定結果に基づいて、前記装着位置を推定し、
推定された前記装着位置を、前記装着位置として認識する、(6)に記載の情報処理装置。
(10)
前記装着位置認識部は、前記装着位置を指定するユーザ操作に基づく操作信号に基づいて、前記装着位置を認識する、(6)に記載の情報処理装置。
(11)
前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値に基づいて、前記装着位置を認識する、(6)に記載の情報処理装置。
(12)
前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値と、前記センサが装着可能な位置それぞれにおいて予め学習されたモデルデータとに基づいて、前記装着位置を認識する、(6)に記載の情報処理装置。
(13)
前記設定対象の装置に対応するセンサを含む検出部をさらに備え、
前記行動認識部は、前記検出部の検出値に基づいて、ユーザの行動を認識する、(1)~(12)のいずれか1つに記載の情報処理装置。
(14)
設定対象の装置の装着位置情報に基づき、行動認識モードを設定するステップと、
設定された前記行動認識モードと、前記設定対象の装置に対応するセンサの検出値とに基づいて、ユーザの行動を認識するステップと、
認識されたユーザの行動に対応する処理の実行を制御するステップと、
を有する、情報処理装置により実行される情報処理方法。
(15)
設定対象の装置の装着位置情報に基づき、行動認識モードを設定するステップ、
設定された前記行動認識モードと、前記設定対象の装置に対応するセンサの検出値とに基づいて、ユーザの行動を認識するステップ、
認識されたユーザの行動に対応する処理の実行を制御するステップ、
をコンピュータに実行させるためのプログラム。
102 通信部
104 検出部
106 制御部
110装着位置認識部
112 行動認識モード設定部
114 特徴抽出部
116 行動認識部
118 処理制御部
Claims (15)
- 設定対象の装置の装着位置情報に基づき、行動認識モードを設定する行動認識モード設定部と、
設定された前記行動認識モードと、前記設定対象の装置に対応するセンサの検出値とに基づいて、ユーザの行動を認識する行動認識部と、
認識されたユーザの行動に対応する処理の実行を制御する処理制御部と、
を備える、情報処理装置。 - 前記行動認識モードは、センサに関する設定を含み、
前記行動認識モード設定部は、前記行動認識モードの設定として、前記設定対象の装置に対応する前記センサに対して、前記装着位置情報に基づく前記センサに関する設定を行う、請求項1に記載の情報処理装置。 - 前記センサに関する設定は、センサの種類の設定と、前記センサのパラメータの設定との一方または双方を含む、請求項2に記載の情報処理装置。
- 前記行動認識モードは、行動認識に係る処理に関する設定を含み、
前記行動認識モード設定部は、前記行動認識モードの設定として、前記装着位置情報に基づく前記行動認識に係る処理に関する設定を行う、請求項1に記載の情報処理装置。 - 前記行動認識に係る処理に関する設定は、前記設定対象の装置に対応する前記センサの検出値の中から行動認識に用いる特徴量の種類の設定と、前記行動認識に係る処理に用いるアルゴリズムの設定と、前記行動認識に係る処理に用いるモデルデータの設定とのうちの、1または2以上を含む、請求項4に記載の情報処理装置。
- 前記設定対象の装置がユーザに装着されている装着位置を認識する装着位置認識部をさらに備え、
前記行動認識モード設定部は、前記装着位置認識部において認識された前記装着位置を示す、前記装着位置情報に基づいて、前記行動認識モードを設定し、
前記処理制御部は、前記装着位置認識部において認識された前記装着位置を示す、前記装着位置情報に基づいて、処理の実行を制御する、請求項1に記載の情報処理装置。 - 前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値と、前記センサが装着可能な位置に対応する条件とに基づいて、前記装着位置を認識する、請求項6に記載の情報処理装置。
- 前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値と、前記装着位置の認識の基準となる基準デバイスの出力とに基づいて、前記装着位置を認識する、請求項6に記載の情報処理装置。
- 前記装着位置認識部は、
前記設定対象の装置に対応するセンサの検出値に基づき推定される、ユーザの行動の推定結果に基づいて、前記装着位置を推定し、
推定された前記装着位置を、前記装着位置として認識する、請求項6に記載の情報処理装置。 - 前記装着位置認識部は、前記装着位置を指定するユーザ操作に基づく操作信号に基づいて、前記装着位置を認識する、請求項6に記載の情報処理装置。
- 前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値に基づいて、前記装着位置を認識する、請求項6に記載の情報処理装置。
- 前記装着位置認識部は、前記設定対象の装置に対応するセンサの検出値と、前記センサが装着可能な位置それぞれにおいて予め学習されたモデルデータとに基づいて、前記装着位置を認識する、請求項6に記載の情報処理装置。
- 前記設定対象の装置に対応するセンサを含む検出部をさらに備え、
前記行動認識部は、前記検出部の検出値に基づいて、ユーザの行動を認識する、請求項1に記載の情報処理装置。 - 設定対象の装置の装着位置情報に基づき、行動認識モードを設定するステップと、
設定された前記行動認識モードと、前記設定対象の装置に対応するセンサの検出値とに基づいて、ユーザの行動を認識するステップと、
認識されたユーザの行動に対応する処理の実行を制御するステップと、
を有する、情報処理装置により実行される情報処理方法。 - 設定対象の装置の装着位置情報に基づき、行動認識モードを設定するステップ、
設定された前記行動認識モードと、前記設定対象の装置に対応するセンサの検出値とに基づいて、ユーザの行動を認識するステップ、
認識されたユーザの行動に対応する処理の実行を制御するステップ、
をコンピュータに実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/110,440 US10592812B2 (en) | 2014-01-20 | 2014-10-16 | Information processing apparatus and information processing method |
JP2015557708A JP6508061B2 (ja) | 2014-01-20 | 2014-10-16 | 情報処理装置、情報処理方法、およびプログラム |
EP14878498.6A EP3098688A4 (en) | 2014-01-20 | 2014-10-16 | Information processing device, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-007920 | 2014-01-20 | ||
JP2014007920 | 2014-01-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015107737A1 true WO2015107737A1 (ja) | 2015-07-23 |
Family
ID=53542647
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/077597 WO2015107737A1 (ja) | 2014-01-20 | 2014-10-16 | 情報処理装置、情報処理方法、およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10592812B2 (ja) |
EP (1) | EP3098688A4 (ja) |
JP (1) | JP6508061B2 (ja) |
WO (1) | WO2015107737A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019076373A (ja) * | 2017-10-24 | 2019-05-23 | アイシン精機株式会社 | 情報処理装置 |
JP2019101811A (ja) * | 2017-12-04 | 2019-06-24 | 富士通株式会社 | 処理プログラム、処理方法及び処理装置、並びに表示プログラム、表示方法及び表示制御装置 |
CN111433831A (zh) * | 2017-12-27 | 2020-07-17 | 索尼公司 | 信息处理装置、信息处理方法和程序 |
JP2020150360A (ja) * | 2019-03-12 | 2020-09-17 | パナソニックi−PROセンシングソリューションズ株式会社 | ウェアラブルカメラおよび映像データ生成方法 |
CN116649959A (zh) * | 2023-05-31 | 2023-08-29 | 北京欧应科技有限公司 | 监测系统、判断佩戴装置定位的方法及存储介质 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016105166A1 (en) * | 2014-12-26 | 2016-06-30 | Samsung Electronics Co., Ltd. | Device and method of controlling wearable device |
US10555021B2 (en) * | 2015-08-31 | 2020-02-04 | Orcam Technologies Ltd. | Systems and methods for selecting content based on a user's behavior |
US11040263B2 (en) * | 2015-09-29 | 2021-06-22 | Sony Corporation | Sensing system, sensor device, and sensor fixture |
CN107145834B (zh) * | 2017-04-12 | 2020-06-30 | 浙江工业大学 | 一种基于物理属性的自适应行为识别方法 |
JP6525181B1 (ja) * | 2018-05-27 | 2019-06-05 | 株式会社アジラ | 行動推定装置 |
CN110896495A (zh) * | 2019-11-19 | 2020-03-20 | 北京字节跳动网络技术有限公司 | 用于目标设备的视图调整方法、装置、电子设备和介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004184351A (ja) * | 2002-12-06 | 2004-07-02 | Toshiba Corp | 動作情報計測システムおよび動作情報計測方法 |
JP2006192276A (ja) * | 2005-01-14 | 2006-07-27 | Samsung Electronics Co Ltd | 活動パターンの監視方法及びその装置 |
JP2006340903A (ja) | 2005-06-09 | 2006-12-21 | Sony Corp | 行動認識装置、方法およびプログラム |
US20090326406A1 (en) * | 2008-06-26 | 2009-12-31 | Microsoft Corporation | Wearable electromyography-based controllers for human-computer interface |
JP2010134802A (ja) | 2008-12-05 | 2010-06-17 | Sony Corp | 情報処理装置、及び情報処理方法 |
JP2012522561A (ja) * | 2009-04-03 | 2012-09-27 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ユーザの転倒を検出する方法及びシステム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7918808B2 (en) * | 2000-09-20 | 2011-04-05 | Simmons John C | Assistive clothing |
US8187182B2 (en) * | 2008-08-29 | 2012-05-29 | Dp Technologies, Inc. | Sensor fusion for activity identification |
EP2437696B1 (en) * | 2009-06-05 | 2019-04-03 | Advanced Brain Monitoring, Inc. | Systems and methods for controlling position |
EP2892421A1 (en) * | 2012-09-04 | 2015-07-15 | Whoop, Inc. | Systems, devices and methods for continuous heart rate monitoring and interpretation |
JP6466420B2 (ja) * | 2013-05-31 | 2019-02-06 | プレジデント アンド フェローズ オブ ハーバード カレッジ | 人間動作を補助するための軟性外骨格スーツ |
-
2014
- 2014-10-16 WO PCT/JP2014/077597 patent/WO2015107737A1/ja active Application Filing
- 2014-10-16 JP JP2015557708A patent/JP6508061B2/ja active Active
- 2014-10-16 EP EP14878498.6A patent/EP3098688A4/en not_active Ceased
- 2014-10-16 US US15/110,440 patent/US10592812B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004184351A (ja) * | 2002-12-06 | 2004-07-02 | Toshiba Corp | 動作情報計測システムおよび動作情報計測方法 |
JP2006192276A (ja) * | 2005-01-14 | 2006-07-27 | Samsung Electronics Co Ltd | 活動パターンの監視方法及びその装置 |
JP2006340903A (ja) | 2005-06-09 | 2006-12-21 | Sony Corp | 行動認識装置、方法およびプログラム |
US20090326406A1 (en) * | 2008-06-26 | 2009-12-31 | Microsoft Corporation | Wearable electromyography-based controllers for human-computer interface |
JP2010134802A (ja) | 2008-12-05 | 2010-06-17 | Sony Corp | 情報処理装置、及び情報処理方法 |
JP2012522561A (ja) * | 2009-04-03 | 2012-09-27 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ユーザの転倒を検出する方法及びシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3098688A4 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019076373A (ja) * | 2017-10-24 | 2019-05-23 | アイシン精機株式会社 | 情報処理装置 |
JP7006128B2 (ja) | 2017-10-24 | 2022-01-24 | 株式会社アイシン | 情報処理装置 |
JP2019101811A (ja) * | 2017-12-04 | 2019-06-24 | 富士通株式会社 | 処理プログラム、処理方法及び処理装置、並びに表示プログラム、表示方法及び表示制御装置 |
CN111433831A (zh) * | 2017-12-27 | 2020-07-17 | 索尼公司 | 信息处理装置、信息处理方法和程序 |
US11508344B2 (en) | 2017-12-27 | 2022-11-22 | Sony Corporation | Information processing device, information processing method and program |
JP2020150360A (ja) * | 2019-03-12 | 2020-09-17 | パナソニックi−PROセンシングソリューションズ株式会社 | ウェアラブルカメラおよび映像データ生成方法 |
CN116649959A (zh) * | 2023-05-31 | 2023-08-29 | 北京欧应科技有限公司 | 监测系统、判断佩戴装置定位的方法及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP6508061B2 (ja) | 2019-05-08 |
US20160335557A1 (en) | 2016-11-17 |
US10592812B2 (en) | 2020-03-17 |
JPWO2015107737A1 (ja) | 2017-03-23 |
EP3098688A1 (en) | 2016-11-30 |
EP3098688A4 (en) | 2017-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015107737A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US11861073B2 (en) | Gesture recognition | |
KR102403212B1 (ko) | 제거가능한 모듈을 갖는 운동 밴드 | |
KR101830558B1 (ko) | 목표 동기부여를 제공하도록 구성되는 피트니스 디바이스 | |
US11113515B2 (en) | Information processing device and information processing method | |
EP3304953B1 (en) | Transmitting athletic data using non-connected state of discovery signal | |
JP6354461B2 (ja) | フィードバック提供方法、システム、および解析装置 | |
US10310836B2 (en) | Athletic activity data device firmware update | |
JP2015205072A (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
US10313868B2 (en) | Athletic data aggregation and display system | |
US10758801B1 (en) | Method and system for proper kicking technique | |
JP6471694B2 (ja) | 情報処理装置、情報処理方法、プログラム、および情報処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14878498 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015557708 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15110440 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014878498 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014878498 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |