WO2022195661A1 - 感情推定装置および感情推定方法 - Google Patents
感情推定装置および感情推定方法 Download PDFInfo
- Publication number
- WO2022195661A1 WO2022195661A1 PCT/JP2021/010340 JP2021010340W WO2022195661A1 WO 2022195661 A1 WO2022195661 A1 WO 2022195661A1 JP 2021010340 W JP2021010340 W JP 2021010340W WO 2022195661 A1 WO2022195661 A1 WO 2022195661A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- emotion
- subjects
- acquisition unit
- biometric information
- Prior art date
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 419
- 238000000034 method Methods 0.000 title claims description 43
- 230000002996 emotional effect Effects 0.000 claims abstract description 64
- 238000002360 preparation method Methods 0.000 claims description 36
- 230000004044 response Effects 0.000 claims description 24
- 238000007621 cluster analysis Methods 0.000 claims description 9
- 230000001815 facial effect Effects 0.000 claims description 9
- 210000001061 forehead Anatomy 0.000 claims description 8
- 238000011156 evaluation Methods 0.000 claims description 5
- 238000013077 scoring method Methods 0.000 claims description 4
- 239000000284 extract Substances 0.000 abstract description 17
- 230000000875 corresponding effect Effects 0.000 description 42
- 238000010276 construction Methods 0.000 description 25
- 230000008569 process Effects 0.000 description 24
- 238000012545 processing Methods 0.000 description 21
- 230000008859 change Effects 0.000 description 18
- 230000006870 function Effects 0.000 description 11
- 230000007177 brain activity Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 8
- 230000002123 temporal effect Effects 0.000 description 8
- 230000002596 correlated effect Effects 0.000 description 7
- 238000000611 regression analysis Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 238000002474 experimental method Methods 0.000 description 3
- 230000000284 resting effect Effects 0.000 description 3
- 206010039897 Sedation Diseases 0.000 description 2
- 230000017531 blood circulation Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000001121 heart beat frequency Effects 0.000 description 2
- 230000036280 sedation Effects 0.000 description 2
- 231100000430 skin reaction Toxicity 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 241000828585 Gari Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000002631 hypothermal effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0075—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/332—Portable devices specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/346—Analysis of electrocardiograms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7278—Artificial waveform generation or derivation, e.g. synthesizing signals from measured signals
Definitions
- the present disclosure relates to an emotion estimation device and an emotion estimation method for estimating a person's comfort.
- Patent Document 1 discloses an emotion determination device that estimates brain activity of a subject based on acquired facial skin temperature data or facial blood circulation data, and determines the subject's emotion based on the estimated brain activity. is described.
- the emotion determination device has an emotion memory unit that associates and stores brain activity estimation results and emotion data representing human emotions.
- the emotion determination device determines the subject's emotion by extracting emotion data associated with brain activity estimated based on facial skin temperature data or facial blood circulation data in the emotion storage unit. It should be noted that the estimation results of brain activity stored in the emotion memory unit are associated with emotion data. It is based on brain activity data collected when given for a period of time.
- the emotion determination device collects brain activity amount data after putting a person in one of the states of "emotions, anger, romance, and pleasure" by some method, and associates the emotion data with brain activity estimation results. Is going.
- a database is constructed that associates emotional information that indicates emotions after the change with biometric information such as the results of brain activity estimation after the change in emotions. is doing.
- the database used for estimating emotions is obtained from experiments in which specific stimuli are given to people or experiments in which specific environmental factors are varied, differences between the experimental environment and the daily environment, or individual differences There is a possibility that emotions cannot be estimated with high accuracy due to influences such as the above.
- the present disclosure has been made to solve the above problems, and aims to provide an emotion estimation device and an emotion estimation method that can improve the estimation accuracy of estimating a subject's emotion.
- An emotion estimating device is an emotion estimating device that estimates emotions of a plurality of subjects present in an estimation space, and includes: a preliminary biological information acquisition unit that acquires and analyzes first biological information indicating a state; a preliminary emotion information acquisition unit that acquires emotion information indicative of the emotions of each of the plurality of subjects present in the preparation space; a relationship building unit that builds related information in which the first biometric information and the emotional information corresponding to the first biometric information are associated with each other; and one of the plurality of subjects, an estimated biometric information acquisition unit for acquiring and analyzing second biometric information indicating the state of the body of any of the subjects present in the estimated space; and the first biometric information equal to the second biometric information. and an estimating unit that extracts the emotion information associated in the related information and estimates the emotion of any of the subjects present in the estimation space based on the extracted emotion information.
- An emotion estimation method is an emotion estimation method executed by an emotion estimation device for estimating emotions of a plurality of subjects present in an estimation space, wherein the plurality of subjects present in a preparation space a preparation biometric information acquisition step of acquiring and analyzing first biometric information indicating the state of the body of each of the subjects; an emotion information acquisition step; an association construction step of constructing related information in which the first biometric information and the emotion information corresponding to the first biometric information are associated with each other; and any one of the plurality of subjects.
- an estimation step of extracting the emotion information associated with the first biometric information in the related information, and estimating the emotion of any of the subjects present in the estimation space based on the extracted emotion information includes.
- FIG. 1 is a diagram illustrating functional blocks of an emotion estimation device according to Embodiment 1;
- FIG. 1 is a schematic diagram illustrating a hardware configuration of an emotion estimation device according to Embodiment 1;
- FIG. 4 is a flowchart illustrating emotion estimation processing by the emotion estimation device according to Embodiment 1;
- 10 is a flowchart illustrating emotion estimation processing according to Embodiment 2;
- 14 is a flowchart illustrating emotion estimation processing according to Embodiment 3;
- Emotion estimation apparatus 100 is an apparatus for estimating the emotion of a subject in a specific room such as a room in an office building, or in a space such as an interior, based on biometric information of the subject, which will be described later. be. Below, the space such as the specific room or indoor space may be referred to as an estimated space.
- FIG. 1 is a diagram illustrating functional blocks of an emotion estimation device according to Embodiment 1.
- the emotion estimation device 100 includes a storage unit 1 , a preliminary biometric information acquisition unit 2 , a preliminary emotion information acquisition unit 3 , a relation building unit 4 , an estimated biometric information acquisition unit 5 , an estimation unit 6 and an output unit 7 .
- the storage unit 1 stores related information for the emotion estimation device 100 to estimate the subject's emotion.
- Emotion estimation apparatus 100 refers to the related information when estimating the subject's emotion.
- the preparation biological information acquisition unit 2 includes one or more sensors. Below, the sensor concerned may be described as a sensor for preparation.
- the preparation biometric information acquisition unit 2 acquires first biometric information as a sample from each of a plurality of subjects present in the preparation space using one or more preparation sensors.
- a preparation space is a space, such as a test room, for acquiring first biological information from a plurality of subjects.
- the preparation space may be the same space as the estimated space described above, a space whose purpose of being in the room is the same as that of the estimated space, or a space similar to the estimated space.
- the space similar to the estimated space may be a space in which the temperature difference from the estimated space is equal to or less than a predetermined temperature difference threshold, or a space in which the humidity difference from the estimated space is equal to or less than a predetermined temperature difference threshold.
- the space may be equal to or less than the humidity difference threshold.
- the similar space may be a space in which the illuminance difference from the estimated space is equal to or less than a predetermined illuminance difference threshold, or the area difference from the estimated space is the predetermined area difference. It may be a space that is below a threshold. Note that the similar spaces described above are examples, and the estimated spaces and similar spaces are not limited to those described above.
- the preparation space described above is an example, and the preparation space is not limited to the one described above.
- the emotion estimation device 100 can The first biometric information can be acquired without imposing a particular burden on the person.
- the preparation space is not the same space as the estimated space, the purpose of being in the room is not the same as the purpose of being in the estimated space, or the space is not similar to the estimated space, the first biological information obtained by the emotion estimation device 100 can be easier to obtain.
- the first biological information is information that serves as an index of the state of the human body.
- the first biometric information is indicated by a numerical value, and has various values depending on the state of the human body.
- the first biological information includes, for example, body surface temperature, heartbeat, pulse wave, electrical skin response, temporal change in body surface temperature, difference in surface temperature of each of a plurality of parts of the body, and frequency analysis of heartbeat. It is information indicating at least one of index values obtained by
- the preparatory biological information acquiring unit 2 receives, for example, first biological information indicating at least one of body surface temperature, heart rate, pulse wave, and electrodermal response from the one or more preparatory sensors. Get by.
- the preliminary biometric information acquisition unit 2 may acquire the first biometric information from one part of the human body by any one type of detection method.
- the prepared biological information acquisition unit 2 obtains one type of first biological information or each of two or more types of first biological information from one part of the human body by each of two or more arbitrary types of detection methods. may be obtained.
- the prepared biological information acquisition unit 2 obtains one type of first biological information or each of two or more types of first biological information from a plurality of parts of the human body by each of two or more arbitrary types of detection methods. may be obtained.
- the preliminary biological information acquisition unit 2 in Embodiment 1 analyzes the first biological information acquired by the one or more preliminary sensors. Specifically, the preliminary biological information acquisition unit 2 uses the first biological information acquired by the one or more preliminary sensors to determine the amount of change in body surface temperature over time, the surface temperature of each of a plurality of parts of the body, First biometric information indicating at least one of a temperature difference, an index value obtained by heartbeat frequency analysis, and the like is generated. For example, when the first biological information obtained by one or more preparation sensors is the body surface temperature, the preliminary biological information obtaining unit 2 may calculate the difference in the surface temperatures of a plurality of parts of the body. good. Alternatively, when the first biometric information acquired by one or more preparatory sensors is data related to heartbeat, such as electrocardiographic data, the preparatory biological information acquisition unit 2 performs frequency analysis to determine the state of heartbeat. You may acquire the index value which shows.
- the prepared emotion information acquisition unit 3 acquires, from each of the plurality of subjects present in the preparation space, emotion information indicating the emotions of each of the plurality of subjects.
- the emotion information is information indicating the emotion such as the degree of comfort, the degree of exhilaration, or the degree of sedation by numerical values.
- Prepared emotion information acquisition unit 3 includes an input device such as a keyboard, touch panel, button, or mouse for each of a plurality of subjects to input emotion information to emotion estimation device 100 .
- the preparation emotion information acquisition unit 3 in Embodiment 1 acquires emotion information from each of a plurality of subjects present in the preparation space based on the experience extraction method.
- the prepared emotion information acquisition unit 3 obtains, from each of the plurality of subjects present in the preparation space, the plurality of subjects at each of the plurality of points of time within a predetermined period. acquire emotion information indicating the emotion of each of the
- the prepared emotion information acquisition unit 3 acquires emotion information at each of a plurality of points in time during the period of, for example, two days or longer, based on the experience extraction method.
- the prepared emotion information acquisition unit 3 acquires emotion information at one or more predetermined times or random times on each day within the period. It is desirable that the prepared emotion information acquisition unit 3 acquires emotion information over a period of five days or more. The reason for this is that events that occur on weekdays or on days of the week are often different, and the prepared emotion information acquisition unit 3 evenly acquires emotion information that indicates the state of mind and body caused by the event. be.
- the emotional information obtained at each of the multiple time points has the recall bias of each subject reduced.
- the emotion information acquired at each point in time accurately indicates the emotion of each subject affected by the event that occurred at each point in time.
- the collection of emotional information acquired at each of multiple time points by the experience extraction method is not biased information that indicates only specific emotions caused by specific events in daily life, but is caused by a wide variety of events. It also shows the wide variety of emotions of each subject.
- the relationship construction unit 4 constructs related information that associates the first biological information analyzed by the preliminary biological information acquisition unit 2 with the emotion information corresponding to the first biological information.
- the emotional information corresponding to the first biometric information is the prepared emotional information at the same point in time when the preliminary biometric information acquisition unit 2 acquires the first biometric information from each subject by one or more preparation sensors. It is information indicating an emotion, such as the degree of comfort, the degree of exhilaration, or the degree of sedation, obtained using the emotion information acquired from each subject by the acquisition unit 3 .
- the term "the same time as a certain time point” refers to a time point within a predetermined period of time before or after the certain time point.
- the predetermined period of time refers to a period of time, obtained in advance through experiments or the like, during which a person's emotional state and physical state do not change.
- the emotion information corresponding to the first biometric information may be the emotion information acquired at the same time as the first biometric information, may be obtained by regression analysis, or may be the value of the emotion information for each of the first biometric information. may be obtained by averaging
- the first biological information is the amount of change in body surface temperature over time.
- the first biometric information indicating 0.5 [°C] is obtained from a plurality of subjects, at the same point in time when the 0.5 [°C] is obtained, it is obtained from the plurality of subjects
- the value of the emotional information obtained or the average value of the values of the emotional information is an example of "the emotional information corresponding to the first biometric information indicating the 0.5[°C]."
- the relationship between the value of the first biometric information and the value of the emotion information is represented by a mathematical formula by regression analysis using each value of the first biometric information and the value of the emotion information obtained from a plurality of subjects, the 0
- the value of emotional information obtained when 0.5[°C] is substituted into the formula is an example of "emotional information corresponding to the first biometric information indicating 0.5[°C]".
- the emotion information corresponding to the first biometric information is generally the emotion of the plurality of subjects at the same point in time when the first biometric information was obtained for the first biometric information indicating the physical state of the plurality of subjects. It becomes the one that has become.
- emotional information with a numerical value indicating a high comfort level is obtained from a plurality of subjects at all times when the first biological information indicating the above 0.5 [° C.] is obtained from the plurality of subjects
- "emotional information corresponding to the first biological information indicating the 0.5 [°C]” is a numerical value indicating a high degree of comfort.
- the high degree of comfort is considered to generally indicate the emotions of a plurality of subjects at the time when the first biometric information indicating 0.5[°C] was acquired.
- the related information may be, for example, a mathematical formula based on the above-described regression analysis that associates the first biometric information with the emotion information corresponding to the first biometric information.
- the related information may be information indicating the value of the first biometric information and the value of the emotion information corresponding to the first biometric information, for example, in tabular form. Note that the related information is not limited to the above-mentioned information as long as it is information that associates the first biometric information with the emotion information corresponding to the first biometric information.
- the estimated biological information acquisition unit 5 includes one or more sensors. Below, the sensor may be described as an estimation sensor. The one or more inferring sensors may be the same as the one or more preparatory sensors described above.
- the estimated biometric information acquisition unit 5 acquires second biometric information from one of the above-described plurality of subjects existing in the estimation space by using the one or more estimation sensors. It should be noted that, as described above, the estimation space refers to a space for emotion estimation device 100 to estimate the emotion of each of the plurality of subjects. The estimated space may be the same space as the preparation space, or may be a different space.
- the second biological information is information that serves as an index of the state of the human body. The second biometric information is indicated by a numerical value, and has various values depending on the state of the human body.
- the second biological information includes, for example, body surface temperature, heartbeat, pulse wave, electrical skin response, temporal change in body surface temperature, difference in surface temperature of each of a plurality of parts of the body, and frequency analysis of heartbeat. It is information indicating at least one of index values obtained by The estimated biological information acquisition unit 5 according to the first embodiment, for example, obtains second biological information indicating at least one of body surface temperature, heart rate, pulse wave, and electrodermal response from the one or more estimation sensors. Get by.
- the estimated biometric information acquisition unit 5 may acquire biometric information from one part of the human body by any one type of detection method. Alternatively, the estimated biometric information acquisition unit 5 may acquire one type of biometric information or each of two or more types of biometric information from one part of the human body by any two or more types of detection methods. good. Alternatively, the estimated biological information acquisition unit 5 may acquire one type of biological information or each of two or more types of biological information from a plurality of parts of the human body by each of two or more arbitrary types of detection methods. good.
- the estimated biological information acquisition unit 5 in Embodiment 1 analyzes the second biological information acquired by the one or more estimation sensors. Specifically, the estimated biological information acquisition unit 5 uses the second biological information acquired by the one or more estimation sensors to determine the amount of change in body surface temperature over time, the surface temperature of each of a plurality of parts of the body, Second biometric information indicating at least one of a temperature difference, an index value obtained by heartbeat frequency analysis, and the like is generated.
- the second biometric information generated by the estimated biometric information acquisition unit 5 is the same type of information as the first biometric information included in the related information. For example, when the first biological information included in the related information is the amount of temporal change in body surface temperature, the second biological information generated by the estimated biological information acquisition unit 5 is the temporal change in body surface temperature. amount.
- the estimation unit 6 uses the related information to extract emotion information associated with the first biometric information that is the same as the second biometric information analyzed by the estimated biometric information acquisition unit 5 .
- the related information is information indicating the value of the first biometric information and the value of the emotion information corresponding to the first biometric information in, for example, a tabular format
- the second biometric information is equal to the second biometric information.
- the first biometric information indicates the smallest difference from the second biometric information.
- the estimation unit 6 estimates the emotion indicated by the extracted emotion information as the emotion of the subject from whom the second biometric information is obtained.
- the output unit 7 communicates with external equipment.
- external devices include terminals such as personal computers, workstations, tablet terminals, and smart phones.
- Other examples of such external devices include display devices such as liquid crystal displays, CRTs (Cathode Ray Tubes), and organic EL (Electro Luminescence) displays.
- the external equipment includes, for example, an environment adjustment device such as an air conditioner or an air cleaner that adjusts the environment of the estimated space.
- the external device includes an audio output device including a speaker.
- the estimation unit 6 controls the output unit 7 to transmit the extracted emotional information to the terminal.
- the estimation unit 6 controls the output unit 7 to transmit the emotional information and the display instruction information instructing display of the emotional information to the display device.
- the estimation unit 6 transmits the emotion information to the environment adjustment device together with an instruction to cause the environment adjustment device to perform an operation according to the emotion information.
- the output unit 7 is controlled so as to Note that the operation of the environment adjusting device according to emotion information refers to an operation that improves the emotional state of the subject in the estimated space.
- the estimation unit 6 controls the output unit 7 to transmit the emotional information and an instruction to output the emotional information by voice to the voice output device. .
- FIG. 2 is a schematic diagram illustrating the hardware configuration of the emotion estimation device according to Embodiment 1.
- the emotion estimation device 100 includes, for example, a processor 11, a memory 12, a storage device 13, a first input interface circuit 14, a sensor 15, a second input interface circuit 16, an input device 17, and an output interface circuit 18. and so on.
- Processor 11 , memory 12 , storage device 13 , first input interface circuit 14 , second input interface circuit 16 and output interface circuit 18 are connected by bus 19 .
- the processor 11 includes, for example, a CPU (Central Processing Unit) or an MPU (Micro Processing Unit).
- Examples of the memory 12 include ROM (Read Only Memory) and RAM (Random Access Memory).
- Examples of the storage device 13 include HDD (Hard Disk Drive) and SSD (Solid State Drive).
- Examples of the sensor 15 include infrared sensors and wearable sensors.
- the input device 17 may be, for example, a keyboard or a touch panel.
- the first input interface circuit 14 is an input interface circuit that mediates between the processor 11 and the sensor 15 .
- the second input interface circuit 16 is an input interface circuit that mediates between the processor 11 and the input device 17 .
- the function of the storage unit 1 can be realized by the storage device 13.
- Each function of the relation construction unit 4 and the estimation unit 6 is that the processor 11 reads and executes various programs such as an emotion estimation program for estimating emotions stored in the memory 12 or the storage device 13. It can be realized by
- the first biometric information acquisition function by the preliminary biometric information acquisition unit 2 and the second biometric information acquisition function by the estimated biometric information acquisition unit 5 can be realized by the first input interface circuit 14 and the sensor 15 .
- the analysis function of the first biometric information by the preliminary biometric information acquisition unit 2 and the analysis function of the second biometric information by the estimated biometric information acquisition unit 5 are performed by various programs such as the emotion estimation program stored in the memory 12 or the storage device 13.
- the program can be realized by the processor 11 reading and executing it.
- the function of the prepared emotion information acquisition unit 3 acquiring emotion information at each of a plurality of points in time can be realized by the processor 11 , the second input interface circuit 16 and the input device 17 . It should be noted that until the relationship construction unit 4 generates related information from the emotion information acquired by the prepared emotion information acquisition unit 3 from each of the plurality of subjects at each of a plurality of time points, the relationship construction unit 4 or the prepared emotion Functions held by the information acquisition unit 3 can be realized by the memory 12 or the storage device 13 .
- the first biological information acquired and analyzed by the prepared biological information acquiring unit 2 from each of the plurality of subjects at each of the plurality of time points until the relationship building unit 4 generates related information, the relationship building unit 4 or functions held by the preliminary biometric information acquisition unit 2 can be realized by the memory 12 or the storage device 13 .
- the function of the output unit 7 can be realized by the output interface circuit 18.
- the output interface circuit 18 is a communication interface circuit when the external device is a terminal.
- the output interface circuit 18 is an output interface that mediates between the device and the processor 11. circuit.
- the emotion estimation device 100 may include multiple first input interface circuits 14 and multiple sensors 15, respectively. With this, emotion estimation apparatus 100 simultaneously obtains the first biometric information and the second Each biometric information can be acquired. Emotion estimation device 100 may include a plurality of second input interface circuits 16 and input devices 17, respectively. Thereby, emotion estimation apparatus 100 can simultaneously acquire emotion information from each of the plurality of subjects when each of the plurality of subjects is in a separate position in the preparation space. All or part of the functions of emotion estimation device 100 may be realized by dedicated hardware.
- step S1 the prepared emotion information acquisition unit 3 determines whether or not the current time is the timing to acquire emotion information. Below, the said timing is described as the acquisition timing. If the current time is not the acquisition timing (step S1: NO), the prepared emotion information acquiring section 3 returns the emotion estimation process to step S1.
- the prepared emotion information acquisition unit 3 acquires emotion information from each of the plurality of subjects in step S2.
- the preliminary biometric information acquiring unit 2 analyzes the first biometric information acquired from each subject at the same point in time as the emotion information is acquired.
- the preparatory biometric information acquiring unit 2 may acquire the first biometric information from each subject at the same time as the emotion information is acquired, or acquire the first biometric information continuously from each subject. It's okay. If the first biometric information is continuously acquired from each subject, the preliminary biometric information acquiring unit 2 may analyze the continuously acquired first biometric information.
- step S4 the prepared emotion information acquisition unit 3 determines whether or not the period has ended. If the period has not ended (step S4: NO), the prepared emotion information acquiring section 3 returns the emotion estimation process to step S1.
- step S5 the relationship building unit 4 uses the emotional information and the first biological information acquired over the period to generate the first biological information, Related information is constructed by associating emotional information corresponding to the first biometric information.
- step S ⁇ b>6 the relationship construction unit 4 stores the constructed related information in the storage unit 1 .
- step S7 the estimated biometric information acquisition unit 5 acquires the second biometric information of any one of the plurality of subjects.
- the estimated biometric information acquiring unit 5 analyzes the acquired second biometric information.
- the estimation unit 6 refers to the related information and extracts emotion information associated with the first biometric information that is the same as the analyzed second biometric information.
- the estimation unit 6 estimates the emotion indicated by the extracted emotion information as the subject's emotion.
- step S10 the estimation unit 6 instructs the output unit 7 to output the extracted emotion information to an external device.
- the output unit 7 outputs the emotion information to an external device according to the instruction from the estimation unit 6 .
- the emotion estimation process ends.
- emotion estimation apparatus 100 executes the process from step S7.
- Emotion estimation apparatus 100 includes preliminary biometric information acquisition unit 2 , preliminary emotion information acquisition unit 3 , association building unit 4 , estimated biometric information acquisition unit 5 and estimation unit 6 .
- the preparation biometric information acquisition unit 2 acquires and analyzes first biometric information indicating the physical condition of each of a plurality of subjects present in the preparation space.
- the prepared emotion information acquisition unit 3 acquires emotion information indicating the emotions of each of a plurality of subjects present in the prepared space.
- the relationship building unit 4 builds related information in which the first biometric information and the emotion information corresponding to the first biometric information are associated with each other.
- the estimated biometric information acquisition unit 5 acquires and analyzes second biometric information indicating the physical state of one of the plurality of subjects present in the estimated space.
- the estimating unit 6 extracts emotion information associated with the first biological information equal to the second biological information in the related information constructed by the relation building unit 4 . Then, the estimating unit 6 estimates the emotion of any of the subjects present in the estimation space based on the extracted emotion information. This improves the accuracy of emotion estimation.
- the relation building unit 4 in the first embodiment includes the emotion information acquired from each of the plurality of subjects by the prepared emotion information acquiring unit 3, and the prepared biological information acquiring unit 2 at the same point in time when the emotion information is acquired.
- the related information is constructed using the first biometric information obtained from each of the plurality of subjects.
- the time point at which the emotion information is acquired and the same time point refer to a time point within a predetermined time before or after the time point at which the prepared emotion information acquisition unit 3 acquires the emotion information from each of the plurality of subjects as a starting point. .
- the emotion information and the first biometric information acquired immediately before the acquisition of the emotion information are reflected in the related information, improving the accuracy of the related information. Therefore, the emotion estimation accuracy using the related information by the emotion estimation device 100 is improved.
- the prepared emotion information acquisition unit 3 obtains a plurality of emotions at each of a plurality of time points within a predetermined period from each of a plurality of subjects present in the preparation space. Emotion information indicating each emotion of the subject is acquired.
- the relationship building unit 4 obtains the first biological information obtained from each of the plurality of subjects by the prepared biological information obtaining unit 2 at each of the plurality of time points, and the prepared emotion information obtaining unit 3 at each of the plurality of time points.
- Related information is constructed using the emotion information acquired from each of the plurality of subjects. This reduces the recall bias of the emotion information acquired by the prepared emotion information acquisition unit 3 .
- the correlation between the first biometric information and the emotion information corresponding to the first biometric information in the related information does not depend on specific events such as climate change, physical condition change, or emotional change. Therefore, the related information is less affected by the event and is more accurate. Therefore, the estimation accuracy of emotion by the estimation unit 6 is improved.
- the above period in Embodiment 1 is a period of 5 days or more.
- the prepared emotion information acquisition unit 3 acquires emotion information from each of a plurality of subjects at one or more predetermined times or at one or more random times on each day.
- the related information does not depend on the weather, physical condition, mood, or the like for each day of the five weekdays, each day of the week, or each season. Therefore, the related information is less affected by various events, and the accuracy is improved. Therefore, the estimation accuracy of emotion by the estimation unit 6 is improved.
- the emotion estimation device 100 further includes an output unit 7 that communicates with an external device.
- the estimation unit 6 controls the output unit 7 to transmit the extracted emotion information to the external device.
- the external device is a terminal, display device, or audio output device
- the user of the terminal, display device, or audio output device can know the emotion of the subject.
- the environment adjustment device performs an action according to the emotions of the subject, for example, to improve the comfort level of the subject, or improve the comfort level. It is possible to improve the intellectual productivity of the correlated subject.
- Embodiment 2 Emotion estimation apparatus 100 according to Embodiment 2 constructs related information that associates first biometric information and emotion information corresponding to the first biometric information for each subject. The emotion estimating apparatus 100 further accurately estimates the emotion of each subject by using the related information.
- Embodiment 2 will be described below.
- the same reference numerals as in the first embodiment are given to the same constituent elements as in the first embodiment.
- descriptions of the same contents as those of the first embodiment will be omitted unless there are special circumstances.
- the functional blocks included in emotion estimation device 100 according to Embodiment 2 are the same as the functional blocks in emotion estimation device 100 according to Embodiment 1 illustrated in FIG.
- the preparatory biological information acquisition unit 2 in Embodiment 2 includes sensors such as wearable sensors assigned to each of a plurality of subjects. Sensor identification information for identifying the sensor is assigned to the sensor assigned to each subject.
- the preliminary biometric information acquisition unit 2 in Embodiment 2 includes a camera and a sensor that recognizes each face of a plurality of subjects. Based on these, the preliminary biometric information acquiring unit 2 identifies each subject. It should be noted that the preparatory biometric information acquiring unit 2 is not limited to the above as long as it can identify each subject.
- the prepared emotion information acquisition unit 3 in Embodiment 2 includes, for example, a terminal assigned to each of a plurality of subjects. An ID (Identifier) for identifying the terminal is assigned to the terminal. Alternatively, the prepared emotion information acquisition unit 3 in Embodiment 2 acquires identification information for identifying each subject together with emotion information from each subject.
- each subject is identified by the ID of each subject's terminal, the above identification information of each subject, the sensor identification information of the sensor assigned to each subject, and the face data of each subject. Information is collectively described as identification information.
- the relationship building unit 4 generates, for each subject's identification information, first biometric information indicating the subject's body, indicating the subject's emotion, and emotion information corresponding to the first biometric information. and generate related information associated with. That is, the relationship construction unit 4 corresponds to the identification information of each subject, the first biometric information indicating the physical condition of each subject, and the first biometric information indicating the physical condition of each subject. Related information is generated by associating emotional information with .
- the emotional information corresponding to the first biometric information indicating the physical condition of each subject may be emotional information obtained from each subject at the same time as the first biometric information.
- it may be information obtained by the regression analysis or the average calculation process using emotion information obtained from each subject at the same time as the first biometric information.
- the first biological information is the amount of change in body surface temperature over time.
- emotional information acquired from a certain subject at the same time as the first biometric information indicating 0.5[°C] indicates a high degree of comfort.
- the numerical value indicating the high degree of comfort, or the average of the numerical values is "the temporal change in body surface temperature of the subject of the certain type, which is 0.5 [°C].
- the emotion information corresponding to the first biometric information indicating the physical condition of each subject indicates a generalized emotion of each subject. For example, if emotional information with a numerical value indicating a high degree of comfort is obtained from the certain subject at all times when the first biological information indicating 0.5 [°C] is obtained, Emotional information corresponding to the first biological information indicating the physical condition of the subject, that is, the amount of temporal change in the body surface temperature of 0.5 [° C.] indicates a high degree of comfort for the subject. become a thing.
- the high degree of comfort is considered to generally indicate the emotion of the certain subject at the time when the first biometric information indicating the 0.5[°C] was obtained.
- the estimating unit 6 is associated with the identification information of each subject and the first biological information equal to the second biological information indicating the physical condition of each subject acquired by the estimated biological information acquiring unit 5 in the related information. Extract emotional information. Then, the estimation unit 6 estimates the emotion indicated by the extracted emotion information as the emotion of each subject.
- FIG. 4 is a flowchart illustrating emotion estimation processing according to the second embodiment. Since the processing in step S21 is the same as the processing in step S1, description thereof will be omitted. If the current time is not the acquisition timing in step S21 (step S21: NO), the prepared emotion information acquiring section 3 returns the emotion estimation process to step S21. In step S21, if the current time is the acquisition timing (step S21: YES), the prepared emotion information acquisition unit 3 acquires identification information and emotion information from a plurality of subjects in step S22. In step S23, the preliminary biometric information acquiring unit 2 analyzes the first biometric information acquired from each subject at the same time as the emotion information is acquired from each subject.
- the preparation biometric information acquiring unit 2 acquires the identification information such as the sensor identification information or the face data for identifying each subject together with the first biometric information from each subject.
- the preparatory biometric information acquiring unit 2 may acquire the first biometric information from each subject at the same time as the emotion information is acquired, or may continuously acquire the first biometric information from each subject. . If the first biometric information is continuously acquired from each subject, the preliminary biometric information acquiring unit 2 may analyze the continuously acquired first biometric information.
- step S24 the prepared emotion information acquisition unit 3 determines whether or not the period has ended. If the period has not ended (step S24: NO), the prepared emotion information acquiring section 3 returns the emotion estimation process to step S21.
- step S24: YES the relationship building unit 4 generates the identification information of each subject, the first biometric information indicating the physical condition of each subject, Relevant information that indicates the subject's emotion and is associated with the emotion information corresponding to the first biometric information is constructed. Since the processing in step S26 is the same as the processing in step S6, description thereof will be omitted.
- step S27 the estimated biometric information acquiring unit 5 acquires the second biometric information from one of the above subjects among the plurality of subjects.
- the estimated biometric information acquisition unit 5 acquires the identification information such as the sensor identification information or the face data for identifying one of the subjects, together with the second biometric information. Since the process of step S28 is the same as the process of step S8, description thereof is omitted.
- step S29 the estimating unit 6 refers to the relevant information and extracts the identification information of any of the subjects and emotion information associated with the first biometric information equal to the second biometric information after the analysis.
- the estimation unit 6 estimates the emotion indicated by the extracted emotion information as the subject's emotion. Since the processing in step S30 is the same as the processing in step S10, description thereof is omitted. After the process of step S30, the emotion estimation process ends. When the emotion estimation process is executed again, emotion estimation apparatus 100 executes the process from step S27.
- the preliminary biometric information acquisition unit 2 and the preliminary emotion information acquisition unit 3 in Embodiment 2 acquire identification information for identifying each of the plurality of subjects.
- the relationship construction unit 4 includes identification information of each subject, first biometric information indicating the physical state of each subject, and first biometric information indicating the emotion of each subject and indicating the physical state of each subject. Construct related information that associates emotional information corresponding to .
- the related information indicates the correlation between the physical condition and emotion for each subject. Therefore, the estimation unit 6 can estimate the emotion of each subject by using the related information. Therefore, the emotion estimating apparatus 100 can highly accurately estimate the emotion of each target person, even if each target person has different characteristics such as being sensitive to heat or being sensitive to cold.
- Embodiment 3 Emotion estimation apparatus 100 according to Embodiment 3 acquires emotion information based on the evaluation grid method and performs cluster analysis to classify subjects in constructing estimation information.
- Embodiment 3 will be described below.
- the same reference numerals as those in the first and second embodiments are attached to the same components as those in the first and second embodiments.
- descriptions of the same contents as those of the first and second embodiments will be omitted unless there are special circumstances.
- the functional blocks included in emotion estimation device 100 according to Embodiment 3 are the same as the functional blocks in emotion estimation device 100 according to Embodiments 1 and 2 illustrated in FIG.
- the prepared emotion information acquisition unit 3 in Embodiment 3 acquires emotion information from each of a plurality of subjects based on the evaluation grid method.
- the preparatory emotion information acquisition unit 3 acquires the comfort level, which serves as an index of the comfort level, as the emotion information based on the scoring method.
- the prepared emotion information acquisition unit 3 acquires factor information indicating factors of the comfort level together with the comfort level based on the free description method.
- the relationship construction unit 4 uses the comfort level and factor information acquired by the prepared emotion information acquisition unit 3 to perform cluster analysis. Specifically, the relationship construction unit 4 classifies each target person into one of a plurality of types using the factor information from each target person. For example, the association construction unit 4 classifies a subject whose main factor of comfort is a thermal factor as a "thermal type” or the like. In addition, the relation construction unit 4 classifies subjects whose main factor of comfort level is an internal factor such as mental and physical condition as an "internal type” or the like. Furthermore, the relation construction unit 4 classifies subjects whose comfort levels are affected by various factors as "balanced type” or the like. Here, what factors influence the comfort level of each subject is determined by the factor information acquired from each subject.
- the relationship construction unit 4 classifies the certain subject as the above-mentioned "thermal type". do.
- the predetermined ratio is 70% or 80% in the third embodiment, but may be 50% or 60%.
- the number of types, the names of types, and the like are not limited to those described above, and can be set as appropriate.
- the relationship construction unit 4 classifies each subject into types as described above based on the factor information from each subject, and also determines how comfortable or displeasing each subject is with what according to the degree of comfort from each subject. You may analyze how you feel. For example, if the factor information from a certain subject indicates coldness, and the degree of comfort acquired from the certain subject along with the factor information is low, the relation construction unit 4 determines that the certain subject is "cold.” It can be analyzed as "Gari type”. Then, the relation construction unit 4 may classify the target person as a "hot type” and a "cold sensitive type", for example, as compared to the case where only the factor information is used, and perform more detailed classification.
- the relationship construction unit 4 may classify each subject by type using not only factor information from each subject but also biometric information from each subject. For example, when the biological information from a certain subject indicates hypothermia, and the factor information obtained from the certain subject at the same time as the biological information indicates "coldness," the relationship construction unit 4 The certain subject may be analyzed as a "cold type".
- the relationship building unit 4 in Embodiment 3 includes type information indicating a type, first biometric information indicating the physical state of one or more subjects of the type, and physical state of the one or more subjects.
- Related information is constructed by associating emotion information corresponding to the first biometric information with the emotion information.
- the emotional information corresponding to the first biometric information indicating the physical state of one or more subjects of the type may be emotional information obtained from the one or more subjects at the same time as the first biometric information. . Alternatively, it may be information obtained by the regression analysis or the average calculation process using emotion information obtained from the one or more subjects at the same time point as the first biometric information.
- Emotional information corresponding to the first biometric information indicating the physical state of the one or more subjects of the relevant type generalizes the emotions of the one or more subjects of the relevant type.
- the first biometric information is the amount of temporal change in body surface temperature
- the first biological information indicating 0.5 [°C] was obtained from the five subjects at the same time point.
- the numerical value indicating the high degree of comfort, or the average of the numerical values is "0.5 [°C] of the 5 subjects of the certain type.
- Temporal change in body surface temperature This is an example of "emotional information corresponding to a quantity", that is, "emotional information corresponding to biometric information indicating the physical condition of five subjects of a certain type”.
- the value of emotional information obtained by substituting 0.5 [°C] into the formula is an example of "emotional information corresponding to biometric information indicating the physical state of five subjects of the certain type". .
- the high degree of comfort is considered to generally indicate the certain type of emotions of the five subjects at the time when the first biometric information indicating 0.5[°C] was obtained.
- the relationship building unit 4 in Embodiment 3 includes identification information of each subject, type information indicating the type of each subject, first biometric information indicating the physical state of each subject, and each subject Related information may be constructed in which emotion information corresponding to the first biometric information indicating the physical condition of the person is associated with the emotion information.
- the estimating unit 6 obtains the second biological information acquired by the estimated biological information acquiring unit 5. Emotional information associated with the first biometric information equal to the biometric information is extracted. In the third embodiment, it is assumed that the type information is uniquely determined by the value of the first biometric information.
- the estimating unit 6 may extract the type information associated with the first biometric information that is the same as the second biometric information, and estimate the type of the subject from whom the second biometric information was obtained.
- the estimation unit 6 determines that the estimated biometric information acquisition unit 5 Emotion information associated with the acquired identification information and the first biometric information equal to the second biometric information acquired by the estimated biometric information acquisition unit 5 is extracted.
- the estimating unit 6 extracts the identification information and the type information associated with the first biometric information equal to the second biometric information, and estimates the type of the subject from whom the second biometric information is obtained. good.
- the estimating unit 6 When the estimating unit 6 extracts type information, it instructs the output unit 7 to output the extracted type information to an external device.
- the output unit 7 outputs the type information to an external device.
- the external device is the display device described above, the type information acquired from the output unit 7 is displayed.
- the external device is the above-described audio output device, the type information acquired from the output unit 7 is output as audio.
- the external device is the terminal described above, the type information acquired from the output unit 7 is displayed or output by voice. Thereby, the subject can know the type of the subject.
- the target person can also know the type of the target person by referring to the storage unit 1 using the terminal connected to the emotion estimation device 100 .
- the subject If the subject knows his or her own type, the subject inputs type information indicating the type to an input device such as a keyboard or touch panel (not shown) when acquiring the second biometric information. good too. Then, the estimation unit 6 may extract the identification information and the type information of the target person, or the emotion information associated with the type information, and estimate the emotion of the target person.
- FIG. 5 is a flowchart illustrating emotion estimation processing according to the third embodiment.
- the case where the preliminary biometric information acquisition unit 2 and the preliminary emotion information acquisition unit 3 acquire identification information for identifying each subject will be described as an example. Since each process in steps S41 to S44 is the same as each process in steps S21 to S24, description thereof will be omitted.
- step S44 the relationship construction unit 4 performs cluster analysis in step S45 to classify each subject into one of a plurality of types.
- step S46 the relationship building unit 4 indicates the type information, the first biological information indicating the physical state of one or more subjects of the type indicated by the type information, and the emotions of the one or more subjects, 1. Construct related information that associates emotional information corresponding to biometric information.
- step S46 the relationship building unit 4 generates identification information for each subject, type information indicating the type of each subject, first biometric information indicating the physical condition of each subject, and Related information that indicates the person's emotion and is associated with the emotion information corresponding to the first biometric information is constructed.
- step S47 is the same as the processing in step S6, description thereof is omitted. Since the processing in step S48 is the same as the processing in step S7 or step S27, the description is omitted. Since the processing in step S49 is the same as the processing in step S8, description thereof is omitted.
- step S50 the estimation unit 6 extracts emotion information and type information associated with the first biometric information that is the same as the second biometric information analyzed in step S49.
- the estimating unit 6 uses the identification information for identifying any of the subjects from which the second biometric information is obtained, and the first biometric information equal to the second biometric information, and the emotion information and type Extract information and
- step S51 the estimation unit 6 instructs the output unit 7 to output the extracted emotion information and type information to an external device.
- the emotion estimation process ends.
- emotion estimation apparatus 100 executes the process from step S48.
- the prepared emotion information acquisition unit 3 in Embodiment 3 acquires, as emotion information, a comfort level indicating the comfort level of each of the plurality of subjects from each of the plurality of subjects based on the evaluation grid method, and , to acquire factor information indicating factors of the comfort level of each of the plurality of subjects.
- the prepared emotion information acquisition unit 3 acquires the degree of comfort based on the scoring method, and acquires the factor information based on the free description method.
- the relationship construction unit 4 performs cluster analysis using the factor information from each of the plurality of subjects, and classifies each of the plurality of subjects into one of a plurality of types.
- the relationship building unit 4 includes type information indicating a type, first biometric information indicating the physical state of one or more subjects out of a plurality of subjects classified into the type, and Relevant information is constructed in which emotional information indicating an emotion and corresponding to the first biometric information indicating the physical state of the one or more subjects is associated with the emotional information.
- the estimating unit 6 extracts emotion information and type information associated with the first biometric information that is the same as any of the second biometric information indicating the physical condition of the subject from the related information.
- the estimating unit 6 estimates the emotion of the subject from the extracted emotion information, and estimates the type of the subject from the extracted type information.
- the related information indicates the correlation between the first biological information and the emotion information, which is peculiar to the type such as the "thermal type” or the "internal type.” Therefore, the estimating unit 6 can highly accurately estimate the emotion of the subject using the second biometric information according to the type of the subject.
- the prepared emotion information acquisition unit 3 in Embodiment 3 acquires, as emotion information, a comfort level indicating the comfort level of each of the plurality of subjects from each of the plurality of subjects based on the evaluation grid method, and , to acquire factor information indicating factors of the comfort level of each of the plurality of subjects.
- the prepared emotion information acquisition unit 3 acquires the degree of comfort based on the scoring method, and acquires the factor information based on the free description method.
- the preparation biometric information acquisition unit 2 and the preparation emotion information acquisition unit 3 acquire identification information for identifying each of a plurality of subjects.
- the relationship construction unit 4 performs cluster analysis using the factor information from each of the plurality of subjects, and classifies each of the plurality of subjects into one of a plurality of types.
- the relationship construction unit 4 includes identification information for each of the plurality of subjects, type information indicating the type of each of the plurality of subjects, first biometric information indicating the physical condition of each of the plurality of subjects, and a plurality of related information that indicates the emotion of each of the subjects and associates the emotional information corresponding to the first biometric information that indicates the physical state of each of the plurality of subjects.
- the estimating unit 6 extracts emotion information and type information associated with any of the subject identification information and the first biometric information equal to the second biometric information in the related information.
- the estimating unit 6 estimates the emotion of the subject from the extracted emotion information, and estimates the type of the subject from the extracted type information.
- the related information indicates the correlation between the physical condition and emotion for each subject. Therefore, the estimation unit 6 can estimate the emotion of each subject by using the related information. Therefore, the emotion estimation apparatus 100 can highly accurately estimate the emotion of each target person according to the type.
- the relationship building unit 4 in Embodiment 3 performs cluster analysis using at least one of the factor information from each subject, the first biological information from each subject, and the comfort level from each subject. and classify each subject into one of a plurality of types. As a result, the relationship construction unit 4 can classify each subject in detail by type. Therefore, the accuracy of related information is further improved. Therefore, the accuracy of emotion estimation by emotion estimation device 100 is improved.
- the emotion estimation device 100 further includes an output unit 7 that communicates with an external device.
- the estimation unit 6 controls the output unit 7 to transmit the extracted emotion information and type information to an external device.
- an external device such as a terminal, display device, or audio output device to know the emotion and type of any subject.
- the user is one of the target persons, the one of the target persons can know his/her own type.
- Embodiment 4 Emotion estimation apparatus 100 according to Embodiment 4 estimates the subject's emotion with even higher accuracy.
- the fourth embodiment will be described below. The same reference numerals as the reference numerals are attached. Further, in the contents of the fourth embodiment, descriptions of the same contents as those of the first to third embodiments will be omitted unless there are special circumstances.
- the functional blocks included in emotion estimation device 100 according to Embodiment 4 are the same as the functional blocks in emotion estimation device 100 according to Embodiments 1 to 3 illustrated in FIG. It should be noted that the prepared emotion information acquisition unit 3 in the fourth embodiment acquires emotion information in the same manner as in the first embodiment.
- the preparatory biometric information acquiring unit 2 in Embodiment 4 acquires first biometric information including at least one of face temperature, electrocardiogram data, electrodermal response data, etc., of each of a plurality of subjects. .
- the preliminary biometric information acquiring unit 2 detects the face temperature, for example, using an infrared sensor.
- the preparatory biological information acquisition unit 2 acquires electrocardiogram data or electrodermal response data using, for example, a wearable sensor.
- the preparatory biometric information acquisition unit 2 Based on the acquired first biometric information, the preparatory biometric information acquisition unit 2 obtains, for each of the plurality of subjects, a difference from the facial temperature at rest, a temperature difference between the cheeks and the forehead, an electrocardiogram index value, and First biometric information including at least one of a value indicating an electrodermal response is calculated. It is assumed that the preliminary biometric information obtaining unit 2 has previously obtained the resting face temperature when calculating the difference from the resting face temperature.
- the estimated biometric information acquiring unit 5 acquires second biometric information including at least one of face temperature, electrocardiogram data, electrodermal response data, etc. of any one of the plurality of subjects. do.
- the estimated biometric information acquisition unit 5 detects the face temperature using, for example, an infrared sensor.
- the estimated biological information acquiring unit 5 acquires electrocardiogram data or electrodermal response data by using, for example, a wearable sensor.
- the estimated biometric information acquiring unit 5 calculates the difference from the facial temperature at rest, the temperature difference between the cheek and the forehead, the value of the electrocardiographic index, and the skin temperature of the subject. Second biological information including at least one of values indicating an electrical response is calculated.
- a high frequency band, a low frequency band, a very low frequency band, or the standard deviation of heartbeat intervals is preferable because it is strongly correlated with emotions.
- the amount of perspiration, reaction amplitude, appearance frequency, reaction time, recovery time, or the like is preferable as the value indicating the electrodermal response, since these values are strongly correlated with emotions.
- the emotion estimation process by emotion estimation apparatus 100 in Embodiment 4 is the same as the emotion estimation process according to Embodiments 1 to 3 described with reference to FIGS. 3 to 5 except for the following points. .
- the preparation biometric information acquiring unit 2 acquires first biometric information including the face temperature of each of a plurality of subjects.
- the prepared biological information acquisition unit 2 uses the face temperature of each subject to determine the difference between the face temperature of each subject from the facial temperature at rest of each subject, or the cheek temperature of each subject.
- a first biometric information including the temperature difference of the forehead is calculated.
- the relation building unit 4 builds related information by associating the first biometric information calculated by the preliminary biometric information acquisition unit 2 and the emotion information corresponding to the first biometric information.
- the estimated biometric information acquisition unit 5 acquires second biometric information including the face temperature of one of the plurality of subjects.
- the estimated biological information acquisition unit 5 uses the facial temperature of the subject to determine the difference between the face temperature of the subject and the face temperature of the subject at rest, or A second biometric information including the temperature difference between the cheek and the forehead of any subject is calculated.
- the estimating unit 6 extracts emotion information associated with the first biometric information that is equal to the second biometric information calculated by the estimated biometric information acquisition unit 5, and estimates the emotion of one of the subjects based on the extracted emotion information. .
- the difference from resting face temperature and the temperature difference between cheeks and forehead are strongly correlated with emotions. Therefore, by using the difference or the temperature difference as the first biometric information and the second biometric information, the emotion estimation accuracy of the emotion estimation device 100 is improved.
- the preliminary biometric information acquisition unit 2 in Embodiment 4 acquires first biometric information including electrocardiographic data for each of a plurality of subjects.
- the preparatory biological information acquiring unit 2 uses the electrocardiographic data of each subject to calculate the first biological information including the value of the electrocardiogram index of each subject.
- the relation building unit 4 builds related information by associating the first biometric information calculated by the preliminary biometric information acquisition unit 2 and the emotion information corresponding to the first biometric information.
- the estimated biological information acquiring unit 5 acquires second biological information including electrocardiogram data of one of the plurality of subjects.
- the estimated biological information acquiring unit 5 uses the electrocardiographic data of the subject to calculate the second biological information including the value of the electrocardiographic index of the subject.
- the estimating unit 6 extracts emotion information associated with the first biometric information that is equal to the second biometric information calculated by the estimated biometric information acquisition unit 5, and estimates the emotion of one of the subjects based on the extracted emotion information. .
- ECG indicators are strongly correlated with emotions. Therefore, by using the electrocardiographic index as the first biological information and the second biological information, the estimation accuracy of the emotion estimation device 100 is improved.
- the preliminary biometric information acquisition unit 2 in Embodiment 4 acquires first biometric information including electrodermal response data for each of a plurality of subjects.
- the preparation biometric information acquiring unit 2 uses the electrodermal response data of each subject to calculate first biometric information including a value indicating the electrodermal response of each subject.
- the relation building unit 4 builds related information by associating the first biometric information calculated by the preliminary biometric information acquisition unit 2 and the emotion information corresponding to the first biometric information.
- the estimated biological information acquisition unit 5 acquires second biological information including electrodermal response data of one of the plurality of subjects.
- the estimated biometric information acquiring unit 5 uses the electrodermal response data of any of the subjects to calculate second biometric information including a value indicating the electrodermal response of any of the subjects.
- the estimating unit 6 extracts emotion information associated with the first biometric information that is equal to the second biometric information calculated by the estimated biometric information acquisition unit 5, and estimates the emotion of one of the subjects based on the extracted emotion information. . Electrodermal responses are strongly correlated with emotions. Therefore, by using values indicating the electrodermal response as the first biological information and the second biological information, the estimation accuracy of the emotion estimation device 100 is improved.
- the preparation biometric information acquisition unit 2 and the estimated biometric information acquisition unit 5 each have a wearable sensor. Thereby, the preliminary biometric information acquisition unit 2 can routinely acquire the first biometric information from each of the plurality of subjects. Therefore, the related information shows in detail the correlation between the physical condition and emotion of each subject in daily life.
- the estimated biometric information acquiring unit 5 can acquire the second biometric information in the daily life of any one of the subjects described above. As a result, the emotion estimation device 100 can accurately estimate the subject's daily emotion.
- 1 storage unit 2 preliminary biometric information acquisition unit, 3 preliminary emotion information acquisition unit, 4 relationship construction unit, 5 estimated biometric information acquisition unit, 6 estimation unit, 7 output unit, 11 processor, 12 memory, 13 storage device, 14th 1 input interface circuit, 15 sensor, 16 second input interface circuit, 17 input device, 18 output interface circuit, 19 bus, 100 emotion estimation device.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Cardiology (AREA)
- Psychiatry (AREA)
- Dermatology (AREA)
- Social Psychology (AREA)
- Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Developmental Disabilities (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Child & Adolescent Psychology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
実施の形態1に係る感情推定装置100は、例えばオフィスビルの一室などの特定の室内または屋内等の空間における対象者の感情を、当該対象者の後述する生体情報に基づいて推定する装置である。以下では、当該特定の室内または屋内等の空間を、推定空間と記載する場合もある。図1は、実施の形態1に係る感情推定装置の機能ブロックを例示する図である。感情推定装置100は、記憶部1、準備生体情報取得部2、準備感情情報取得部3、関連構築部4、推定生体情報取得部5、推定部6、および出力部7を備える。
実施の形態2に係る感情推定装置100は、対象者毎の、第1生体情報と、当該第1生体情報に対応する感情情報と、を関連付けた関連情報を構築する。そして、当該感情推定装置100は、当該関連情報を用いることにより、各対象者の感情を更に精度良く推定するものである。以下、実施の形態2について説明する。なお、実施の形態2における構成要素において、上記実施の形態1における構成要素と同様のものには、上記実施の形態1における符号と同様の符号を付す。また、実施の形態2の内容において、上記実施の形態1の内容と同様のものについては、特段の事情が無い限り説明を省略する。
実施の形態3に係る感情推定装置100は、推定用情報の構築において、評価グリッド法に基づいて感情情報を取得し、クラスター分析を行って対象者を分類するものである。以下、実施の形態3について説明する。なお、実施の形態3における構成要素において、実施の形態1および実施の形態2における構成要素と同様のものには、実施の形態1および実施の形態2における符号と同様の符号を付す。また、実施の形態3の内容において、実施の形態1および実施の形態2の内容と同様のものについては、特段の事情が無い限り説明を省略する。
実施の形態4に係る感情推定装置100は、対象者の感情を、更に高い精度で推定するものである。以下、実施の形態4について説明する、なお、実施の形態4における構成要素において、実施の形態1~実施の形態3における構成要素と同様のものには、実施の形態1~実施の形態3における符号と同様の符号を付す。また、実施の形態4の内容において、実施の形態1~実施の形態3の内容と同様のものについては、特段の事情が無い限り説明を省略する。
Claims (19)
- 推定空間内に存在する複数の対象者の感情を推定する感情推定装置であって、
準備空間内に存在する前記複数の対象者の各々の身体の状態を示す第1生体情報を取得して解析する準備生体情報取得部と、
前記準備空間内に存在する前記複数の対象者の各々の感情を示す感情情報を取得する準備感情情報取得部と、
前記第1生体情報、および、該第1生体情報に対応する前記感情情報を、互いに関連付けた関連情報を構築する関連構築部と、
前記複数の対象者のうちのいずれかの対象者であって、前記推定空間内に存在する前記いずれかの対象者の身体の状態を示す第2生体情報を取得して解析する推定生体情報取得部と、
前記第2生体情報と等しい前記第1生体情報に、前記関連情報において関連付けられている前記感情情報を抽出し、抽出した該感情情報によって前記推定空間内に存在する前記いずれかの対象者の感情を推定する推定部と、
を備える、感情推定装置。 - 前記関連構築部は、
前記準備感情情報取得部が前記複数の対象者の各々から取得した前記感情情報と、前記準備感情情報取得部が前記複数の対象者の各々から該感情情報を取得した時点を起点として前後予め定められた時間以内に前記準備生体情報取得部が前記複数の対象者の各々から取得した前記第1生体情報と、を用いて前記関連情報を構築する、請求項1に記載の感情推定装置。 - 前記準備感情情報取得部は、
前記準備空間内に存在する複数の対象者の各々から、予め定められた期間内の複数の時点の各々において、該複数の時点の各々における前記複数の対象者の各々の感情を示す前記感情情報を取得し、
前記関連構築部は、
前記複数の時点の各々において前記準備生体情報取得部が前記複数の対象者の各々から取得した前記第1生体情報と、前記複数の時点の各々において前記準備感情情報取得部が前記複数の対象者の各々から取得した前記感情情報と、を用いて前記関連情報を構築する、請求項1または請求項2に記載の感情推定装置。 - 前記期間は、2日以上の期間であって、
前記準備感情情報取得部は、
各日にちにおける、予め定められた1以上の時刻、または、ランダムな1以上の時刻において、前記複数の対象者の各々から前記感情情報を取得する、請求項3に記載の感情推定装置。 - 前記期間は5日以上である、請求項4に記載の感情推定装置。
- 外部の機器と通信する出力部を更に備え、
前記推定部は、
抽出した前記感情情報を、前記外部の機器に送信するよう前記出力部を制御する、請求項1~請求項5のいずれか一項に記載の感情推定装置。 - 前記準備生体情報取得部および前記準備感情情報取得部は、
前記複数の対象者の各々を識別するための識別情報を取得し、
前記関連構築部は、
前記複数の対象者の各々の前記識別情報と、前記複数の対象者の各々の身体の状態を示す前記第1生体情報と、前記複数の対象者の各々の感情を示し、前記複数の対象者の各々の身体の状態を示す前記第1生体情報に対応する感情情報と、を関連付けた前記関連情報を構築し、
前記推定部は、
前記関連情報において、前記いずれかの対象者の前記識別情報、および、前記第2生体情報と等しい前記第1生体情報に、関連付けられた前記感情情報を抽出し、抽出した該感情情報によって前記いずれかの対象者の感情を推定する、請求項1~請求項5のいずれか一項に記載の感情推定装置。 - 前記準備感情情報取得部は、
評価グリッド法に基づいて、前記複数の対象者の各々から、前記感情情報として、該複数の対象者の各々の快適度合いを示す快適度を取得すると共に、該複数の対象者の各々の該快適度の要因を示す要因情報を取得する、請求項1~請求項5のいずれか一項に記載の感情推定装置。 - 前記準備感情情報取得部は、
前記快適度を評点法に基づいて取得し、
前記要因情報を自由記述法に基づいて取得する、請求項8に記載の感情推定装置。 - 前記関連構築部は、
前記複数の対象者の各々からの前記要因情報を用いてクラスター分析を行い、前記複数の対象者の各々を、複数のタイプのうちのいずれかのタイプに分類し、
前記タイプを示すタイプ情報と、該タイプに分類された前記複数の対象者のうちの1以上の対象者の身体の状態を示す第1生体情報と、該1以上の対象者の感情を示し、該1以上の対象者の身体の状態を示す前記第1生体情報に対応する感情情報と、を関連付けた前記関連情報を構築し、
前記推定部は、
前記関連情報において、前記いずれかの対象者の身体の状態を示す前記第2生体情報と等しい前記第1生体情報に関連付けられた、前記感情情報および前記タイプ情報を抽出し、抽出した該感情情報によって前記いずれかの対象者の感情を推定し、且つ、抽出した該タイプ情報によって前記いずれかの対象者の前記タイプを推定する、請求項8または請求項9に記載の感情推定装置。 - 前記準備生体情報取得部および前記準備感情情報取得部は、
前記複数の対象者の各々を識別するための識別情報を取得し、
前記関連構築部は、
前記複数の対象者の各々からの前記要因情報を用いてクラスター分析を行い、前記複数の対象者の各々を、複数のタイプのうちのいずれかのタイプに分類し、
前記複数の対象者の各々の前記識別情報と、前記複数の対象者の各々の前記タイプを示すタイプ情報と、前記複数の対象者の各々の身体の状態を示す前記第1生体情報と、前記複数の対象者の各々の感情を示し、前記複数の対象者の各々の身体の状態を示す前記第1生体情報に対応する感情情報と、を関連付けた前記関連情報を構築し、
前記推定部は、
前記関連情報において、前記いずれかの対象者の前記識別情報、および、前記第2生体情報と等しい前記第1生体情報に関連付けられた、前記感情情報および前記タイプ情報を抽出し、抽出した該感情情報によって前記いずれかの対象者の感情を推定し、且つ、抽出した該タイプ情報によって前記いずれかの対象者の前記タイプを推定する、請求項8または請求項9に記載の感情推定装置。 - 前記関連構築部は、
前記複数の対象者の各々からの前記要因情報と共に、前記複数の対象者の各々からの前記第1生体情報、および、前記複数の対象者の各々からの前記快適度のうちの少なくとも一方を用いてクラスター分析を行い、前記複数の対象者の各々を、前記複数のタイプのうちのいずれかの前記タイプに分類する、請求項10または請求項11に記載の感情推定装置。 - 外部の機器と通信する出力部を更に備え、
前記推定部は、
抽出した前記感情情報および前記タイプ情報を、前記外部の機器に送信するよう前記出力部を制御する、請求項11または請求項12に記載の感情推定装置。 - 前記準備生体情報取得部は、
前記複数の対象者の各々の、顔温度を含む前記第1生体情報を取得し、
前記複数の対象者の各々の前記顔温度を用いて、該複数の対象者の各々の前記顔温度の、該複数の対象者の各々の安静時における前記顔温度からの差分、または、前記複数の対象者の各々の頬と額の温度差を含む、前記第1生体情報を演算し、
前記関連構築部は、
前記準備生体情報取得部が演算した前記第1生体情報、および、該準備生体情報取得部が演算した前記第1生体情報に対応する感情情報を、関連付けた前記関連情報を構築し、
前記推定生体情報取得部は、
前記いずれかの対象者の顔温度を含む前記第2生体情報を取得し、
前記いずれかの対象者の前記顔温度を用いて、該いずれかの対象者の前記顔温度の、該いずれかの対象者の安静時における前記顔温度からの差分、または、前記いずれかの対象者の頬と額の温度差を含む前記第2生体情報を演算し、
前記推定部は、
前記推定生体情報取得部が演算した前記第2生体情報と等しい前記第1生体情報と関連付けられた前記感情情報を抽出し、抽出した前記感情情報によって前記いずれかの対象者の感情を推定する、請求項1~請求項13のいずれか一項に記載の感情推定装置。 - 前記準備生体情報取得部および前記推定生体情報取得部は、それぞれ赤外線センサを有し、
前記赤外線センサは、前記顔温度を検知する、請求項14に記載の感情推定装置。 - 前記準備生体情報取得部は、
前記複数の対象者の各々の、心電データを含む前記第1生体情報を取得し、
前記複数の対象者の各々の前記心電データを用いて、該複数の対象者の各々の心電指標の値を含む前記第1生体情報を演算し、
前記関連構築部は、
前記準備生体情報取得部が演算した前記第1生体情報、および、該準備生体情報取得部が演算した前記第1生体情報に対応する感情情報を、関連付けた前記関連情報を構築し、
前記推定生体情報取得部は、
前記いずれかの対象者の心電データを含む前記第2生体情報を取得し、
前記いずれかの対象者の前記心電データを用いて、該いずれかの対象者の心電指標の値を含む前記第2生体情報を演算し、
前記推定部は、
前記推定生体情報取得部が演算した前記第2生体情報と等しい前記第1生体情報と関連付けられた前記感情情報を抽出し、抽出した前記感情情報によって前記いずれかの対象者の感情を推定する、請求項1~請求項13のいずれか一項に記載の感情推定装置。 - 前記準備生体情報取得部は、
前記複数の対象者の各々の、皮膚電気反応データを含む前記第1生体情報を取得し、
前記複数の対象者の各々の前記皮膚電気反応データを用いて、該複数の対象者の各々の皮膚電気反応を示す値を含む前記第1生体情報を演算し、
前記関連構築部は、
前記準備生体情報取得部が演算した前記第1生体情報、および、該準備生体情報取得部が演算した前記第1生体情報に対応する感情情報を、関連付けた前記関連情報を構築し、
前記推定生体情報取得部は、
前記いずれかの対象者の皮膚電気反応データを含む前記第2生体情報を取得し、
前記いずれかの対象者の前記皮膚電気反応データを用いて、該いずれかの対象者の皮膚電気反応を示す値を含む前記第2生体情報を演算し、
前記推定部は、
前記推定生体情報取得部が演算した前記第2生体情報と等しい前記第1生体情報と関連付けられた前記感情情報を抽出し、抽出した前記感情情報によって前記いずれかの対象者の前記感情を推定する、請求項1~請求項13のいずれか一項に記載の感情推定装置。 - 前記準備生体情報取得部および前記推定生体情報取得部は、それぞれウェアラブルセンサを有する、請求項16または請求項17に記載の感情推定装置。
- 推定空間内に存在する複数の対象者の感情を推定する感情推定装置が実行する感情推定方法であって、
準備空間内に存在する前記複数の対象者の各々の身体の状態を示す第1生体情報を取得して解析する準備生体情報取得ステップと、
前記準備空間内に存在する前記複数の対象者の各々の感情を示す感情情報を取得する準備感情情報取得ステップと、
前記第1生体情報、および、該第1生体情報に対応する前記感情情報を、互いに関連付けた関連情報を構築する関連構築ステップと、
前記複数の対象者のうちのいずれかの対象者であって、前記推定空間内に存在する前記いずれかの対象者の身体の状態を示す第2生体情報を取得して解析する推定生体情報取得ステップと、
前記第2生体情報と等しい前記第1生体情報に、前記関連情報において関連付けられている前記感情情報を抽出し、抽出した該感情情報によって前記推定空間内に存在する前記いずれかの対象者の感情を推定する推定ステップと、
を含む、感情推定方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/278,003 US12097031B2 (en) | 2021-03-15 | 2021-03-15 | Emotion estimation apparatus and emotion estimation method |
PCT/JP2021/010340 WO2022195661A1 (ja) | 2021-03-15 | 2021-03-15 | 感情推定装置および感情推定方法 |
CN202180095436.1A CN116963667B (zh) | 2021-03-15 | 2021-03-15 | 情绪推测装置以及情绪推测方法 |
JP2021550162A JP6976498B1 (ja) | 2021-03-15 | 2021-03-15 | 感情推定装置および感情推定方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/010340 WO2022195661A1 (ja) | 2021-03-15 | 2021-03-15 | 感情推定装置および感情推定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022195661A1 true WO2022195661A1 (ja) | 2022-09-22 |
Family
ID=78815494
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/010340 WO2022195661A1 (ja) | 2021-03-15 | 2021-03-15 | 感情推定装置および感情推定方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US12097031B2 (ja) |
JP (1) | JP6976498B1 (ja) |
CN (1) | CN116963667B (ja) |
WO (1) | WO2022195661A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018043061A1 (ja) * | 2016-09-01 | 2018-03-08 | 株式会社ワコム | 座標入力処理装置、感情推定装置、感情推定システム及び感情推定用データベースの構築装置 |
JP2018187044A (ja) * | 2017-05-02 | 2018-11-29 | 国立大学法人東京工業大学 | 感情推定装置、感情推定方法およびコンピュータプログラム |
JP2019126657A (ja) * | 2018-01-26 | 2019-08-01 | 富士ゼロックス株式会社 | 検出装置、及び検出プログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101840644B1 (ko) * | 2011-05-31 | 2018-03-22 | 한국전자통신연구원 | 감성인지 기반 보디가드 시스템, 감성인지 디바이스, 영상 및 센서 제어 장치, 신변 보호 관리 장치 및 그 제어 방법 |
JP6096857B1 (ja) | 2015-10-15 | 2017-03-15 | ダイキン工業株式会社 | 感情判定装置 |
US10709338B2 (en) | 2015-10-15 | 2020-07-14 | Daikin Industries, Ltd. | Useful information presentation device |
US20200060597A1 (en) * | 2016-12-14 | 2020-02-27 | Mitsubishi Electric Corporation | State estimation device |
JP6839818B2 (ja) * | 2017-05-17 | 2021-03-10 | パナソニックIpマネジメント株式会社 | コンテンツ提供方法、コンテンツ提供装置及びコンテンツ提供プログラム |
CN107679067A (zh) * | 2017-08-04 | 2018-02-09 | 平安科技(深圳)有限公司 | 信息推荐方法及移动终端 |
JP6906197B2 (ja) * | 2017-09-29 | 2021-07-21 | パナソニックIpマネジメント株式会社 | 情報処理方法、情報処理装置及び情報処理プログラム |
CN108095740B (zh) * | 2017-12-20 | 2021-06-22 | 姜涵予 | 一种用户情绪评估方法和装置 |
CN108154398B (zh) * | 2017-12-27 | 2021-01-12 | Oppo广东移动通信有限公司 | 信息显示方法、装置、终端及存储介质 |
EP4000520A1 (en) * | 2020-11-12 | 2022-05-25 | Koninklijke Philips N.V. | Method and system for sensor signals dependent dialog generation during a medical imaging process |
-
2021
- 2021-03-15 CN CN202180095436.1A patent/CN116963667B/zh active Active
- 2021-03-15 US US18/278,003 patent/US12097031B2/en active Active
- 2021-03-15 JP JP2021550162A patent/JP6976498B1/ja active Active
- 2021-03-15 WO PCT/JP2021/010340 patent/WO2022195661A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018043061A1 (ja) * | 2016-09-01 | 2018-03-08 | 株式会社ワコム | 座標入力処理装置、感情推定装置、感情推定システム及び感情推定用データベースの構築装置 |
JP2018187044A (ja) * | 2017-05-02 | 2018-11-29 | 国立大学法人東京工業大学 | 感情推定装置、感情推定方法およびコンピュータプログラム |
JP2019126657A (ja) * | 2018-01-26 | 2019-08-01 | 富士ゼロックス株式会社 | 検出装置、及び検出プログラム |
Also Published As
Publication number | Publication date |
---|---|
US12097031B2 (en) | 2024-09-24 |
CN116963667B (zh) | 2025-01-28 |
US20240032835A1 (en) | 2024-02-01 |
JPWO2022195661A1 (ja) | 2022-09-22 |
CN116963667A (zh) | 2023-10-27 |
JP6976498B1 (ja) | 2021-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Nkurikiyeyezu et al. | The effect of person-specific biometrics in improving generic stress predictive models | |
CN109009017B (zh) | 一种智能健康监测系统及其数据处理方法 | |
US20160321403A1 (en) | Data collection method and apparatus | |
US7621871B2 (en) | Systems and methods for monitoring and evaluating individual performance | |
JP7534789B2 (ja) | 血圧推定システム、血圧推定方法、学習装置、学習方法及びプログラム | |
CN107405072A (zh) | 用于生成个体的压力水平信息和压力弹性水平信息的系统和方法 | |
JP7338886B2 (ja) | エリア別環境管理システム及び方法とプログラム | |
CN113096808B (zh) | 事件提示方法、装置、计算机设备及存储介质 | |
US20240282469A1 (en) | Monitoring a person using machine learning | |
JP2016129629A (ja) | 生体状態推定装置 | |
CN111265209A (zh) | 一种基于心电图和脑电图来评判着装接触舒适度的方法 | |
WO2022195661A1 (ja) | 感情推定装置および感情推定方法 | |
JP7625203B2 (ja) | 熱ストレスの影響推定装置、熱ストレスの影響推定方法、及び、コンピュータプログラム | |
TW202247816A (zh) | 非接觸式心律類別監測系統及方法 | |
JP7001469B2 (ja) | 肌状態認識システム | |
JP2019017946A (ja) | 気分推定システム | |
EP4012722A1 (en) | Sleep quality analysis | |
JP2018190176A (ja) | 画像表示装置、肌状態サポートシステム、画像表示プログラム及び画像表示方法 | |
CN115280355A (zh) | 舒适性解析装置以及环境控制用指令装置 | |
KR102427263B1 (ko) | 빅데이터를 이용한 맞춤형 물리치료장치 및 그 제어방법 | |
JP7280333B2 (ja) | 画像表示装置 | |
Fernandeza et al. | Application of heart rate variability for thermal comfort in office buildings in real-life conditions | |
WO2023074283A1 (ja) | 健康状態判定方法、及び、健康状態判定システム | |
CN118866354A (zh) | 神经发育评估方法、装置、设备及存储介质 | |
WO2023074284A1 (ja) | 健康状態判定方法、及び、健康状態判定システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2021550162 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21931417 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18278003 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180095436.1 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21931417 Country of ref document: EP Kind code of ref document: A1 |