WO2025017760A1 - Dispositif d'inférence d'onde de pouls et procédé d'inférence d'onde de pouls - Google Patents
Dispositif d'inférence d'onde de pouls et procédé d'inférence d'onde de pouls Download PDFInfo
- Publication number
- WO2025017760A1 WO2025017760A1 PCT/JP2023/025975 JP2023025975W WO2025017760A1 WO 2025017760 A1 WO2025017760 A1 WO 2025017760A1 JP 2023025975 W JP2023025975 W JP 2023025975W WO 2025017760 A1 WO2025017760 A1 WO 2025017760A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pulse wave
- unit
- measurement area
- subject
- measurement
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 43
- 238000005259 measurement Methods 0.000 claims abstract description 564
- 238000000605 extraction Methods 0.000 claims abstract description 98
- 238000001514 detection method Methods 0.000 claims abstract description 57
- 239000000284 extract Substances 0.000 claims abstract description 29
- 238000003384 imaging method Methods 0.000 claims description 78
- 238000005457 optimization Methods 0.000 claims description 4
- 230000001815 facial effect Effects 0.000 description 106
- 238000012545 processing Methods 0.000 description 38
- 238000010586 diagram Methods 0.000 description 24
- 230000007423 decrease Effects 0.000 description 16
- 230000005856 abnormality Effects 0.000 description 12
- 230000037007 arousal Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 210000000056 organ Anatomy 0.000 description 11
- 238000005286 illumination Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 239000002131 composite material Substances 0.000 description 5
- 238000000926 separation method Methods 0.000 description 4
- 230000003466 anti-cipated effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000036626 alertness Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 206010015037 epilepsy Diseases 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
Definitions
- This disclosure relates to a pulse wave estimation device and a pulse wave estimation method.
- the present disclosure has been made to solve the above-mentioned problems, and aims to provide a pulse wave estimation device that prevents a decrease in the accuracy of estimating a person's pulse wave due to, for example, the skin area of the person from which the pulse wave is to be estimated not being captured in the captured image, or the occurrence of so-called shadowing in that skin area.
- the pulse wave estimation device includes an image acquisition unit that acquires an image of a person frame by frame, a skin area detection unit that detects the skin area of the person from the image, a measurement area setting unit that sets a measurement area that can be used to extract a pulse wave source signal that indicates a luminance change and that contains a pulse wave component of the person in an area corresponding to the skin area on the image, a face direction estimation unit that estimates the face direction of the person frame by frame based on the image, a luminance signal extraction unit that sets a usage measurement area to be used to extract the pulse wave source signal from the measurement areas set by the measurement area setting unit and extracts a time-series pulse wave source signal based on the luminance change in the set usage measurement area, a luminance signal selection unit that selects a time-series estimated pulse wave source signal to be used for estimating the person's pulse wave from the time-series pulse wave source signals extracted by the luminance signal extraction unit during a pulse wave estimation target period, taking into account the face direction of the
- the present disclosure it is possible to prevent a decrease in the accuracy of estimating a person's pulse wave due to, for example, the skin area of the person from which the pulse wave is to be estimated not being captured in the captured image, or due to a shadow falling on that skin area.
- FIG. 1 is a diagram illustrating an example of the configuration of a pulse wave estimation device according to a first embodiment.
- 2A and 2B are diagrams for explaining an example of the contents of reference measurement area information in the first embodiment.
- 3A and 3B are diagrams for explaining an example of the relationship between the position and orientation of the imaging device and the position and orientation of an assumed subject in the first embodiment.
- 4A and 4B are diagrams for explaining an example of a set region determined by the measurement region setting unit in the first embodiment.
- 5A and 5B are diagrams for explaining another example of the relationship between the position and orientation of the imaging device and the position and orientation of an assumed subject in embodiment 1.
- FIG. 6A and 6B are diagrams for explaining another example of the set region determined by the measurement region setting unit in the first embodiment.
- FIG. 7A, 7B, and 7C are diagrams for explaining an example of a method for setting a measurement region by the measurement region setting unit in the pulse wave estimation device according to embodiment 1.
- FIG. 11 is a diagram showing an example of a histogram indicating the distribution ratio of the face direction in the yaw direction of a subject corresponding to each frame of a captured image during a pulse wave estimation period calculated by a luminance signal selection unit in embodiment 1.
- 4 is a flowchart for explaining the operation of the pulse wave estimation device according to the first embodiment.
- 10 is a flowchart for explaining details of the process of step ST5 in FIG. 9 .
- 10 is a flowchart for explaining details of the process of step ST6 in FIG. 9 .
- 13 is a diagram for explaining an example of a weighting coefficient for each measurement region that is set by the weight setting unit based on the face direction of the subject estimated by the face direction estimation unit on a frame-by-frame basis in the third embodiment; 13 is a flowchart for explaining the operation of a pulse wave estimation device according to embodiment 3.
- 20 is a flowchart for explaining details of step ST5a in FIG. 19.
- 20 is a flowchart for explaining details of step ST6b in FIG. 19.
- the skin region may be a region other than the subject's face.
- the skin region may be a region corresponding to a part belonging to the face, such as the subject's eyes, eyebrows, nose, mouth, forehead, cheeks, or chin.
- the skin region may also be a region corresponding to a body part other than the face, such as the subject's head, shoulders, hands, neck, or feet.
- the skin region may be a plurality of regions.
- the pulse wave estimation device 1 estimates the subject's pulse wave from a series of frames Im(k-Tp+1) to Im(k) for each specific number of frames Tp, and outputs a pulse wave estimation result P(t) which is information indicating the estimated pulse wave (hereinafter referred to as "pulse wave information"). Specifically, the pulse wave estimation device 1 estimates the subject's pulse wave from a luminance signal based on the luminance change of the subject's skin area in the series of frames Im(k-Tp+1) to Im(k).
- the series of frames Im(k-Tp+1) to Im(k) are assumed to be captured images acquired by the imaging device 2 during a preset period (hereinafter referred to as the "pulse wave estimation target period").
- the pulse wave estimation target period is set, for example, by an administrator or the like as the period during which one pulse wave estimation is performed.
- t indicates the output number assigned to each specific number of frames Tp.
- the pulse wave estimation result given at the timing following pulse wave estimation result P(t) is pulse wave estimation result P(t+1).
- Frame number k and output number t are integers equal to or greater than 1.
- Frame number Tp is an integer equal to or greater than 2.
- an event may occur in which the subject's skin area is not sufficiently secured on the captured image captured by the imaging device 2.
- An event in which the subject's skin area is not sufficiently secured on the captured image may include, for example, an event in which the subject's skin area is not captured on the captured image captured by the imaging device 2, or a so-called shadow drop occurs on the skin area.
- the imaging device 2 captures an image of the subject from the front, and the subject's face is facing 90 degrees to the left or right from the optical axis of the imaging device 2, the right or left side of the subject's face may not be captured or may be shadowed.
- Pulse wave estimation device 1 considers the subject's facial orientation to select a luminance signal extracted from a region from which it is assumed that a luminance signal that sufficiently contains the subject's pulse wave component can be extracted as the luminance signal to be used to estimate the subject's pulse wave. Pulse wave estimation device 1 then estimates the subject's pulse wave from the luminance signal selected in consideration of the subject's facial orientation.
- the number of subjects included in the captured image may be one or more.
- the number of subjects included in the captured image will be described as one.
- the pulse wave estimation result P(t) is output from the pulse wave estimation device 1 to, for example, an arousal level estimation device that estimates the arousal level of the person, or an abnormality detection device that detects an abnormality in the person's physical condition. Note that the arousal level estimation device and the abnormality detection device are not shown in FIG. 1.
- the arousal level estimation device estimates a decrease in the arousal level of the subject based on the pulse wave estimation result P(t) output from the pulse wave estimation device 1. For example, the arousal level estimation device estimates that the arousal level of the subject is decreasing when the pulse rate of the subject tends to decrease slowly.
- the arousal level estimation device estimates that the arousal level of the subject is decreasing, it warns, for example, the subject or people around the subject of the decrease in the arousal level of the subject.
- the abnormality detection device detects an abnormality in the subject's physical condition based on the pulse wave estimation result P(t) output from the pulse wave estimation device 1.
- the abnormality in the subject's physical condition may be, for example, epilepsy or heart disease. For example, if the subject's pulse wave is rising rapidly, the abnormality detection device detects that the subject is in an abnormal physical condition. If the abnormality detection device detects an abnormality in the subject's physical condition, it warns, for example, the subject or people around the subject that the subject is in an abnormal physical condition.
- the imaging device 2 includes an imaging section (not shown) and an illumination section (not shown).
- the illumination section is configured, for example, with an LED (Light Emitting Diode).
- the illumination section irradiates light onto an imaging range of the imaging section.
- the imaging section images the imaging range irradiated with light by the illumination section emitting light.
- the imaging device 2 may include one illumination section or multiple illumination sections.
- the image capturing device 2 is installed so as to be able to capture an image of the subject's skin area. That is, in this embodiment, the image capturing device 2 is installed so as to be able to capture an image of the driver's skin area.
- the pulse wave estimation device 1 the imaging device 2, the alertness estimation device (not shown), and the abnormality detection device (not shown) are assumed to be mounted on a vehicle (not shown), and the subject is the driver of the vehicle. In other words, the pulse wave estimation device 1 estimates the pulse wave of the vehicle driver.
- the pulse wave estimation device 1 includes an image acquisition unit 11, a skin area detection unit 12, a face direction estimation unit 13, a measurement area setting unit 14, a luminance signal extraction unit 15, a luminance signal selection unit 16, a pulse wave estimation unit 17, and an output unit 18.
- the captured image acquisition unit 11 acquires a captured image of the subject.
- the captured image acquisition unit 11 acquires a captured image of the driver of the vehicle captured by the imaging device 2.
- the captured image acquisition unit 11 outputs the acquired captured image to the skin area detection unit 12 and the face direction estimation unit 13 .
- the skin area detection unit 12 detects the skin area of the subject from a frame Im(k) included in the captured image acquired by the captured image acquisition unit 11.
- the skin area detection unit 12 may detect the skin area using a known means.
- the skin area detection unit 12 may detect the skin area using a cascade type face detector using Haar-like features.
- the skin area detection unit 12 generates skin area information S(k) indicating the detected skin area.
- the skin region information S(k) may include information indicating whether a skin region has been detected and information indicating the position and size of the detected skin region on the captured image.
- the skin region is represented by a rectangular region on the captured image, and the skin region information S(k) includes information indicating the position and size of the rectangular region on the captured image.
- the skin region information S(k) indicates, for example, whether the subject's face is detected, the center coordinates Fc (Fcx, Fcy) of the rectangle surrounding the subject's face on the captured image, and the width Fcw and height Fch of the rectangle.
- the presence or absence of detection of the subject's face is represented, for example, by "1" if it is detected, and "0" if it is not detected.
- the center coordinates of the rectangle surrounding the face are expressed in the coordinate system of the frame Im(k).
- the skin region detection unit 12 outputs the generated skin region information S(k) to the measurement region setting unit 14 .
- the face direction estimating section 13 estimates the face direction of the subject on a frame-by-frame basis based on the frame Im(k) of the captured image acquired by the captured image acquiring section 11 .
- the face direction of the subject estimated by the face direction estimation unit 13 is an angle calculated based on the front position of the subject, i.e., the driver in this case, as a reference (0 degrees), regardless of the installation position of the imaging device 2.
- "front” does not necessarily mean strictly front, but includes approximately front.
- the face direction of the subject estimated by the face direction estimation unit 13 based on the front of the subject is also referred to as "subject-based face direction.”
- the face direction estimating unit 13 can estimate the face direction of the subject in three directions: yaw [deg], pitch [deg], and roll [deg].
- the face direction estimation unit 13 may estimate the face direction of the subject, i.e., the subject-based face direction, using a method that uses a publicly known, trained model (hereinafter referred to as a "machine learning model”) such as Hope-Net.
- the face direction estimating unit 13 generates information (hereinafter referred to as "face direction information”) F(k) relating to the estimated face direction of the subject (face direction based on the subject).
- the face direction information F(k) is information in which an estimated subject-reference face direction is associated with a frame Im(k) of a captured image from which the face direction is estimated.
- the face direction estimating portion 13 outputs the generated face direction information F(k) to the luminance signal extracting portion 15 .
- measurement area setting unit 14 Based on frame Im(k) of the captured image acquired by image acquisition unit 11 and skin area information S(k) output by skin area detection unit 12, measurement area setting unit 14 sets a plurality of measurement areas in an image area on frame Im(k) corresponding to the skin area indicated by skin area information S(k), which can be used to extract a luminance signal indicating a change in luminance, the luminance signal including a pulse wave component of the subject.
- the luminance signal that indicates the luminance change and is extracted from a plurality of measurement regions is also referred to as a "pulse wave source signal.”
- Which area of the image area corresponding to the skin area is to be set as the measurement area is determined based on the range of the facial direction of the subject for which the pulse wave is to be estimated (hereinafter referred to as the "target facial direction range") and the expected positional relationship between the imaging device 2 and the subject.
- the target facial direction range is appropriately determined by an administrator or the like.
- the administrator or the like determines the target facial direction range depending on the purpose of the pulse wave estimation device 1, etc.
- the administrator or the like stores information indicating the target facial direction range in a location that can be referenced by the measurement area setting unit 14, such as a memory unit not shown.
- an administrator or the like generates information (hereinafter referred to as "reference measurement area information") that defines multiple measurement areas from which a pulse wave source signal containing the subject's pulse wave component can be extracted when the imaging device 2 images the subject at a reference position (hereinafter referred to as the "reference position") and orientation (hereinafter referred to as the "reference orientation”), and stores the information in a location that can be referenced by the pulse wave estimation device 1, such as a memory unit.
- the reference measurement area information is information in a table format in which the subject's facial orientation in the captured image corresponds to information indicating the measurement area when the imaging device 2 captures the subject at a reference position and reference orientation.
- the information indicating the measurement area is, for example, a number that can identify the measurement area (hereinafter referred to as a "measurement area number").
- a measurement area number is assigned to each measurement area.
- the subject is assumed to be a subject of standard build facing forward in a standard assumed position (hereinafter referred to as "assumed subject").
- the assumed subject is, for example, a driver of standard build who sits facing forward in a driver's seat set in a standard position with a good posture.
- the facial orientation of the subject (anticipated subject) in the captured image is an angle calculated based on the reference (0 degrees) being the front of the imaging device 2.
- the facial orientation of the subject (anticipated subject) based on the front of the imaging device 2 defined in the reference measurement area is also referred to as the "camera-based facial orientation.”
- layout information information indicating the positional relationship between the actually installed imaging device 2 and the expected subject
- the administrator, etc. generates layout information taking into account the position of the expected subject, and stores it in the storage unit, etc.
- the information indicating the target face direction range, the reference measurement area information, and the layout information are generated in advance by an administrator or the like and stored in a storage unit or the like.
- the measurement area setting unit 14 sets a plurality of measurement areas based on the layout information and information indicating the target face direction range, and on the plurality of measurement areas defined in the reference measurement area information. Specifically, the measurement area setting unit 14 corrects the camera-based face orientation defined in the reference measurement area information to a subject-based face orientation based on the layout information, and sets multiple measurement areas based on the multiple measurement areas defined in the reference measurement area information based on the target face orientation range.
- the measurement area setting unit 14 calculates the difference between the position and orientation of the image capture device 2 and the reference position and orientation based on the layout information, and offsets the facial orientation (camera-based facial orientation) of the assumed subject in the reference measurement area information based on the calculated difference, so that the facial orientation (camera-based facial orientation) of the assumed subject in the frame Im(k) of the captured image captured at the reference position and reference orientation of the image capture device 2 assumed in the reference measurement area information matches the facial orientation (subject-based facial orientation) of the assumed subject in the frame Im(k) of the captured image captured at the current position and orientation of the image capture device 2. Then, the measurement area setting unit 14 determines the multiple measurement areas to be set from the multiple measurement areas defined in the reference measurement area information based on the target facial orientation range.
- the measurement area setting unit 14 determines the multiple measurement areas to be set based on the layout information, the information indicating the target face direction range, and the reference measurement area information. Note that in the following specific example, for the sake of convenience, it is assumed that the subject changes the face direction only in the yaw direction.
- Fig. 2 is a diagram for explaining an example of the contents of the reference measurement area information in embodiment 1.
- Fig. 2 is a diagram for explaining an example of the contents of the reference measurement area information in a case where the face direction of the subject in the yaw direction from "-30 degrees to +30 degrees" is associated with information indicating the measurement area.
- the face direction of the subject in the yaw direction is expressed as "0 degrees" when the subject's face is facing forward, and the more the face is turned to the left from the front, the more negative the angle, and the more the face is turned to the right from the front, the more positive the angle.
- the reference measurement area information is set with information indicating all 24 measurement areas with measurement area numbers (1) to (24).
- the measurement area number is associated with the subject's face direction (camera-based face direction), but information indicating the face direction is omitted in Fig. 2A.
- FIG. 2A shows all 24 measurement regions set in the reference measurement region information so that their positional relationships on the captured image can be seen.
- the measurement areas with measurement area numbers (1) to (12) indicate measurement areas set in the skin area corresponding to the left cheek. Also, of the 24 measurement areas set in the reference measurement area information shown in Fig.
- the measurement areas with measurement area numbers (13) to (24) indicate measurement areas set in the skin area corresponding to the right cheek.
- information indicating each measurement area is provided with information capable of identifying which area of the skin area the measurement area is set in.
- information indicating the four vertices of the measurement area is set as information capable of identifying which area of the skin area the measurement area is set in.
- the measurement area is assumed to be a quadrilateral.
- the information indicating the four vertices is represented, for example, by landmarks of facial organs such as the outer corner and inner corner of the eye, the nose and the mouth, or auxiliary landmarks.
- information indicating the four vertices of the measurement area is set, for example, by information indicating which facial organ landmark, or the number of the auxiliary landmark from which facial organ landmark on the line segment between which facial organ landmark and which facial organ landmark. Details of the facial organ landmarks and auxiliary landmarks will be described later.
- Figure 2B is a diagram showing frames (indicated by "I” in Figure 2B) of the image of the intended subject captured by the imaging device 2 corresponding to the measurement area (indicated by "M” in Figure 2B) determined by an administrator, etc., for every 10 degrees of the intended subject's facial direction in the yaw direction.
- the administrator or the like sets the measurement area at intervals of 10 degrees in the yaw direction of the expected subject's face. Note that this is merely an example, and the administrator or the like can set the measurement area in appropriate units, such as in units of 1 degree. 2B, the possible range of the subject's facial orientation in the yaw direction is "-30 degrees to +30 degrees.” This is determined by an administrator or the like based on the resolution of the imaging device 2, etc.
- the range in which the imaging device 2 can image the intended subject changes depending on the facial orientation of the intended subject.
- the size of the measurement area changes, the amount of pulse wave components included in the signal indicating the luminance change extracted from the measurement area also changes.
- the administrator or the like sets in advance a measurement area from which a luminance signal containing a pulse wave component sufficient to estimate the pulse wave of the intended subject is extracted when the imaging device 2 images the intended subject from a reference position and in a reference orientation, according to the facial orientation of the intended subject (the facial orientation based on the camera).
- the reference position of the imaging device 2 is specifically a position in real space where the horizontal and vertical coordinates indicating the position of the imaging device 2 are the same as the horizontal and vertical coordinates indicating the position of the assumed subject.
- the position of the subject is indicated by the center position of the subject's face.
- the real space is represented by three-dimensional coordinate axes, with the x-axis being an axis parallel to the vehicle width direction of the vehicle, the y-axis being an axis parallel to the vehicle height direction of the vehicle, and the z-axis being an axis parallel to the vehicle length direction of the vehicle, in other words, the traveling direction of the vehicle.
- parallel is not limited to being strictly parallel, but includes being approximately parallel. That is, the x-coordinate and the y-coordinate of the coordinates indicating the reference position of the imaging device 2 are the same as the x-coordinate and the y-coordinate of the coordinates indicating the position of the assumed subject, respectively.
- the measurement area is an area that can be used to extract a pulse wave source signal that indicates a change in luminance.
- the administrator or the like decides that when the imaging device 2 images the intended subject at the reference position and reference orientation, if the subject's facial orientation in the yaw direction is "-30 degrees," then measurement areas numbered (13) to (24) will be set.
- the administrator or the like determines the measurement area based on the size of the measurement area, for example, whether the size of the measurement area is equal to or larger than a predetermined threshold.
- this is merely one example.
- the administrator or the like may determine the measurement area based on the brightness of the measurement area, for example, whether the brightness of the measurement area is equal to or larger than a predetermined threshold.
- the administrator or the like may determine the measurement area from which it is assumed that a pulse wave original signal including the subject's pulse wave component can be extracted, depending on the subject's facial orientation in the yaw direction.
- the administrator, etc. determines the measurement area with a sufficient size as the measurement area for that face direction in the same manner as when the expected subject's face direction in the yaw direction is "-30 degrees.” As a result, the administrator, etc., sets a total of 24 measurement areas, excluding overlaps, as described with reference to FIG. 2A, for the expected subject's face direction in the yaw direction of "-30 degrees to +30 degrees.”
- the administrator or the like when the administrator or the like sets the measurement area according to the facial orientation in the yaw direction of the intended subject, the administrator or the like generates reference measurement area information and stores it in a storage unit or the like.
- the administrator or the like measures the subject's facial orientation in the yaw direction as "-30 degrees" in the measurement area numbers (13) to (24), the subject's facial orientation in the yaw direction as "less than -20 degrees and greater than -30 degrees" in the measurement area numbers (1), (7), (14) to (18), and (20) to (24), the subject's facial orientation in the yaw direction as "less than -10 degrees and greater than -20 degrees” in the measurement area numbers (1) to (2), (7) to (8), (15) to (18), and (21) to (24), and the subject's facial orientation in the yaw direction as "greater than -10 degrees and less than +10 degrees.”
- Reference measurement area information is generated in which area numbers (1) to (3), (7) to (9), (16) to (18), (22) to (
- the measurement area setting unit 14 determines the measurement area to be set based on the layout information, information indicating the target face direction range, and reference measurement area information, and then sets the determined multiple measurement areas in the image area on frame Im(k) corresponding to the skin area indicated by the skin area information S(k) based on the reference measurement area information, frame Im(k) of the captured image acquired by the captured image acquisition unit 11, and skin area information S(k) output by the skin area detection unit 12.
- the measurement area setting unit 14 sets the measurement area corresponding to the target face direction range in the reference measurement area information as the measurement area to be set in the image area on frame Im(k) corresponding to the skin area indicated by the skin area information S(k).
- the measurement area setting unit 14 offsets the yaw direction facial orientation (camera-based facial orientation) of the intended subject defined in the reference measurement area information based on the layout information, in other words, aligns the yaw direction facial orientation (subject-based facial orientation) of the intended subject when imaged at the current position and orientation of the imaging device 2 relative to the intended subject with the yaw direction facial orientation (camera-based facial orientation) of the intended subject imaged at the reference position and reference angle of the imaging device 2, and then determines the measurement area to be set.
- the measurement area setting unit 14 determines the measurement area by offsetting the face direction of the assumed subject defined in the reference measurement area information. It is assumed that the contents of the reference measurement area information are as described with reference to FIG. 2B.
- the relationship between the position and orientation of the imaging device 2 and the position and orientation of the assumed subject is as shown in Fig. 3.
- the assumed subject is a subject facing forward.
- "R" indicates the imaging range of the imaging device 2.
- the vertical coordinate indicating the position of the imaging device 2 is the same as the vertical coordinate indicating the position of the assumed subject (shown as "D" in FIG. 3A)
- the horizontal coordinate indicating the position of the imaging device 2 is the same as the horizontal coordinate indicating the position of the assumed subject (shown as "D" in FIG. 3B).
- the x coordinate and the y coordinate are the same as the x coordinate and the y coordinate of the coordinate indicating the position of the assumed subject.
- the optical axis of the imaging device 2 and the straight line indicating the front direction of the assumed subject overlap. There is no difference in the relationship between the front direction of the assumed subject and the direction of the optical axis of the imaging device 2. More specifically, the yaw angle formed by the optical axis of the imaging device 2 and the straight line parallel to the straight line indicating the front direction of the assumed subject is 0 degrees.
- the current position and orientation of the imaging device 2 with respect to the assumed subject are the reference position and reference orientation. Also, for example, it is assumed that the target face direction range is currently determined to be "-20 degrees to +20 degrees.”
- the measurement area setting unit 14 determines, based on the layout information, that there is no deviation of the position and orientation of the imaging device 2 from the reference position and reference orientation.
- the measurement area setting unit 14 determines that there is no need to offset the face orientation of the intended subject (face orientation based on the camera) defined in the reference measurement area information.
- the measurement area setting unit 14 determines, in the reference measurement area information, the measurement area indicated by the measurement area number associated with the face direction in the yaw direction of "-20 degrees to +20 degrees" as the measurement area to be set.
- the measurement area corresponding to "-20 degrees to +20 degrees" is determined to be the measurement area to be set (see Figures 4A and 4B. Information regarding measurement areas that have not been determined is shown shaded).
- the measurement area setting unit 14 determines all 20 measurement areas, namely, measurement area numbers (1) to (5), (7) to (11), (14) to (18), and (20) to (24), as the measurement areas to be set in the image area corresponding to the skin area.
- Fig. 5 indicates the imaging range of the imaging device 2.
- the vertical coordinate indicating the position of the imaging device 2 is the same as the vertical coordinate indicating the position of the assumed subject (shown as "D” in Fig. 5A), but as shown in Fig. 5B, in the real space, the horizontal coordinate indicating the position of the imaging device 2 is different from the horizontal coordinate indicating the position of the assumed subject (shown as "D” in Fig. 5B).
- the imaging device 2 captures an image of the assumed subject facing forward from a position at a yaw angle of "+20 degrees" to the right of the assumed subject, and the angle between the optical axis of the imaging device 2 and a line parallel to the line indicating the assumed subject's forward direction is "+20 degrees" in relation to the forward direction of the assumed subject.
- the imaging device 2 captures an image of the assumed subject from the front when the assumed subject faces right by a yaw angle of 20 degrees.
- the administrator or the like has now determined the target face direction range to be "-10 degrees to +10 degrees.”
- the measurement area setting unit 14 determines that the position and orientation of the imaging device 2 are displaced from the reference position and reference orientation based on the layout information.
- the measurement area setting unit 14 needs to offset the facial orientation of the assumed subject defined in the reference measurement area information (the facial orientation based on the camera) so that the positional relationship with the assumed subject when the imaging device 2 images the assumed subject from the front coincides with the positional relationship assumed in the reference measurement area information.
- the measurement area setting unit 14 sets the facial orientation in the yaw direction of the assumed subject defined in the reference measurement area information to "+20 degrees".
- the facial orientation in the yaw direction of the assumed subject defined in the reference measurement area information can be regarded as "-10 degrees to +50 degrees” when aligned with the position and orientation of the actual imaging device 2.
- the facial orientation based on the subject can be regarded as "-10 degrees to +50 degrees”.
- the measurement area setting unit 14 determines the measurement area corresponding to the target face direction range "-10 degrees to +10 degrees” as the measurement area to be set.
- the measurement area corresponding to the face direction in the yaw direction "-30 degrees to -10 degrees" in the reference measurement area information is determined as the measurement area to be set (see Figs. 6A and 6B.
- the measurement area setting unit 14 determines, based on the reference measurement area information, a total of 16 measurement areas, that is, measurement area numbers (1) to (2), (7) to (8), and (13) to (24), as the measurement areas to be set in the image area corresponding to the skin area.
- the subject changes his/her facial orientation only in the yaw direction
- the reference measurement area information is information in which the facial orientation in the yaw direction is associated with information indicating the measurement area, but the reference measurement area is also generated for the pitch and roll directions in the same manner as for the yaw direction.
- the measurement area setting unit 14 determines the measurement area to be set by offsetting the facial orientation of the assumed subject defined in the reference measurement area information (the facial orientation based on the camera) based on the deviation in the pitch or roll direction from the reference position and reference orientation of the position and orientation of the imaging device 2 with respect to the assumed subject, in the same manner as for the yaw direction described above.
- the measurement area setting unit 14 determines the multiple measurement areas to be set based on the multiple measurement areas defined in the reference measurement area information based on the layout information and information indicating the target face direction range, and then, based on the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 and the skin area information S(k) output by the skin area detection unit 12, sets the multiple measurement areas determined based on the layout information in the image area on frame Im(k) corresponding to the skin area indicated by the skin area information S(k).
- the measurement region setting unit 14 may acquire the captured image acquired by the captured image acquisition unit 11 via the skin region detection unit 12 .
- FIGS 7A, 7B, and 7C are diagrams for explaining an example of a method for setting a measurement region by measurement region setting unit 14 in pulse wave estimation device 1 according to embodiment 1.
- the measurement region setting unit 14 sets the measurement region ri(k)
- the measurement region setting unit 14 detects Ln (a positive integer) landmarks of facial organs such as the corners and corners of the eyes, the nose, and the mouth in the skin region sr indicated by the skin region information S(k).
- the landmarks are indicated by circles.
- the measurement region setting unit 14 sets a vector storing the coordinate values of the detected landmarks as L(k).
- the measurement region setting unit 14 may detect face parts using a known method such as a model called a Constrained Local Model (CLM).
- CLM Constrained Local Model
- the measurement area setting unit 14 sets the vertex coordinates of the quadrangle of the measurement area ri(k) based on the detected landmark. For example, the measurement area setting unit 14 sets the vertex coordinates of a quadrangle as shown in FIG. 7C, and sets Rn measurement areas ri(k).
- the measurement area setting unit 14 selects a landmark LA1 on the face outline and a landmark LA2 on the nose.
- the measurement area setting unit 14 first selects the landmark LA2 on the nose, and then selects the landmark LA1 on the face outline that is closest to the landmark LA2 on the nose. Then, the measurement area setting unit 14 sets auxiliary landmarks a1, a2, and a3 so as to divide the line segment between the landmark LA1 and the landmark LA2 into four equal parts.
- the measurement area setting unit 14 selects a landmark LB1 on the facial contour and a landmark LB2 on the nose.
- the measurement area setting unit 14 also sets auxiliary landmarks b1, b2, and b3 so as to divide the line segment between the landmarks LB1 and LB2 into four equal parts.
- the landmarks LB1 and LB2 may be selected, for example, from the landmarks on the facial contour or the nose adjacent to the landmarks LA1 and LA2, respectively.
- the measurement area setting unit 14 sets a quadrilateral area surrounded by the auxiliary landmarks a1, b1, b2, and a2 as one measurement area R1.
- the auxiliary landmarks a1, b1, b2, and a2 each have vertex coordinates corresponding to the measurement area R1.
- the measurement area setting unit 14 sets one measurement area R2 surrounded by auxiliary landmarks a2, b2, b3, and a3, and the vertex coordinates of the measurement area R2.
- the measurement area setting unit 14 determines the measurement area ri(k) based on the layout information and information indicating the target face direction range, and based on multiple measurement areas defined in the reference measurement area information, and sets the determined measurement area ri(k).
- the measurement area setting unit 14 sets the measurement area ri(k) determined based on the layout information, the information indicating the target face direction range, and the reference measurement area information, in the manner described with reference to Fig. 7.
- the measurement area setting unit 14 can specify which points are the landmarks or auxiliary landmarks that are the vertices of the measurement area ri(k) to be set.
- the measurement area setting unit 14 can similarly set the measurement area and the vertex coordinates of the measurement area ri(k) for, for example, other portions of the cheek and the skin area sr in a portion corresponding to the chin.
- the measurement area setting unit 14 can also set the measurement area ri(k) in a portion of the subject's skin area sr that corresponds to the forehead or the tip of the nose.
- the measurement area setting unit 14 may set the measurement area ri(k) using a method other than CLM.
- the measurement area setting unit 14 may set the measurement area ri(k) using a tracking technique such as the Kanade-Lucas-Tomasi (KLT) tracker.
- KLT Kanade-Lucas-Tomasi
- the measurement area setting unit 14 may detect the coordinates of face organ points by CLM for the skin area of the first frame Im(1) of the series of frames Im(k-Tp+1) to Im(k), and may track the face organ points by the KLT tracker from the skin area of the next frame Im(2) onwards, and calculate the face organ points for the skin area of each frame Im(k).
- the measurement area setting unit 14 may execute CLM once every few frames and perform reset processing such as resetting the coordinate positions of the face organ points.
- the measurement region setting unit 14 When the measurement region setting unit 14 sets a plurality of measurement regions ri(k), it generates measurement region information R(k) indicating the set measurement regions ri(k).
- the measurement region ri(k) is a quadrangle, and the position and size of the measurement region ri(k) are defined as the coordinate values of the four vertices of the quadrangle on the captured image.
- the measurement region setting unit 14 outputs the generated measurement region information R(k) to the luminance signal extraction unit 15 .
- the luminance signal extraction unit 15 sets a measurement area ri(k) to be used for extracting a pulse wave source signal (hereinafter referred to as a “used measurement area”) ri(k) from among the measurement areas ri(k) set by the measurement area setting unit 14.
- the luminance signal extraction unit 15 extracts a pulse wave source signal indicating a luminance change during a pulse wave estimation target period, in other words, a period corresponding to the number of frames Tp, from each of the set use measurement areas ri(k) among the multiple measurement areas ri(k) on the frame Im(k) indicated by the measurement area information R(k).
- the pulse wave source signal is a signal that is the source of the pulse wave.
- the pulse wave estimation device 1 estimates the subject's pulse wave using the pulse wave source signal.
- the subject's pulse wave is estimated by the pulse wave estimation unit 17.
- the pulse wave estimation unit 17 will be described in detail later.
- the luminance signal extraction unit 15 may acquire the captured image acquired by the captured image acquisition unit 11 via the skin region detection unit 12 and the measurement region setting unit 14 .
- the luminance signal extraction unit 15 sets the used measurement area ri(k) based on, for example, the face direction information F(k) output from the face direction estimation unit 13 and the used measurement area setting information.
- the used measurement area setting information is information in which the face direction of the subject is associated with the measurement area number. For example, an administrator or the like generates the used measurement area setting information in advance and stores it in a location such as a storage unit that can be referenced by the pulse wave estimation device 1.
- the luminance signal extraction unit 15 compares a frame Im(k) of the captured image acquired by the captured image acquisition unit 11 with a frame Im(k) of the captured image included in the face direction information F(k) to determine the face direction of the subject corresponding to the frame Im(k), more specifically, the face direction of the subject estimated by the face direction estimation unit 13 based on the frame Im(k).
- the luminance signal extraction unit 15 may determine the face direction of the subject by comparing the identification numbers of the frames Im(k) of the captured image. An identification number is assigned to the frame Im(k) of the captured image.
- the imaging device 2 assigns an identification number to each frame Im(k) and outputs the captured image.
- the luminance signal extraction unit 15 refers to the information for setting the used measurement area, and sets the measurement area ri(k) having the measurement area number corresponding to the determined face direction as the used measurement area ri(k).
- the luminance signal extraction unit 15 extracts a pulse wave source signal during a pulse wave estimation target period from each of the set usage measurement regions ri(k) on the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 .
- the face direction of the subject in the yaw direction indicated by face direction information F(k) including frame Im(k) of the captured image is "-10 degrees.”
- the face direction of the subject in the yaw direction of "-10 degrees” is associated with measurement area numbers (1) to (2), (7) to (8), (15) to (18), and (21) to (24).
- the luminance signal extraction unit 15 sets the measurement areas ri(k) with measurement area numbers (1) to (2), (7) to (8), (15) to (18), and (21) to (24) among the measurement areas ri(k) set in the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 as the used measurement areas ri(k).
- the luminance signal extraction unit 15 extracts a pulse wave source signal indicating a luminance change during the pulse wave estimation target period from each of the used measurement areas ri(k) having measurement area numbers (1)-(2), (7)-(8), (15)-(18), and (21)-(24). After extracting the pulse wave origin signal, the luminance signal extractor 15 generates pulse wave origin signal information W(t) indicating the extracted pulse wave origin signal.
- the pulse wave original signal information W(t) includes information indicative of the pulse wave original signal wi(t) extracted from the used measurement region ri(k).
- the pulse wave original signal wi(t) is time series data for Tp and is extracted, for example, based on frames Im(k-Tp+1), Im(k-Tp+2), ..., Im(k) for the past Tp and measurement region information R(k-Tp+1), R(k-Tp+2), ..., R(k).
- the luminance feature amount is a value calculated based on the luminance value on the frame Im(j) of the captured image for each used measurement area ri(j).
- the luminance feature amount is the average or variance of the luminance values of the pixels included in the used measurement area ri(j). In the first embodiment, as an example, the luminance feature amount is the average of the luminance values of the pixels included in the used measurement area ri(j).
- the luminance signal extraction unit 15 generates pulse wave original signal information W(t) indicating the pulse wave original signal wi(t) in each used measurement region ri(k).
- the pulse wave original signal information W(t) includes the pulse wave original signal wi(t) in each used measurement region ri(k), information indicating which used measurement region ri(k) the pulse wave original signal wi(t) was extracted from, each frame Im(k) of the captured image including the used measurement region ri(k) from which the pulse wave original signal wi(t) was extracted, and an identification number of each frame Im(k) of the captured image in the time series.
- the luminance signal extracting section 15 outputs the generated pulse wave source signal information W(t) to the luminance signal selecting section 16 .
- the luminance signal selection unit 16 selects a time-series pulse wave source signal wi(t) to be used for estimating the subject's pulse wave (hereinafter referred to as the "estimated pulse wave source signal") wi(t) from the time-series pulse wave source signals wi(t) extracted by the luminance signal extraction unit 15 during the pulse wave estimation target period, based on the pulse wave source signal information W(t) output from the luminance signal extraction unit 15 and taking into consideration the subject's facial direction estimated by the facial direction estimation unit 13.
- the luminance signal selection unit 16 calculates the distribution ratio of the subject's facial direction corresponding to each frame Im(k) of the captured image estimated by the facial direction estimation unit 13 during the pulse wave estimation target period.
- the luminance signal selection unit 16 selects, as the time series estimation pulse wave source signal wi(t), a time series pulse wave source signal wi(t) extracted from a commonly-set usage measurement region ri(k) in each frame Im(k) of the captured image from which the subject's facial direction, the frequency of appearance of which is equal to or greater than a preset threshold (hereinafter referred to as the "facial direction determination threshold”), is estimated.
- a preset threshold hereinafter referred to as the "facial direction determination threshold
- the face direction estimation unit 13 stores the face direction information F(k) in a time series in a storage unit or the like.
- the captured image is provided with information on the capture date and time of the captured image.
- the luminance signal selection unit 16 can calculate the distribution ratio of the subject's face direction corresponding to each frame Im(k) of the captured image estimated by the face direction estimation unit 13 during the pulse wave estimation target period.
- FIG. 8 is a diagram showing an example of a histogram indicating the distribution ratio of the subject's facial direction corresponding to each frame Im(k) of the captured image, estimated by the facial direction estimation unit 13 during the pulse wave estimation target period, calculated by the luminance signal selection unit 16 in embodiment 1. Note that here, it is assumed that the subject changes his or her facial direction only in the yaw direction. In other words, the distribution ratio of the subject's facial direction shown in FIG. 8 is the distribution ratio of the subject's facial direction in the yaw direction.
- the luminance signal selection unit 16 selects, as the time series estimation pulse wave source signal wi(t), the time series pulse wave source signal wi(t) extracted from the commonly determined usage measurement region ri(k) in each frame Im(k) of the captured image from which the subject's facial orientation in the yaw direction is estimated to be "-5 degrees to +5 degrees.”
- the used measurement area ri(k) determined in common in each frame Im(k) of the captured image from which the subject's facial orientation in the yaw direction was estimated to be "-5 degrees to +5 degrees” means the measurement area ri(k) determined as the used measurement area ri(k) by the luminance signal extraction unit 15 in each frame Im(k) of the captured image from which the subject's facial orientation in the yaw direction was estimated to be "-5 degrees to +5
- the luminance signal selection unit 16 for example, based on the above-mentioned information for setting the used measurement area, identifies the used measurement area ri(k) that is commonly determined for each frame Im(k) of the captured image from which the subject's facial orientation in the yaw direction is estimated to be "-5 degrees to +5 degrees.”
- the luminance signal selection unit 16 selects the time-series pulse wave source signal wi(t) extracted from the identified used measurement region ri(k) as the time-series estimated pulse wave source signal wi(t).
- the luminance signal selector 16 then outputs pulse wave source signal information W(t) including the selected estimated pulse wave source signal wi(t) (hereinafter referred to as “selected pulse wave source signal information”) to the pulse wave estimator 17 .
- the distribution ratio of the subject's face direction is determined in increments of 5 degrees, but this is merely an example.
- the distribution ratio of the subject's face direction may be determined in increments of 1 degree. Basically, it is safe that the resolution of the face direction defined in the reference measurement area information matches the resolution of the distribution ratio of the subject's face direction.
- the pulse wave estimating section 17 estimates the subject's pulse wave based on the selected pulse wave original signal information W(t) output from the luminance signal selecting section 16 .
- the pulse wave estimation unit 17 may estimate the subject's pulse wave using a known method for estimating the subject's pulse wave based on the pulse wave source signal.
- pulse wave estimation unit 17 first generates a signal (hereinafter referred to as a "separated signal") indicating a plurality of signal components (hereinafter referred to as "principal components”) based on estimated pulse wave source signals wi(t).
- pulse wave estimation unit 17 analyzes the plurality of principal components using a general signal separation technique such as PCA or ICA, and generates a separated signal indicating the analyzed plurality of principal components.
- a general signal separation technique such as PCA or ICA
- pulse wave estimation unit 17 separates components that appear to be pulse wave components and components that appear to be noise components from the plurality of estimated pulse wave source signals wi(t).
- the pulse wave estimation unit 17 reconstructs an estimated pulse wave original signal wi(t) for each used measurement region ri(k) based on the generated separated signals, more specifically, on separated signal information Sep(t) relating to the generated separated signals indicating the generated principal components.
- Each separated signal included in the separated signal information Sep(t) includes an estimated pulse wave source signal wi(t) for each used measurement region ri(k).
- Pulse wave estimation unit 17 can restore an estimated pulse wave source signal wi(t) for each used measurement region ri(k) from the multiple separated signals included in the separated signal information Sep(t).
- the pulse wave estimation unit 17 restores the estimated pulse wave original signal wi(t) for each used measurement region ri(k), it generates restored estimated pulse wave original signal information RW(t) indicating the restored estimated pulse wave original signal wi(t) for each used measurement region ri(k).
- the restored estimated pulse wave original signal information RW(t) includes the restored estimated pulse wave original signal wi(t) for each used measurement region ri(k).
- the pulse wave estimator 17 estimates the subject's pulse wave based on the generated restored estimated pulse wave original signal information RW(t). For example, the pulse wave estimation unit 17 calculates the S/N ratio of the restored estimated pulse wave source signal for each used measurement region ri(k). The pulse wave estimation unit 17 weights the restored estimated pulse wave source signal for each used measurement region ri(k) based on the calculated S/N ratio, and then calculates composite estimated pulse wave signal information D(t) by adding up the restored estimated pulse wave source signals corresponding to each used measurement region ri(k). That is, the pulse wave estimation unit 17 calculates one composite estimated pulse wave signal information D(t) for all used measurement regions ri(k).
- the composite estimated pulse wave signal information D(t) is assumed to be a signal that resembles a pulse wave component with noise components removed.
- Pulse wave estimator 17 then performs a Fourier transform on composite estimated pulse wave signal information D(t) to calculate, as the pulse rate, a peak frequency in the frequency power spectrum within a predetermined frequency range.
- the predetermined frequency range is set taking into account the range of human heart rates.
- Pulse wave estimating section 17 outputs pulse wave estimation result P(t), which is pulse wave information indicating the estimated pulse wave, to output section 18.
- the pulse wave information may be, for example, time series data of the subject's pulse wave estimated by pulse wave estimation unit 17, the subject's pulse rate, or the subject's pulse interval.
- the output unit 18 outputs the pulse wave estimation result P(t) output from the pulse wave estimating unit 17 to, for example, an arousal level estimation device or an abnormality detection device.
- the function of output unit 18 may be provided in pulse wave estimation unit 17. When the function of output unit 18 is provided in pulse wave estimation unit 17, output unit 18 is not an essential component of pulse wave estimation device 1.
- FIG. 9 is a flowchart for explaining the operation of pulse wave estimation device 1 according to the first embodiment. For example, when the power supply of the vehicle is turned on, pulse wave estimation device 1 repeats the process shown in the flowchart of FIG. 9 until the power supply of the vehicle is turned off.
- the captured image acquisition unit 11 acquires a captured image of a subject (step ST1).
- the captured image acquisition unit 11 outputs the acquired captured image to the skin area detection unit 12 .
- the skin region detection unit 12 detects the skin region of the subject from the frame Im(k) included in the captured image acquired by the captured image acquisition unit 11 in step ST1 (step ST2).
- the skin area detection unit 12 generates skin area information S(k) indicating the detected skin area.
- the skin region detection unit 12 outputs the generated skin region information S(k) to the measurement region setting unit 14 .
- the measurement area setting unit 14 determines multiple measurement areas ri(k) to be set based on the multiple measurement areas defined in the reference measurement area information, based on the layout information and information indicating the target face direction range, and sets the determined multiple measurement areas ri(k) in the image area on the frame Im(k) corresponding to the skin area indicated by the skin area information S(k), based on the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 and the skin area information S(k) output by the skin area detection unit 12.
- the measurement region setting unit 14 sets a plurality of measurement regions ri(k) it generates measurement region information R(k) indicating the set plurality of measurement regions ri(k).
- the measurement region setting unit 14 outputs the generated measurement region information R(k) to the luminance signal extraction unit 15 .
- the luminance signal extraction unit 15 extracts a time-series luminance signal indicating a change in luminance during the pulse wave estimation target period, in other words, a time-series pulse wave original signal wi(t), from each of the usage measurement areas ri(k) among the multiple measurement areas ri(k) on the frame Im(k) indicated by the measurement area information R(k), based on the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 in step ST1 and the measurement area information R(k) output from the measurement area setting unit 14 in step ST3 (step ST5).
- the luminance signal extractor 15 generates pulse wave source signal information W(t) indicating the extracted time-series pulse wave source signal wi(t).
- the luminance signal extracting section 15 outputs the generated pulse wave source signal information W(t) to the luminance signal selecting section 16 .
- the luminance signal selection unit 16 selects a time-series luminance signal, in other words, a time-series estimated pulse wave source signal wi(t), based on the pulse wave source signal information W(t) output from the luminance signal extraction unit 15 in step ST5 and taking into consideration the face direction of the subject estimated by the face direction estimation unit 13 in step ST4 (step ST6).
- the luminance signal selection section 16 outputs the selected pulse wave source signal information W(t) to the pulse wave estimation section 17 .
- the pulse wave estimating section 17 estimates the subject's pulse wave based on the selected pulse wave original signal information W(t) output from the luminance signal selecting section 16 in step ST6 (step ST7).
- Pulse wave estimating section 17 outputs pulse wave estimation result P(t), which is pulse wave information indicating the estimated pulse wave, to output section 18.
- the output unit 18 outputs the pulse wave estimation result P(t) output from the pulse wave estimating unit 17 to, for example, an arousal level estimation device or an abnormality detection device.
- step ST4 is performed after the processing of step ST3, but this is merely one example.
- the order of the processing of step ST4 and the processing of step ST3 may be reversed, or the processing of steps ST2 to ST3 and the processing of step ST4 may be performed in parallel.
- the processing of step ST4 may be performed after the processing of step ST1 and before the processing of step ST5 is performed.
- FIG. 10 is a flowchart for explaining the details of the process of step ST5 in FIG.
- the luminance signal extraction unit 15 compares frame Im(k) of the captured image acquired by the captured image acquisition unit 11 in step ST1 of FIG. 9 with frame Im(k) of the captured image included in the facial direction information F(k) output in step ST4 of FIG. 9, and determines the facial direction of the subject corresponding to that frame Im(k). Then, the luminance signal extraction unit 15 refers to the information for setting the used measurement area, and sets the measurement area ri(k) having the measurement area number corresponding to the determined face direction as the used measurement area ri(k) (step ST501).
- the luminance signal extraction unit 15 extracts a time-series pulse wave original signal wi(t) during the pulse wave estimation target period from each of the used measurement areas ri(k) on the frame Im(k) based on the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 in step ST1 of FIG. 9 and the measurement area information R(k) output from the measurement area setting unit 14 in step ST3 of FIG. 9 (step ST502).
- the luminance signal extractor 15 extracts the time-series pulse wave source signal wi(t), it generates pulse wave source signal information W(t) indicating the extracted time-series pulse wave source signal wi(t).
- the luminance signal extracting section 15 outputs the generated pulse wave source signal information W(t) to the luminance signal selecting section 16 .
- FIG. 11 is a flowchart for explaining the details of the process of step ST6 in FIG.
- the luminance signal selection section 16 calculates the distribution ratio of the subject's face direction corresponding to each frame Im(k) of the captured image estimated by the face direction estimation section 13 during the pulse wave estimation period (step ST601).
- the luminance signal selection unit 16 selects, as a time-series estimation pulse wave source signal wi(t), a time-series pulse wave source signal wi(t) extracted from a commonly-set usage measurement area ri(k) in each frame Im(k) of the captured image from which the subject's facial direction, the frequency of occurrence of which is equal to or greater than the facial direction determination threshold, is estimated (step ST602).
- the luminance signal selection section 16 outputs the selected pulse wave source signal information W(t) to the pulse wave estimation section 17 .
- pulse wave estimation device 1 detects the subject's skin area from the captured image, and sets a measurement area ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change and that contains the subject's pulse wave component, in the area corresponding to the skin area on the captured image. Specifically, pulse wave estimation device 1 determines a measurement area ri(k) to be set in the area corresponding to the skin area on the captured image based on information indicating the target face direction range, layout information, and reference measurement area information, and sets the determined measurement area ri(k).
- Pulse wave estimation device 1 sets a usage measurement area ri(k) from within measurement area ri(k), extracts a time-series pulse wave original signal wi(t) based on luminance changes in the usage measurement area ri(k), and then selects a time-series estimation pulse wave original signal wi(t) to be used for estimating the subject's pulse wave from the time-series pulse wave original signals wi(t) extracted during the pulse wave estimation period, taking into consideration the subject's facial direction estimated on a frame-by-frame basis based on the captured images.
- the pulse wave estimation device 1 estimates the subject's pulse wave based on the selected time-series estimation pulse wave original signal wi(t).
- the pulse wave estimation device 1 extracts the time-series pulse wave original signal wi(t) and selects the time-series estimation pulse wave original signal wi(t) by extracting, from the set measurement region ri(k), a measurement region ri(k) corresponding to the subject's facial orientation estimated based on the captured image in which the measurement region ri(k) is set, as the used measurement region ri(k).
- the pulse wave estimation device 1 then calculates the distribution ratio of the subject's facial orientation estimated for each frame during the pulse wave estimation period, and selects, as the time-series estimation pulse wave original signal wi(t), the time-series estimation pulse wave original signal wi(t) extracted from the used measurement region ri(k) commonly set in the captured image from which the subject's facial orientation, whose occurrence frequency is equal to or exceeds a facial orientation determination threshold, is estimated.
- the pulse wave estimation device 1 can extract a luminance signal containing sufficient pulse wave components to estimate the subject's pulse wave, in other words, a pulse wave source signal wi(t), from the skin area on the captured image.
- the pulse wave estimation device 1 can prevent a decrease in the accuracy of estimating the subject's pulse wave due to the skin area of the person whose pulse wave is to be estimated, i.e., the subject, not being captured in the captured image, or the skin area being shadowed.
- the processing circuit 101 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination of these.
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the processing circuit is the processor 104
- the functions of the captured image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16, pulse wave estimation unit 17, and output unit 18 are realized by software, firmware, or a combination of software and firmware.
- the software or firmware is described as a program and stored in the memory 105.
- the processor 104 executes the functions of the captured image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16, pulse wave estimation unit 17, and output unit 18 by reading and executing the program stored in the memory 105.
- the pulse wave estimation device 1 includes a memory 105 for storing a program that, when executed by the processor 104, results in the execution of steps ST1 to ST7 in FIG. 9 described above.
- the program stored in memory 105 can also be said to cause a computer to execute the processing procedures or methods of the image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15, the luminance signal selection unit 16, the pulse wave estimation unit 17, and the output unit 18.
- memory 105 includes, for example, non-volatile or volatile semiconductor memory such as RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), etc.
- non-volatile or volatile semiconductor memory such as RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), etc.
- the functions of the captured image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15, the luminance signal selection unit 16, the pulse wave estimation unit 17, and the output unit 18 may be partially realized by dedicated hardware and partially realized by software or firmware.
- the functions of the captured image acquisition unit 11 and the output unit 18 may be realized by a processing circuit 101 as dedicated hardware, and the functions of the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15, the luminance signal selection unit 16, and the pulse wave estimation unit 17 may be realized by the processor 104 reading and executing a program stored in the memory 105.
- the storage unit (not shown) is configured by, for example, the memory 105 .
- the pulse wave estimation device 1 also includes devices such as an image capture device 2, an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
- the subject is the driver of the vehicle, but this is merely one example.
- the subject may be a passenger other than the driver of the vehicle.
- the pulse wave estimation device 1 is an in-vehicle device, and the image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15, the luminance signal selection unit 16, the pulse wave estimation unit 17, and the output unit 18 are provided in the in-vehicle device.
- some of the image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16, pulse wave estimation unit 17, and output unit 18 may be mounted on an in-vehicle device of a vehicle, and the others may be provided in a server connected to the in-vehicle device via a network, so that a system is formed by the in-vehicle device and the server.
- the image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16, pulse wave estimation unit 17, and output unit 18 may all be provided in the server.
- the pulse wave estimation device 1 is not limited to an in-vehicle device mounted on a vehicle, but may be applied to a moving body other than a vehicle or a home appliance, for example.
- the subject is not limited to a vehicle occupant, but may be various people.
- pulse wave estimation device 1 may be mounted on a television installed in a living room of a house. In this case, the subject is a user such as a resident of the house. Pulse wave estimation device 1 estimates the pulse wave of the user based on an image captured by imaging device 2 mounted on the television.
- the measurement area setting unit 14 determines the measurement area ri(k) to be set in the area corresponding to the skin area on the captured image based on information indicating the target face direction range, layout information, and reference measurement area information.
- the target face direction range is the range of all face directions that the subject can take within the range that the imaging device 2 can capture, and it is assumed that the imaging device 2 always captures the subject in a reference position and reference direction, the measurement area setting unit 14 does not need to take into account the target face direction range and layout information.
- the pulse wave estimation device 1 includes an image acquisition unit 11 that acquires an image of a person (subject) on a frame-by-frame basis, a skin area detection unit 12 that detects the skin area of the person from the image, a measurement area setting unit 14 that sets a measurement area ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change and that contains a pulse wave component of the person in an area corresponding to the skin area on the image, a face direction estimation unit 13 that estimates the face direction of the person on a frame-by-frame basis based on the image, and a pulse wave original signal wi(t) that can be extracted from the measurement area ri(k) set by the measurement area setting unit 14.
- the apparatus is configured to include a luminance signal extraction unit 15 that sets a used measurement area ri(k) used to extract a pulse wave signal wi(t) based on a luminance change in the set used measurement area ri(k), a luminance signal selection unit 16 that selects a time-series pulse wave source signal wi(t) to be used for estimating the pulse wave of the person from the time-series pulse wave source signals wi(t) extracted by the luminance signal extraction unit 15 during a pulse wave estimation target period, taking into account the face direction of the person estimated by the face direction estimation unit 13, and a pulse wave estimation unit 17 that estimates the pulse wave of the person based on the time-series pulse wave source signal wi(t) selected by the luminance signal selection unit 16. Therefore, the pulse wave estimation device 1 can prevent a decrease in the estimation accuracy of the subject's pulse wave due to the skin area of the person to be estimated, i.e., the subject, not being captured on the captured image, or the skin area being overshadowed.
- luminance signal extraction unit 15 extracts pulse wave original signal wi(t) from measurement area ri(k) set by measurement area setting unit 14, which corresponds to the facial direction of a person (subject) estimated by facial direction estimation unit 13 based on the captured image in which measurement area ri(k) is set, as a used measurement area ri(k), and luminance signal selection unit 16 calculates the distribution ratio of the facial direction of the person estimated by facial direction estimation unit 13 for each frame during a pulse wave estimation target period, and selects, as a time series estimation pulse wave original signal wi(t), the time series estimation pulse wave original signal wi(t) extracted from the used measurement area ri(k) set in common in the captured images from which the facial direction of the person whose appearance frequency is equal to or greater than the facial direction determination threshold is estimated.
- the pulse wave estimation device 1 can prevent a decrease in the accuracy of estimating the subject's pulse wave due to, for example, the skin area of the person whose pulse wave is to be estimated, i.e., the subject, not being captured in the captured image, or the occurrence of so-called shadowing on the skin area.
- the pulse wave estimation device selects a time-series estimation pulse wave source signal based on the distribution ratio of the subject's facial direction for each frame.
- a time-series estimation pulse wave source signal is selected by a method different from that of the first embodiment.
- the pulse wave estimation device is mounted on a vehicle and the subject is the driver of the vehicle.
- FIG. 13 is a diagram showing an example of the configuration of a pulse wave estimation device 1a according to the second embodiment.
- the same components as those of the pulse wave estimation device 1 described in the first embodiment with reference to FIG. 1 are denoted by the same reference numerals, and duplicated explanations will be omitted.
- the specific operation of luminance signal selection section 16a differs from the specific operation of luminance signal selection section 16 in pulse wave estimation device 1 according to the first embodiment.
- the luminance signal selection unit 16a selects a time-series estimated pulse wave source signal wi(t) to be used to estimate the subject's pulse wave from among the time-series pulse wave source signals wi(t) extracted by the luminance signal extraction unit 15 during the pulse wave estimation target period, based on the pulse wave source signal information W(t) output from the luminance signal extraction unit 15 and taking into consideration the subject's facial direction estimated by the facial direction estimation unit 13.
- the luminance signal selection unit 16a calculates the distribution ratio of the used measurement region ri(k) from which the luminance signal extraction unit 15 extracted the pulse wave original signal wi(t) during the pulse wave estimation target period. The luminance signal selection unit 16a then selects the time-series pulse wave original signal wi(t) extracted from the used measurement region ri(k) whose occurrence frequency is equal to or greater than a preset threshold value (hereinafter referred to as the "region determination threshold value”) as the time-series estimated pulse wave original signal wi(t).
- a preset threshold value hereinafter referred to as the "region determination threshold value
- the luminance signal selector 16a stores the pulse wave original signal information W(t) in a chronological order in a storage unit (not shown). Based on the pulse wave original signal information W(t) stored in the storage unit, the luminance signal selector 16a can calculate the distribution ratio of the used measurement region r i (k) from which the luminance signal extractor 15 extracted the pulse wave original signal w i (t) during the pulse wave estimation target period.
- FIG. 14 is a diagram showing an example of a histogram showing the distribution ratio of the used measurement region r i (k) from which the pulse wave source signal w i (t) extracted by the luminance signal extracting unit 15 was extracted during the pulse wave estimation target period, as calculated by the luminance signal selecting unit 16 a in the second embodiment.
- the luminance signal extraction unit 15 sets all 42 measurement regions ri(k) with measurement region numbers (1) to (42) as the used measurement regions ri(k) during the pulse wave estimation target period, and extracts the pulse wave source signal wi(t).
- the frequency of the used measurement areas ri(k) of the measurement area numbers (1) to (26) is equal to or higher than the area determination threshold value.
- the luminance signal selection unit 16a selects the time-series pulse wave source signal wi(t) extracted from the used measurement area ri(k) having measurement area numbers (1) to (26) as the time-series estimated pulse wave source signal wi(t).
- the luminance signal selector 16a outputs pulse wave source signal information W(t) including the selected time series estimation pulse wave source signal wi(t) to the pulse wave estimator 17 as selected pulse wave source signal information W(t).
- FIG. 15 is a flowchart for explaining the operation of pulse wave estimation device 1a according to the second embodiment. For example, when the power supply of the vehicle is turned on, pulse wave estimation device 1a repeats the process shown in the flowchart of FIG. 15 until the power supply of the vehicle is turned off.
- steps ST1 to ST5 and ST7 performed by pulse wave estimation device 1a are similar to the specific operations of steps ST1 to ST5 and ST7 already explained in embodiment 1 using the flowchart in FIG. 9, so the same step numbers are used and duplicate explanations are omitted.
- the luminance signal selection unit 16 selects a time-series estimated pulse wave source signal wi(t) based on the pulse wave source signal information W(t) output from the luminance signal extraction unit 15 in step ST5 and taking into consideration the face direction of the subject estimated by the face direction estimation unit 13 in step ST4 (step ST6a).
- the luminance signal selection section 16 outputs the selected pulse wave source signal information W(t) to the pulse wave estimation section 17 .
- step ST4 is performed after the processing of step ST3, but this is merely one example.
- the order of the processing of step ST4 and the processing of step ST3 may be reversed, or the processing of steps ST2 to ST3 and the processing of step ST4 may be performed in parallel.
- the processing of step ST4 may be performed after the processing of step ST1 and before the processing of step ST5 is performed.
- FIG. 16 is a flowchart for explaining the details of step ST6a in FIG.
- the luminance signal selector 16a calculates the distribution ratio of the used measurement region ri(k) from which the luminance signal extractor 15 extracted the pulse wave source signal wi(t) during the pulse wave estimation period (step ST611).
- the luminance signal selection unit 16a selects the time-series pulse wave source signal wi(t) extracted from the usage measurement area ri(k) whose occurrence frequency is equal to or greater than the area determination threshold as the time-series estimated pulse wave source signal wi(t) (step ST612).
- the luminance signal selector 16 a outputs the selected pulse wave source signal information W(t) to the pulse wave estimator 17 .
- pulse wave estimation device 1a detects the subject's skin area from the captured image, and sets a measurement area ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change and that contains the subject's pulse wave component, in the area corresponding to the skin area on the captured image. Specifically, pulse wave estimation device 1a determines a measurement area ri(k) to be set in the area corresponding to the skin area on the captured image based on information indicating the target face direction range, layout information, and reference measurement area information, and sets the determined measurement area ri(k).
- the pulse wave estimation device 1a sets a usage measurement area ri(k) from within the measurement area ri(k), extracts a time-series pulse wave source signal based on luminance changes in the usage measurement area ri(k), and then selects a time-series estimation pulse wave source signal wi(t) to be used for estimating the subject's pulse wave from the time-series pulse wave source signals wi(t) extracted during the pulse wave estimation period, taking into consideration the subject's facial direction estimated on a frame-by-frame basis based on the captured images.
- the pulse wave estimation device 1a estimates the subject's pulse wave based on the selected time-series estimation pulse wave original signal wi(t).
- pulse wave estimation device 1a calculates the distribution proportion of the used measurement region ri(k) from which the pulse wave original signal wi(t) was extracted during the pulse wave estimation period, and selects the time-series pulse wave original signal wi(t) extracted from the used measurement region ri(k) whose occurrence frequency is equal to or greater than the region determination threshold as the time-series estimated pulse wave original signal wi(t).
- pulse wave estimation device 1a can extract a luminance signal containing sufficient pulse wave components to estimate the subject's pulse wave, in other words, a pulse wave source signal wi(t), from the skin area on the captured image.
- pulse wave estimation device 1a can prevent a decrease in the accuracy of estimating the subject's pulse wave due to the skin area of the person whose pulse wave is to be estimated, i.e., the subject, not being captured in the captured image, or the skin area being shadowed.
- pulse wave estimation device 1a includes processing circuit 101 for extracting, from the skin area on the captured image, a luminance signal containing a pulse wave component sufficient to estimate a person's pulse wave, and for performing control to estimate the subject's pulse wave from the extracted luminance signal.
- the processing circuit 101 reads out and executes the programs stored in the memory 105, thereby executing the functions of the captured image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15, the luminance signal selection unit 16a, the pulse wave estimation unit 17, and the output unit 18. That is, the pulse wave estimation device 1a includes a memory 105 for storing a program that, when executed by the processing circuit 101, results in the execution of steps ST1 to ST7 in FIG. 15 described above.
- the programs stored in the memory 105 cause a computer to execute the procedures or methods of the processing of the captured image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15, the luminance signal selection unit 16a, the pulse wave estimation unit 17, and the output unit 18.
- the storage unit (not shown) is configured by, for example, the memory 105 .
- the pulse wave estimation device 1a also includes devices such as an image capture device 2, an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
- the subject is the driver of the vehicle, but this is merely one example.
- the subject may be a passenger other than the driver of the vehicle.
- pulse wave estimation device 1a is an in-vehicle device, and image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16a, pulse wave estimation unit 17, and output unit 18 are provided in the in-vehicle device.
- some of the image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16a, pulse wave estimation unit 17, and output unit 18 may be mounted on an in-vehicle device of a vehicle, and the others may be provided in a server connected to the in-vehicle device via a network, so that a system is formed by the in-vehicle device and the server.
- the image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15, luminance signal selection unit 16a, pulse wave estimation unit 17, and output unit 18 may all be provided in the server.
- the pulse wave estimation device 1a according to the second embodiment described above is not limited to an in-vehicle device mounted on a vehicle, but can also be applied to, for example, moving objects other than vehicles or home appliances. Furthermore, the subjects are not limited to vehicle occupants, but can be various types of people.
- the measurement area setting unit 14 in the pulse wave estimation device 1a determines the measurement area ri(k) to be set in the area corresponding to the skin area on the captured image based on the information indicating the target face direction range, the layout information, and the reference measurement area information, but this is merely an example.
- the target face direction range is the range of all face directions that the subject can take within the range that the imaging device 2 can capture, and it is assumed that the imaging device 2 always captures the subject at a reference position and reference direction
- the measurement area setting unit 14 does not need to consider the target face direction range and the layout information.
- an administrator or the like may determine in advance a measurement region ri(k) that can be used to extract a pulse wave original signal wi(t) containing the subject's pulse wave component based on an image captured by the imaging device 2 of the intended subject, and generate information indicating the determined measurement region ri(k) as reference measurement region information and store it in a memory unit or the like.
- the measurement area setting unit 14 does not take into account the information indicating the target face direction range and the layout information, and simply sets the measurement area ri(k) defined in the reference measurement area information to an area corresponding to the skin area on the captured image.
- the pulse wave estimation device 1a includes an image acquisition unit 11 that acquires an image of a person (subject) on a frame-by-frame basis, a skin area detection unit 12 that detects the skin area of the person from the captured image, a measurement area setting unit 14 that sets a measurement area ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change in an area corresponding to the skin area on the captured image and that contains a pulse wave component of the person, a face direction estimation unit 13 that estimates the face direction of the person on a frame-by-frame basis based on the captured image, and a measurement area setting unit 14 that sets a measurement area ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change in an area corresponding to the skin area on the captured image, the face direction estimation unit 13 that estimates the face direction of the person on a frame-by-frame basis based on the captured image, and a
- the apparatus is configured to include a luminance signal extraction unit 15 that sets a used measurement area ri(k) used to extract a pulse wave signal wi(t) based on a luminance change in the set used measurement area ri(k), a luminance signal selection unit 16a that selects a time-series pulse wave source signal wi(t) to be used for estimating the pulse wave of the person from among the time-series pulse wave source signals wi(t) extracted by the luminance signal extraction unit 15 during a pulse wave estimation target period, taking into account the face direction of the person estimated by the face direction estimation unit 13, and a pulse wave estimation unit 17 that estimates the pulse wave of the person based on the time-series pulse wave source signal wi(t) selected by the luminance signal selection unit 16a.
- the pulse wave estimation device 1a can prevent a decrease in the estimation accuracy of the subject's pulse wave due to the skin area of the person to be estimated, i.e., the subject, not being captured on the captured image, or the skin area being overshadowed.
- luminance signal extraction unit 15 extracts pulse wave original signal wi(t) from measurement area ri(k) set by measurement area setting unit 14, which corresponds to the facial direction of the person (subject) estimated by facial direction estimation unit 13 based on the captured image in which measurement area ri(k) is set, as a used measurement area ri(k).
- Luminance signal selection unit 16a calculates the distribution proportion of used measurement area ri(k) from which luminance signal extraction unit 15 extracted pulse wave original signal wi(t) during the pulse wave estimation target period, and selects the time-series pulse wave original signal wi(t) extracted from the used measurement area ri(k) whose occurrence frequency is equal to or greater than the area determination threshold as the time-series estimated pulse wave original signal wi(t). Therefore, the pulse wave estimation device 1a can prevent a decrease in the accuracy of estimating the subject's pulse wave due to, for example, the skin area of the person whose pulse wave is to be estimated, i.e., the subject, not being captured in the captured image, or the occurrence of so-called shadowing on the skin area.
- Embodiment 3 an embodiment in which a time-series estimation pulse wave source signal is selected by a method different from that of the first and second embodiments will be described.
- the pulse wave estimation device is mounted on a vehicle and the subject is the driver of the vehicle.
- FIG. 17 is a diagram showing an example of the configuration of a pulse wave estimation device 1b according to the third embodiment.
- the same components as those of the pulse wave estimation device 1 described in the first embodiment with reference to FIG. 1 are denoted by the same reference numerals, and duplicated explanations will be omitted.
- a pulse wave estimation device 1 b according to the third embodiment differs from the pulse wave estimation device 1 according to the first embodiment in that a weight setting unit 19 is provided.
- the specific operations of luminance signal extraction unit 15a and luminance signal selection unit 16b differ from the specific operations of luminance signal extraction unit 15 and luminance signal selection unit 16 in pulse wave estimation device 1 according to embodiment 1, respectively.
- the weight setting unit 19 sets a weighting coefficient for each measurement area ri(k) set by the measurement area setting unit 14 in the captured image in which the face direction of the subject is estimated, based on the face direction of the subject estimated by the face direction estimation unit 13 on a frame-by-frame basis.
- the face direction estimation unit 13 outputs the face direction information F(k) to the weight setting unit 19.
- the measurement region setting unit 14 outputs the generated measurement region information R(k) to the luminance signal extraction unit 15 and the weight setting unit 19.
- Figure 18 is a diagram for explaining an example of a weighting coefficient for each measurement area r i (k) that is set by the weight setting unit 19 based on the face direction of the subject estimated by the face direction estimation unit 13 on a frame-by-frame basis in embodiment 3.
- the weight setting unit 19 sets the weight coefficient based on the face direction of the subject, for example, so that the weight of the measurement region ri(k) closer to the image capture device 2 is larger.
- the weight setting unit 19 sets a weighting coefficient for each measurement area ri(k) so that "the weight of the measurement area ri(k) indicated by W1 > the weight of the measurement area ri(k) indicated by W2 > the weight of the measurement area ri(k) indicated by W3".
- the weight setting unit 19 can determine the positional relationship between the measurement area ri(k) and the imaging device 2. In other words, based on the layout information and the face direction information F(k), the weight setting unit 19 can select the measurement area ri(k) close to the imaging device 2 and assign a weight coefficient to it.
- the weight setting unit 19 outputs information on the weight coefficient set for each measurement region ri(k) (hereinafter referred to as "weight information") to the luminance signal selection unit 16b.
- the weighting information is information in which each frame Im(k) of a captured image, an identification number of each frame Im(k) of the captured image, and a weighting coefficient for each measurement region ri(k) are associated with each other.
- the luminance signal extraction unit 15 a sets a usage measurement region ri(k) from among the measurement regions ri(k) set by the measurement region setting unit 14 .
- the luminance signal extraction unit 15a sets the measurement region ri(k) set by the measurement region setting unit 14 as the usage measurement region ri(k).
- the luminance signal extraction unit 15a extracts a pulse wave original signal wi(t) indicating a luminance change during a pulse wave estimation target period, in other words, a period corresponding to the number of frames Tp, from each of the set usage measurement areas ri(k) among the multiple measurement areas ri(k) on frame Im(k) indicated by the measurement area information R(k).
- the method by which the luminance signal extraction unit 15a extracts the pulse wave original signal wi(t) indicating the change in luminance during the pulse wave estimation period from each of the set usage measurement areas ri(k) is similar to the method by which the luminance signal extraction unit 15 extracts the pulse wave original signal wi(t) indicating the change in luminance during the pulse wave estimation period from each of the set usage measurement areas ri(k) already described in embodiment 1, and therefore a duplicated description will be omitted.
- the luminance signal extraction unit 15a generates pulse wave original signal information W(t) indicating the pulse wave original signal wi(t) in each used measurement region ri(k), and outputs the generated pulse wave original signal information W(t) to the luminance signal selection unit 16b.
- the luminance signal selection unit 16b selects a time-series estimated pulse wave source signal wi(t) to be used to estimate the subject's pulse wave from among the time-series pulse wave source signals wi(t) extracted by the luminance signal extraction unit 15a during the pulse wave estimation target period, based on the pulse wave source signal information W(t) output from the luminance signal extraction unit 15a and taking into consideration the subject's facial direction estimated by the facial direction estimation unit 13.
- the luminance signal selection unit 16b optimizes the pulse wave original signal wi(t) extracted by the luminance signal extraction unit 15a based on the weighting coefficient set by the weighting unit 19 for the used measurement region ri(k) from which the pulse wave original signal wi(t) was extracted.
- the luminance signal selection unit 16b can identify the weighting coefficient set by the weighting unit 19 for the used measurement region ri(k) from which the pulse wave original signal wi(t) was extracted by matching the pulse wave original signal information W(t) with the weighting information using, for example, the identification number of the frame Im(k) of the captured image and the used measurement region ri(k) as keys.
- the luminance signal selecting unit 16b performs the optimization by correcting the pulse wave source signal wi(t) to be smaller. It is predetermined how much the pulse wave source signal wi(t) should be reduced for a given weighting coefficient.
- the luminance signal selecting unit 16b performs the optimization by selecting to make the pulse wave source signal wi(t) blank (set to zero).
- the weighting coefficient at which the pulse wave source signal wi(t) is made blank is determined in advance.
- the luminance signal selection unit 16b selects the time-series pulse wave source signal wi(t) after optimization as the time-series estimated pulse wave source signal wi(t).
- the luminance signal selector 16 b outputs the time-series selected pulse wave source signal information W(t) to the pulse wave estimator 17 .
- FIG. 19 is a flowchart for explaining the operation of a pulse wave estimation device 1b according to the third embodiment. For example, when the vehicle power is turned on, pulse wave estimation device 1b repeats the process shown in the flowchart of FIG. 19 until the vehicle power is turned off.
- steps ST1 to ST4 and step ST7 performed by pulse wave estimation device 1b are similar to the specific operations of steps ST1 to ST4 and step ST7 already explained in embodiment 1 using the flowchart in FIG. 9, so the same step numbers are used and duplicate explanations are omitted.
- the weight setting unit 19 Based on the face direction of the subject estimated by the face direction estimation unit 13 on a frame-by-frame basis in step ST4, the weight setting unit 19 sets a weighting coefficient for each measurement area ri(k) set by the measurement area setting unit 14 in step ST3 in the captured image in which the face direction of the subject has been estimated (step ST41). The weight setting unit 19 outputs the weight information to the luminance signal selection unit 16b.
- the luminance signal extraction unit 15a extracts a luminance signal indicating a change in luminance during the pulse wave estimation target period, in other words, a pulse wave original signal wi(t), from each of the usage measurement regions ri(k) among the multiple measurement regions ri(k) on the frame Im(k) indicated by the measurement region information R(k), based on the frame Im(k) of the captured image acquired by the captured image acquisition unit 11 in step ST1 and the measurement region information R(k) output from the measurement region setting unit 14 in step ST3 (step ST5a).
- the luminance signal extractor 15a generates pulse wave origin signal information W(t) indicating the extracted pulse wave origin signal wi(t).
- the luminance signal extractor 15 a outputs the generated pulse wave source signal information W(t) to the luminance signal selector 16 .
- the luminance signal selection unit 16b selects an estimated pulse wave source signal wi(t) based on the pulse wave source signal information W(t) output from the luminance signal extraction unit 15a in step ST5 and taking into consideration the face direction of the subject estimated by the face direction estimation unit 13 in step ST4 (step ST6b).
- the luminance signal selector 16 b outputs the selected pulse wave source signal information W(t) to the pulse wave estimator 17 .
- steps ST4 to ST41 are performed after the process of step ST3, but this is merely one example.
- the order of the processes of steps ST4 and ST3 may be reversed, or the processes of steps ST2 to ST3 and step ST4 may be performed in parallel. It is sufficient that the process of step ST4 is performed after the process of step ST1 and before the process of step ST41 is performed.
- FIG. 20 is a flowchart for explaining the details of step ST5a in FIG.
- the luminance signal extraction unit 15a sets the measurement region ri(k) set by the measurement region setting unit 14 in step ST3 of FIG. 19 as the usage measurement region ri(k) (step ST521). Then, based on frame Im(k) of the captured image acquired by the captured image acquisition unit 11 in step ST1 of FIG. 9 and measurement area information R(k) output from the measurement area setting unit 14 in step ST3 of FIG. 9, the luminance signal extraction unit 15a extracts a time-series pulse wave original signal wi(t) during the pulse wave estimation target period from each of the used measurement areas ri(k) on frame Im(k) (step ST522).
- the luminance signal extractor 15a When the luminance signal extractor 15a extracts the pulse wave source signal wi(t), it generates pulse wave source signal information W(t) that indicates the extracted time-series pulse wave source signal wi(t). The luminance signal extractor 15a outputs the generated pulse wave source signal information W(t) to the luminance signal selector 16b.
- FIG. 21 is a flowchart for explaining the details of step ST6b in FIG.
- the luminance signal selection unit 16b optimizes the time-series pulse wave source signal wi(t) extracted by the luminance signal extraction unit 15a during the pulse wave estimation period based on the weighting coefficient set by the weight setting unit 19 for the used measurement region ri(k) from which the time-series pulse wave source signal wi(t) was extracted (step ST621).
- the luminance signal selection section 16b selects the optimized time-series pulse wave source signal wi(t) as an estimation time-series pulse wave source signal wi(t) (step ST622).
- the luminance signal selector 16 b outputs the selected pulse wave source signal information W(t) to the pulse wave estimator 17 .
- pulse wave estimation device 1b sets a measurement region ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change and that contains a subject's pulse wave component, in a region corresponding to the skin region on the captured image.
- pulse wave estimation device 1a determines a measurement region ri(k) to be set in the region corresponding to the skin region on the captured image based on information indicating a target face direction range, layout information, and reference measurement region information, and sets the determined measurement region ri(k).
- Pulse wave estimation device 1b sets a usage measurement area ri(k) from within measurement area ri(k), extracts a time-series pulse wave original signal wi(t) based on luminance changes in the usage measurement area ri(k), and then selects a time-series estimation pulse wave original signal wi(t) to be used for estimating the subject's pulse wave from the time-series pulse wave original signals wi(t) extracted during the pulse wave estimation period, taking into consideration the subject's facial direction estimated on a frame-by-frame basis based on the captured images.
- the pulse wave estimation device 1b estimates the subject's pulse wave based on the selected time-series estimation pulse wave original signal wi(t).
- pulse wave estimation device 1b includes a weight setting unit 19 that sets a weighting coefficient for each measurement region ri(k) set by measurement region setting unit 14 in an captured image in which the facial direction of a person (subject) has been estimated based on the facial direction of the person estimated on a frame-by-frame basis by facial direction estimation unit 13, and luminance signal extraction unit 15a extracts a pulse wave original signal wi(t) using the measurement region ri(k) set by measurement region setting unit 14 as the used measurement region ri(k).
- Luminance signal selection unit 16b selects, as a time-series estimated pulse wave original signal wi(t), a time-series estimated pulse wave original signal wi(t) obtained after optimizing the pulse wave original signal wi(t) extracted by luminance signal extraction unit 15a based on the weighting coefficient set by weight setting unit 19 for the used measurement region ri(k) from which the pulse wave original signal wi(t) was extracted.
- This allows pulse wave estimation device 1b to extract a luminance signal containing sufficient pulse wave components to estimate a person's pulse wave, in other words, a pulse wave source signal wi(t), from the skin area of the captured image.
- pulse wave estimation device 1b can prevent a decrease in the accuracy of estimating the subject's pulse wave due to the skin area of the person whose pulse wave is to be estimated, i.e., the subject, not being captured in the captured image, or the skin area being shadowed.
- pulse wave estimation device 1b includes processing circuit 101 for extracting, from the skin area on a captured image, a luminance signal containing a pulse wave component sufficient to estimate a person's pulse wave, and for performing control to estimate the subject's pulse wave from the extracted luminance signal.
- the processing circuit 101 reads out and executes the programs stored in the memory 105, thereby executing the functions of the captured image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15a, the luminance signal selection unit 16b, the pulse wave estimation unit 17, the output unit 18, and the weight setting unit 19. That is, the pulse wave estimation device 1b includes a memory 105 for storing a program that, when executed by the processing circuit 101, results in the execution of steps ST1 to ST7 in FIG. 19 described above.
- the programs stored in the memory 105 cause a computer to execute the procedures or methods of the processing of the captured image acquisition unit 11, the skin area detection unit 12, the face direction estimation unit 13, the measurement area setting unit 14, the luminance signal extraction unit 15a, the luminance signal selection unit 16b, the pulse wave estimation unit 17, the output unit 18, and the weight setting unit 19.
- the storage unit (not shown) is configured by, for example, the memory 105 .
- Pulse wave estimation device 1b also includes devices such as image capture device 2, and an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
- the subject is the driver of the vehicle, but this is merely one example.
- the subject may be a passenger other than the driver of the vehicle.
- pulse wave estimation device 1b is an in-vehicle device, and image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15a, luminance signal selection unit 16b, pulse wave estimation unit 17, output unit 18, and weight setting unit 19 are provided in the in-vehicle device.
- some of the image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15a, luminance signal selection unit 16b, pulse wave estimation unit 17, output unit 18, and weight setting unit 19 may be mounted on an in-vehicle device of a vehicle, and the others may be provided in a server connected to the in-vehicle device via a network, so that a system is formed by the in-vehicle device and the server.
- the image acquisition unit 11, skin area detection unit 12, face direction estimation unit 13, measurement area setting unit 14, luminance signal extraction unit 15a, luminance signal selection unit 16b, pulse wave estimation unit 17, output unit 18, and weight setting unit 19 may all be provided in the server.
- the pulse wave estimation device 1b according to the third embodiment described above is not limited to an in-vehicle device mounted on a vehicle, but can also be applied to, for example, moving objects other than vehicles or home appliances. Furthermore, the subjects are not limited to vehicle occupants, but can be various types of people.
- the measurement area setting unit 14 determines the measurement area ri(k) to be set in the area corresponding to the skin area on the captured image based on the information indicating the target face direction range, the layout information, and the reference measurement area information, but this is merely an example.
- the target face direction range is the range of all face directions that the subject can take within the range that the imaging device 2 can capture, and it is assumed that the imaging device 2 always captures the subject at a reference position and reference direction
- the measurement area setting unit 14 does not need to take into account the target face direction range and the layout information.
- an administrator or the like may determine in advance a measurement region ri(k) that can be used to extract a pulse wave original signal wi(t) containing the subject's pulse wave component based on an image captured by the imaging device 2 of the intended subject, and generate information indicating the determined measurement region ri(k) as reference measurement region information and store it in a memory unit or the like.
- the measurement area setting unit 14 does not take into account the information indicating the target face direction range and the layout information, and simply sets the measurement area ri(k) defined in the reference measurement area information to an area corresponding to the skin area on the captured image.
- the pulse wave estimation device 1b includes an image acquisition unit 11 that acquires an image of a person (subject) on a frame-by-frame basis, a skin area detection unit 12 that detects the skin area of the person from the captured image, a measurement area setting unit 14 that sets a measurement area ri(k) that can be used to extract a pulse wave original signal wi(t) that indicates a luminance change in an area corresponding to the skin area on the captured image and that contains a pulse wave component of the person, a face direction estimation unit 13 that estimates the face direction of the person on a frame-by-frame basis based on the captured image, and a pulse wave original signal wi(t) that can be extracted from the measurement area ri(k) set by the measurement area setting unit 14.
- the apparatus is configured to include a luminance signal extraction unit 15a that sets a used measurement area ri(k) used to extract wi(t) and extracts a time-series pulse wave original signal wi(t) based on a luminance change in the set used measurement area ri(k), a luminance signal selection unit 16b that selects a time-series estimation pulse wave original signal wi(t) to be used for estimating the pulse wave of the person from the time-series pulse wave original signals wi(t) extracted by the luminance signal extraction unit 15a during the pulse wave estimation target period, taking into account the face direction of the person estimated by the face direction estimation unit 13, and a pulse wave estimation unit 17 that estimates the pulse wave of the person based on the time-series estimation pulse wave original signal wi(t) selected by the luminance signal selection unit 16b.
- the pulse wave estimation device 1b can prevent a decrease in the estimation accuracy of the subject's pulse wave due to the skin area of the person to be estimated, i.e., the subject, not being captured on the captured image, or the skin area being overshadowed.
- pulse wave estimation device 1b includes a weight setting unit 19 that sets a weighting coefficient for each measurement region ri(k) set by measurement region setting unit 14 in an captured image in which the facial direction of a person (subject) has been estimated based on the facial direction of the person estimated on a frame-by-frame basis by facial direction estimation unit 13, and luminance signal extraction unit 15a extracts a pulse wave original signal wi(t) using the measurement region set by measurement region setting unit 14 as the used measurement region ri(k).
- Luminance signal selection unit 16b selects, as a time-series estimated pulse wave original signal wi(t), a time-series estimated pulse wave original signal wi(t) obtained by optimizing the pulse wave original signal wi(t) extracted by luminance signal extraction unit 15a based on the weighting coefficient set by weight setting unit 19 for the used measurement region ri(k) from which the pulse wave original signal wi(t) was extracted. Therefore, pulse wave estimation device 1b can prevent a decrease in the accuracy of estimating the subject's pulse wave due to, for example, the skin area of the person whose pulse wave is to be estimated, i.e., the subject, not being captured in the captured image, or the occurrence of so-called shadowing on the skin area.
- pulse wave estimation unit 17 analyzes a plurality of principal components using a general signal separation technique such as PCA or ICA, generates a separated signal indicating the analyzed plurality of principal components, and estimates the subject's pulse wave based on the separated signal.
- a general signal separation technique such as PCA or ICA
- pulse wave estimation unit 17 may estimate the subject's pulse wave using other methods.
- the pulse wave estimating section 17 may perform a Fourier transform on the selected pulse wave raw signal information W(t) and calculate the peak frequency in the frequency power spectrum as the pulse rate.
- the measurement area setting unit 14 sets a plurality of measurement areas r i (k), and the luminance signal extraction units 15, 15 a extract a time-series pulse wave source signal w i (t) from the plurality of usage measurement areas r i (k).
- the time-series pulse wave source signal wi(t) may be extracted from one used measurement region ri(k).
- the pulse wave estimation unit 17 estimates the subject's pulse wave using a method that does not use a general signal separation technique such as PCA or ICA (for example, a method of performing a Fourier transform on the above-mentioned selected pulse wave source signal information W(t)).
- a general signal separation technique such as PCA or ICA
- the pulse wave estimation device can prevent a situation in which, during noise removal, even the pulse wave signal contained in the luminance signal of the subject's skin area is deemed a noise component and removed, resulting in the inability to extract the luminance signal of the subject's skin area that should be used to estimate the subject's pulse wave.
- 1, 1a, 1b pulse wave estimation device, 11: image acquisition unit, 12: skin area detection unit, 13: face direction estimation unit, 14: measurement area setting unit, 15, 15a: luminance signal extraction unit, 16, 16a, 16b: luminance signal selection unit, 17: pulse wave estimation unit, 18: output unit, 19: weight setting unit, 2: imaging device, 101: processing circuit, 102: input interface device, 103: output interface device, 104: processor, 105: memory.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Physiology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Cardiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Le dispositif d'inférence d'onde de pouls selon l'invention comprend : une unité d'acquisition d'image capturée (11) qui acquiert une image capturée ; une unité de détection de région cutanée (12) qui détecte une région cutanée d'une personne à partir de l'image capturée ; une unité d'établissement de région de mesure (14) qui définit, dans une région correspondant à la région cutanée sur l'image capturée, une région de mesure qui peut être utilisée pour extraire un signal de source d'onde de pouls comprenant un composant d'onde de pouls de la personne ; une unité d'inférence de direction de visage (13) qui infère la direction du visage de la personne sur la base de l'image capturée ; une unité d'extraction de signal de luminance (15, 15a) qui définit une région de mesure d'utilisation et extrait des signaux chronologiques de source d'onde de pouls sur la base d'un changement de luminance dans la région de mesure d'utilisation définie ; une unité de sélection de signal de luminance (16, 16a, 16b) qui sélectionne un signal chronologique de source d'onde d'impulsion d'inférence devant être utilisé pour inférer l'onde de pouls de la personne tout en tenant compte de la direction du visage de la personne, inférée par l'unité d'inférence de direction de visage (13) ; et une unité d'inférence d'onde de pouls (17), qui infère l'onde de pouls de la personne sur la base du signal chronologique de source d'onde de pouls d'inférence sélectionné par l'unité de sélection de signal de luminance (16, 16a, 16b).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/025975 WO2025017760A1 (fr) | 2023-07-14 | 2023-07-14 | Dispositif d'inférence d'onde de pouls et procédé d'inférence d'onde de pouls |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2023/025975 WO2025017760A1 (fr) | 2023-07-14 | 2023-07-14 | Dispositif d'inférence d'onde de pouls et procédé d'inférence d'onde de pouls |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2025017760A1 true WO2025017760A1 (fr) | 2025-01-23 |
Family
ID=94281645
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/025975 WO2025017760A1 (fr) | 2023-07-14 | 2023-07-14 | Dispositif d'inférence d'onde de pouls et procédé d'inférence d'onde de pouls |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2025017760A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014198201A (ja) * | 2013-03-29 | 2014-10-23 | 富士通株式会社 | 脈波検出プログラム、脈波検出方法および脈波検出装置 |
JP2018189720A (ja) * | 2017-04-28 | 2018-11-29 | パナソニックIpマネジメント株式会社 | 情報出力制御装置、情報出力制御方法、情報出力システム、およびプログラム |
JP2020039480A (ja) * | 2018-09-07 | 2020-03-19 | 株式会社エクォス・リサーチ | 脈波検出装置、車両装置、及び脈波検出プログラム |
WO2020054122A1 (fr) * | 2018-09-10 | 2020-03-19 | 三菱電機株式会社 | Dispositif de traitement d'informations, programme et procédé de traitement d'informations |
JP2020162873A (ja) * | 2019-03-29 | 2020-10-08 | 株式会社エクォス・リサーチ | 脈波検出装置、及び脈波検出プログラム |
-
2023
- 2023-07-14 WO PCT/JP2023/025975 patent/WO2025017760A1/fr unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014198201A (ja) * | 2013-03-29 | 2014-10-23 | 富士通株式会社 | 脈波検出プログラム、脈波検出方法および脈波検出装置 |
JP2018189720A (ja) * | 2017-04-28 | 2018-11-29 | パナソニックIpマネジメント株式会社 | 情報出力制御装置、情報出力制御方法、情報出力システム、およびプログラム |
JP2020039480A (ja) * | 2018-09-07 | 2020-03-19 | 株式会社エクォス・リサーチ | 脈波検出装置、車両装置、及び脈波検出プログラム |
WO2020054122A1 (fr) * | 2018-09-10 | 2020-03-19 | 三菱電機株式会社 | Dispositif de traitement d'informations, programme et procédé de traitement d'informations |
JP2020162873A (ja) * | 2019-03-29 | 2020-10-08 | 株式会社エクォス・リサーチ | 脈波検出装置、及び脈波検出プログラム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4728432B2 (ja) | 顔姿勢推定装置、顔姿勢推定方法、及び、顔姿勢推定プログラム | |
JP6695503B2 (ja) | 車両の運転者の状態を監視するための方法及びシステム | |
JP6973258B2 (ja) | 画像解析装置、方法およびプログラム | |
JP4585471B2 (ja) | 特徴点検出装置及びその方法 | |
KR102460665B1 (ko) | 응시 거리를 결정하는 방법 및 디바이스 | |
JP6919619B2 (ja) | 画像解析装置、方法およびプログラム | |
JP6822482B2 (ja) | 視線推定装置、視線推定方法及びプログラム記録媒体 | |
US20210056291A1 (en) | Method for analysis of an intrinsic facial feature of a face | |
KR20170092533A (ko) | 얼굴 포즈 교정 방법 및 장치 | |
JP6633462B2 (ja) | 情報処理装置および情報処理方法 | |
JP2013156680A (ja) | フェーストラッキング方法、フェーストラッカおよび車両 | |
JP2019185556A (ja) | 画像解析装置、方法およびプログラム | |
JP2009064395A (ja) | ポインティングデバイス、操作者の注視位置とカーソルの位置との誤差の補正をコンピュータに実行させるためのプログラムおよびそのプログラムを記録したコンピュータ読み取り可能な記録媒体 | |
JP2018101212A (ja) | 車載器および顔正面度算出方法 | |
WO2025017760A1 (fr) | Dispositif d'inférence d'onde de pouls et procédé d'inférence d'onde de pouls | |
CN114266691A (zh) | 过滤方法、过滤程序和过滤装置 | |
CN116348909A (zh) | 姿势检测装置、姿势检测方法及睡相判定方法 | |
Zhu et al. | 3D face pose tracking from an uncalibrated monocular camera | |
WO2020255645A1 (fr) | Dispositif de mise à jour de données tridimensionnelles, dispositif d'estimation d'orientation de visage, procédé de mise à jour de données tridimensionnelles et support d'enregistrement lisible par ordinateur | |
JP7558451B2 (ja) | 脈波推定装置、状態推定装置、及び、脈波推定方法 | |
KR102074977B1 (ko) | 전자 장치 및 그의 제어 방법 | |
JP7630062B2 (ja) | 脈波推定装置、状態推定装置、及び、脈波推定方法 | |
JP7542779B2 (ja) | 脈波推定装置及び脈波推定方法 | |
JP2021043914A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
WO2023195148A1 (fr) | Dispositif d'estimation d'onde de pouls, dispositif d'estimation d'état et procédé d'estimation d'onde de pouls |