[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US10803294B2 - Driver monitoring system - Google Patents

Driver monitoring system Download PDF

Info

Publication number
US10803294B2
US10803294B2 US16/167,786 US201816167786A US10803294B2 US 10803294 B2 US10803294 B2 US 10803294B2 US 201816167786 A US201816167786 A US 201816167786A US 10803294 B2 US10803294 B2 US 10803294B2
Authority
US
United States
Prior art keywords
driver
zero
angle
point
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/167,786
Other versions
US20190138790A1 (en
Inventor
Takeshi Matsumura
Jun Sato
Kenichiroh Hara
Junya Kasugai
Koichiro Yamauchi
Kazuhiro TOMOSUE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Original Assignee
Toyota Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=66179137&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US10803294(B2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Toyota Motor Corp filed Critical Toyota Motor Corp
Assigned to TOYOTA JIDOSHA KABUSHIKI KAISHA reassignment TOYOTA JIDOSHA KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HARA, KENICHIROH, KASUGAI, JUNYA, MATSUMURA, TAKESHI, SATO, JUN, Tomosue, Kazuhiro, YAMAUCHI, KOICHIRO
Publication of US20190138790A1 publication Critical patent/US20190138790A1/en
Application granted granted Critical
Publication of US10803294B2 publication Critical patent/US10803294B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00248
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • G06K9/00369
    • G06K9/00845
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0881Seat occupation; Driver or passenger presence
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/043Identity of occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/225Direction of gaze
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/30Road curve radius
    • G06K9/00208
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/647Three-dimensional objects by matching two-dimensional images to three-dimensional objects

Definitions

  • the present invention relates to a driver monitoring system.
  • Driver monitoring systems provided with a camera which captures a face image of a driver, have been known.
  • the direction of the face of the driver is detected based on the face image captured by the camera, and it is determined whether or not the driver is inattentive during driving, based on the detected direction of the face of the driver (for example, Patent Literature 1).
  • Patent Literature 1 Japanese Patent Publication (Kokai) No. 2000-326757
  • the direction of the face of the driver when facing forward is taken as a reference.
  • the current angle of the direction of the face relative to the reference direction is large, it is determined that the driver is inattentive while driving.
  • the direction of the face with respect to the driver monitoring camera when the driver is facing forward varies in accordance with the installation position of the driver monitoring camera, and the physique, seated posture, etc., of the driver.
  • a predetermined facial orientation with respect to the driver monitor camera is determined as the face orientation when facing forward, there is a possibility that it cannot be accurately recognized when the driver is facing forward. In this case, it may be determined that the driver is driving inattentively even though this is not the case. Conversely, it may be determined that the driver is not driving inattentively, despite the driver driving inattentively.
  • the present invention aims to solve the above problem and aims to provide a driver monitoring system which can accurately recognize the direction of the face when the driver is facing toward.
  • a driver monitoring system which can accurately recognize the direction of the face when the driver is facing forward.
  • FIG. 1 is a schematic block diagram showing the configuration of the driver monitoring system.
  • FIG. 2 is a schematic side view partially showing the inside of a vehicle equipped with the driver monitoring system.
  • FIG. 3 is a view schematically showing the head of a driver.
  • FIG. 4 is a view showing feature points used to recognize the facial orientation angle of the driver.
  • FIG. 5 is a flowchart showing the control routine for determining the execution timing of zero-point setting.
  • FIG. 6 is a flowchart showing the control routine for setting the zero-point of the facial orientation angle of the driver.
  • FIG. 7 is a flowchart showing the control routine for setting the zero-point of the facial orientation angle of the driver according to a second embodiment.
  • FIG. 1 is a schematic block diagram showing the configuration of the driver monitoring system 1 according to the present embodiment.
  • FIG. 2 is a schematic side view partially showing the inside of the vehicle 60 equipped with the driver monitoring system 1 according to the present embodiment.
  • the driver monitoring system 1 is installed in the vehicle 60 and monitors the state of the driver of the vehicle 60 . As shown in FIG. 1 , the driver monitoring system 1 is provided with a driver monitoring camera 11 (image capturing device) which captures a face image of the driver, a vehicle velocity sensor 12 which detects the velocity of the vehicle 60 , and a seat pressure sensor 13 which detects the pressure applied to the seat surface of the driver's seat.
  • a driver monitoring camera 11 image capturing device
  • vehicle velocity sensor 12 which detects the velocity of the vehicle 60
  • a seat pressure sensor 13 which detects the pressure applied to the seat surface of the driver's seat.
  • the vehicle 60 comprises a steering wheel 62 attached via a steering column 61 , a rear-view mirror 63 arranged in front of and above the driver, and a driver's seat 64 on which the driver sits.
  • the driver monitoring camera 11 is provided on the upper part of the steering column 61 and is arranged facing the driver so as to be able to capture the driver, specifically the face and part of the upper body of the driver.
  • the steering column 61 is basically arranged so as to be positioned lower than the position of the face of the driver.
  • the image capture direction of the driver monitoring camera 11 is directed upwards from the horizontal by a predetermined angle ⁇ (e.g., 10 to 20°).
  • the driver monitoring camera 11 can capture the driver of the vehicle 60 , it may be provided in a position different from the upper part of the steering column 61 .
  • the driver monitoring camera 11 may be provided on the steering wheel 62 , the rear-view mirror 63 , the instrument panel, the instrument panel housing, etc., of the vehicle 60 .
  • the driver monitoring camera 11 is provided with a camera and a light source.
  • the camera may be, for example, a CMOS (complementary metal oxide semiconductor) camera or a CCD (charge coupled device) camera, and the light source may be an LED (light-emitting diode).
  • the light source is preferably a near-infrared LED, and the camera is also preferably capable of detecting near-infrared rays.
  • the light source may be two near-infrared LEDs arranged on both sides of the camera.
  • a filter such as a visible light cut-off filter may be provided on the camera.
  • the driver monitoring camera 11 is connected, by wire or wirelessly, to an ECU 50 , which is described later. Thus, the image captured by the driver monitoring camera 11 is transmitted to the ECU 50 as image data.
  • the vehicle velocity sensor 12 is a sensor for detecting the velocity of the vehicle 60 .
  • the vehicle velocity sensor 12 is, for example, attached to the housing (not shown) of the drive shaft of the vehicle to detect the rotation speed of the drive shaft of the vehicle, and detects the velocity of the vehicle 60 based on the detected rotation speed of the drive shaft.
  • the vehicle velocity sensor 12 is connected to the ECU 50 so that the output signal of the vehicle velocity sensor 12 is transmitted to the ECU 50 . Note that, as long as the vehicle velocity sensor 12 is capable of detecting the velocity of the vehicle 60 , the vehicle velocity sensor 12 may be a different type of sensor attached to another part of the vehicle.
  • the seat pressure sensor 13 is arranged under the seat surface of the driver's seat 64 .
  • the seat pressure sensor 13 is a pressure sensor for detecting the pressure applied to the seat surface of the driver's seat 64 by the driver.
  • the seat pressure sensor 13 can detect the pressure applied to each divided part.
  • the seat pressure sensor 13 can detect the pressure distribution applied to the seat surface of the driver's seat 64 .
  • the seat pressure sensor 13 is connected to the ECU 50 so that the output signal from the seat pressure sensor 13 is transmitted to the ECU 50 .
  • the seat pressure sensor 13 may not be able to detect the pressure distribution applied to the seat surface of the driver's seat 64 .
  • the driver monitoring system 1 is further provided with a surrounding environment detection device 20 , a GPS receiver 31 which receives a signal from GPS satellites, and a map database 32 related to map information. All of these components are connected to the ECU 50 .
  • the surrounding environment detection device 20 detects the environmental situation surrounding the vehicle 60 (hereinafter referred to as the “vehicle surrounding situation”).
  • the vehicle surrounding situation includes not only the road conditions around the vehicle 60 , but also obstacles, weather, etc., around the vehicle 60 .
  • the surrounding environment detection device 20 includes LIDAR (laser imaging detection and ranging) 21 , a millimeter-wave radar sensor 22 , and an external camera 23 .
  • the LIDAR 21 is a device which uses laser light to detect the road and obstacles surrounding a host vehicle.
  • the LIDAR 21 detects the status of the road and obstacles in the entire circumference of the vehicle 60 , i.e., the vehicle surrounding situation as a three-dimensional image, and transmits the three-dimensional image to the ECU 50 .
  • the millimeter-wave radar sensor 22 is a device which detects obstacles surrounding a host vehicle at a distance greater than the LIDAR 21 , using radio waves.
  • the millimeter-wave radar sensor 22 transmits such measurement results to the ECU 50 as the vehicle surrounding situation.
  • the external camera 23 is a device which photographs the area in front of the vehicle 60 .
  • the external camera 23 detects road information or the like of the area in front of the vehicle 60 , by performing image processing on the captured image of the area in front of the vehicle 60 .
  • the external camera 23 transmits the detected information to the ECU 50 .
  • the GPS receiver 31 is a device for detecting the current position of the vehicle 60 (e.g., the longitude and latitude of the vehicle 60 ) by receiving signals from three or more GPS satellites.
  • the GPS receiver 31 transmits the detected current position information of the vehicle 60 to the ECU 50 .
  • the map database 32 is a database related to map information.
  • the map database 32 is stored in, for example, an HDD (Hard Disk Drive) mounted on the vehicle 60 .
  • the map information includes information such as, for example, information on the locations of roads, information on the shapes of roads (e.g., the types of curves and straight lines, the curvature of curves, etc.), position information on intersections and branch points, and road types.
  • the driver monitoring system 1 is further provided with a human machine interface (HMI) 40 and an electrical control unit (ECU) 50 .
  • HMI human machine interface
  • ECU electrical control unit
  • the HMI 40 is an interface for performing information input and output between the driver or vehicle occupant and the driver monitoring system 1 .
  • the HMI 40 is provided with an information provision device for providing various information to the driver, specifically, a display for displaying character information and image information, and a speaker for outputting sound. Additionally, the HMI 40 is provided with a microphone for recognizing the voice of the driver, and operation buttons or a touch panel for the input operation of the driver.
  • the HMI 40 is connected, by wire or wirelessly, to the ECU 50 .
  • the information input by the driver or the like is transmitted from the HMI 40 to the ECU 50 .
  • the information to be provided to the driver by the information provision device of the HMI 40 is transmitted from the ECU 50 to the HMI 40 .
  • an inattention warning command is transmitted from the state diagnosis unit 53 to the HMI 40 .
  • the HMI 40 for example, generates a warning sound using the speaker, and displays on the display a warning indicating that attention should be given.
  • the HMI 40 functions as an alarm unit that alerts the driver to abnormalities when an anomaly such as inattentiveness of the driver occurs.
  • the ECU 50 functions as a control device performing various controls, and is a microcomputer comprising a central processing unit (CPU), read-only memory (ROM), random access memory (RAM), an input port, and an output port, all of which are connected to each other through bidirectional busses.
  • a single ECU 50 is provided.
  • a plurality of ECUs connected to each other via busses or like conforming to CAN or the like, may be provided for each function.
  • the ECU 50 comprises a facial orientation recognition unit 51 , a zero-point setting unit 52 , a state diagnosis unit 53 , a road condition detection unit 54 , and a driver change detection unit 55 .
  • the facial orientation recognition unit 51 recognizes the facial orientation angle of the driver, based on the face image of the driver captured by the driver monitoring camera 11 .
  • the direction of the face of the driver is represented by a pitch angle ⁇ x, a yaw angle ⁇ y, and a roll angle ⁇ z (collectively referred to as the “facial orientation angle”).
  • the pitch angle ⁇ x, the yaw angle ⁇ y, and the roll angle ⁇ z will be described briefly below.
  • FIG. 3 is a view schematically showing the face of the driver.
  • the direction of the face of the driver can be represented by angles about three mutually orthogonal axes X, Y, and Z.
  • the X axis is the axis extending in the left and right directions of the face of the driver, and is the axis extending perpendicular to the direction of travel of the vehicle 60 and perpendicular to the horizontal direction.
  • the angle ⁇ x of the face of the driver around the X axis represents the pitch angle.
  • the pitch angle ⁇ x of the driver changes when driver faces upwards or downwards.
  • the Y axis is the axis extending in the upwards and downwards directions of the face of the driver, and is the axis extending in the vertical direction.
  • the angle ⁇ y of the face of the driver around the Y axis represents the yaw angle.
  • the yaw angle ⁇ y of the driver changes when the driver turns his/her head toward left or right.
  • the Z axis is the axis extending in the forwards and backwards directions of the face of the driver, and is the axis extending in the direction of travel of the vehicle 60 .
  • the angle ⁇ z of the face of the driver around the Z axis represents the roll angle.
  • the roll angle ⁇ z of the driver changes when the driver inclines his/her head.
  • FIG. 4 is a view showing feature points used to recognize the facial orientation angle of the driver.
  • the facial orientation recognition unit 51 when recognizing the face direction of the driver, first extracts feature points of the face from the face image.
  • the facial orientation recognition unit 51 extracts the outer ends of the eyes and the tip of the nose as feature points from the face image of the driver captured by the driver monitoring camera 11 .
  • the facial orientation recognition unit 51 extracts, as shown in FIG. 4 , the outer end L of the left eye (the left side end), the outer end R of the right eye (the right-side end), and the tip N of the nose.
  • the intersection between the line, passing through point N, normal to the line segment LR (the line segment connecting the outer ends of the left and right eyes) and the line segment LR is extracted as feature point C.
  • the facial orientation recognition unit 51 recognizes the facial orientation angle of the driver, based on the features points extracted in this way.
  • the pitch angle ⁇ x the ratio of the length of line segment CN to the length of line segment LR (hereinafter referred to as the “CN/LR ratio”) becomes smaller as a change in the direction of the face of the driver with respect to the state facing the driver monitoring camera 11 increases.
  • the CN/LR ratio decreases as the direction of the face of the driver faces upwards. Therefore, in the present embodiment, the facial orientation recognition unit 51 detects the pitch angle ⁇ x of the face of the driver based on the CN/LR ratio.
  • the pitch angle ⁇ x at this time is provisionally calculated, using, for example, the direction in which the face of the driver faces the driver monitoring camera 11 in the upward and downward directions (pitch angle direction) as a provisional reference (i.e., the direction in which the CN/LR ratio is maximum).
  • a provisional reference i.e., the direction in which the CN/LR ratio is maximum.
  • the pitch angle is provisionally calculated assuming that the pitch angle of the face of the driver is 0°.
  • the provisional pitch angle calculated in this way based on the provisional reference is referred to as the provisional pitch angle ⁇ x′.
  • the provisional reference for calculating the provisional pitch angle need not necessarily be the direction in which the face of the driver faces the driver monitoring camera 11 , but can be set to any direction.
  • the ratio of line segment LC to line segment RC (hereinafter referred to as the “RC/LC ratio”) is approximately 1.
  • the RC/LC ratio becomes smaller than 1.
  • the facial orientation recognition unit 51 calculates the yaw angle ⁇ y based on the RC/LC ratio.
  • the yaw angle ⁇ y at this time is provisionally calculated, using, for example, the direction in which the RC/LC ratio becomes 1 as a provisional reference.
  • the yaw angle is provisionally calculated assuming that the yaw angle of the face of the driver is 0°.
  • the provisional yaw angle calculated in this way based on the provisional reference is referred to as the provisional yaw angle ⁇ y′.
  • the provisional reference for calculating the provisional yaw angle need not necessarily be the direction in which the RC/LC ratio becomes 1, but can be set to any direction.
  • the facial orientation recognition unit 51 calculates the roll angle ⁇ z of the driver, based on the angle of line segment RL with respect to the horizontal line of the driver monitoring camera 11 .
  • the roll angle ⁇ z at this time is provisionally calculated, using, for example, the direction in which the angle of line segment RL is 0° as a provisional reference.
  • the roll angle is provisionally calculated assuming that the roll angle of the face of the driver is 0°.
  • the provisional roll angle calculated in this way based on the provisional reference is referred to as the provisional roll angle ⁇ z′.
  • the provisional reference for calculating the provisional roll angle need not necessarily be the direction in which the angle of line segment RL is 0°, but can be set to any direction.
  • the facial orientation recognition unit 51 extracts three feature points based on the face image captured by the driver monitoring camera 11 , and recognizes the facial orientation angle of the driver based on these feature points.
  • the facial orientation recognition unit 51 may be configured so as to recognize the facial orientation angle of the driver by another method.
  • the facial orientation recognition unit 51 may be configured so as to recognize the facial orientation angle of the driver based on, for example, a greater number of feature points, or may be configured to fit (match) the face image to a 3D model and recognize the facial orientation angle using this 3D model.
  • provisional pitch angle ⁇ x′, provisional yaw angle ⁇ y′, and provisional roll angle ⁇ z′ are calculated by the facial orientation recognition unit 51 based on the provisional reference.
  • the zero-point setting unit 52 sets the zero-point of the facial orientation angle.
  • the zero-point is set so that the pitch angle ⁇ x, the yaw angle ⁇ y, and the roll angle ⁇ z are all 0° when the driver is facing forward.
  • the zero-point setting unit 52 sets the zero-point of the pitch angle so that the pitch angle ⁇ x when the driver is facing forward is 0°.
  • the provisional pitch angle ⁇ x′ is a certain angle
  • setting of the zero-point of the pitch angle is performed assuming that the pitch angle ⁇ x is 0°. For example, if the provisional pitch angle ⁇ x′ is 15° when the driver is facing forward, the zero-point of the pitch angle of the face of the driver is set in a state in which the provisional pitch angle ⁇ x′ is 15°.
  • the zero-point setting unit 52 sets the zero-point of the yaw angle such that the yaw angle ⁇ y when the driver is facing forward is 0° and sets the zero-point of the roll angle so that the roll angle ⁇ z when the driver is facing forward is 0°.
  • the state diagnosis unit 53 diagnoses the state of the driver, based on the current facial orientation angle with respect to the zero-point set by the zero-point setting unit 52 .
  • the zero-point is set so that the pitch angle ⁇ x, the yaw angle ⁇ y, and the roll angle ⁇ z are all 0° when the driver is facing forward.
  • the state diagnosis unit 53 determines that the driver is driving inattentively.
  • the state diagnosis unit 53 determines that an abnormality has occurred in the driver.
  • the road condition detection unit 54 detects the condition of the road on which the vehicle 60 is travelling.
  • the road condition detection unit 54 may detect the road condition based on the vehicle surrounding situation detected by the surrounding environment detection device 20 or may detect the road condition based on the current position detected by the GPS receiver 31 and the map information of the map database 32 .
  • the road conditions detected by the road condition detection unit 54 include, for example, whether or not the road on which the vehicle 60 is travelling is a straight road, and if it is a curve, the radius of curvature of the road, and additionally, the inclination angle of the road on which the vehicle 60 is travelling.
  • the driver change detection unit 55 detects a change in the posture of the driver or a change of driver.
  • the driver change detection unit 55 detects a change in driver based on, for example, the pressure of the seat pressure sensor 13 . Specifically, for example, when the pressure detected by the seat pressure sensor 13 temporarily becomes zero and then becomes a certain value again, the driver change detection unit 55 determines that the driver has changed or that the posture of the driver has changed. Furthermore, when the distribution in pressure detected by the seat pressure sensor 13 changes without becoming zero, the driver change detection unit 55 determines that the posture of the driver has changed.
  • the driver change detection unit 55 may detect a change in driver based not on the seat pressure sensor 13 , but rather on the output of another detection device.
  • the driver change detection unit 55 may detect a change in driver based on the image captured by the driver monitoring camera 11 .
  • the zero-point in which the above-described facial orientation angle is zero, can be set to a predetermined fixed value.
  • a provisional reference (fixed value) for calculating the provisional pitch angle ⁇ x′ can be set in advance as the zero-point of the pitch angle ⁇ x
  • a provisional reference (fixed value) for calculating the provisional yaw angle ⁇ y′ can be set in advance as the zero point of the yaw angle ⁇ y.
  • the position of the face of the driver with respect to the driver monitoring camera 11 differs between when the driver is driving while leaning forward and when the driver is sitting far back in the driver's seat 65 .
  • the direction of the face of the driver with respect to the driver monitoring camera 11 is different when driver is facing forward.
  • the position of the face of the driver with respect to the driver monitoring camera 11 differs for a driver having a high sitting height and a driver having a low sitting height.
  • the direction of the face of the driver with respect to the driver monitoring camera 11 when the driver is facing forward differs between the two.
  • the direction of the face with respect to the driver monitoring camera varies according to the installation position of the driver monitoring camera, and the physique, the seated posture, etc., of the driver.
  • the CN/LR ratio, the RC/LC ratio, and the angle of line segment RL when the driver is facing forward differ depending on the composition of the face of the driver.
  • the provisional pitch angle ⁇ x′, the provisional yaw angle ⁇ y′, and the provisional roll angle ⁇ z′ when the driver faces forward are different values.
  • the state diagnosis unit 53 may determine the pitch angle, yaw angle, etc., of the face of the driver, based on a case in which the driver is not facing forward. As a result, the state diagnosis unit 53 may determine that the driver is driving inattentively, etc., when the driver is not driving inattentively, or conversely, may determine that the driver is not driving inattentively, etc., when the driver is driving inattentively.
  • the zero-point setting unit 52 sets the zero-point of the pitch angle, yaw angle, and roll angle of the face of the driver (i.e., the facial orientation angle), based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when zero-point setting conditions have been established.
  • the zero-point setting conditions include that the velocity of the vehicle is higher than a predetermined reference velocity greater than 0.
  • the zero-point setting unit 52 sets the zero-point, based on the face direction of the driver recognized by the facial orientation recognition unit 51 when the velocity of the vehicle is higher than the predetermined reference velocity greater than 0.
  • the reference velocity be a velocity at which the frequency at which the driver is inattentive is reduced, for example, greater than or equal to 20 kmph, preferably greater than or equal to 40 kmph, and more preferable greater than 60 kmph.
  • the zero-point setting conditions include that the road condition detection unit 52 detects that the vehicle 60 is travelling on a straight road having no inclination.
  • the zero-point setting unit 52 sets the zero-point, based on the face direction of the driver recognized by the facial orientation recognition unit 51 when the vehicle 60 is travelling on a straight road having no inclination.
  • the roads with no inclination include, in addition to the road which is not inclined at all with respect to the horizontal within the range that the driver can see, roads that are not substantially inclined within a range that the driver can see, for example, roads having an inclination angle of less than or equal to 0.5° with respect to the horizontal.
  • straight road means a road extending linearly in a range visible to the driver.
  • the zero-point of the pitch angle, yaw angle, and roll angle of the face of the driver are set based on the provisional pitch angle ⁇ x′, the provisional yaw angle ⁇ y′, and the provisional roll angle ⁇ z′, respectively, recognized by the facial orientation recognition unit 51 when the zero-point setting conditions have been established.
  • the zero-point setting unit 52 acquires the face direction of the driver, i.e., the provisional pitch angle ⁇ x′, the provisional yaw angle ⁇ y′, and the provisional roll angle ⁇ z′, continuously over a predetermined period.
  • the average value of the provisional pitch angles ⁇ x′, the average value of the provisional yaw angles ⁇ y′, and the average value of the provisional roll angles ⁇ z′, acquired in this way over the predetermined period, are set as the zero-point of the pitch angle, the yaw angle, and the roll angle, respectively.
  • the predetermined period is a period long enough to avoid errors in the setting of the zero-point, for example, from 10 seconds to five minutes, preferably from 30 seconds to 3 minutes, and more preferably from 1 minute to two minutes.
  • the setting of the zero-point is performed at the following timings.
  • the following timings are also one of the zero-point setting conditions.
  • the first timing may be when the setting of the zero-point of the facial orientation angle by the zero-point setting unit 52 has not yet been performed after the ignition switch of the vehicle 60 has been changed from “OFF” to “ON”.
  • the ignition switch is changed to “ON”
  • the zero-point can be matched to the new driver or the new posture of the driver.
  • the second timing may include when the time elapsed since the setting of the zero-point was last performed is longer than a predetermined period or when the distance travelled since the setting of the zero-point was last performed is longer than a predetermined distance.
  • a predetermined period In general, it is highly likely that the posture of the driver will change after having driven for a long time or a long distance.
  • the zero-point can be matched with the posture after the change in the driver.
  • the third timing may include when the driver change detection unit 55 detects a change in the driver (i.e., a change in driver or a change in the posture of the driver, etc.). As described above, it is highly likely that when the driver or the posture thereof change, the zero-point will also change. Thus, by performing setting of the zero-point at this timing, the zero-point can be matched with the new driver or the new posture.
  • a change in the driver i.e., a change in driver or a change in the posture of the driver, etc.
  • the driver When the vehicle 60 is stopped, the driver is often not looking forward and is inattentive. Thus, if setting of the zero-point of the facial orientation angle of the driver is performed when the vehicle 60 is stopped, there is a risk that the facial orientation angle of the driver when the driver is not facing forward will be set as the zero-point. Conversely, in the driver monitoring system 1 of present embodiment, the setting of the zero-point is performed, based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when the velocity of the vehicle is equal to or greater than the reference velocity. As a result, according to the driver monitoring system 1 of the present embodiment, the zero-point can be set as the facial orientation angle of the driver when the driver is facing forward.
  • the driver monitoring system 1 of the present embodiment when the front of the vehicle 60 is upwardly or downwardly inclined, the driver often faces upwards or downwards toward the direction of inclination. Thus, if the vehicle 60 is travelling on an inclined road, the face of the driver may not be facing forward (in particular, in the pitch angle direction).
  • the setting of the zero-point is performed based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when the vehicle 60 is travelling on a road having no inclination.
  • a more accurate zero-point of the facial orientation angle of the driver in particular, the pitch angle
  • the driver monitoring system 1 of the present embodiment determines whether the facial orientation angle of the driver is facing forward (in particular, the yaw angle direction).
  • the setting of the zero-point is performed based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when the vehicle 60 is travelling on a straight road.
  • the average values of the provisional pitch angle ⁇ x′, etc., acquired over a predetermined period are set as the zero-point of the pitch angle, etc., instead of the provisional pitch angle ⁇ x′, etc., acquired at an instantaneous timing.
  • the zero-point setting unit 52 sets the zero-point regarding all of the pitch angle, the yaw angle, and the roll angle of the face of the driver based on the facial orientation angle of the driver when the zero-point setting conditions have been established.
  • the zero-point setting unit 52 may set the zero-point of the facial orientation angle including at least any one of the pitch angle, the yaw angle, and the roll angle of the face of the driver.
  • the zero-point setting conditions include that the vehicle 60 is travelling on a straight road having no inclination.
  • such conditions need not necessarily be included as zero-point setting conditions.
  • the zero-point setting conditions may include only that the vehicle 60 is travelling on a road having no inclination or that the vehicle 60 is travelling on a straight road.
  • the setting of the zero-point is performed based on the average values of the provisional pitch angle ⁇ x′, etc., acquired at a predetermined period.
  • the setting of the zero-point may be performed based on the provisional pitch angle ⁇ x′, etc., acquired at a single time point, or the setting of the zero-point may be performed based on the peaks of the frequency distributions of the provisional pitch angle ⁇ x′, etc., acquired over a predetermined period.
  • the zero-point setting conditions may include that the absolute value of the current yaw angle be within a predetermined angle (e.g., 15°), which is determined in advance.
  • a predetermined angle e.g. 15°
  • the facial orientation recognition unit 51 may not be able to accurately calculate the provisional pitch angle ⁇ x′. Therefore, if the setting of the zero-point is performed in a state in which the driver is facing sideways, there is a risk that the zero-point of the pitch angle will not be set correctly.
  • the absolute value of the current yaw angle is within the predetermined angle, which is determined in advance, the setting of the zero-point of the pitch angle can be accurately performed.
  • FIG. 5 is a flowchart showing the control routine for determining the execution timing of the setting of the zero-point.
  • the illustrated control routine is repeatedly executed at constant time intervals.
  • step S 11 it is determined whether or not the ignition switch of the vehicle 60 has been switched from “OFF” to “ON”. When it is determined that the ignition switch has been switched to “ON”, the routine proceeds to step S 14 . When it is determined that the ignition switch has not been changed to “ON”, the routine proceeds to step S 12 .
  • step S 12 it is determined whether or not the elapsed time T from the previous performance of the setting of the zero-point is equal to or greater than a predetermined reference time Tref.
  • the routine proceeds to step S 14 .
  • the routine proceeds to step S 13 .
  • step S 13 it is determined whether or not a change in the driver, i.e., change in the posture of the drier or a change of driver has been detected by the driver change detection unit 55 .
  • the routine proceeds to step S 14 .
  • the control routine ends.
  • step S 14 the execution flag Fr of the setting of the zero-point is set to “ON”, and the control routine ends.
  • the execution flag FR is set to “ON” when the setting of the zero-point is to be performed and is set to “OFF” otherwise.
  • FIG. 6 is a flowchart showing the control routine for setting the zero-point of the facial orientation angle of the driver.
  • the illustrated control routine is repeatedly executed at constant time intervals.
  • step S 21 it is determined whether or not the execution flag Fr of the setting of the zero-point has been set to “ON”. If the execution flag Fr has been set to “ON” in step S 14 of FIG. 5 , it is determined in step S 21 that the execution flag Fr has been set to “ON”, and the routine proceeds to step S 22 .
  • step S 22 it is determined whether or not the velocity V of the vehicle 60 detected by the vehicle velocity sensor 12 is greater than or equal to a predetermined reference velocity Vref.
  • the control routine ends.
  • the routine proceeds to step S 23 .
  • step S 23 it is determined whether or not it has been detected by the road condition detection unit 54 that the vehicle 60 is travelling on a straight road having no inclination.
  • the control routine ends.
  • step S 24 the provisional facial orientation angle ⁇ ′ (i.e., the provisional pitch angle ⁇ x′, the provisional yaw angle ⁇ y′, and the provisional roll angle ⁇ z′) is acquired.
  • step S 25 the value obtained by adding 1 to the current value of a counter CT is set as a new value of the counter CT.
  • the counter CT is a counter indicating the elapsed time since the acquisition of the provisional facial orientation angle ⁇ ′ has begun.
  • step S 26 it is determined whether or not the counter CT is greater than or equal to a reference value CTref. When it is determined in step S 26 that the counter CT is less than the reference value CTref, the control routine ends. Conversely, when it is determined in step S 26 that the counter CT is greater than or equal to the reference value CTref, the routine proceeds to step S 27 .
  • step S 27 the zero-point of the facial orientation angle of the driver is set (updated), based on the provisional facial orientation angle ⁇ ′ acquired a plurality of times in step S 24 . Specifically, the average value of the provisional facial orientation angles ⁇ ′ is set as the zero-point of the facial orientation angle ⁇ .
  • step S 28 the execution flag Fr is reset to “OFF”, and the control routine ends.
  • step S 21 when it is determined in step S 21 that the execution flag Fr has been set to “OFF”, the routine proceeds to step S 29 .
  • step S 29 the counter CT is reset to zero and the control routine ends.
  • driver monitoring system 1 according to a second embodiment will be described with reference to FIG. 7 .
  • the configuration and control of the driver monitoring system according to the second embodiment is substantially the same as the configuration and control of the driver monitoring system according to the first embodiment.
  • the points of difference from the driver monitoring system of the first embodiment will be mainly described below.
  • the provisional facial angle ⁇ ′ when the driver is facing forward is essentially within a predetermined range (reference angle range).
  • the provisional yaw angle ⁇ y′ when the driver is facing forward essentially falls within the range of about ⁇ 10° to about 10°.
  • the average value of the provisional yaw angles ⁇ y′ calculated for the setting of the zero-point i.e., the average value of the provisional yaw angles ⁇ y′ during the above-mentioned constant period
  • the setting of the zero-point when the average value of the provisional yaw angles ⁇ y′ calculated for the setting of the zero-point is not within the reference angle range, the setting of the zero-point is prohibited.
  • the average value of the provisional pitch angles ⁇ x′ calculated for the setting of the zero-point is not within the reference angle range and the average value of the provisional roll angles ⁇ z′ calculated for the setting of the zero-point is not within the reference angle range
  • the setting of the zero-point is prohibited.
  • the provisional facial orientation angle ⁇ ′ calculated for the setting of the zero-point when the provisional facial orientation angle ⁇ ′ calculated for the setting of the zero-point is not within the reference angle range, which is set in advance, the setting of the zero-point of the facial orientation angle is prohibited.
  • the reference angle range is set to a range which the facial orientation angle may normally take when the driver is facing forward.
  • FIG. 7 is a flowchart showing the control routine for the setting of the zero-point of the facial orientation angle of the driver in the second embodiment.
  • the illustrated control routine is repeatedly executed at constant time intervals. Note that steps S 31 to S 33 , S 35 to S 37 and S 39 to S 41 in FIG. 7 are similar to steps S 21 to S 29 in FIG. 6 , respectively, and thus an explanation thereof has been omitted below.
  • step S 34 it is determined whether or not the prohibition conditions have been established.
  • the prohibition conditions are established, for example, when the driver is operating the switches of the vehicle 60 .
  • the routine proceeds to step S 35 .
  • the control routine ends.
  • step S 38 it is determined whether or not the average value ⁇ ′ave of the provisional facial orientation angles (i.e., the provisional pitch angle ⁇ x′, the provisional yaw angle ⁇ y′, and the provisional roll angle ⁇ z′), which have been acquired a plurality of times in step S 35 , is within the reference angle range ( ⁇ ′ 1 to ⁇ ′ 2 ), which is set in advance. Specifically, it is determined whether or not, for example, the average value of the provisional pitch angles ⁇ x′ is within the reference angle range between the first provisional pitch angle ⁇ x′ 1 and the second provisional pitch angle ⁇ x′ 2 .
  • step S 38 When it is determined in step S 38 that the average value ⁇ ′ave of the provisional facial orientation angles is not within the reference angle range, which is set in advance, the control routine ends. Conversely, when it is determined that the average value ⁇ ′ave of the provisional facial orientation angles is within the reference angle range, which is set in advance, the routine proceeds to step S 39 and the setting of the zero-point is performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Analysis (AREA)

Abstract

A driver monitoring system 1 includes: an image capturing device 11 which captures a face image of the driver; and a control device 50 to which the face image captured by the image capturing device is transmitted. The control device includes: a facial orientation recognition unit which recognizes the facial orientation angle of the driver based on the face image captured by the image capturing device; a zero-point setting unit which sets a zero-point of a pitch angle, yaw angle, and roll angle of the face of the driver based on the facial orientation angle recognized by the facial orientation recognition unit, when zero-point setting conditions, including that the velocity of the vehicle is equal to or greater than a predetermined velocity, are satisfied; and a state diagnosis unit that diagnoses a state of the driver based on the current facial orientation angle with respect to the zero-point set by the zero-point setting unit.

Description

TECHNICAL FIELD
The present invention relates to a driver monitoring system.
BACKGROUND ART
Driver monitoring systems provided with a camera which captures a face image of a driver, have been known. In such driver monitoring systems, for example, the direction of the face of the driver is detected based on the face image captured by the camera, and it is determined whether or not the driver is inattentive during driving, based on the detected direction of the face of the driver (for example, Patent Literature 1).
CITATION LIST Patent Literature
[Patent Literature 1] Japanese Patent Publication (Kokai) No. 2000-326757
SUMMARY OF INVENTION Technical Problem
In order to determine whether or not the driver is inattentive during driving based on the direction of the face of the driver, the direction of the face of the driver when facing forward is taken as a reference. When the current angle of the direction of the face relative to the reference direction is large, it is determined that the driver is inattentive while driving.
However, the direction of the face with respect to the driver monitoring camera when the driver is facing forward varies in accordance with the installation position of the driver monitoring camera, and the physique, seated posture, etc., of the driver. Thus, if a predetermined facial orientation with respect to the driver monitor camera is determined as the face orientation when facing forward, there is a possibility that it cannot be accurately recognized when the driver is facing forward. In this case, it may be determined that the driver is driving inattentively even though this is not the case. Conversely, it may be determined that the driver is not driving inattentively, despite the driver driving inattentively.
The present invention aims to solve the above problem and aims to provide a driver monitoring system which can accurately recognize the direction of the face when the driver is facing toward.
Advantageous Effect of Invention
According to the present invention, provided is a driver monitoring system which can accurately recognize the direction of the face when the driver is facing forward.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a schematic block diagram showing the configuration of the driver monitoring system.
FIG. 2 is a schematic side view partially showing the inside of a vehicle equipped with the driver monitoring system.
FIG. 3 is a view schematically showing the head of a driver.
FIG. 4 is a view showing feature points used to recognize the facial orientation angle of the driver.
FIG. 5 is a flowchart showing the control routine for determining the execution timing of zero-point setting.
FIG. 6 is a flowchart showing the control routine for setting the zero-point of the facial orientation angle of the driver.
FIG. 7 is a flowchart showing the control routine for setting the zero-point of the facial orientation angle of the driver according to a second embodiment.
DESCRIPTION OF EMBODIMENTS
The embodiments of the present invention will be described in detail below with reference to the drawings. Note that, in the description below, the same constituent elements are assigned the same reference numerals.
First Embodiment Configuration of the Device
The configuration of the driver monitoring system 1 according to an embodiment will be described with reference to FIGS. 1 and 2. FIG. 1 is a schematic block diagram showing the configuration of the driver monitoring system 1 according to the present embodiment. FIG. 2 is a schematic side view partially showing the inside of the vehicle 60 equipped with the driver monitoring system 1 according to the present embodiment.
The driver monitoring system 1 is installed in the vehicle 60 and monitors the state of the driver of the vehicle 60. As shown in FIG. 1, the driver monitoring system 1 is provided with a driver monitoring camera 11 (image capturing device) which captures a face image of the driver, a vehicle velocity sensor 12 which detects the velocity of the vehicle 60, and a seat pressure sensor 13 which detects the pressure applied to the seat surface of the driver's seat.
First, the driver monitoring camera 11 will be described with reference to FIG. 2. As shown in FIG. 2, the vehicle 60 comprises a steering wheel 62 attached via a steering column 61, a rear-view mirror 63 arranged in front of and above the driver, and a driver's seat 64 on which the driver sits.
In the present embodiment, the driver monitoring camera 11 is provided on the upper part of the steering column 61 and is arranged facing the driver so as to be able to capture the driver, specifically the face and part of the upper body of the driver. In particular, the steering column 61 is basically arranged so as to be positioned lower than the position of the face of the driver. Thus, the image capture direction of the driver monitoring camera 11 is directed upwards from the horizontal by a predetermined angle α (e.g., 10 to 20°).
Note that, as long as the driver monitoring camera 11 can capture the driver of the vehicle 60, it may be provided in a position different from the upper part of the steering column 61. For example, the driver monitoring camera 11 may be provided on the steering wheel 62, the rear-view mirror 63, the instrument panel, the instrument panel housing, etc., of the vehicle 60.
The driver monitoring camera 11 is provided with a camera and a light source. The camera may be, for example, a CMOS (complementary metal oxide semiconductor) camera or a CCD (charge coupled device) camera, and the light source may be an LED (light-emitting diode). Furthermore, in order to be able to capture the face of the driver without causing discomfort to the driver, even in low light such as at night, the light source is preferably a near-infrared LED, and the camera is also preferably capable of detecting near-infrared rays. For example, the light source may be two near-infrared LEDs arranged on both sides of the camera. Furthermore, a filter such as a visible light cut-off filter may be provided on the camera.
The driver monitoring camera 11 is connected, by wire or wirelessly, to an ECU 50, which is described later. Thus, the image captured by the driver monitoring camera 11 is transmitted to the ECU 50 as image data.
The vehicle velocity sensor 12 is a sensor for detecting the velocity of the vehicle 60. The vehicle velocity sensor 12 is, for example, attached to the housing (not shown) of the drive shaft of the vehicle to detect the rotation speed of the drive shaft of the vehicle, and detects the velocity of the vehicle 60 based on the detected rotation speed of the drive shaft. The vehicle velocity sensor 12 is connected to the ECU 50 so that the output signal of the vehicle velocity sensor 12 is transmitted to the ECU 50. Note that, as long as the vehicle velocity sensor 12 is capable of detecting the velocity of the vehicle 60, the vehicle velocity sensor 12 may be a different type of sensor attached to another part of the vehicle.
The seat pressure sensor 13 is arranged under the seat surface of the driver's seat 64. The seat pressure sensor 13 is a pressure sensor for detecting the pressure applied to the seat surface of the driver's seat 64 by the driver. In particular, in the present embodiment, when the driver's seat 64 is virtually divided into a plurality of seat surfaces, the seat pressure sensor 13 can detect the pressure applied to each divided part. In other words, the seat pressure sensor 13 can detect the pressure distribution applied to the seat surface of the driver's seat 64. The seat pressure sensor 13 is connected to the ECU 50 so that the output signal from the seat pressure sensor 13 is transmitted to the ECU 50.
Note that, as long as the seat pressure sensor 13 is capable of detecting at least the pressure applied to the entire seat surface of the driver's seat 64, the seat pressure sensor 13 may not be able to detect the pressure distribution applied to the seat surface of the driver's seat 64.
Furthermore, the driver monitoring system 1 is further provided with a surrounding environment detection device 20, a GPS receiver 31 which receives a signal from GPS satellites, and a map database 32 related to map information. All of these components are connected to the ECU 50.
The surrounding environment detection device 20 detects the environmental situation surrounding the vehicle 60 (hereinafter referred to as the “vehicle surrounding situation”). The vehicle surrounding situation includes not only the road conditions around the vehicle 60, but also obstacles, weather, etc., around the vehicle 60. In the present embodiment, the surrounding environment detection device 20 includes LIDAR (laser imaging detection and ranging) 21, a millimeter-wave radar sensor 22, and an external camera 23.
The LIDAR 21 is a device which uses laser light to detect the road and obstacles surrounding a host vehicle. The LIDAR 21 detects the status of the road and obstacles in the entire circumference of the vehicle 60, i.e., the vehicle surrounding situation as a three-dimensional image, and transmits the three-dimensional image to the ECU 50. The millimeter-wave radar sensor 22 is a device which detects obstacles surrounding a host vehicle at a distance greater than the LIDAR 21, using radio waves. The millimeter-wave radar sensor 22 transmits such measurement results to the ECU 50 as the vehicle surrounding situation. The external camera 23 is a device which photographs the area in front of the vehicle 60. The external camera 23 detects road information or the like of the area in front of the vehicle 60, by performing image processing on the captured image of the area in front of the vehicle 60. The external camera 23 transmits the detected information to the ECU 50.
The GPS receiver 31 is a device for detecting the current position of the vehicle 60 (e.g., the longitude and latitude of the vehicle 60) by receiving signals from three or more GPS satellites. The GPS receiver 31 transmits the detected current position information of the vehicle 60 to the ECU 50.
The map database 32 is a database related to map information. The map database 32 is stored in, for example, an HDD (Hard Disk Drive) mounted on the vehicle 60. The map information includes information such as, for example, information on the locations of roads, information on the shapes of roads (e.g., the types of curves and straight lines, the curvature of curves, etc.), position information on intersections and branch points, and road types.
The driver monitoring system 1 is further provided with a human machine interface (HMI) 40 and an electrical control unit (ECU) 50.
The HMI 40 is an interface for performing information input and output between the driver or vehicle occupant and the driver monitoring system 1. The HMI 40 is provided with an information provision device for providing various information to the driver, specifically, a display for displaying character information and image information, and a speaker for outputting sound. Additionally, the HMI 40 is provided with a microphone for recognizing the voice of the driver, and operation buttons or a touch panel for the input operation of the driver.
The HMI 40 is connected, by wire or wirelessly, to the ECU 50. Thus, the information input by the driver or the like is transmitted from the HMI 40 to the ECU 50. Furthermore, the information to be provided to the driver by the information provision device of the HMI 40 is transmitted from the ECU 50 to the HMI 40. For example, when it is determined by the state diagnosis unit 53 of the ECU 50, which is described later, that the driver is driving inattentively, an inattention warning command is transmitted from the state diagnosis unit 53 to the HMI 40. When such an inattention warning command is transmitted, the HMI 40, for example, generates a warning sound using the speaker, and displays on the display a warning indicating that attention should be given. Thus, the HMI 40 functions as an alarm unit that alerts the driver to abnormalities when an anomaly such as inattentiveness of the driver occurs.
The ECU 50 functions as a control device performing various controls, and is a microcomputer comprising a central processing unit (CPU), read-only memory (ROM), random access memory (RAM), an input port, and an output port, all of which are connected to each other through bidirectional busses. In the present embodiment, a single ECU 50 is provided. However, a plurality of ECUs connected to each other via busses or like conforming to CAN or the like, may be provided for each function. As shown in FIG. 1, in the present embodiment, the ECU 50 comprises a facial orientation recognition unit 51, a zero-point setting unit 52, a state diagnosis unit 53, a road condition detection unit 54, and a driver change detection unit 55.
The facial orientation recognition unit 51 recognizes the facial orientation angle of the driver, based on the face image of the driver captured by the driver monitoring camera 11. In the present embodiment, the direction of the face of the driver is represented by a pitch angle θx, a yaw angle θy, and a roll angle θz (collectively referred to as the “facial orientation angle”). The pitch angle θx, the yaw angle θy, and the roll angle θz will be described briefly below.
FIG. 3 is a view schematically showing the face of the driver. As shown in FIG. 3, the direction of the face of the driver can be represented by angles about three mutually orthogonal axes X, Y, and Z. In the drawing, the X axis is the axis extending in the left and right directions of the face of the driver, and is the axis extending perpendicular to the direction of travel of the vehicle 60 and perpendicular to the horizontal direction. The angle θx of the face of the driver around the X axis represents the pitch angle. Thus, the pitch angle θx of the driver changes when driver faces upwards or downwards.
In the drawing, the Y axis is the axis extending in the upwards and downwards directions of the face of the driver, and is the axis extending in the vertical direction. The angle θy of the face of the driver around the Y axis represents the yaw angle. Thus, the yaw angle θy of the driver changes when the driver turns his/her head toward left or right.
In the drawing, the Z axis is the axis extending in the forwards and backwards directions of the face of the driver, and is the axis extending in the direction of travel of the vehicle 60. The angle θz of the face of the driver around the Z axis represents the roll angle. Thus, the roll angle θz of the driver changes when the driver inclines his/her head.
Next, the specific method for recognizing the facial orientation angle of the driver by the facial orientation recognition unit 51 will be described with reference to FIG. 4. FIG. 4 is a view showing feature points used to recognize the facial orientation angle of the driver.
The facial orientation recognition unit 51, when recognizing the face direction of the driver, first extracts feature points of the face from the face image. In the present embodiment, the facial orientation recognition unit 51 extracts the outer ends of the eyes and the tip of the nose as feature points from the face image of the driver captured by the driver monitoring camera 11. Specifically, the facial orientation recognition unit 51 extracts, as shown in FIG. 4, the outer end L of the left eye (the left side end), the outer end R of the right eye (the right-side end), and the tip N of the nose. As shown in FIG. 4, the intersection between the line, passing through point N, normal to the line segment LR (the line segment connecting the outer ends of the left and right eyes) and the line segment LR is extracted as feature point C.
The facial orientation recognition unit 51 recognizes the facial orientation angle of the driver, based on the features points extracted in this way. First, regarding the pitch angle θx, the ratio of the length of line segment CN to the length of line segment LR (hereinafter referred to as the “CN/LR ratio”) becomes smaller as a change in the direction of the face of the driver with respect to the state facing the driver monitoring camera 11 increases. Thus, when the driver monitoring camera 11 is oriented by a certain degree from the horizontal as in the present embodiment, the CN/LR ratio decreases as the direction of the face of the driver faces upwards. Therefore, in the present embodiment, the facial orientation recognition unit 51 detects the pitch angle θx of the face of the driver based on the CN/LR ratio.
Furthermore, the pitch angle θx at this time is provisionally calculated, using, for example, the direction in which the face of the driver faces the driver monitoring camera 11 in the upward and downward directions (pitch angle direction) as a provisional reference (i.e., the direction in which the CN/LR ratio is maximum). Thus, when the face of the driver faces the driver monitoring camera 11 in the pitch angle direction, the pitch angle is provisionally calculated assuming that the pitch angle of the face of the driver is 0°. Below, the provisional pitch angle calculated in this way based on the provisional reference, is referred to as the provisional pitch angle θx′. Note that the provisional reference for calculating the provisional pitch angle need not necessarily be the direction in which the face of the driver faces the driver monitoring camera 11, but can be set to any direction.
Further, regarding the yaw angle θy, in the state in which the face of the driver faces towards the driver monitoring camera 11, the ratio of line segment LC to line segment RC (hereinafter referred to as the “RC/LC ratio”) is approximately 1. When the driver's face turns from this state to the right, since line segment RC becomes short, the RC/LC ratio becomes smaller than 1. Further, when the driver's face turns from this state to the left, since line segment LC becomes short, the RC/LC ratio becomes greater than 1. Therefore, in the present embodiment, the facial orientation recognition unit 51 calculates the yaw angle θy based on the RC/LC ratio.
Furthermore, the yaw angle θy at this time is provisionally calculated, using, for example, the direction in which the RC/LC ratio becomes 1 as a provisional reference. Thus, when the RC/LC ratio becomes 1, the yaw angle is provisionally calculated assuming that the yaw angle of the face of the driver is 0°. Below, the provisional yaw angle calculated in this way based on the provisional reference is referred to as the provisional yaw angle θy′. Note that the provisional reference for calculating the provisional yaw angle need not necessarily be the direction in which the RC/LC ratio becomes 1, but can be set to any direction.
Further, regarding the roll angle θz, when the direction of the face of the driver is tilted, the angle of line segment RL (hereinafter referred to as “the angle of line segment RL”) with respect to the horizontal line of the driver monitoring camera 11, changes. Therefore, in the present embodiment, the facial orientation recognition unit 51 calculates the roll angle θz of the driver, based on the angle of line segment RL with respect to the horizontal line of the driver monitoring camera 11.
Furthermore, the roll angle θz at this time is provisionally calculated, using, for example, the direction in which the angle of line segment RL is 0° as a provisional reference. Thus, when the angle of line segment RL is 0°, the roll angle is provisionally calculated assuming that the roll angle of the face of the driver is 0°. Below, the provisional roll angle calculated in this way based on the provisional reference is referred to as the provisional roll angle θz′. Note that the provisional reference for calculating the provisional roll angle need not necessarily be the direction in which the angle of line segment RL is 0°, but can be set to any direction.
Note that in the example shown in FIG. 4, the facial orientation recognition unit 51 extracts three feature points based on the face image captured by the driver monitoring camera 11, and recognizes the facial orientation angle of the driver based on these feature points. However, the facial orientation recognition unit 51 may be configured so as to recognize the facial orientation angle of the driver by another method. Thus, the facial orientation recognition unit 51 may be configured so as to recognize the facial orientation angle of the driver based on, for example, a greater number of feature points, or may be configured to fit (match) the face image to a 3D model and recognize the facial orientation angle using this 3D model. In either case, the provisional pitch angle θx′, provisional yaw angle θy′, and provisional roll angle θz′ (collectedly referred to as the “provisional facial orientation angle θ′”) are calculated by the facial orientation recognition unit 51 based on the provisional reference.
The zero-point setting unit 52 sets the zero-point of the facial orientation angle. The zero-point is set so that the pitch angle θx, the yaw angle θy, and the roll angle θz are all 0° when the driver is facing forward.
For example, when considering the pitch angle direction, the provisional pitch angle θx′ does not necessarily become 0° when the driver is facing forward. Therefore, the zero-point setting unit 52 sets the zero-point of the pitch angle so that the pitch angle θx when the driver is facing forward is 0°. Thus, when the provisional pitch angle θx′ is a certain angle, setting of the zero-point of the pitch angle is performed assuming that the pitch angle θx is 0°. For example, if the provisional pitch angle θx′ is 15° when the driver is facing forward, the zero-point of the pitch angle of the face of the driver is set in a state in which the provisional pitch angle θx′ is 15°.
The same is true for the yaw angle direction and the roll angle direction. Thus, the zero-point setting unit 52 sets the zero-point of the yaw angle such that the yaw angle θy when the driver is facing forward is 0° and sets the zero-point of the roll angle so that the roll angle θz when the driver is facing forward is 0°.
The state diagnosis unit 53 diagnoses the state of the driver, based on the current facial orientation angle with respect to the zero-point set by the zero-point setting unit 52. As described above, the zero-point is set so that the pitch angle θx, the yaw angle θy, and the roll angle θz are all 0° when the driver is facing forward. Thus, for example, during operation of the vehicle 60, when the absolute values of the pitch angle θx and the yaw angle θy with respect to zero-point are greater than a predetermined reference value over a predetermined time or longer, the state diagnosis unit 53 determines that the driver is driving inattentively. Furthermore, during operation of the vehicle 60, when the roll angle θz with respect to the zero-point is greater than a predetermined reference value over a predetermined time or longer, the state diagnosis unit 53 determines that an abnormality has occurred in the driver.
The road condition detection unit 54 detects the condition of the road on which the vehicle 60 is travelling. The road condition detection unit 54 may detect the road condition based on the vehicle surrounding situation detected by the surrounding environment detection device 20 or may detect the road condition based on the current position detected by the GPS receiver 31 and the map information of the map database 32. The road conditions detected by the road condition detection unit 54 include, for example, whether or not the road on which the vehicle 60 is travelling is a straight road, and if it is a curve, the radius of curvature of the road, and additionally, the inclination angle of the road on which the vehicle 60 is travelling.
The driver change detection unit 55 detects a change in the posture of the driver or a change of driver. The driver change detection unit 55 detects a change in driver based on, for example, the pressure of the seat pressure sensor 13. Specifically, for example, when the pressure detected by the seat pressure sensor 13 temporarily becomes zero and then becomes a certain value again, the driver change detection unit 55 determines that the driver has changed or that the posture of the driver has changed. Furthermore, when the distribution in pressure detected by the seat pressure sensor 13 changes without becoming zero, the driver change detection unit 55 determines that the posture of the driver has changed.
Note that the driver change detection unit 55 may detect a change in driver based not on the seat pressure sensor 13, but rather on the output of another detection device. Thus, for example, the driver change detection unit 55 may detect a change in driver based on the image captured by the driver monitoring camera 11.
Problems with Zero-Point Setting
The zero-point, in which the above-described facial orientation angle is zero, can be set to a predetermined fixed value. For example, a provisional reference (fixed value) for calculating the provisional pitch angle θx′ can be set in advance as the zero-point of the pitch angle θx, and a provisional reference (fixed value) for calculating the provisional yaw angle θy′ can be set in advance as the zero point of the yaw angle θy.
However, if the zero-point of pitch angle θx or the zero-point of the yaw angle θy are set as fixed values, it may not be possible to accurately recognize when driver is facing forward.
For example, the position of the face of the driver with respect to the driver monitoring camera 11 differs between when the driver is driving while leaning forward and when the driver is sitting far back in the driver's seat 65. As a result, therebetween, the direction of the face of the driver with respect to the driver monitoring camera 11 is different when driver is facing forward. Furthermore, the position of the face of the driver with respect to the driver monitoring camera 11 differs for a driver having a high sitting height and a driver having a low sitting height. As a result, the direction of the face of the driver with respect to the driver monitoring camera 11 when the driver is facing forward differs between the two. Thus, the direction of the face with respect to the driver monitoring camera varies according to the installation position of the driver monitoring camera, and the physique, the seated posture, etc., of the driver. Furthermore, the CN/LR ratio, the RC/LC ratio, and the angle of line segment RL when the driver is facing forward, differ depending on the composition of the face of the driver. Thus, the provisional pitch angle θx′, the provisional yaw angle θy′, and the provisional roll angle θz′ when the driver faces forward are different values.
If it cannot be accurately recognized when the driver is facing forward, the state diagnosis unit 53 may determine the pitch angle, yaw angle, etc., of the face of the driver, based on a case in which the driver is not facing forward. As a result, the state diagnosis unit 53 may determine that the driver is driving inattentively, etc., when the driver is not driving inattentively, or conversely, may determine that the driver is not driving inattentively, etc., when the driver is driving inattentively.
Setting of the Zero-Point
In the present embodiment, the zero-point setting unit 52 sets the zero-point of the pitch angle, yaw angle, and roll angle of the face of the driver (i.e., the facial orientation angle), based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when zero-point setting conditions have been established.
In the present embodiment, the zero-point setting conditions include that the velocity of the vehicle is higher than a predetermined reference velocity greater than 0. Thus, the zero-point setting unit 52 sets the zero-point, based on the face direction of the driver recognized by the facial orientation recognition unit 51 when the velocity of the vehicle is higher than the predetermined reference velocity greater than 0. It is preferable that the reference velocity be a velocity at which the frequency at which the driver is inattentive is reduced, for example, greater than or equal to 20 kmph, preferably greater than or equal to 40 kmph, and more preferable greater than 60 kmph.
Furthermore, in the present embodiment, the zero-point setting conditions include that the road condition detection unit 52 detects that the vehicle 60 is travelling on a straight road having no inclination. Thus, the zero-point setting unit 52 sets the zero-point, based on the face direction of the driver recognized by the facial orientation recognition unit 51 when the vehicle 60 is travelling on a straight road having no inclination.
The roads with no inclination include, in addition to the road which is not inclined at all with respect to the horizontal within the range that the driver can see, roads that are not substantially inclined within a range that the driver can see, for example, roads having an inclination angle of less than or equal to 0.5° with respect to the horizontal. Furthermore, straight road means a road extending linearly in a range visible to the driver.
Specifically, in the present embodiment, the zero-point of the pitch angle, yaw angle, and roll angle of the face of the driver are set based on the provisional pitch angle θx′, the provisional yaw angle θy′, and the provisional roll angle θz′, respectively, recognized by the facial orientation recognition unit 51 when the zero-point setting conditions have been established.
In particular, in the present embodiment, the zero-point setting unit 52 acquires the face direction of the driver, i.e., the provisional pitch angle θx′, the provisional yaw angle θy′, and the provisional roll angle θz′, continuously over a predetermined period. The average value of the provisional pitch angles θx′, the average value of the provisional yaw angles θy′, and the average value of the provisional roll angles θz′, acquired in this way over the predetermined period, are set as the zero-point of the pitch angle, the yaw angle, and the roll angle, respectively. The predetermined period is a period long enough to avoid errors in the setting of the zero-point, for example, from 10 seconds to five minutes, preferably from 30 seconds to 3 minutes, and more preferably from 1 minute to two minutes.
Furthermore, in the present embodiment, the setting of the zero-point is performed at the following timings. Thus, the following timings are also one of the zero-point setting conditions.
The first timing may be when the setting of the zero-point of the facial orientation angle by the zero-point setting unit 52 has not yet been performed after the ignition switch of the vehicle 60 has been changed from “OFF” to “ON”. When the ignition switch is changed to “ON”, it is highly likely that the driver and the posture of the driver have changed since the vehicle was last operated, and when the driver and the posture thereof have changed, it is highly likely that zero-point has also changed accordingly. Thus, by setting the zero-point at this timing, the zero-point can be matched to the new driver or the new posture of the driver.
The second timing may include when the time elapsed since the setting of the zero-point was last performed is longer than a predetermined period or when the distance travelled since the setting of the zero-point was last performed is longer than a predetermined distance. In general, it is highly likely that the posture of the driver will change after having driven for a long time or a long distance. Thus, by setting the zero-point at such timing, the zero-point can be matched with the posture after the change in the driver.
The third timing may include when the driver change detection unit 55 detects a change in the driver (i.e., a change in driver or a change in the posture of the driver, etc.). As described above, it is highly likely that when the driver or the posture thereof change, the zero-point will also change. Thus, by performing setting of the zero-point at this timing, the zero-point can be matched with the new driver or the new posture.
Mode of Operation and Effect
Next, the mode of operation and effect of the driver monitoring system 1 according to the present embodiment will be described.
When the vehicle 60 is stopped, the driver is often not looking forward and is inattentive. Thus, if setting of the zero-point of the facial orientation angle of the driver is performed when the vehicle 60 is stopped, there is a risk that the facial orientation angle of the driver when the driver is not facing forward will be set as the zero-point. Conversely, in the driver monitoring system 1 of present embodiment, the setting of the zero-point is performed, based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when the velocity of the vehicle is equal to or greater than the reference velocity. As a result, according to the driver monitoring system 1 of the present embodiment, the zero-point can be set as the facial orientation angle of the driver when the driver is facing forward.
Furthermore, when the front of the vehicle 60 is upwardly or downwardly inclined, the driver often faces upwards or downwards toward the direction of inclination. Thus, if the vehicle 60 is travelling on an inclined road, the face of the driver may not be facing forward (in particular, in the pitch angle direction). Conversely, in the driver monitoring system 1 of the present embodiment, the setting of the zero-point is performed based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when the vehicle 60 is travelling on a road having no inclination. As a result, according to the driver monitoring system 1 of the present embodiment, a more accurate zero-point of the facial orientation angle of the driver (in particular, the pitch angle) can be set.
Additionally, when the road in front of the vehicle 60 is curved, the driver is likely to face the turning direction in accordance with the curve. Thus, when the vehicle 60 is travelling on a curved road, the face of the driver sometimes is not facing forward (in particular, the yaw angle direction). Conversely, in the driver monitoring system 1 of the present embodiment, the setting of the zero-point is performed based on the facial orientation angle of the driver recognized by the facial orientation recognition unit 51 when the vehicle 60 is travelling on a straight road. As a result, according to the driver monitoring system 1 of the present embodiment, a more accurate zero-point of the facial orientation angle of the driver (in particular, the yaw angle) can be set.
Further, in the present embodiment, the average values of the provisional pitch angle θx′, etc., acquired over a predetermined period are set as the zero-point of the pitch angle, etc., instead of the provisional pitch angle θx′, etc., acquired at an instantaneous timing. Thus, it is possible to set the zero-point relatively accurately even under circumstances in which the driver notices obstacles or the like on the road and thus temporarily does not face forward.
Modification of the First Embodiment
Note that, in the present embodiment, the zero-point setting unit 52 sets the zero-point regarding all of the pitch angle, the yaw angle, and the roll angle of the face of the driver based on the facial orientation angle of the driver when the zero-point setting conditions have been established. However, it is not necessary that the zero-point setting unit 52 set the zero-point for all of the pitch angle, the yaw angle, and the roll angle of the face of the driver, but may set the zero-point of the facial orientation angle including at least any one of the pitch angle, the yaw angle, and the roll angle of the face of the driver.
Furthermore, in the above-described embodiment, the zero-point setting conditions include that the vehicle 60 is travelling on a straight road having no inclination. However, such conditions need not necessarily be included as zero-point setting conditions. Alternatively, the zero-point setting conditions may include only that the vehicle 60 is travelling on a road having no inclination or that the vehicle 60 is travelling on a straight road.
Further, in the above-described embodiment, the setting of the zero-point is performed based on the average values of the provisional pitch angle θx′, etc., acquired at a predetermined period. However, it is not necessary to use the average values of the provisional pitch angle θx′, etc., acquired over a predetermined period to set the zero-point. For example, the setting of the zero-point may be performed based on the provisional pitch angle θx′, etc., acquired at a single time point, or the setting of the zero-point may be performed based on the peaks of the frequency distributions of the provisional pitch angle θx′, etc., acquired over a predetermined period.
Furthermore, when performing the setting of the zero-point of the pitch angle, the zero-point setting conditions may include that the absolute value of the current yaw angle be within a predetermined angle (e.g., 15°), which is determined in advance. In other words, if the driver is facing sideways, the facial orientation recognition unit 51 may not be able to accurately calculate the provisional pitch angle θx′. Therefore, if the setting of the zero-point is performed in a state in which the driver is facing sideways, there is a risk that the zero-point of the pitch angle will not be set correctly. Thus, when the absolute value of the current yaw angle is within the predetermined angle, which is determined in advance, the setting of the zero-point of the pitch angle can be accurately performed.
Flowchart
FIG. 5 is a flowchart showing the control routine for determining the execution timing of the setting of the zero-point. The illustrated control routine is repeatedly executed at constant time intervals.
As shown in FIG. 5, first, in step S11, it is determined whether or not the ignition switch of the vehicle 60 has been switched from “OFF” to “ON”. When it is determined that the ignition switch has been switched to “ON”, the routine proceeds to step S14. When it is determined that the ignition switch has not been changed to “ON”, the routine proceeds to step S12.
In step S12, it is determined whether or not the elapsed time T from the previous performance of the setting of the zero-point is equal to or greater than a predetermined reference time Tref. When it is determined that the elapsed time T is greater than or equal to the reference time Tref, the routine proceeds to step S14. When it is determined that the elapsed time T is less than the reference time Tref, the routine proceeds to step S13.
In step S13, it is determined whether or not a change in the driver, i.e., change in the posture of the drier or a change of driver has been detected by the driver change detection unit 55. When it is determined that a change in the driver has been detected in step S13, the routine proceeds to step S14. Conversely, when it has been determined that a change in driver has not been detected, the control routine ends.
In step S14, the execution flag Fr of the setting of the zero-point is set to “ON”, and the control routine ends. The execution flag FR is set to “ON” when the setting of the zero-point is to be performed and is set to “OFF” otherwise.
FIG. 6 is a flowchart showing the control routine for setting the zero-point of the facial orientation angle of the driver. The illustrated control routine is repeatedly executed at constant time intervals.
As shown in FIG. 6, first, in step S21, it is determined whether or not the execution flag Fr of the setting of the zero-point has been set to “ON”. If the execution flag Fr has been set to “ON” in step S14 of FIG. 5, it is determined in step S21 that the execution flag Fr has been set to “ON”, and the routine proceeds to step S22.
In step S22, it is determined whether or not the velocity V of the vehicle 60 detected by the vehicle velocity sensor 12 is greater than or equal to a predetermined reference velocity Vref. When it is determined in step S22 that the velocity V of the vehicle 60 is less than the reference velocity Vref, the control routine ends. When it is determined in step S22 that the velocity V of the vehicle 60 is greater than or equal to the reference velocity Vref, the routine proceeds to step S23.
In step S23, it is determined whether or not it has been detected by the road condition detection unit 54 that the vehicle 60 is travelling on a straight road having no inclination. When it is determined that the vehicle 60 is not travelling on a straight road having no inclination, the control routine ends. Conversely, when it is determined in step S23 that the vehicle 60 is travelling on a straight road having no inclination, the routine proceeds to step S24. In step S24, the provisional facial orientation angle θ′ (i.e., the provisional pitch angle θx′, the provisional yaw angle θy′, and the provisional roll angle θz′) is acquired.
Next, in step S25, the value obtained by adding 1 to the current value of a counter CT is set as a new value of the counter CT. The counter CT is a counter indicating the elapsed time since the acquisition of the provisional facial orientation angle θ′ has begun. Next, in step S26, it is determined whether or not the counter CT is greater than or equal to a reference value CTref. When it is determined in step S26 that the counter CT is less than the reference value CTref, the control routine ends. Conversely, when it is determined in step S26 that the counter CT is greater than or equal to the reference value CTref, the routine proceeds to step S27.
In step S27, the zero-point of the facial orientation angle of the driver is set (updated), based on the provisional facial orientation angle θ′ acquired a plurality of times in step S24. Specifically, the average value of the provisional facial orientation angles θ′ is set as the zero-point of the facial orientation angle θ. Next, in step S28, the execution flag Fr is reset to “OFF”, and the control routine ends.
On the other hand, when it is determined in step S21 that the execution flag Fr has been set to “OFF”, the routine proceeds to step S29. In step S29, the counter CT is reset to zero and the control routine ends.
Second Embodiment
Next, the driver monitoring system 1 according to a second embodiment will be described with reference to FIG. 7. The configuration and control of the driver monitoring system according to the second embodiment is substantially the same as the configuration and control of the driver monitoring system according to the first embodiment. The points of difference from the driver monitoring system of the first embodiment will be mainly described below.
When the driver operates the switches of the vehicle 60, it is likely that the driver is looking toward the switches. Thus, even if the above-described zero-point setting conditions are established, for example, when the driver is operating the switches of the vehicle 60, it is highly likely that the driver is not facing forward.
In the present embodiment, when prohibition conditions, such that the operation of the switches in the vicinity of the driver is detected, are established, even if the zero-point setting conditions are established, setting the of zero-point is prohibited. In particular, in the present embodiment, at such a time, acquisition of the provisional facial orientation angle θ′ from the facial orientation recognition unit 51 by the zero-point setting unit 52 is stopped.
Furthermore, the provisional facial angle θ′ when the driver is facing forward is essentially within a predetermined range (reference angle range). For example, the provisional yaw angle θy′ when the driver is facing forward essentially falls within the range of about −10° to about 10°. Thus, the average value of the provisional yaw angles θy′ calculated for the setting of the zero-point (i.e., the average value of the provisional yaw angles θy′ during the above-mentioned constant period) should be within this range.
In the present embodiment, when the average value of the provisional yaw angles θy′ calculated for the setting of the zero-point is not within the reference angle range, the setting of the zero-point is prohibited. Likewise, when the average value of the provisional pitch angles θx′ calculated for the setting of the zero-point is not within the reference angle range and the average value of the provisional roll angles θz′ calculated for the setting of the zero-point is not within the reference angle range, the setting of the zero-point is prohibited. In other words, in the present embodiment, when the provisional facial orientation angle θ′ calculated for the setting of the zero-point is not within the reference angle range, which is set in advance, the setting of the zero-point of the facial orientation angle is prohibited. The reference angle range is set to a range which the facial orientation angle may normally take when the driver is facing forward.
FIG. 7 is a flowchart showing the control routine for the setting of the zero-point of the facial orientation angle of the driver in the second embodiment. The illustrated control routine is repeatedly executed at constant time intervals. Note that steps S31 to S33, S35 to S37 and S39 to S41 in FIG. 7 are similar to steps S21 to S29 in FIG. 6, respectively, and thus an explanation thereof has been omitted below.
In step S34, it is determined whether or not the prohibition conditions have been established. The prohibition conditions are established, for example, when the driver is operating the switches of the vehicle 60. When it is determined in step S34 that the prohibition conditions have not been established, the routine proceeds to step S35. When it is determined in step S34 that the prohibition conditions have been established, the control routine ends.
Furthermore, in step S38, it is determined whether or not the average value θ′ave of the provisional facial orientation angles (i.e., the provisional pitch angle θx′, the provisional yaw angle θy′, and the provisional roll angle θz′), which have been acquired a plurality of times in step S35, is within the reference angle range (θ′1 to θ′2), which is set in advance. Specifically, it is determined whether or not, for example, the average value of the provisional pitch angles θx′ is within the reference angle range between the first provisional pitch angle θx′1 and the second provisional pitch angle θx′2. When it is determined in step S38 that the average value θ′ave of the provisional facial orientation angles is not within the reference angle range, which is set in advance, the control routine ends. Conversely, when it is determined that the average value θ′ave of the provisional facial orientation angles is within the reference angle range, which is set in advance, the routine proceeds to step S39 and the setting of the zero-point is performed.
REFERENCE SIGNS LIST
  • 1. driver monitoring system
  • 11. driver monitoring camera
  • 12. vehicle velocity sensor
  • 13. seat pressure sensor
  • 20. surrounding environment detection device
  • 31. GPS receiver
  • 32. map database
  • 40. HMI
  • 50. ECU
  • 51. facial orientation recognition unit
  • 52. zero-point setting unit
  • 53. state diagnosis unit
  • 54. road condition detection unit
  • 55. driver change detection unit
  • 60. vehicle

Claims (6)

The invention claimed is:
1. A driver monitoring system comprising an image capturing device for capturing a face image of a driver and a controller to which the face image captured by the image capturing device is transmitted, wherein the controller is configured to:
recognize a facial orientation angle of the driver based on the face image captured by the image capturing device;
set a zero-point of the facial orientation angle including at least one of a pitch angle, a yaw angle, and a roll angle of a face of the driver, based on the face orientation angle of the driver recognized by the controller, when zero-point setting conditions, including that a speed of a vehicle is higher than a predetermined speed faster than zero, are established; and
diagnose a state of the driver based on a current facial orientation angle with respect to the zero-point set by the controller.
2. The driver monitoring system according to claim 1, wherein the zero-point setting conditions include that the setting of the zero-point of the facial orientation angle has not yet been performed after the ignition of the vehicle has been turned on.
3. The driver monitoring system according to claim 1, wherein the zero-point setting conditions include that a predetermined time has elapsed or the vehicle has traveled a predetermined distance since setting of the zero-point of the facial orientation angle has been last performed.
4. The driver monitoring system according to claim 1, the controller is configured to detect a posture of the driver or a change in the driver, wherein
the zero-point setting conditions include that the controller detects that the posture of the driver has changed or that the driver has changed.
5. The driver monitoring system according to claim 1, the controller is configured to detect condition of a road on which the vehicle is travelling, wherein
the controller sets a zero-point of a pitch angle of the face of the driver, and
the zero-point setting conditions include that the controller detects that the vehicle is travelling on a road having no inclination.
6. The driver monitoring system according to claim 1, wherein the controller is configured to detect the condition of a road on which the vehicle is travelling, wherein
the controller sets a zero-point of a yaw angle of the face of the driver, and
the zero-point setting conditions include that the controller detects that the vehicle is travelling on a straight road.
US16/167,786 2017-11-09 2018-10-23 Driver monitoring system Active 2039-01-31 US10803294B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-216602 2017-11-09
JP2017216602A JP6915503B2 (en) 2017-11-09 2017-11-09 Driver monitor system

Publications (2)

Publication Number Publication Date
US20190138790A1 US20190138790A1 (en) 2019-05-09
US10803294B2 true US10803294B2 (en) 2020-10-13

Family

ID=66179137

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/167,786 Active 2039-01-31 US10803294B2 (en) 2017-11-09 2018-10-23 Driver monitoring system

Country Status (4)

Country Link
US (1) US10803294B2 (en)
JP (1) JP6915503B2 (en)
CN (1) CN109760689A (en)
DE (1) DE102018127886A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230182762A1 (en) * 2021-12-10 2023-06-15 Panasonic Intellectual Property Management Co., Ltd. Driver monitoring device, driver monitoring method, and non-transitory computer-readable medium

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6973298B2 (en) * 2018-05-31 2021-11-24 トヨタ自動車株式会社 Object monitoring device
US11068701B2 (en) * 2019-06-13 2021-07-20 XMotors.ai Inc. Apparatus and method for vehicle driver recognition and applications of same
JP7294194B2 (en) * 2020-03-11 2023-06-20 いすゞ自動車株式会社 Safe driving judgment device
JP7314085B2 (en) * 2020-03-18 2023-07-25 株式会社東海理化電機製作所 Image processing device, computer program, and anomaly estimation system
JP7402719B2 (en) * 2020-03-18 2023-12-21 株式会社東海理化電機製作所 Image processing device, computer program, and abnormality estimation system
JP2021196941A (en) * 2020-06-16 2021-12-27 トヨタ自動車株式会社 Driver monitoring system
JP7359087B2 (en) 2020-07-02 2023-10-11 トヨタ自動車株式会社 Driver monitoring device and driver monitoring method
JP7327301B2 (en) * 2020-07-02 2023-08-16 トヨタ自動車株式会社 Driver monitoring device
FR3119145B1 (en) * 2021-01-22 2023-10-27 Renault Sas Method for determining a level of distraction of a vehicle driver
CN114680889A (en) * 2022-03-08 2022-07-01 中国石油大学(北京) Method and device for identifying unsafe behaviors of drilling operation personnel

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000326757A (en) 1999-05-18 2000-11-28 Mitsubishi Electric Corp Dangerous approach preventing device
US20090147993A1 (en) * 2007-07-06 2009-06-11 Harman Becker Automotive Systems Gmbh Head-tracking system
JP2012218592A (en) 2011-04-08 2012-11-12 Nissan Motor Co Ltd On-board equipment control device
JP2016057839A (en) * 2014-09-09 2016-04-21 株式会社デンソーアイティーラボラトリ Facial direction detection device and warning system for vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496117B2 (en) * 2001-03-30 2002-12-17 Koninklijke Philips Electronics N.V. System for monitoring a driver's attention to driving
JP2005196567A (en) * 2004-01-08 2005-07-21 Nissan Motor Co Ltd Face direction detecting device
JP4894572B2 (en) * 2006-03-14 2012-03-14 オムロン株式会社 Detection apparatus and method, recording medium, and program
CN102910130B (en) * 2012-10-24 2015-08-05 浙江工业大学 Forewarn system is assisted in a kind of driving of real enhancement mode
CN103978932B (en) * 2014-05-16 2016-06-29 长安大学 The detecting device of a kind of bus driver dangerous driving and detection method
CN105313898B (en) * 2014-07-23 2018-03-20 现代摩比斯株式会社 Driver status induction installation and its method
CN105096528B (en) * 2015-08-05 2017-07-11 广州云从信息科技有限公司 A kind of method for detecting fatigue driving and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000326757A (en) 1999-05-18 2000-11-28 Mitsubishi Electric Corp Dangerous approach preventing device
US6393361B1 (en) 1999-05-18 2002-05-21 Mitsubishi Denki Kabushiki Kaisha Dangerous approach prevention device for vehicle
US20090147993A1 (en) * 2007-07-06 2009-06-11 Harman Becker Automotive Systems Gmbh Head-tracking system
JP2012218592A (en) 2011-04-08 2012-11-12 Nissan Motor Co Ltd On-board equipment control device
JP2016057839A (en) * 2014-09-09 2016-04-21 株式会社デンソーアイティーラボラトリ Facial direction detection device and warning system for vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230182762A1 (en) * 2021-12-10 2023-06-15 Panasonic Intellectual Property Management Co., Ltd. Driver monitoring device, driver monitoring method, and non-transitory computer-readable medium
US12017676B2 (en) * 2021-12-10 2024-06-25 Panasonic Automotive Systems Co., Ltd. Driver monitoring device, driver monitoring method, and non-transitory computer-readable medium

Also Published As

Publication number Publication date
CN109760689A (en) 2019-05-17
JP6915503B2 (en) 2021-08-04
US20190138790A1 (en) 2019-05-09
JP2019087150A (en) 2019-06-06
DE102018127886A1 (en) 2019-05-09

Similar Documents

Publication Publication Date Title
US10803294B2 (en) Driver monitoring system
US20200082190A1 (en) Apparatus detecting driving incapability state of driver
KR102051142B1 (en) System for managing dangerous driving index for vehicle and method therof
US9731727B2 (en) Method and device for detecting the alertness of a vehicle driver
CN114512030B (en) Device for detecting driving incapacity state of driver
US8447072B2 (en) Inattention determining device
US7403124B2 (en) Driving support equipment for vehicles
KR101950476B1 (en) Driver state sensing system, driver state sensing method, and vehicle including thereof
EP3216667A1 (en) Control system for vehicle
JP6103138B2 (en) Driver support system
JP6683185B2 (en) Information processing device, driver monitoring system, information processing method, and information processing program
US20150258997A1 (en) Driver Attentiveness Detection Method and Device
JP6693489B2 (en) Information processing device, driver monitoring system, information processing method, and information processing program
CN102310771A (en) Motor vehicle safety control method and system based on driver face identification
JP2009294753A (en) Image processor and image processing method
WO2019155913A1 (en) Data processing device, monitoring system, alertness system, data processing method, data processing program, and storage medium
JP2018127084A (en) Automatic drive vehicle
WO2017199709A1 (en) Face orientation estimation device and face orientation estimation method
US20210370956A1 (en) Apparatus and method for determining state
US11881054B2 (en) Device and method for determining image data of the eyes, eye positions and/or a viewing direction of a vehicle user in a vehicle
KR102644324B1 (en) Autonomous driving apparatus and method
JP7276082B2 (en) Driver state estimation device
JP2018094294A (en) State estimation system
KR102480989B1 (en) Display apparatus for vehicle and operating method thereof
US20240262380A1 (en) Driver monitor, method, and computer program for monitoring driver

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATSUMURA, TAKESHI;SATO, JUN;HARA, KENICHIROH;AND OTHERS;REEL/FRAME:047276/0671

Effective date: 20181010

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4