CN114049677A - Vehicle ADAS control method and system based on emotion index of driver - Google Patents
Vehicle ADAS control method and system based on emotion index of driver Download PDFInfo
- Publication number
- CN114049677A CN114049677A CN202111478522.9A CN202111478522A CN114049677A CN 114049677 A CN114049677 A CN 114049677A CN 202111478522 A CN202111478522 A CN 202111478522A CN 114049677 A CN114049677 A CN 114049677A
- Authority
- CN
- China
- Prior art keywords
- driver
- emotion
- emotion index
- threshold value
- intensity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a vehicle ADAS control method and system based on a driver emotion index, which are used for extracting facial expression category information based on time sequence in the driving process of a driver by acquiring a face image in the real-time driving process of the driver; calculating the emotion index of the driver according to the facial expression category information based on the time series; and comparing the emotion index with a preset emotion threshold value, and adjusting the alarm threshold value/intensity of ADAS according to the comparison result. Compared with the prior art, the invention can calculate the emotion index of the driver to accurately quantify the current driving state of the driver according to the facial expression data with time sequence information in the driving process of the driver, and dynamically adjust the trigger condition and the alarm intensity of ADAS alarm according to the emotion index of the driver, thereby better ensuring the comfort and the safety of the driver in the driving process.
Description
Technical Field
The invention relates to the technical field of vehicle ADAS (Advanced driving assistance System), in particular to a vehicle ADAS control method and System based on a driver emotion index.
Background
The bad emotion of a driver is one of main reasons for traffic accidents, and at the present stage, an automobile lacks an effective monitoring and reminding function for the bad emotion state of the driver, and how to effectively and accurately identify the emotion state of the driver is a key direction for vehicle intellectualization and human-computer interaction development.
The emotion recognition technology at the present stage is mainly developed in a laboratory controlled environment, and the reliability is low in an actual driving scene. Meanwhile, in the existing emotion recognition technology based on electroencephalogram signals, electrocardiosignals and the like, because signal acquisition equipment is complex, the technology cannot be applied to emotion recognition scenes of drivers, the face image acquisition difficulty is low, and researches on recognition of face expressions of the drivers are more and more.
On the other hand, the ADAS is becoming an indispensable tool for automobiles, and the ADAS integrates functions such as forward collision warning, lane departure warning, automatic emergency braking, and the like. However, these functions often alarm by means of a set fixed threshold, for example, the forward collision warning function sends out a prompt according to a set remaining collision time threshold, and triggers an alarm when the remaining collision time between a driving vehicle and a forward obstacle reaches the threshold. Therefore, the ADAS cannot dynamically adjust the alarming time and intensity according to the driving state of the driver, and is likely to cause excessive alarming when the driver is in a good driving state or fail to give an alarm in time when the driver is in a bad driving state, so that the driving process safety of the driver cannot be effectively guaranteed, and the driving comfort of the driver is also likely to be affected.
Disclosure of Invention
The invention provides a vehicle ADAS control method and system based on a driver emotion index, which are used for solving the technical problem that the conventional vehicle ADAS alarm is not timely or too frequent.
In order to solve the technical problems, the technical scheme provided by the invention is as follows:
a vehicle ADAS control method based on a driver emotion index comprises the following steps:
in the real-time driving process of a driver, monitoring a facial image of the driver in real time, and extracting facial expression category information with time series information from the facial image;
calculating the emotion index of the driver according to the facial expression category information with the time series information; and comparing the emotion index with a preset emotion threshold value, and adjusting the alarm threshold value/intensity of ADAS according to the comparison result.
Preferably, the method for extracting facial expression category information with time series information from the facial image includes the following steps:
combining simulated driving with real-vehicle experiments, collecting facial expression pictures of a driver, enhancing and expanding a data set by using data, and manufacturing a special data set containing a large number of facial expressions of the driver;
selecting a convolutional neural network model, training and verifying on a mainstream face data set to obtain neural network parameters, and establishing an initial neural network model;
performing migration training on the initial neural network model on the established special data set for the facial expression of the driver by adopting a Fine-tune migration learning mode, and retraining part of the convolutional layer and the output layer again to obtain a neural network model after migration learning;
and identifying the facial expression of the driver in the real-time driving process by using the neural network model after the transfer learning to obtain the facial expression category of the driver in the real-time driving process.
Preferably, the emotion index of the driver is calculated according to the facial expression category information with time series information, and the emotion index is calculated according to the following formula:
in the formula, EkIs the emotional index of the driver at time k, Ek-1The emotion index of the driver at the moment k-1, alpha is a coefficient of the emotion index of the driver at the moment k-1, beta represents an expression conversion coefficient from the moment k-1 to the moment k, and is related to the expression categories of the driver at the previous moment and the current moment, and epsilon is a fixed value set according to the expression categories of the driver at the current moment k.
Preferably, the emotion threshold includes a first emotion threshold and a second emotion threshold, and specifically includes the following steps:
when the emotion index exceeds the first emotion threshold and does not exceed a second emotion threshold, judging that the driver is not suitable for driving at present, and adjusting the alarm threshold/intensity of the ADAS to a first-level alarm threshold/intensity;
and when the emotion index exceeds the second emotion threshold, judging that the driver is extremely unsuitable for driving at present, and adjusting the alarm threshold/intensity of the ADAS to a secondary alarm threshold/intensity.
Preferably, the alarm thresholds/intensities include: one or a combination of any several of a collision distance threshold value, a residual collision time threshold value, a yaw angle threshold value and alarm intensity.
Preferably, the acquisition of the face image of the driver in the real-time driving process is realized by a camera which is arranged above a front windshield of the vehicle and close to the driver side.
A computer system comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method when executing the computer program.
The invention has the following beneficial effects:
1. according to the method and the system for controlling the vehicle ADAS based on the emotion index of the driver, facial expression category information based on time series in the driving process of the driver is extracted by acquiring a face image in the real-time driving process of the driver; calculating the emotion index of the driver according to the facial expression category information based on the time series; and comparing the emotion index with a preset emotion threshold value, and adjusting the alarm threshold value/intensity of ADAS according to the comparison result. Compared with the prior art, the invention can calculate the emotion index of the driver according to the facial expression category information of the driver with time series information to accurately quantify the current driving state of the driver, and dynamically adjust the trigger condition and the alarm intensity of ADAS alarm according to the emotion index of the driver, thereby better ensuring the comfort and the safety of the driver in the driving process.
2. In a preferred scheme, the method can effectively identify the facial expression of the driver based on the deep neural network and the transfer learning technology.
In addition to the objects, features and advantages described above, other objects, features and advantages of the present invention are also provided. The present invention will be described in further detail below with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart of the neural network model building based on the facial expression recognition of the driver of the transfer learning in the preferred embodiment of the present invention;
FIG. 2 is a flow chart of a method of vehicle ADAS control based on a driver emotion index in a preferred embodiment of the present invention;
fig. 3 is a schematic diagram of a vehicle ADAS control system based on a driver emotion index in a preferred embodiment of the present invention.
Detailed Description
The embodiments of the invention will be described in detail below with reference to the drawings, but the invention can be implemented in many different ways as defined and covered by the claims.
The first embodiment is as follows:
the implementation discloses a vehicle ADAS control method based on a driver emotion index, which comprises the following steps:
in the real-time driving process of a driver, monitoring a facial image of the driver in real time, and extracting facial expression category information with time series information from the facial image;
calculating the emotion index of the driver according to the facial expression category information with the time series information; and comparing the emotion index with a preset emotion threshold value, and adjusting the alarm threshold value/intensity of ADAS according to the comparison result. .
In addition, in the embodiment, a computer system is also disclosed, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the steps of the method are implemented.
According to the method and the system for controlling the vehicle ADAS based on the emotion index of the driver, facial expression category information based on time series in the driving process of the driver is extracted by acquiring a facial image of the driver in the real-time driving process of the driver; calculating the emotion index of the driver according to the facial expression category information based on the time series; and comparing the emotion index with a preset emotion threshold value, and adjusting the alarm threshold value/intensity of ADAS according to the comparison result. Compared with the prior art, the invention can calculate the emotion index of the driver according to the facial expression category information of the driver with time series information to accurately quantify the current driving state of the driver, and dynamically adjust the trigger condition and the alarm intensity of ADAS alarm according to the emotion index of the driver, thereby better ensuring the comfort and the safety of the driver in the driving process.
Example two:
the second embodiment is the preferred embodiment of the first embodiment, and is different from the first embodiment in that specific steps of a vehicle ADAS control method based on a driver emotion index are refined, and a specific structure of a vehicle ADAS control system based on the driver emotion index is optimized:
as shown in fig. 1, in the present embodiment, a method for controlling a vehicle ADAS based on a driver emotion index is disclosed, which includes the following steps:
1) collecting a driver facial expression specific dataset in a non-laboratory controlled environment: collecting facial expression images of drivers under different emotions through simulation driving, real vehicle experiments and the like, labeling, and obtaining a facial expression data set containing driving scene pictures of the drivers after data enhancement;
2) selecting a neural network model by applying deep learning, testing and verifying on a mainstream face data set to obtain parameters of the neural network model, and establishing an initial neural network model;
3) fixing network parameters of the initial neural network model except for the output layer and part of the convolution layer, performing migration training on a special data set for the emotional expression of the driver, and adjusting the network parameters to obtain a neural network model subjected to fine adjustment aiming at the emotional expression of the driver;
4) and identifying the facial expressions of the driver in the real-time driving process, and calculating the emotion index of the driver according to the facial expression category information of the driver with the time series information.
In the present embodiment, the facial expression categories include: sadness, anger, surprise, fear, happiness, disgust, neutrality.
Wherein, the calculation formula of the emotion index is as follows:
in the formula, EkIs the emotional index of the driver at time k, Ek-1The emotion index of the driver at the moment k-1, alpha is a coefficient of the emotion index of the driver at the moment k-1, beta represents an expression conversion coefficient from the moment k-1 to the moment k, and is related to the expression categories of the driver at the previous moment and the current moment, and epsilon is a fixed value set according to the expression categories of the driver at the current moment k.
Here, α and β need to be set empirically, and α needs to be smaller if the emotion index at the previous time is attenuated more or the proportion is smaller, and α needs to be larger if the emotion index at the previous time is attenuated less or the proportion is larger. β is related to the expression categories of the previous moment and the current moment, which is illustrated in table 1, and when the expressions of the previous moment are consistent, the more positive the expression of the current moment is, the smaller β is, the more negative the expression of the current moment is, and the larger β is; under the condition that the expressions at the current moment are consistent, the more positive the expression at the previous moment is, the smaller beta is, and the more negative the expression at the previous moment is, the larger beta is. Epsilon is set so that the more negative the expression at the current moment, the more positive the expression, and the less negative the expression.
In this embodiment, the values of β are as follows:
table 1 beta values example
5) Adjusting an ADAS alarm threshold value and intensity according to the emotion index of the driver, and when the emotion index of the driver is suitable for driving, the ADAS works according to a normal collision distance threshold value, a normal remaining collision time threshold value, a normal yaw angle threshold value and normal alarm intensity; when the emotion index of the driver represents that the driver is not suitable for driving, the ADAS works according to a primary collision distance threshold value, a primary residual collision time threshold value, a primary yaw angle threshold value and a primary alarm intensity; when the emotion index characterization of the driver is extremely unsuitable for driving, the ADAS will work according to a secondary collision distance threshold value, a secondary remaining collision time threshold value, a secondary yaw angle threshold value and a secondary alarm intensity.
In this embodiment, the primary collision distance threshold, the primary remaining collision time threshold, the primary yaw angle threshold, and the primary alarm intensity are collectively referred to as a first emotion threshold; the secondary collision distance threshold, the secondary residual collision time threshold, the secondary yaw angle threshold and the secondary alarm intensity are collectively called a second emotion threshold; the first emotion threshold is smaller than the second emotion threshold, and the specific numerical value is related to the setting of the emotion index; the first-level alarm threshold value is larger than the second-level alarm threshold value, the first-level alarm intensity is weaker than the second-level alarm intensity, and the second-level alarm can increase more alarm modes such as lamplight and active safety belt pre-tightening.
Further, in the present embodiment, as shown in fig. 2 and 3, there is disclosed a vehicle ADAS control system based on a driver emotion index, including:
1) an image acquisition module: the system comprises one or more cameras, a front windshield and a rear windshield, wherein the cameras are arranged at proper positions above the front windshield of the vehicle and close to the driver side;
2) an image processing module: based on the established neural network model, acquiring a facial expression image of the driver from the image acquisition module, identifying the facial expression image, and calculating the emotion index of the driver, wherein the calculation mode of the emotion index of the driver at the moment k is shown as the formula (1):
3) an ADAS module: the system comprises a front collision early warning module and a lane departure early warning module, wherein a normal collision distance threshold value, a normal remaining collision time threshold value, a normal yaw angle threshold value and normal warning intensity are arranged in the modules; a primary collision distance threshold value, a primary residual collision time threshold value, a primary yaw angle threshold value and a primary alarm intensity; a secondary collision distance threshold value, a secondary residual collision time threshold value, a secondary yaw angle threshold value and a secondary alarm intensity. The ADAS module receives the emotion index of the driver obtained from the image processing module, when the driver is judged to be suitable for driving, the front collision early warning module starts a normal collision distance threshold value and a normal remaining collision time threshold value, the lane departure early warning module starts a normal yaw angle threshold value, and the ADAS module starts normal alarm intensity; when the driver is judged to be not suitable for driving, a front collision early warning module starts a primary collision distance threshold value and a primary residual collision time threshold value, a lane departure early warning module starts a primary yaw angle threshold value, and an ADAS module starts primary alarm intensity; when the driver is judged to be extremely unsuitable for driving, the front collision early warning module starts a secondary collision distance threshold value and a secondary remaining collision time threshold value, the lane departure early warning module starts a secondary yaw angle threshold value, and the ADAS module starts secondary alarm intensity. Wherein the normal collision distance threshold is greater than the primary collision distance threshold, and the primary collision distance threshold is greater than the secondary collision distance threshold; the normal residual collision time threshold value is larger than a primary residual collision time threshold value, and the primary residual collision time threshold value is larger than a secondary residual collision time threshold value; the normal yaw angle threshold value is larger than a first-level yaw angle threshold value, and the first-level yaw angle threshold value is larger than a second-level yaw angle threshold value; the normal alarm intensity is weaker than the first-level alarm intensity, and the first-level alarm intensity is weaker than the second-level alarm intensity. When any parameter is detected to be less than or equal to the threshold value, a corresponding alarm mechanism is started.
In summary, the vehicle ADAS control method and system based on the emotion index of the driver in the invention collect the face picture of the driver in the simulated driving or real scene under the non-laboratory controlled environment, and improve the accuracy of recognizing the face expression of the driver by using the transfer learning. Meanwhile, the method for calculating the emotion index of the driver is provided, the emotion index of the driver is obtained by combining the facial expression category information of the driver with the time series information, and the driving state of the driver is represented by the facial expression. The invention also provides a self-adaptive advanced driving assistance system and a self-adaptive advanced driving assistance method based on the emotion index of the driver, which can dynamically adjust the alarm threshold and intensity of the ADAS according to the emotion index of the driver. Therefore, the invention not only improves the accuracy of the facial expression recognition of the driver and obtains the emotion index reflecting the driving state of the driver, but also avoids the problem of ADAS false alarm or untimely alarm and improves the comfort and safety of the driver in the driving process.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (7)
1. A vehicle ADAS control method based on a driver emotion index is characterized by comprising the following steps:
in the real-time driving process of a driver, monitoring a facial image of the driver in real time, and extracting facial expression category information with time series information from the facial image;
calculating the emotion index of the driver according to the facial expression category information with the time series information; and comparing the emotion index with a preset emotion threshold value, and adjusting the alarm threshold value/intensity of ADAS according to the comparison result.
2. The driver emotion index-based vehicle ADAS control method as claimed in claim 1, wherein extracting facial expression category information with time series information from the facial image, comprises the steps of:
combining simulated driving with real-vehicle experiments, collecting facial expression pictures of a driver, enhancing and expanding a data set by using data, and manufacturing a special data set containing a large number of facial expressions of the driver;
selecting a convolutional neural network model, training and verifying on a mainstream face data set to obtain neural network parameters, and establishing an initial neural network model;
performing migration training on the initial neural network model on the established special data set for the facial expression of the driver by adopting a Fine-tune migration learning mode, and retraining part of the convolutional layer and the output layer again to obtain a neural network model after migration learning;
and identifying the facial expression of the driver in the real-time driving process by using the neural network model after the transfer learning to obtain the facial expression category of the driver in the real-time driving process.
3. The method of claim 1, wherein the emotion index of the driver is calculated according to the facial expression category information with time series information, and is implemented by the following formula:
in the formula, EkIs the emotional index of the driver at time k, Ek-1The emotion index of the driver at the moment k-1, alpha is a coefficient of the emotion index of the driver at the moment k-1, beta represents an expression conversion coefficient from the moment k-1 to the moment k, and is related to the expression categories of the driver at the previous moment and the current moment, and epsilon is a fixed value set according to the expression categories of the driver at the current moment k.
4. The vehicle ADAS control method according to claim 2, wherein the emotion threshold includes a first emotion threshold and a second emotion threshold, and specifically includes the following steps:
when the emotion index exceeds the first emotion threshold and does not exceed a second emotion threshold, judging that the driver is not suitable for driving at present, and adjusting the alarm threshold/intensity of the ADAS to a first-level alarm threshold/intensity;
and when the emotion index exceeds the second emotion threshold, judging that the driver is extremely unsuitable for driving at present, and adjusting the alarm threshold/intensity of the ADAS to a secondary alarm threshold/intensity.
5. The driver emotion index based vehicle ADAS control method as claimed in claim 1, wherein the alarm threshold/intensity includes: one or a combination of any several of a collision distance threshold value, a residual collision time threshold value, a yaw angle threshold value and alarm intensity.
6. The vehicle ADAS control method according to claim 1, wherein the face image of the driver during real-time driving is captured by a camera installed above the front windshield of the vehicle and close to the driver's side.
7. A computer system comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method of any one of claims 1 to 6 are performed when the computer program is executed by the processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111478522.9A CN114049677B (en) | 2021-12-06 | 2021-12-06 | Vehicle ADAS control method and system based on driver emotion index |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111478522.9A CN114049677B (en) | 2021-12-06 | 2021-12-06 | Vehicle ADAS control method and system based on driver emotion index |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114049677A true CN114049677A (en) | 2022-02-15 |
CN114049677B CN114049677B (en) | 2023-08-25 |
Family
ID=80212240
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111478522.9A Active CN114049677B (en) | 2021-12-06 | 2021-12-06 | Vehicle ADAS control method and system based on driver emotion index |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114049677B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114610433A (en) * | 2022-03-23 | 2022-06-10 | 中国第一汽车股份有限公司 | Vehicle instrument parameterization dynamic display method and system |
CN115376115A (en) * | 2022-10-25 | 2022-11-22 | 中国第一汽车股份有限公司 | Reckless driving behavior marking method, vehicle, cloud server and storage medium |
WO2023236434A1 (en) * | 2022-06-07 | 2023-12-14 | 公安部第三研究所 | Safe driving early warning system based on driver emotion intervention |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016149063A (en) * | 2015-02-13 | 2016-08-18 | オムロン株式会社 | Emotion estimation system and emotion estimation method |
CN106627589A (en) * | 2016-12-27 | 2017-05-10 | 科世达(上海)管理有限公司 | Vehicle driving safety auxiliary method and system and vehicle |
CN106803423A (en) * | 2016-12-27 | 2017-06-06 | 智车优行科技(北京)有限公司 | Man-machine interaction sound control method, device and vehicle based on user emotion state |
CN107292778A (en) * | 2017-05-19 | 2017-10-24 | 华中师范大学 | A kind of cloud classroom learning evaluation method and its device based on cognitive emotion perception |
CN110472511A (en) * | 2019-07-19 | 2019-11-19 | 河海大学 | A kind of driver status monitoring device based on computer vision |
CN110619301A (en) * | 2019-09-13 | 2019-12-27 | 道和安邦(天津)安防科技有限公司 | Emotion automatic identification method based on bimodal signals |
CN112380395A (en) * | 2020-10-30 | 2021-02-19 | 华南理工大学 | Method and system for obtaining emotion of graph convolution network based on double-flow architecture and storage medium |
CN113715833A (en) * | 2021-09-09 | 2021-11-30 | 重庆金康赛力斯新能源汽车设计院有限公司 | Road rage preventing method, device and system |
-
2021
- 2021-12-06 CN CN202111478522.9A patent/CN114049677B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016149063A (en) * | 2015-02-13 | 2016-08-18 | オムロン株式会社 | Emotion estimation system and emotion estimation method |
CN106627589A (en) * | 2016-12-27 | 2017-05-10 | 科世达(上海)管理有限公司 | Vehicle driving safety auxiliary method and system and vehicle |
CN106803423A (en) * | 2016-12-27 | 2017-06-06 | 智车优行科技(北京)有限公司 | Man-machine interaction sound control method, device and vehicle based on user emotion state |
CN107292778A (en) * | 2017-05-19 | 2017-10-24 | 华中师范大学 | A kind of cloud classroom learning evaluation method and its device based on cognitive emotion perception |
CN110472511A (en) * | 2019-07-19 | 2019-11-19 | 河海大学 | A kind of driver status monitoring device based on computer vision |
CN110619301A (en) * | 2019-09-13 | 2019-12-27 | 道和安邦(天津)安防科技有限公司 | Emotion automatic identification method based on bimodal signals |
CN112380395A (en) * | 2020-10-30 | 2021-02-19 | 华南理工大学 | Method and system for obtaining emotion of graph convolution network based on double-flow architecture and storage medium |
CN113715833A (en) * | 2021-09-09 | 2021-11-30 | 重庆金康赛力斯新能源汽车设计院有限公司 | Road rage preventing method, device and system |
Non-Patent Citations (4)
Title |
---|
MAIK AURICHT, RAINER STARK: "How to consider emotional reactions of the driver within the development of Advanced Driver Assistance Systems (ADAS)?", 《 24TH CIRP DESIGN CONFERENCE 》, pages 70 - 74 * |
MIRA JEONG AND BYOUNG CHUL KO: "Driver’s Facial Expression Recognition in Real-Time for Safe Driving", 《SENSORS》, pages 1 - 17 * |
余卓平 等: "基于驾驶员面部检测的自适应碰撞预警系统", 《内燃机与配件》, pages 198 - 201 * |
王洋: "基于动态序列的人脸表情识别方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, pages 138 - 289 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114610433A (en) * | 2022-03-23 | 2022-06-10 | 中国第一汽车股份有限公司 | Vehicle instrument parameterization dynamic display method and system |
WO2023236434A1 (en) * | 2022-06-07 | 2023-12-14 | 公安部第三研究所 | Safe driving early warning system based on driver emotion intervention |
CN115376115A (en) * | 2022-10-25 | 2022-11-22 | 中国第一汽车股份有限公司 | Reckless driving behavior marking method, vehicle, cloud server and storage medium |
WO2024087887A1 (en) * | 2022-10-25 | 2024-05-02 | 中国第一汽车股份有限公司 | Reckless driving behavior marking method, vehicle, cloud server, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114049677B (en) | 2023-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114049677B (en) | Vehicle ADAS control method and system based on driver emotion index | |
US10049284B2 (en) | Vision-based rain detection using deep learning | |
CN111434553B (en) | Brake system, method and device, and fatigue driving model training method and device | |
CN105769120A (en) | Fatigue driving detection method and device | |
WO2021017341A1 (en) | Method and apparatus for recognizing driving state of intelligent driving device, and device | |
CN104573622A (en) | Face detection apparatus, face detection method, and program | |
CN116001795A (en) | Channel change risk assessment and personalized channel change decision method | |
JP7154959B2 (en) | Apparatus and method for recognizing driver's state based on driving situation judgment information | |
JP2022539697A (en) | On-vehicle active learning method and apparatus for learning the perception network of an autonomous driving vehicle | |
CN115937830A (en) | Special vehicle-oriented driver fatigue detection method | |
CN112052829A (en) | Pilot behavior monitoring method based on deep learning | |
CN117734705A (en) | Driving assistance method, driving assistance system, electronic device and storage medium | |
JP7443283B2 (en) | Wakefulness estimation method, wakefulness estimation device, and wakefulness estimation program | |
CN111186436B (en) | Vehicle component control method and device and vehicle | |
CN113370991A (en) | Driving assistance method, device, equipment, storage medium and computer program product | |
CN113085888A (en) | Intelligent networked automobile driving-assisting safety information detection system | |
CN118269999B (en) | Multi-mode interaction new energy automobile control system | |
Hashimoto et al. | Quantification of Contributing Degree to Braking Operation of Driver based on the Hidden Markov Model | |
JP7351889B2 (en) | Vehicle interior monitoring/situation understanding sensing method and its system | |
KR102594384B1 (en) | Image recognition learning apparatus of autonomous vehicle using error data insertion and image recognition learning method using the same | |
JP7462113B2 (en) | Information processing device, information processing method, and computer program | |
JP7230710B2 (en) | Image processing device, monitoring device, control system, image processing method, and program | |
CN111891132B (en) | Acceleration and deceleration-based service processing method, device, equipment and storage medium | |
CN113460074B (en) | Automatic driving manual takeover request time adjustment method and system | |
CN115691047A (en) | Anti-fatigue reminding device based on image acquisition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |