[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN116611605A - Evaluation system and method for intelligent cabin man-machine interface and storage medium - Google Patents

Evaluation system and method for intelligent cabin man-machine interface and storage medium Download PDF

Info

Publication number
CN116611605A
CN116611605A CN202211531796.4A CN202211531796A CN116611605A CN 116611605 A CN116611605 A CN 116611605A CN 202211531796 A CN202211531796 A CN 202211531796A CN 116611605 A CN116611605 A CN 116611605A
Authority
CN
China
Prior art keywords
data
analysis module
driver
module
evaluation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211531796.4A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kingfar International Inc
Original Assignee
Kingfar International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kingfar International Inc filed Critical Kingfar International Inc
Priority to CN202211531796.4A priority Critical patent/CN116611605A/en
Publication of CN116611605A publication Critical patent/CN116611605A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • A61B5/374Detecting the frequency distribution of signals, e.g. detecting delta, theta, alpha, beta or gamma waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Human Resources & Organizations (AREA)
  • Multimedia (AREA)
  • Cardiology (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Physiology (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)

Abstract

The application provides an evaluation system, an evaluation method and a storage medium of an intelligent cabin human-computer interface, wherein the system comprises a data acquisition module, a data processing module and a data processing module, wherein the data acquisition module is used for acquiring vehicle evaluation data and driver evaluation data; the interactive behavior analysis module is used for determining the interactive behavior of the driver according to the driver evaluation data acquired by the data acquisition module; a delay analysis module for determining a fatigue level of the driver based on the fluctuation time difference; the track analysis module is used for comparing the running track of the vehicle with a preset track and outputting track comparison data; the vehicle running analysis module is used for determining the running stability grade of the vehicle according to the combined yaw rate, the centroid lateral angle data and the lateral acceleration in the vehicle evaluation data acquired by the data acquisition module; the subjective evaluation module is used for sending a preset questionnaire scale to the driving end and receiving the questionnaire scale fed back by the driving end; and the report generation module is used for outputting the received data as an analysis report according to the report generation instruction.

Description

Evaluation system and method for intelligent cabin man-machine interface and storage medium
Technical Field
The application relates to the technical field of vehicle evaluation, in particular to an evaluation system, an evaluation method and a storage medium of an intelligent cabin human-computer interface.
Background
The world governments are continually moving to support the manufacture of lower carbon, environmentally friendly, safe automobiles, in view of the demand for sustainable development. Because the traditional automobile is difficult to realize technical breakthrough in terms of mechanical structure to meet the requirements, the development of electronic and information technology realizes more accurate control on the automobile, so that the oil consumption is reduced, the safety is improved, and the human-computer interaction comfort and the intellectualization are provided.
Further, research shows that human factors are dominant factors influencing driving behaviors and driving safety, a driver dynamically controls a vehicle in real time through perception, judgment and decision, and the driving behaviors have complexity, ambiguity, nonlinearity and time-varying changes, so that the driver is determined to be the most complex part in a human-vehicle-road-environment system.
Intelligent equipment HMI evolves rapidly: the development of 5G technology, artificial intelligence, machine learning, high-performance chips and the like in high-end equipment application and Internet services enables intelligent equipment to be changed into a highly complex information integration carrier for information exchange, processing analysis and learning from an initial mechanical device, and the highly complex information integration carrier comprises a mobile terminal HMI, an Internet-end HMI, a complex high-end intelligent HMI and the like.
The intelligent equipment HMI is used as a main medium for information communication between people and a system, so that the equipment is an important stress point and a performance carrier for the development of intelligence, networking and automation.
The evaluation importance of the intelligent equipment HMI is increasingly highlighted, and the intelligent cabin HMI serves as a hot spot for the development of the intelligent technology of the automobile and becomes an important research and development field of various automobile manufacturing enterprises. The function and performance tests of the cabin system are common in industry, corresponding test standards are formulated, but the test for the cabin system is not systematic, standard or standard, and the test dimension and the test range are relatively limited.
The existing selecting method is mainly subjective selecting, the subjective selecting is easily affected by external factors, and the result is not objective. The existing objective evaluation method is independent of the subjective evaluation method, and the two methods cannot be combined.
In summary, society has put forward higher demands on standardization and scientificity of ergonomic evaluation of intelligent cabins, and comprehensive methods and platforms for human factor research of people, vehicles and road systems are urgently needed. This patent will focus on developing the evaluation system of the man-machine interaction interface of an intelligent cabin.
Disclosure of Invention
In view of this, embodiments of the present application provide an assessment system for an intelligent cockpit human-machine interface that obviates or mitigates one or more of the disadvantages of the prior art.
One aspect of the present application provides an assessment system for an intelligent cockpit human-machine interface, the system comprising:
the data acquisition module is based on the installed recording equipment and is used for acquiring vehicle evaluation data and driver evaluation data;
the interactive behavior analysis module is used for determining the interactive behavior of the driver according to the driver evaluation data acquired by the data acquisition module, wherein the interactive behavior comprises limb behaviors and psychological activities, and the driver evaluation data comprises video data, eye movement data, brain electricity data and/or electrocardio data;
the delay analysis module is used for calculating fluctuation time difference of the evaluation data of the driver based on the same trigger event and determining the fatigue level of the driver based on the fluctuation time difference;
the track analysis module is used for comparing the vehicle running track of the vehicle evaluation data acquired by the data acquisition module with a preset track and outputting track comparison data;
the vehicle running analysis module is used for determining the running stability grade of the vehicle according to the combined yaw rate, the centroid lateral angle data and the transverse acceleration in the vehicle evaluation data acquired by the data acquisition module;
the subjective evaluation module is used for sending a preset questionnaire scale to the driving end and receiving the questionnaire scale fed back by the driving end;
in some embodiments of the present application, the questionnaire list includes questionnaire questions, which may be a driving comfort score, a manipulability score, a vehicle performance score, etc., and the driving end may be a computer installed on the vehicle, on which the driver operates, and after the operation is completed, the questionnaire list is uploaded to reflect the subjective feeling of the driver.
The report generation module is used for receiving the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module, and outputting the received data as an analysis report according to a report generation instruction.
By adopting the scheme, the system can collect data in the experimental process, including eye movement monitoring, heart rate monitoring, skin electricity monitoring, skin temperature monitoring, respiration monitoring, brain electricity monitoring, near infrared monitoring, motion capturing, behavior monitoring, interaction recording, facial expression monitoring and the like, can be synchronously implemented on the same platform, each item of data is directly recorded to the background, and in the experimental playback stage, data analysis is carried out, the system is compatible with various analysis modules of subjective and objective data, the analysis modules comprise diversified standard analysis algorithms, and the analyzed data can be exported in a key mode according to an operator report generation instruction, so that on one hand, the system, standard and standard of assessment of a cabin system can be ensured, and on the other hand, the report exported by the scheme is based on the subjective assessment module, the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module and the delay analysis module, so that the subjective and objective assessment can be combined, and the subjective assessment is prevented from being influenced by external factors.
In some embodiments of the present application, the interaction behavior analysis module includes:
the action behavior analysis module is used for calculating the times and the frequency of each action behavior according to the action behaviors recorded by the driver;
the eye movement behavior analysis module is used for determining the gazing behaviors of the driver in each interest area according to the eye movement data of the driver based on the interest areas which are divided into the in-vehicle space in advance, and determining the gazing time of each gazing behavior;
the electroencephalogram behavior analysis module is used for determining the current conscious behavior of a driver based on the electric brain data of delta, theta, alpha, beta and gamma in 5 frequency bands in the electric brain data;
and the electrocardio behavior analysis module is used for determining the fluctuation range of the heart rate in a preset time length according to the collected heart rate of the driver and determining whether the driver has tension or not based on the fluctuation range of the heart rate data in the preset time length.
The mental activities include tension.
In some embodiments of the application, the interaction behavior analysis module comprises an action gesture analysis module,
the action gesture analysis module is used for determining the current action health grade of the driver based on the angles of the joints according to the angles of the joints of the driver acquired by the data acquisition module.
In some embodiments of the present application, the interactive behavior analysis module comprises an expression action analysis module,
the expression action analysis module is used for identifying the expression of the driver based on a preset neural network model according to the facial segments of the driver in the video data acquired by the data acquisition module.
In some embodiments of the present application, the action behavior analysis module calculates the number and frequency of each action behavior from the action behaviors recorded by the driver in the video data in the step of calculating the number and frequency of each action behavior from the action behaviors recorded by the driver.
In some embodiments of the present application, the action behavior analysis module obtains an operation record of the device of the vehicle itself, determines the action behavior of the driver based on the operated device, and determines the number and frequency of action behaviors based on the operated number and frequency of the device in the step of calculating the number and frequency of each action behavior according to the action behavior of the driver recorded.
In some embodiments of the application, the delay analysis module, in the step of calculating a fluctuation time difference of the driver assessment data based on the same trigger event,
the triggering event comprises an obstacle event, the time of the occurrence of the obstacle in front of the vehicle is obtained from the data acquisition module, the time of the first occurrence of fluctuation of the evaluation data of the driver after the time of the occurrence of the obstacle is taken as the reaction time, the time difference between the time of the occurrence of the obstacle and the reaction time is taken as the fluctuation time difference, and the fatigue level of the driver is determined based on the fluctuation time difference.
In some embodiments of the present application, the delay analysis module is configured to, in the step of taking, as the reaction time, a time at which the driver's evaluation data fluctuates for the first time after the time at which the obstacle appears:
the time of first pupil dilation after the time of obstacle appearance is taken as the time of first fluctuation of eye movement data of driver evaluation data;
and taking the time of the first occurrence of the peak of the electroencephalogram data and/or the electrocardiographic data of the evaluation data of the driver after the time of the occurrence of the obstacle as the time of the first occurrence of the fluctuation of the electroencephalogram data or the electrocardiographic data.
In some embodiments of the present application, the report generating module outputs the received data as the analysis report according to a report generating instruction, the report generating instruction being all data that designates the data collection module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, and the delay analysis module or part of data that designates the data collection module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module, and the subjective assessment module.
In some embodiments of the present application, if the report generating instruction is a part of data that designates the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle driving analysis module, and the delay analysis module, the instruction mode includes an event statistics report instruction and a fragment statistics report instruction:
if the report generation instruction is an event statistics report instruction, based on a trigger event, counting data before or after the trigger event occurs, and displaying the data as an analysis report;
and if the report generation instruction is a segment statistics report instruction, based on one frame of image in the video data, counting the data before or after the frame of image, and presenting the data as an analysis report.
The application also provides an evaluation method of the intelligent cabin human-computer interface, which comprises the following steps:
the method comprises the steps that vehicle evaluation data and driver evaluation data are collected through a data collection module, and the data collection module is used for collecting data based on installed recording equipment;
the method comprises the steps that the interactive behavior of a driver is determined through an interactive behavior analysis module, the interactive behavior analysis module determines the interactive behavior of the driver according to driver evaluation data collected by a data collection module, the interactive behavior comprises limb behaviors and psychological activities, and the driver evaluation data comprises video data, eye movement data, electroencephalogram data and/or electrocardiographic data;
determining the fatigue level of the driver through a delay analysis module, wherein the delay analysis module calculates the fluctuation time difference of the evaluation data of the driver based on the same trigger event, and determines the fatigue level of the driver based on the fluctuation time difference;
comparing the vehicle running track of the vehicle evaluation data acquired by the data acquisition module with a preset track through the track analysis module, and outputting track comparison data;
determining a running stability grade of the vehicle through a vehicle running analysis module, wherein the vehicle running analysis module determines the running stability grade of the vehicle according to the yaw rate, the centroid lateral angle data and the transverse acceleration in the vehicle evaluation data acquired by the data acquisition module;
the method comprises the steps that a questionnaire scale fed back by a driving end is obtained through a subjective evaluation module, the subjective evaluation module sends a preset questionnaire scale to the driving end, and the questionnaire scale fed back by the driving end is received;
and the report generation module is used for receiving the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module, and outputting the received data as an analysis report according to a report generation instruction.
The application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the steps of the method for evaluating a human-computer interface of an intelligent cabin. The computer readable storage medium may be a tangible storage medium such as Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, floppy disks, hard disk, a removable memory disk, a CD-ROM, or any other form of storage medium known in the art.
Additional advantages, objects, and features of the application will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present application are not limited to the above-described specific ones, and that the above and other objects that can be achieved with the present application will be more clearly understood from the following detailed description.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate and together with the description serve to explain the application.
FIG. 1 is a schematic diagram of an embodiment of an evaluation system for an intelligent cockpit human-computer interface of the present application;
FIG. 2 is a schematic diagram of another embodiment of an evaluation system for an intelligent cockpit human-computer interface of the present application;
FIG. 3 is a schematic diagram of one embodiment of an interaction behavior analysis module.
Detailed Description
The present application will be described in further detail with reference to the following embodiments and the accompanying drawings, in order to make the objects, technical solutions and advantages of the present application more apparent. The exemplary embodiments of the present application and the descriptions thereof are used herein to explain the present application, but are not intended to limit the application.
It should be noted here that, in order to avoid obscuring the present application due to unnecessary details, only structures and/or processing steps closely related to the solution according to the present application are shown in the drawings, while other details not greatly related to the present application are omitted.
It should be emphasized that the term "comprises/comprising" when used herein is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
It is also noted herein that the term "coupled" may refer to not only a direct connection, but also an indirect connection in which an intermediate is present, unless otherwise specified.
Hereinafter, embodiments of the present application will be described with reference to the accompanying drawings. In the drawings, the same reference numerals represent the same or similar components, or the same or similar steps.
In order to solve the above problems, as shown in fig. 1 and 2, the present application provides an evaluation system for a man-machine interface of an intelligent cabin, comprising:
the intelligent cabin HMI prototype intelligent reading technology in fig. 1: by developing the HMI central control system platform, the HMI source data is directly read, the HMI source data comprises but is not limited to HMI codes, the component information of the HMI human-computer interaction interface comprises positions, sizes, colors and the like, the human-computer interaction data of a driver and the HMI interface are self-calculated and analyzed, and the human-computer interaction data comprise finger clicking, voice interaction, eye control interaction and the like.
The data acquisition module is based on the installed recording equipment and is used for acquiring vehicle evaluation data and driver evaluation data;
the recording device includes a biosensor, an environmental sensor, and a vehicle parameter sensor;
the biological sensor comprises a blood volume pulse measuring instrument, a skin electric measuring instrument, a respiration measuring instrument, a skin temperature sensor, a blood oxygen saturation measuring instrument, an electrocardiograph, a myoelectricity measuring instrument, a gyroscope, an acceleration sensor, a magnetometer, a motion capture recorder, a wireless electroencephalogram analyzer, a remote eye movement recorder, a virtual reality eye movement recorder and a camera;
the motion capture recorder comprises a real-time angle sensor, a position sensor, a speed sensor and a torque sensor which are arranged at each joint of a driver;
the environment sensor comprises a temperature sensor, a humidity sensor, an atmospheric pressure sensor and an illumination sensor;
the vehicle parameter sensor comprises a GPS, an IMU, a pedal pressure sensor, a steering wheel force angle sensor, an engine rotating speed sensor, a vehicle speed sensor, an engine temperature sensor, an air inlet pressure sensor and a vehicle load sensor.
The interactive behavior analysis module is used for determining the interactive behavior of the driver according to the driver evaluation data acquired by the data acquisition module, wherein the interactive behavior comprises limb behaviors and psychological activities, and the driver evaluation data comprises video data, eye movement data, brain electricity data and/or electrocardio data;
the limb behavior comprises a change in joint angle and a change in eye movement data; the psychological activities include changes in the driver's brain electrical data, changes in the electrocardiographic data, and conscious behaviors of the driver.
The video data is recorded through a camera, the eye movement data is recorded through a remote eye movement recorder or a virtual reality eye movement recorder, the electroencephalogram data is recorded through a wireless electroencephalogram analyzer, and the electrocardiographic data is recorded through an electrocardiograph.
The delay analysis module is used for calculating fluctuation time difference of the evaluation data of the driver based on the same trigger event and determining the fatigue level of the driver based on the fluctuation time difference;
the track analysis module is used for comparing the vehicle running track of the vehicle evaluation data acquired by the data acquisition module with a preset track and outputting track comparison data;
in some embodiments of the present application, the vehicle running track is recorded by a GPS, the vehicle running track recorded by the GPS and the preset track are drawn in the same display area, the same display area may be the same map, and the images of comparing the vehicle running track with the preset track in the same display area are track comparison data.
And obtaining the part of the vehicle running track deviating from the preset track according to the track comparison data.
The vehicle running analysis module is used for determining the running stability grade of the vehicle according to the combined yaw rate, the centroid lateral angle data and the transverse acceleration in the vehicle evaluation data acquired by the data acquisition module;
in some embodiments of the application, the vehicle assessment data further includes a sideslip angle and a friction circle;
the yaw rate, the centroid lateral angle data, the sideslip angle and the friction circle are all measured through an IMU, and the transverse acceleration is measured through a vehicle speed sensor;
the subjective evaluation module is used for sending a preset questionnaire scale to the driving end and receiving the questionnaire scale fed back by the driving end;
the step of specifically determining the running stability level of the vehicle may be to preset a combined yaw rate threshold, a centroid sideways angle data threshold and a lateral acceleration threshold, and determine that the vehicle is in an unstable state when any one of the combined yaw rate, centroid sideways angle data and lateral acceleration is greater than the corresponding combined yaw rate threshold, centroid sideways angle data threshold and lateral acceleration threshold; otherwise, in a steady state.
In some embodiments of the present application, the questionnaire list includes questionnaire questions, which may be a driving comfort score, a manipulability score, a vehicle performance score, etc., and the driving end may be a computer installed on the vehicle, on which the driver operates, and after the operation is completed, the questionnaire list is uploaded to reflect the subjective feeling of the driver.
The report generation module is used for receiving the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module, and outputting the received data as an analysis report according to a report generation instruction.
In some embodiments of the present application, the report generating module extracts all or part of the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle driving analysis module, the delay analysis module and the subjective assessment module based on the report generating instruction, and creates an analysis report;
the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module can be divided based on time or any triggering event; the triggering event comprises an obstacle event, a steering event, a fatigue event or an electrocardiographic event; the steering event is that the vehicle steers, the fatigue time can be that the driver reaches a fatigue threshold value, the fatigue threshold value can be fatigue grade 2, and the electrocardio event can be that electrocardio data of the driver exceeds a preset electrocardio threshold value; and outputting an analysis report according to the trigger event positioning time.
By adopting the scheme, the system can collect data in the experimental process, including eye movement monitoring, heart rate monitoring, skin electric monitoring, skin temperature monitoring, respiration monitoring, brain electric monitoring, near infrared monitoring, motion capturing, behavior monitoring, interaction recording, facial expression monitoring and the like, is synchronously implemented on the same platform, each item of data is directly recorded to the background, and in the experimental playback stage, data analysis is carried out, the system is compatible with various analysis modules of subjective and objective data, the analysis modules comprise diversified standard analysis algorithms, and the analyzed data can be exported in a key mode according to an operator report generation instruction, and the exported report can be combined based on the subjective and objective evaluation module, the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module and the delay analysis module, so that the subjective and objective evaluation is prevented from being influenced by external factors.
As shown in fig. 3, in some embodiments of the present application, the interaction behavior analysis module includes:
the action behavior analysis module is used for calculating the times and the frequency of each action behavior according to the action behaviors recorded by the driver;
the action may be braking, adjusting a seat, etc.
The above actions can be identified by camera recording.
The eye movement behavior analysis module is used for determining the gazing behaviors of the driver in each interest area according to the eye movement data of the driver based on the interest areas which are divided into the in-vehicle space in advance, and determining the gazing time of each gazing behavior;
in some embodiments of the present application, the division of the interest area may divide the dashboard into one interest area and the vehicle navigation into one interest area.
The electroencephalogram behavior analysis module is used for determining the current conscious behavior of a driver based on the electric brain data of delta, theta, alpha, beta and gamma in 5 frequency bands in the electric brain data;
if the frequency range of the delta wave is 0Hz-4Hz, the thought cannot be judged when the frequency of the delta wave is higher, the brain cannot be recovered when the frequency of the delta wave is lower, and the sleep is poor;
the frequency range of the theta wave is 4Hz-8Hz, impulse behaviors or inattention are judged when the frequency of the theta wave is higher, and anxiety symptoms are judged when the frequency of the theta wave is lower;
the frequency range of the alpha wave is 8Hz-12Hz, the attention cannot be focused when the alpha wave is high, and anxiety symptoms are judged when the alpha wave is low;
the frequency range of the beta wave is 12Hz-40Hz, the user can not feel relaxed tension when the frequency of the beta wave is higher, and the user can not concentrate the cognitive ability when the frequency of the beta wave is lower;
the frequency range of the gamma wave is 40Hz-100Hz, anxiety is judged when the frequency of the gamma wave is higher, and hyperkinetic symptom is judged when the frequency of the gamma wave is lower.
In some embodiments of the present application, the above-mentioned higher and lower are achieved by presetting different thresholds, and the above-mentioned results of the 5 frequency band decisions are conscious behaviors through δ, θ, α, β and γ.
And the electrocardio behavior analysis module is used for determining the fluctuation range of the heart rate in a preset time length according to the collected heart rate of the driver and determining whether the driver has tension or not based on the fluctuation range of the heart rate data in the preset time length.
The psychological activities include tension, anxiety and inattention.
As shown in fig. 3, in some embodiments of the application, the interaction behavior analysis module includes an action gesture analysis module,
the action gesture analysis module is used for determining the current action health grade of the driver based on the angles of the joints according to the angles of the joints of the driver acquired by the data acquisition module.
In some embodiments of the present application, in the step of determining the current motion health level of the driver, the motion health level may be determined based on the magnitude of the angle of the joint exceeding the normal range, and the motion health level may be 1 level for 3 degrees exceeding the preset normal range, and 2 levels for 6 degrees exceeding the preset normal range.
As shown in fig. 3, in some embodiments of the present application, the interactive behavior analysis module includes an expression action analysis module,
the expression action analysis module is used for identifying the expression of the driver based on a preset neural network model according to the facial segments of the driver in the video data acquired by the data acquisition module.
The expression action analysis module can recognize expressions through a preset convolutional neural network model.
In some embodiments of the present application, the action behavior analysis module calculates the number and frequency of each action behavior from the action behaviors recorded by the driver in the video data in the step of calculating the number and frequency of each action behavior from the action behaviors recorded by the driver.
The action behavior comprises gear engagement, braking, turning and the like.
In some embodiments of the present application, the action behavior analysis module obtains an operation record of the device of the vehicle itself, determines the action behavior of the driver based on the operated device, and determines the number and frequency of action behaviors based on the operated number and frequency of the device in the step of calculating the number and frequency of each action behavior according to the action behavior of the driver recorded.
In some embodiments of the application, the delay analysis module, in the step of calculating a fluctuation time difference of the driver assessment data based on the same trigger event,
the triggering event comprises an obstacle event, the time of the occurrence of the obstacle in front of the vehicle is obtained from the data acquisition module, the time of the first occurrence of fluctuation of the evaluation data of the driver after the time of the occurrence of the obstacle is taken as the reaction time, the time difference between the time of the occurrence of the obstacle and the reaction time is taken as the fluctuation time difference, and the fatigue level of the driver is determined based on the fluctuation time difference.
In some embodiments of the present application, the system collects the time of the front obstacle through the camera, records the time of the front obstacle, and then the time of the driver reacting, where the time of the driver reacting may be the time of pupil dilation, the time of peak of electroencephalogram data or the time of peak of electrocardiographic data recorded by the far-end eye movement recorder or the virtual reality eye movement recorder.
In some embodiments of the present application, the fatigue level corresponding to the fluctuation time difference is preset, for example, the fluctuation time difference is 1ms corresponding to the fatigue level 1, and the fluctuation time difference is 2ms corresponding to the fatigue level 2.
In some embodiments of the present application, the delay analysis module is configured to, in the step of taking, as the reaction time, a time at which the driver's evaluation data fluctuates for the first time after the time at which the obstacle appears:
the time of first pupil dilation after the time of obstacle appearance is taken as the time of first fluctuation of eye movement data of driver evaluation data;
and taking the time of the first occurrence of the peak of the electroencephalogram data and/or the electrocardiographic data of the evaluation data of the driver after the time of the occurrence of the obstacle as the time of the first occurrence of the fluctuation of the electroencephalogram data or the electrocardiographic data.
In some embodiments of the present application, the report generating module outputs the received data as the analysis report according to a report generating instruction, the report generating instruction being all data that designates the data collection module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, and the delay analysis module or part of data that designates the data collection module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module, and the subjective assessment module.
In some embodiments of the present application, if the report generating instruction is a part of data that designates the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle driving analysis module, and the delay analysis module, the instruction mode includes an event statistics report instruction and a fragment statistics report instruction:
if the report generation instruction is an event statistics report instruction, based on a trigger event, counting data before or after the trigger event occurs, and displaying the data as an analysis report;
and if the report generation instruction is a segment statistics report instruction, based on one frame of image in the video data, counting the data before or after the frame of image, and presenting the data as an analysis report.
The application also provides an evaluation method of the intelligent cabin human-computer interface, which comprises the following steps:
the method comprises the steps that vehicle evaluation data and driver evaluation data are collected through a data collection module, and the data collection module is used for collecting data based on installed recording equipment;
the method comprises the steps that the interactive behavior of a driver is determined through an interactive behavior analysis module, the interactive behavior analysis module determines the interactive behavior of the driver according to driver evaluation data collected by a data collection module, the interactive behavior comprises limb behaviors and psychological activities, and the driver evaluation data comprises video data, eye movement data, electroencephalogram data and/or electrocardiographic data;
determining the fatigue level of the driver through a delay analysis module, wherein the delay analysis module calculates the fluctuation time difference of the evaluation data of the driver based on the same trigger event, and determines the fatigue level of the driver based on the fluctuation time difference;
comparing the vehicle running track of the vehicle evaluation data acquired by the data acquisition module with a preset track through the track analysis module, and outputting track comparison data;
determining a running stability grade of the vehicle through a vehicle running analysis module, wherein the vehicle running analysis module determines the running stability grade of the vehicle according to the yaw rate, the centroid lateral angle data and the transverse acceleration in the vehicle evaluation data acquired by the data acquisition module;
the method comprises the steps that a questionnaire scale fed back by a driving end is obtained through a subjective evaluation module, the subjective evaluation module sends a preset questionnaire scale to the driving end, and the questionnaire scale fed back by the driving end is received;
and the report generation module is used for receiving the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module, and outputting the received data as an analysis report according to a report generation instruction.
The application also provides an evaluation device of the intelligent cabin human-computer interface, which comprises computer equipment, wherein the computer equipment comprises a processor and a memory, the memory is stored with computer instructions, the processor is used for executing the computer instructions stored in the memory, and when the computer instructions are executed by the processor, the device realizes the functions of each module of the system as described above.
The application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the steps of the method for evaluating a human-computer interface of an intelligent cabin. The computer readable storage medium may be a tangible storage medium such as Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, floppy disks, hard disk, a removable memory disk, a CD-ROM, or any other form of storage medium known in the art.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein can be implemented as hardware, software, or a combination of both. The particular implementation is hardware or software dependent on the specific application of the solution and the design constraints. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, a plug-in, a function card, or the like. When implemented in software, the elements of the application are the programs that are used to perform the required tasks
An order or code segment. The program or code segments may be stored in a machine readable medium or transmitted over a transmission medium or a communication link by a data 5 signal carried in a carrier wave.
It should be understood that the application is not limited to the particular arrangements and instrumentality described above and shown in the drawings. For the sake of brevity, a detailed description of known methods is omitted here. In the above embodiments, several specific steps are described and shown as examples. The process of the method of the present application is not limited to the specific steps described and shown, however, and is in the field
Various changes, modifications and additions may be made, or the order of 0 between steps may be altered by the skilled person after appreciating the spirit of the present application.
In this disclosure, features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, and various modifications and variations may be made to the embodiments of the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. An assessment system for an intelligent cockpit human-machine interface, the system comprising:
the data acquisition module is based on the installed recording equipment and is used for acquiring vehicle evaluation data and driver evaluation data;
the interactive behavior analysis module is used for determining the interactive behavior of the driver according to the driver evaluation data acquired by the data acquisition module, wherein the interactive behavior comprises limb behaviors and psychological activities, and the driver evaluation data comprises video data, eye movement data, brain electricity data and/or electrocardio data;
the delay analysis module is used for calculating fluctuation time difference of the evaluation data of the driver based on the same trigger event and determining the fatigue level of the driver based on the fluctuation time difference;
the track analysis module is used for comparing the vehicle running track of the vehicle evaluation data acquired by the data acquisition module with a preset track and outputting track comparison data;
the vehicle running analysis module is used for determining the running stability grade of the vehicle according to the combined yaw rate, the centroid lateral angle data and the transverse acceleration in the vehicle evaluation data acquired by the data acquisition module;
the subjective evaluation module is used for sending a preset questionnaire scale to the driving end and receiving the questionnaire scale fed back by the driving end;
the report generation module is used for receiving the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module, and outputting the received data as an analysis report according to a report generation instruction.
2. The assessment system of an intelligent cockpit human-computer interface of claim 1 wherein the interactive behavior analysis module comprises:
the action behavior analysis module is used for calculating the times and the frequency of each action behavior according to the action behaviors recorded by the driver;
the eye movement behavior analysis module is used for determining the gazing behaviors of the driver in each interest area according to the eye movement data of the driver based on the interest areas which are divided into the in-vehicle space in advance, and determining the gazing time of each gazing behavior;
the electroencephalogram behavior analysis module is used for determining the current conscious behavior of a driver based on the electric brain data of delta, theta, alpha, beta and gamma in 5 frequency bands in the electric brain data;
and the electrocardio behavior analysis module is used for determining the fluctuation range of the heart rate in a preset time length according to the collected heart rate of the driver and determining whether the driver has tension or not based on the fluctuation range of the heart rate data in the preset time length.
3. The assessment system of an intelligent cockpit human-computer interface of claim 1 or 2 wherein the interactive behavior analysis module comprises an action gesture analysis module,
the action gesture analysis module is used for determining the current action health grade of the driver based on the angles of the joints according to the angles of the joints of the driver acquired by the data acquisition module.
4. The assessment system of an intelligent cockpit human-computer interface of claim 1 or 2 wherein the interactive behavior analysis module comprises an expression action analysis module,
the expression action analysis module is used for identifying the expression of the driver based on a preset neural network model according to the facial segments of the driver in the video data acquired by the data acquisition module.
5. The assessment system of claim 1, wherein the delay analysis module, in the step of calculating a fluctuation time difference of the driver assessment data based on the same trigger event,
the triggering event comprises an obstacle event, the time of the occurrence of the obstacle in front of the vehicle is obtained from the data acquisition module, the time of the first occurrence of fluctuation of the evaluation data of the driver after the time of the occurrence of the obstacle is taken as the reaction time, the time difference between the time of the occurrence of the obstacle and the reaction time is taken as the fluctuation time difference, and the fatigue level of the driver is determined based on the fluctuation time difference.
6. The intelligent cockpit human interface of claim 5 wherein the delay analysis module in the step of taking as reaction time the time at which the driver's assessment data first fluctuates after the time at which the obstacle appears:
the time of first pupil dilation after the time of obstacle appearance is taken as the time of first fluctuation of eye movement data of driver evaluation data;
and taking the time of the first occurrence of the peak of the electroencephalogram data and/or the electrocardiographic data of the evaluation data of the driver after the time of the occurrence of the obstacle as the time of the first occurrence of the fluctuation of the electroencephalogram data or the electrocardiographic data.
7. The evaluation system of an intelligent cabin human-computer interface according to claim 1, wherein in the step of outputting the received data as an analysis report according to a report generation instruction, the report generation instruction is all data or part of data which are output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, and the delay analysis module.
8. The system of claim 7, wherein if the report generation instruction is a part of data that designates the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle driving analysis module, and the delay analysis module, the instruction mode includes an event statistics report instruction and a segment statistics report instruction:
if the report generation instruction is an event statistics report instruction, based on a trigger event, counting data before or after the trigger event occurs, and displaying the data as an analysis report;
and if the report generation instruction is a segment statistics report instruction, based on one frame of image in the video data, counting the data before or after the frame of image, and presenting the data as an analysis report.
9. An evaluation method of an intelligent cabin man-machine interface is characterized by comprising the following steps:
the method comprises the steps that vehicle evaluation data and driver evaluation data are collected through a data collection module, and the data collection module is used for collecting data based on installed recording equipment;
the method comprises the steps that the interactive behavior of a driver is determined through an interactive behavior analysis module, the interactive behavior analysis module determines the interactive behavior of the driver according to driver evaluation data collected by a data collection module, the interactive behavior comprises limb behaviors and psychological activities, and the driver evaluation data comprises video data, eye movement data, electroencephalogram data and/or electrocardiographic data;
determining the fatigue level of the driver through a delay analysis module, wherein the delay analysis module calculates the fluctuation time difference of the evaluation data of the driver based on the same trigger event, and determines the fatigue level of the driver based on the fluctuation time difference;
comparing the vehicle running track of the vehicle evaluation data acquired by the data acquisition module with a preset track through the track analysis module, and outputting track comparison data;
determining a running stability grade of the vehicle through a vehicle running analysis module, wherein the vehicle running analysis module determines the running stability grade of the vehicle according to the yaw rate, the centroid lateral angle data and the transverse acceleration in the vehicle evaluation data acquired by the data acquisition module;
the method comprises the steps that a questionnaire scale fed back by a driving end is obtained through a subjective evaluation module, the subjective evaluation module sends a preset questionnaire scale to the driving end, and the questionnaire scale fed back by the driving end is received;
and the report generation module is used for receiving the data output by the data acquisition module, the track analysis module, the interaction behavior analysis module, the vehicle running analysis module, the delay analysis module and the subjective evaluation module, and outputting the received data as an analysis report according to a report generation instruction.
10. A computer readable storage medium having stored thereon a computer program which when executed by a processor implements the assessment method of a smart cockpit human-machine interface of claim 9.
CN202211531796.4A 2022-12-01 2022-12-01 Evaluation system and method for intelligent cabin man-machine interface and storage medium Pending CN116611605A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211531796.4A CN116611605A (en) 2022-12-01 2022-12-01 Evaluation system and method for intelligent cabin man-machine interface and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211531796.4A CN116611605A (en) 2022-12-01 2022-12-01 Evaluation system and method for intelligent cabin man-machine interface and storage medium

Publications (1)

Publication Number Publication Date
CN116611605A true CN116611605A (en) 2023-08-18

Family

ID=87673413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211531796.4A Pending CN116611605A (en) 2022-12-01 2022-12-01 Evaluation system and method for intelligent cabin man-machine interface and storage medium

Country Status (1)

Country Link
CN (1) CN116611605A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855410A (en) * 2012-09-20 2013-01-02 上海品铭机械工程有限公司 Method and system for evaluation of man-machine work efficiency of cabin simulation test bed
CN113359689A (en) * 2021-06-04 2021-09-07 西北工业大学 New man-machine cooperative intelligent navigation technology in unstructured environment
CN113460062A (en) * 2021-07-30 2021-10-01 安波福电子(苏州)有限公司 Driving behavior analysis system
CN114298469A (en) * 2021-11-24 2022-04-08 重庆大学 User experience test evaluation method for intelligent cabin of automobile
CN114347998A (en) * 2022-01-07 2022-04-15 中山大学 Vehicle driving assistance control method, system, equipment and medium
CN114889591A (en) * 2022-05-12 2022-08-12 合肥职业技术学院 Minimum safe distance generation algorithm based on driver reaction time
WO2022183449A1 (en) * 2021-03-04 2022-09-09 吉林大学 Automatic driving testing method and system, vehicle and storage medium
CN115165399A (en) * 2022-08-23 2022-10-11 中汽院智能网联科技有限公司 Intelligent automobile human-computer interaction evaluation system and test method
CN115407872A (en) * 2022-08-12 2022-11-29 北京津发科技股份有限公司 Intelligent man-machine cooperative system evaluation method and device and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855410A (en) * 2012-09-20 2013-01-02 上海品铭机械工程有限公司 Method and system for evaluation of man-machine work efficiency of cabin simulation test bed
WO2022183449A1 (en) * 2021-03-04 2022-09-09 吉林大学 Automatic driving testing method and system, vehicle and storage medium
CN113359689A (en) * 2021-06-04 2021-09-07 西北工业大学 New man-machine cooperative intelligent navigation technology in unstructured environment
CN113460062A (en) * 2021-07-30 2021-10-01 安波福电子(苏州)有限公司 Driving behavior analysis system
CN114298469A (en) * 2021-11-24 2022-04-08 重庆大学 User experience test evaluation method for intelligent cabin of automobile
CN114347998A (en) * 2022-01-07 2022-04-15 中山大学 Vehicle driving assistance control method, system, equipment and medium
CN114889591A (en) * 2022-05-12 2022-08-12 合肥职业技术学院 Minimum safe distance generation algorithm based on driver reaction time
CN115407872A (en) * 2022-08-12 2022-11-29 北京津发科技股份有限公司 Intelligent man-machine cooperative system evaluation method and device and storage medium
CN115165399A (en) * 2022-08-23 2022-10-11 中汽院智能网联科技有限公司 Intelligent automobile human-computer interaction evaluation system and test method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SHUHANG HAN ET AL.: ""Evaluation method of human-vehicle tactile interaction experience based on EEG"", 《2020 12TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS》, 23 September 2020 (2020-09-23) *
郁淑聪等: ""基于驾驶员的智能座舱人机工效测评研究"", 《汽车工程》, vol. 44, no. 1, 25 January 2022 (2022-01-25), pages 37 - 41 *

Similar Documents

Publication Publication Date Title
Doudou et al. Driver drowsiness measurement technologies: Current research, market solutions, and challenges
Healey et al. Detecting stress during real-world driving tasks using physiological sensors
Begum Intelligent driver monitoring systems based on physiological sensor signals: A review
Braunagel et al. Online recognition of driver-activity based on visual scanpath classification
Aljaafreh et al. Driving style recognition using fuzzy logic
Pratama et al. A review on driver drowsiness based on image, bio-signal, and driver behavior
Ji et al. Real-time nonintrusive monitoring and prediction of driver fatigue
Katsis et al. Toward emotion recognition in car-racing drivers: A biosignal processing approach
CN111104820A (en) Gesture recognition method based on deep learning
CN107822621A (en) Integrated on-board data collection
Nacpil et al. Application of physiological sensors for personalization in semi-autonomous driving: A review
Rong et al. Artificial intelligence methods in in-cabin use cases: A survey
CN114298469A (en) User experience test evaluation method for intelligent cabin of automobile
CN114387587A (en) Fatigue driving monitoring method
CN111858696A (en) Assessing cognitive responses to airborne updates
Milardo et al. Understanding drivers’ stress and interactions with vehicle systems through naturalistic data analysis
Tao et al. A multimodal physiological dataset for driving behaviour analysis
Walocha et al. Activity and stress estimation based on OpenPose and electrocardiogram for user-focused level-4-vehicles
Fu et al. Advancements in the Intelligent Detection of Driver Fatigue and Distraction: A Comprehensive Review
Dehzangi et al. Unobtrusive driver drowsiness prediction using driving behavior from vehicular sensors
Ani et al. A critical review on driver fatigue detection and monitoring system
CN116611605A (en) Evaluation system and method for intelligent cabin man-machine interface and storage medium
Sukumar et al. Physiological and Physical Sensors for Stress Level, Drowsiness Detection, and Behaviour Analysis
Gong et al. TFAC-Net: A Temporal-Frequential Attentional Convolutional Network for Driver Drowsiness Recognition With Single-Channel EEG
Armenta et al. An intelligent multi-sourced sensing system to study driver’s visual behaviors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination