CN113734166B - Automatic automobile driving control system and method based on sensing fusion SWC - Google Patents
Automatic automobile driving control system and method based on sensing fusion SWC Download PDFInfo
- Publication number
- CN113734166B CN113734166B CN202111155811.5A CN202111155811A CN113734166B CN 113734166 B CN113734166 B CN 113734166B CN 202111155811 A CN202111155811 A CN 202111155811A CN 113734166 B CN113734166 B CN 113734166B
- Authority
- CN
- China
- Prior art keywords
- data
- module
- rte
- swc
- signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 82
- 238000000034 method Methods 0.000 title claims description 13
- 230000006870 function Effects 0.000 claims abstract description 78
- 238000012545 processing Methods 0.000 claims abstract description 35
- 230000008447 perception Effects 0.000 claims abstract description 25
- 238000004891 communication Methods 0.000 claims description 31
- 230000003993 interaction Effects 0.000 claims description 17
- 230000005540 biological transmission Effects 0.000 claims description 11
- 238000004458 analytical method Methods 0.000 claims description 10
- 238000007405 data analysis Methods 0.000 claims description 10
- 238000012216 screening Methods 0.000 claims description 9
- 230000002159 abnormal effect Effects 0.000 claims description 7
- 230000002093 peripheral effect Effects 0.000 claims description 6
- 230000001360 synchronised effect Effects 0.000 claims description 5
- 238000012795 verification Methods 0.000 claims description 4
- 230000002688 persistence Effects 0.000 claims description 3
- 230000004888 barrier function Effects 0.000 claims description 2
- 230000033001 locomotion Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 206010022562 Intermittent claudication Diseases 0.000 description 2
- 208000024980 claudication Diseases 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000002955 isolation Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000000889 atomisation Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
- B60R16/0231—Circuits relating to the driving or the functioning of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses an automatic driving control system of an automobile based on perception fusion, which relates to the automatic driving technology of the automobile, integrates three-level automatic driving perception codes, AEB, ACC and other SWC function modules based on simulink in an open source compiling environment, constructs a complete automatic driving framework of the automobile with data receiving and transmitting, perception processing, target identification and decision planning through a middleware interface component, realizes the autonomous realization of the automatic driving function of low-cost L1L2, and can realize function expansion and iteration. Collecting and identifying the output road target and obstacle data, integrating all external SWC module signals, checking safety functions, giving signal states, completing signal distribution, fusing multi-sensor data by adopting a single-thread data processing mode, and outputting target level data; the emergency obstacle avoidance of the vehicle to the front obstacle is controlled, and the intelligent self-adaptive cruising of the vehicle is realized by the ACC function SWC.
Description
Technical Field
The invention relates to an automobile control system, in particular to an emergency obstacle avoidance and intelligent self-adaptive cruising in an unmanned automobile.
Background
At present, the sensing fusion and control function codes which can be run on the low-cost system on chip are commercial integrated software whole represented by bosch and corresponding hardware systems. Scheme one: based on the perception codes and the control codes of various open source development platforms, the target environment perception classification, the vehicle track planning and the control are respectively and independently realized. The cross-platform functions are dispersed, and effective overall development is difficult. Scheme II: the L1L2 complete functional architecture based on the whole vehicle level controller comprises first-level and second-level automatic driving functions such as AEB, IACC and the like. The function code cannot be opened and is difficult to modify, and the cost is high.
Publication No.: the Chinese patent of CN112429012A discloses an automobile electric control system, an automatic driving control method and an automobile, and the system comprises: an Ethernet bus, an intelligent cabin domain processing unit, an automatic driving domain processing unit, a vehicle control domain processing unit, a central control domain processing unit and a sensor system which are respectively connected with the Ethernet bus; an automatic driving claudication algorithm program is backed up in the intelligent cabin domain processing unit; when the vehicle control domain processing unit monitors that the running state of the automatic driving domain processing unit is abnormal and the running state of the intelligent cabin domain processing unit is normal, the intelligent cabin domain processing unit is controlled to run an automatic driving limp algorithm program and receive sensing data in an Ethernet bus to perform fusion operation and decision control so as to control the vehicle control domain processing unit to perform vehicle limp safety control. The failure condition of the controller in the automatic driving domain is solved, the safety of the vehicle and passengers is ensured, and the safety backup of the function of the controller in the automatic driving domain is realized. An Ethernet bus, an intelligent cabin area processing unit, an automatic driving area processing unit, a vehicle control area processing unit, a central control area processing unit and a sensor system which are respectively connected with the Ethernet bus; an automatic driving claudication algorithm program is backed up in the intelligent cabin domain processing unit; the automatic driving domain processing unit is used for receiving the sensing data loaded by the sensor system in the Ethernet bus and outputting a corresponding automatic driving control instruction to the vehicle control domain processing unit; the vehicle control domain processing unit is used for receiving the automatic driving control instruction and controlling the movement of the vehicle based on the automatic driving control instruction; the vehicle control domain processing unit is used for monitoring the running states of the automatic driving domain processing unit, the intelligent cabin domain processing unit and the vehicle control domain in real time, and when the running states of the automatic driving domain processing unit are abnormal and the running states of the intelligent cabin domain processing unit are normal, the intelligent cabin domain processing unit is controlled to run the automatic driving limp algorithm program and receive sensing data in the Ethernet bus to perform fusion operation and decision control so as to control the vehicle control domain processing unit to perform vehicle limp safety control.
The prior art disclosed by the patent uses a domain controller as a hardware basis and has complete functions, the system is large in size, high in cost and complex in system, and the system is directly used for an auxiliary safety automatic driving function mainly comprising automatic emergency braking and intelligent self-adaptive cruising, so that obvious software and hardware resource waste exists, and the related perception data related to various information is not fused according to various formats collected by various sensors of an automobile, so that the sharing and interactive utilization of the data of different systems of different automobile types are not facilitated; the patent uses automatic emergency braking and intelligent self-adaptive cruising as functional requirements, and develops a high-performance automatic driving system which is light in weight, low in cost and based on fusion perception data information aiming at the characteristics of the system.
Disclosure of Invention
Aiming at the characteristics of dispersed cross-platform functions, difficult to develop effectively, incapability of opening source and modifying function codes and high cost, the invention aims at solving the problems of high cost. The automatic driving system is characterized in that three-level automatic driving sensing codes, automatic emergency braking AEB, intelligent self-adaptive cruise ACC and other functional modules based on a matlab tool kit simulink are integrated in an open source compiling environment HighTec, and an automobile automatic driving framework with complete data receiving and transmitting, sensing processing, target identification and decision planning is constructed through a middleware interface assembly RTE, so that autonomous realization of low-cost auxiliary or partial auxiliary automatic driving functions is realized, and function expansion and iteration can be realized.
The technical scheme for solving the technical problems is that the automobile automatic driving control system based on perception fusion comprises: the system comprises a data transceiver module, a signal middleware interface RTE module, a perception fusion SWC module, an emergency braking AEB function SWC module and an ACC function SWC module, wherein the data transceiver module acquires road target and obstacle data in front of the automobile in real time and transmits the data to the signal middleware interface RTE module according to a specified signal format of a preset signal list; the signal middleware interface RTE module is responsible for integrating all SWC module signals, checking safety functions, determining and giving out signal states (such as states of normal signals, abnormal signal super threshold values and the like) and completing signal distribution; the sensing fusion module SWC fuses the multi-sensor data by adopting a single-thread data processing mode, and screens and outputs a certain amount of target-level data which has close influence on driving safety and driving performance according to the driving area of the vehicle; according to the acquired target level data, the AEB function SWC module controls the vehicle to avoid the obstacle in front of the obstacle in an emergency way, and the ACC function SWC controls the vehicle to realize intelligent self-adaptive cruising.
The data transceiver module further comprises: the front-view camera can adopt a horizon double-core chip, the camera integrates a monocular camera target identification inner core and provides a data interface, meanwhile, the front-view camera can acquire information such as speed and heading of the vehicle through an intermediate interface RTE module, and finally, the target information identified by the image is sent to a rear-end perception fusion module through the intermediate interface for use. The front millimeter wave radar obtains the signals such as the required speed and heading information through the intermediate data interface, and sends the targets identified by clustering to the rear end sensing fusion module through the intermediate interface for use.
Furthermore, the front-view camera data transceiver module is integrated on the master chip, the horizon object recognition module is integrated on the slave chip, the master chip and the slave chip interact data through the RTE module by adopting a serial peripheral interface, and the SPI communication protocol is utilized to package road object and barrier data, so that each signal API (application program interface) function with the prefix of Rte_fc_signal name is formed; the front millimeter wave radar data receiving and transmitting module is integrated on a main chip, performs data interaction with a vehicle body system through a vehicle body CAN (controller area network) network interface, performs interaction between the RTE module and the FR module by adopting a CAN communication protocol, and performs data analysis and packet grouping by utilizing parameters of a factor, an offset, a max and a min.
When the RTE module receives forward millimeter wave radar FR data, the data is packaged by utilizing an SPI communication protocol to form various signal API functions with prefix of Rte_fr_signal name, the RTE module packages the data to be a standard external interface function, wherein signals related to a front camera FC are packaged in Rte_spi.h, millimeter wave radar FR and vehicle body data signals are packaged in Rte_com.h, and signals related to a perception fusion module SDF are packaged in Rte_sdf.h.
The RTE module and all SWC modules perform data interaction through corresponding interface functions, data from the RTE module are distinguished through a prefix Rte, the same data is redefined by redefinition to obtain signals meeting the requirements of the SWC modules, validity CHECK is performed by using a rollingConter (CHECK flag bit), data rationality judgment is performed according to dbc physical signal definition, and two states of RTE_E_OK and RTE_E_CHECK are output through the data judged by the validity and rationality.
The sensing fusion module SWC input end is provided with a sensor module for receiving data which is collected by all sensors of an abstract sensor receiver and contains pure virtual functions, and three receiving sub-class objects of Ethernet data, CAN data and CANFd data are respectively arranged on the data according to the classification of the Ethernet, the CAN network or the CANFd network, wherein the three sub-class objects respectively contain the pure virtual functions.
The sensing fusion module SWC sets an abstract analysis abstatt_parameter parent class, and converts the original data acquired by the sensor into readable and usable data forms according to the data transmission mode, manufacturer and model and the data analysis realized by the CAN dbc or signal list provided by the sensor.
Based on data receiving analysis, a perception fusion module SWC is internally provided with a perception preprocessing acceptance module, unreasonable data is removed according to the data characteristics of each sensor, then target data fusion is carried out, lane line data fusion is carried out, target screening is carried out according to the driving area of the vehicle, and targets with close influence on driving safety and driving performance are output.
The time synchronization module adopts the absolute time of the system as a time axis, selects the output time of any one of the sensors as a reference time, establishes a sequential query mechanism by using a sequence container, compares the data of each frame of the other sensors with the reference time in sequence, and searches for the frame with the smallest time interval as the synchronized time; the space synchronization module takes a front protection center point as an original point, and takes a left-right positive, front-right and back-negative own vehicle coordinate system as a reference coordinate system of all sensors.
The invention also provides an automatic driving control method of the automobile based on perception fusion, which comprises the steps of acquiring and identifying road targets and obstacle data in front of the automobile in real time by a front-view camera, horizon target identification and a multi-sensor of a front millimeter wave radar data transceiver module, integrating and checking safety functions, confirming signal states and completing signal distribution, fusing the multi-sensor data by adopting a single-thread data processing mode, and screening and outputting a certain amount of target-level data which has close influence on driving safety and driving performance according to a driving area of the automobile; according to the acquired target level data, the AEB function SWC module controls the vehicle to avoid the obstacle in front of the obstacle in an emergency way, and the ACC function SWC controls the vehicle to realize intelligent self-adaptive cruising.
The front millimeter wave radar data transceiver module performs data interaction with a vehicle body system through a vehicle body CAN network interface, and performs data interaction between the RTE module and the FR module by using a CAN communication protocol.
When the RTE module receives FR data, the FR data is packaged by using an SPI communication protocol to form various signal API functions with prefix of Rte_fr_signal name, and all the data are packaged by the RTE module to become a standard external interface function, wherein, signals related to FC are packaged in Rte_spi.h, FR and vehicle body data signals are packaged in Rte_com.h, and SDF related signals are packaged in Rte_sdf.h.
The RTE module and all SWC modules perform data interaction through corresponding interface functions, data from RTE are distinguished through prefixes Rte, the same data are redefined through signal definition to obtain signals meeting the requirements of the SWC modules, valid bit verification is performed through signal zone bits, data rationality judgment is performed according to dbc physical signal definition, and RTE_E_OK and RTE_E_CHECK are output through the data judged through validity and rationality.
The sensing fusion module SWC input end is provided with a sensor module for receiving data which is collected by all sensors of an automobile and contains a pure virtual function recevie by a abstract sensor recevier parent object, and three receiving sub-class objects of Ethernet data, CAN data and CANFd data are respectively arranged on the data according to the classification of Ethernet, CAN network or CANFd network, and each sub-class object contains a pure virtual function. The sensing fusion module SWC is provided with an abstract analysis abstatt_player parent class, realizes data analysis according to a data transmission mode, a manufacturer, a model and a CAN dbc or signal list provided by the sensor, and converts the original data acquired by the sensor into readable and usable data forms; and a permission sensing module is arranged in the sensing fusion module SWC, unreasonable data is removed according to the data characteristics of each sensor, then target data fusion is carried out, lane line data fusion is carried out, and target screening is carried out according to the vehicle driving area.
The time synchronization module adopts the absolute time of the system as a time axis, selects the output time of any sensor as a reference time, sequentially inquires by using a vector container and a map container, compares the data of each frame of the rest sensors with the reference time in sequence, and searches the frame with the minimum time interval as the synchronized time; the space synchronization module takes a front protection center point as an original point, takes a left-right positive, front-right negative and back-right own vehicle coordinate system as a reference coordinate system of all sensors
And an abstract sensor module receiving function is arranged at the SWC input end of the perception fusion module and is responsible for managing the acquisition and the reception of all sensors of the automobile. Meanwhile, the system is considered to be applicable to various working conditions, the Ethernet, CAN and CANFd communication protocols are developed, and the system is realized by utilizing the polymorphic concept in C++. The sensing fusion module SWC sets an abstatt_parameter parent class, and converts the original data acquired by the sensor into a readable and usable data form according to the data transmission mode, manufacturer, model and data analysis realized by a CAN message or signal list provided by the sensor. The external sensor or the communication protocol thereof can be transformed by only changing and analyzing part of codes, so that the isolation and independence of data transmission are realized, and the granularity is good. Based on data receiving analysis, a perception module is built in a perception fusion module SWC, unreasonable data are removed according to data characteristics of each sensor, then a target data fusion module is carried out, lane line data fusion is carried out, and a target screening module is carried out according to a vehicle driving area. The time synchronization and space synchronization module is also arranged to solve the time synchronization problem and the space synchronization problem of the cameras, radars and other sensors arranged at different positions of the vehicle body.
The invention provides a complete system based on L1L2 (auxiliary automatic driving), and the functions of automatic emergency braking, intelligent self-adaptive cruising and the like are realized. All signals of the system are uniformly managed through the data intermediate interface, different targets and data acquired by the sensors are subjected to format uniformity, classification identification and effective fusion, the data requirements of intelligent automobile driving, control, communication and other aspects are met, a new automatic driving function interface is reserved, specifications and possibility are provided for different data utilization of sensing equipment such as different types of sensors, and system support is provided for future function upgrading. The system can be deployed on low-power consumption and low-cost control chips such as a system on a chip due to the simplified architecture, and meanwhile, the system adopts a modularized design to realize full decoupling of system functions and meet the design idea of automatic driving function atomization service.
Drawings
FIG. 1 is a schematic block diagram of an automatic driving control system of an automobile according to the present invention.
Detailed Description
The invention will be further described and illustrated with reference to the drawings and specific examples. The present embodiment is only used for clearly illustrating and describing the implementation process of the present invention, and the protection scope of the present invention is not limited to the present embodiment, and any technical solution obtained based on simple reasoning and logic combination by those skilled in the art on the basis of the present embodiment shall also belong to the protection scope of the present invention.
Fig. 1 is a schematic block diagram of the system structure of the present invention. The system comprises a front-view camera data receiving and transmitting module, a front millimeter wave radar data receiving and transmitting module, a signal middleware interface RTE module, a sensing fusion SWC module, an automatic emergency braking AEB function sensing fusion SWC and a self-adaptive cruise ACC function sensing fusion SWC. The forward-looking camera data receiving and transmitting module acquires the road target and pedestrian data identified and output by the monocular camera in real time and transmits the road target and pedestrian data according to a preset signal list signal format; the module is externally connected with a front-view camera formed by a horizon dual-core chip, and the camera integrates a monocular camera eye mark inner core and provides a data interface. The front millimeter wave radar data receiving and transmitting module acquires road target data sent by the front millimeter wave radar in real time and transmits the road target data according to a predefined signal format. The data intermediate interface RTE module is a central center of the whole system architecture and is responsible for integrating and uniformly managing signals of all external SWC modules and all functional modules, and signal interaction of all modules is needed to be distributed and managed through RTE.
The sensing fusion module SWC provides sensing data for realizing automatic driving to the multi-sensor data fusion, and then outputs target-level data with relatively high reliability, meanwhile, according to compiling environment and hardware characteristics, the sensing fusion adopts a single-thread data processing mode, and targets information detected by a camera and a millimeter wave radar are subjected to data association, data fusion and target evaluation steps to output fusion target results, wherein the fusion target results comprise: and comprehensive information such as transverse and longitudinal positions, transverse and longitudinal speeds, transverse and longitudinal acceleration, course angle, tracking serial number, target type, target motion state and the like.
The SDF perception fusion module SWC defines an abstract receiving class pointer according to an actual external hardware sensor and a communication mode, and the specific receiving function can be realized by using the pointer to point to the used communication mode; defining an abstract data analysis class pointer, and using the pointer to point to a specific sensor to realize the realization of the analysis Parser (data analysis) function of the specific data, wherein each sensor analyzes according to a signal list predefined by a manufacturer, and finally analyzes original sensor data obtained by RECIve (data receiving) into target data (such as target position, speed, number, motion attribute and the like) required by the inside of a sensing module; and on the basis, performing Perception preprocessing on the target, and performing data filtering on target data obtained through analysis to remove obviously abnormal and unreliable data. Then, an obj fusion (object fusion) module performs data association on data of the same object by using multiple sensors, aggregates the data of the multiple sensors into the same object by using a mahalanobis distance and probability ellipse equidistant matching mode, and performs data fusion by using output characteristics of each sensor: the FC camera has good target classification performance and accurate identification, but the output target transverse position has large fluctuation, is easily affected by weather, and has insufficient capability of measuring the speed of a remote object; the millimeter wave radar FR has good testing capability on objects moving at high speed by utilizing the doppler effect, but is easily interfered by the environment, and the output target position information is unstable. The superior working conditions of all sensors are fully considered in the fusion strategy, the associated target attribute is fused into the optimal attribute under the current environment, and the target attribute information (target position, speed, number, motion attribute and the like) with higher redundancy and higher precision is output; the lane fusion module is used for judging rationality by utilizing the ground lane line tertiary curve parameters acquired by the FC camera, filtering the input accompanying interference according to the principle that the lane line cannot be suddenly distorted and needs continuous smoothing, and outputting stable and reliable lane line parameters (tertiary curve parameters); and filtering targets and pedestrians outside the lane and the left and right adjacent lanes by using a result output by the target_selector (fusion target screening), and numbering and outputting surrounding vehicles according to the front, rear, left, right, side front and side rear positions by taking the vehicle as a center and combining with the auxiliary automatic driving function requirement.
The automatic emergency braking AEB function SWC module realizes the emergency obstacle avoidance of the vehicle on the structured road to the front obstacle, the adaptive cruise ACC function SWC module realizes the intelligent adaptive cruise of the vehicle on the structured road, a simulink tool in matlab can be adopted to develop respective functional logic, AEB and ACC utilize target attributes output by SDF, internal logic operation is carried out according to target positions and motion states, vehicle control signals such as an electronic steering wheel steering angle, an accelerator pedal control amount, a brake pedal control amount and the like are finally generated, and a rear-end executing mechanism controls the vehicle. Other development tools and platform implementations may also be employed.
According to the embodiment, an AC chip is combined with a horizon J2 vision processing chip to output a target, a three-level automatic driving C++ sensing code, an automatic emergency braking AEB, an adaptive cruise ACC and other functional modules based on matlab development are integrated in an open-source compiling environment HighTec, a complete automatic driving framework of an automobile with data receiving and transmitting, sensing processing, target identification and decision planning is constructed through a data intermediate interface RTE, autonomous realization of a low-cost auxiliary automatic driving function is achieved, and function expansion and iteration can be achieved.
And the HighTec cross-platform compiling environment is utilized, the data interaction and verification of all modules are realized through a middleware RTE, and the modularization and standardization management of the system architecture are realized under the condition of ensuring smooth data. The method comprises the steps that millimeter wave radar, a forward-looking camera and vehicle body information serving as external equipment are connected to an RTE (real-time kinematic) as external input, a sensing fusion SWC module is used for carrying out target fusion by using FR (forward millimeter wave radar), FC (front camera) and vehicle body information, a fusion target result is output and sent back to the RTE, an automatic emergency braking and self-adaptive cruise module mounted on the RTE is used for carrying out operation in the self module by using the fusion target as input, and finally, an actual vehicle control variable is obtained and sent to the RTE, finally received by a vehicle, and ESP vehicle body attitude control is carried out, so that automatic driving is realized.
The front-view camera data transceiver module can be integrated on a master chip (for example, an Infrax SAK-TC233LP-32F200N AC chip can be adopted), the data transmission mode is a serial peripheral interface, and the horizon object identification chip is a slave chip. The data communication of the master chip and the slave chip are interacted through a data intermediate interface RTE, the bottom layer is the package for realizing the output target of the camera by utilizing a serial peripheral interface communication protocol, and finally, each signal application program interface function with the prefix of Rte_fc_ (data intermediate interface_front camera target information) signal name is formed, so that the RTE can acquire the camera target data from the signal application program interface function; the front millimeter wave radar data receiving and transmitting module is integrated on the main chip, and is different from a front-view camera, and CAN communication and data interaction are carried out on an automobile hardware system through a CAN interface. Therefore, CAN communication is adopted between the intermediate data interface RTE and the front millimeter wave radar FR, and signals are transmitted and received according to the definition signals of the CAN message signal list.
The data communication of the front-view camera data transceiver module and the horizon line target identification module is realized through the SPI serial peripheral interface, the SPI communication protocol is utilized to package the road target and the obstacle data, each signal API function (application program interface) with the prefix of Rte_fc_signal name is formed, the front millimeter wave radar data transceiver module is used for carrying out data interaction with a vehicle body system through the vehicle body CAN network interface, the CAN communication protocol is utilized between the RTE module and the FR module, and the offset and the proportion are utilized to carry out data analysis and package by the parameters.
When the RTE module receives the FR data, it encapsulates the FR data by using the SPI communication protocol to form each signal API function with prefix "rte_fr_signal name (millimeter wave radar intermediate interface data)", the RTE module encapsulates all the data to make it a standard external interface function, wherein the FC related signal is encapsulated in rte_spi.h (front camera intermediate interface data), the FR and body data signals are encapsulated in rte_com.h (body intermediate interface data), and the SDF related signal is encapsulated in rte_sdf.h (fusion data intermediate interface data).
The RTE module and all SWC modules perform data interaction through corresponding interface functions, data prefixes from the RTE are distinguished by RTE, and the same data are redefined by using variable name definition to obtain signals meeting the requirements of the SWC modules. And (3) checking the valid bit by using the flag CHECK bit, performing data rationality judgment according to the physical signal definition of the signal list, and outputting the data judged by the validity and rationality into two states of RTE_E_OK (signal normal) and RTE_E_CHECK (signal abnormal).
The signal distribution of the RTE may specifically be that, after the RTE receives the FR data, the RTE performs data interaction with other components through the SPI, and the bottom layer thereof uses the SPI communication protocol to implement the encapsulation of each signal, so as to finally form each signal API function with a prefix of "rte_fr_ (data intermediate interface_millimeter wave radar target information) signal name", so as to obtain millimeter wave target data from the RTE.
And the RTE middleware designs corresponding SWC components according to the functional requirements, and limits the receiving and transmitting signals for all the components so as to realize the unified management of interfaces.
In RTE, all signals in demand are encapsulated by using open source compiler software HighTec, so that all signals become standard external interface functions. The signals related to the front camera FC are packaged in a Rte_spi.h (front camera middle interface data) file, the millimeter wave radar FR output target and the vehicle body data signals are packaged in a Rte_com.h (vehicle body middle interface data) header file, the signals related to the sensing fusion output target are packaged in a Rte_sdf.h (fusion data middle interface data) header file, and the control function end module calls signals from the packaging library for use.
In order to ensure unification of the signal interface and the design mode, the data interaction between the data intermediate interface module RTE and signals of all functional modules adopts corresponding intermediate interface data: each target signal output by the front camera FC has an external interface function, so that the rear end can acquire the signal. For example, if the sensing fusion module SDF needs to use the target longitudinal position output by the front camera, the SDF directly invokes the signal acquisition function already packaged as rte_spi.h, and the return value of the function is the target longitudinal position, and other signals are similar; meanwhile, the data interface module can also carry out rationality judgment when transmitting each signal: each signal contains, in addition to the physical definition itself to which it refers, a set of signal rationality flags rte_e_ok (signal normal) and rte_e_check (signal abnormal) in its function. In the data intermediate module RTE, corresponding rationality judgment logic is written for each signal, if the signal receiving signal exceeds a threshold value, rte_e_check is set to 1, rte_e_ok is set to 0, otherwise rte_e_check is set to 0, rte_e_ok is set to 1.
The sensing fusion module SWC is a three-level automatic driving fusion related function. In order to realize the adaptation of the same sensing code to the communication protocols required by various types of sensors, a C++ polymorphism mode is utilized by a fused input end, a receiving class receiver (receiving abstract class) of a father class abstract layer is established, and a specific receiving mode and signal analysis are respectively realized in subclasses according to different actually accessed sensors. Finally, the obtained target signals output by the sensors are used for the rear end to sense the fusion codes to perform data association and fusion; the sensor communication protocol is mainly divided into Ethernet, CAN (controller area network) network or CANFd (frame message data segment baud rate variable communication protocol), and the three subclasses are inherited from the father class abstract layer receiving class, and in the specific receiving implementation, the functions of the communication protocols required by the subclasses are realized by the polymorphism mode.
The sensing fusion module SWC input end is provided with a Sensor module for receiving data which is collected by all sensors of an automobile and contains pure virtual function receptions by a father class object of an abstract Sensor, and three receiving sub-class objects of socket_sensor_receptions, can_sensor_receptions and canfd_receptions are respectively arranged for the data according to Ethernet, CAN network or CANFd network classification, and each sub-class object contains the virtual function receptions.
The sensing fusion module SWC sets an abstatt_parameter parent class, and converts the original data acquired by the sensor into a readable and usable data form according to a data transmission mode, a manufacturer, a model and a CAN signal list or data analysis realized by the signal list provided by the sensor; based on data receiving and analyzing, a perception fusion module SWC is internally provided with a persistence module, unreasonable data are removed according to the data characteristics of each sensor, then an objfusion module and a lane line data fusion module are carried out, and a target screening target_selector is carried out according to the driving area of the vehicle.
On the basis of realizing normal receiving of the original data of the sensor, the original data are stored in a sensor original data RawSensorMsg (sensor original data) structure body for analysis by the back end; by setting an abstract parsing parent class Parser (abstract parsing parent class), the parsing work of all original data is uniformly managed. According to the data analysis realized by the data transmission modes, manufacturers, models and signal lists provided by the sensors, the original data acquired by the sensors are converted into readable and usable data forms, and finally the data are uniformly stored in a preset sensor internal data sensor msg (sensor internal data) structure body, and the external sensors or the conversion of communication protocols thereof only need to change and analyze part codes, so that the isolation and independence of data transmission are realized, and the data transmission has better granularity.
Based on the sensor data, the sensing algorithm fusion module SDF realizes the association and fusion of multiple sensor data, utilizes a persistence module to perform data preprocessing according to the data characteristics of each sensor, eliminates unreasonable data, then passes through an obj fusion module and a lane line data fusion module to realize the 'make up for the short of the long term' of the sensor output characteristics, finally performs vehicle region screening target_selector according to the vehicle driving region, and finally outputs a certain number (e.g. 0-18) of targets which have close influence on driving safety and driving performance as final output results.
The result is target level data with relatively high reliability, and the accuracy of output is higher than that of a single sensor. Simultaneously consider the time synchronization problem of different sensors: the time synchronization module adopts the absolute time of the system as a time axis, selects the output time of one sensor as the reference time, sequentially inquires by using a data container, sequentially compares the data of each frame of the other sensors with the reference time, and searches the frame with the minimum time interval as the synchronized time; the output of each sensor installed at different positions of the vehicle body is required to be spatially synchronized: the space synchronization module takes a front protection center point as an original point, and takes a left-right positive, front-right and back-negative own vehicle coordinate system as a reference coordinate system of all sensors.
And the automatic emergency braking AEB, the self-adaptive cruise ACC and other functional software functional modules SWC use a fusion target result output by a perception fusion module SDF through an intermediate interface RTE, obtain the actual vehicle control quantity after internal operation of the AEB and ACC modules, and send a vehicle body control signal to a vehicle end through the RTE to realize automatic driving.
Claims (15)
1. An automotive autopilot control system based on perceptual fusion, comprising: the system comprises a data transceiver module, a signal middleware interface RTE module, a perception fusion SWC module, an AEB function SWC module and an ACC function SWC module, wherein the data transceiver module acquires road target and obstacle data in front of the automobile in real time and transmits the data to the signal middleware interface RTE module according to a preset dbc signal format; the signal middleware interface RTE module integrates all received SWC module signals, performs safety function verification, determines signal states and completes signal distribution; the sensing fusion module SWC fuses the multi-sensor data by adopting a single-thread data processing mode, a C++ polymorphism mode is utilized by a fused input end, a father class abstract layer is established to receive abstract classes, specific receiving modes and signal analysis are respectively realized in subclasses according to different actually accessed sensors, and target-level data which has close influence on driving safety and driving performance is screened and output according to a vehicle driving area; and the AEB function SWC module controls the vehicle to emergently avoid the obstacle to the front obstacle according to the target-level data, and the ACC function SWC controls the vehicle to intelligently and adaptively cruise.
2. The system of claim 1, wherein the data transceiver module comprises: the front-view camera data transceiver module, the front millimeter wave radar data transceiver module and the horizon target recognition module are integrated on a master chip, the horizon target recognition module is integrated on a slave chip, the master chip and the slave chip interact data through a serial peripheral interface (RTE) module, and the SPI communication protocol is utilized to package road target and barrier data, so that each signal API function with the prefix of Rte_fc_signal name is formed; the front millimeter wave radar data receiving and transmitting module is integrated on the main chip, performs data interaction with the vehicle body system through the vehicle body CAN network interface, performs interaction between the RTE module and the FR module by adopting a CAN communication protocol, and performs data analysis and package combination by utilizing factor, offset, max, min parameters.
3. The system of claim 1 wherein FR data received by the RTE module is encapsulated using the SPI communication protocol to form signal API functions prefixed with a "rte_fr_signal name" as standard external interface functions, FC related signals are encapsulated in rte_spi.h, FR and body data signals are encapsulated in rte_com.h, and SDF related signals are encapsulated in rte_sdf.h.
4. A system according to any one of claims 1-3, wherein the RTE module and all SWC modules interact with each other by means of corresponding interface functions, the RTE module data are distinguished by a prefix RTE, the same data are redefined by typedefend to obtain signals meeting the requirements of the SWC modules, the rollingcontroller is used for validity verification, the dbc signal is used for data rationality judgment, and the data output signals judged by validity and rationality are in normal and abnormal states.
5. A system according to any one of claims 1-3, wherein the sensor module is arranged at the SWC input end of the sensing fusion module to receive data comprising pure virtual functions collected by all sensors of the vehicle, and three receiving sub-class objects of ethernet data, CAN data and CANFd data are respectively arranged according to the classification of the ethernet, CAN network or CANFd network for the data, and each of the three sub-class objects comprises pure virtual functions.
6. The system of claim 5, wherein the sensor fusion module SWC sets an abstract parsing parent class, and performs data parsing according to a data transmission mode, a manufacturer, a model, and a CAN dbc or signal list provided by the sensor, so as to convert raw data collected by the sensor into a readable and usable data form.
7. The system according to claim 5, wherein the perception fusion module SWC has a persistence perception module built therein, eliminates unreasonable data according to data characteristics of each sensor, performs target data fusion, performs lane line data fusion, performs target screening according to a vehicle driving area, and outputs targets having a close influence on driving safety and driving performance.
8. The system of claim 7, wherein the time synchronization and space synchronization module is further configured to synchronize the cameras, radars, and other sensors mounted at different positions of the vehicle body, the time synchronization module uses the absolute time of the system as a time axis, selects the output time of any one of the sensors as a reference time, sequentially queries by using the sequence container, compares the data of each frame of the remaining sensors with the reference time in sequence, and searches for the frame with the smallest time interval as the synchronization time; the space synchronization module takes a front protection center point of the vehicle as an origin, and takes a vehicle coordinate system with left, right, front, back and negative as a reference coordinate system of all sensors.
9. The automatic driving control method of the automobile based on perception fusion is characterized by comprising a front-view camera, horizon object recognition and a multi-sensor of a front millimeter wave radar data receiving and transmitting module, wherein the multi-sensor acquires and recognizes road objects and obstacle data in front of the automobile driving in real time, integrates and checks safety functions, confirms signal states and completes signal distribution, adopts a single-thread data processing mode to fuse the multi-sensor data, a fused input end utilizes a C++ polymorphism mode to establish a father abstract layer to receive abstract class, realizes specific receiving modes and signal analysis in subclasses respectively according to different actually accessed sensors, and screens and outputs a certain amount of target-level data which has close influence on driving safety and driving performance according to a vehicle driving area; and the AEB function SWC module controls the vehicle to avoid the obstacle in front according to the target level data, and the ACC function SWC controls the vehicle to intelligently and adaptively cruise.
10. The method of claim 9, wherein the data communication between the front-view camera data transceiver module and the horizon object identification module is via an SPI serial peripheral interface, the road object and the obstacle data are encapsulated using an SPI communication protocol to form respective signal API functions prefixed with a "rte_fc_signal name", the front millimeter wave radar data transceiver module is via a car body CAN network interface for data interaction with a car body system, and the Rte module and the FR module are via a CAN communication protocol for data interaction.
11. The method of claim 9 wherein the RTE module encapsulates the received FR data using the SPI communication protocol to form respective signal API functions prefixed with a "rte_fr_signal name", and encapsulates all data to make it a standard external interface function, wherein FC related signals are encapsulated in rte_spi.h, FR related signals and body data signals are encapsulated in rte_com.h, and SDF related signals are encapsulated in rte_sdf.h.
12. The method of claim 11, wherein the RTE module and all SWC modules interact with each other by corresponding interface functions, the data from the RTE is distinguished by a prefix RTE, the same data is redefined to obtain a signal meeting the requirements of the SWC modules, the signal flag bit is used for checking the validity bit, the data rationality is determined according to the dbc physical signal, and two states of rte_e_ok and rte_e_check are output by the data judged by the validity and rationality.
13. The method of claim 11, wherein the sensor module is arranged at the SWC input end of the sense fusion module to receive data including the pure virtual function recevie collected by all sensors and to set three receiving sub-class objects including the ethernet data, CAN data and CANFd data according to the classification of the ethernet, CAN network or CANFd network, respectively, and the three sub-class objects each include the pure virtual function.
14. The method of claim 13, wherein the sensing fusion module SWC sets an abstract parsing parent class, performs data parsing according to a data transmission mode, a manufacturer, a model and a CAN dbc or signal list provided by the sensor, and converts raw data collected by the sensor into a readable and usable data form; and a permission sensing module is arranged in the sensing fusion module SWC, unreasonable data is removed according to the data characteristics of each sensor, then target data fusion is carried out, lane line data fusion is carried out, and target screening is carried out according to the vehicle driving area.
15. The method according to one of claims 9 to 14, wherein the time synchronization module uses the absolute time of the system as a time axis, selects the output time of any sensor as a reference time, sequentially queries using a sequence container, sequentially compares the data of each frame of the remaining sensors with the reference time, and searches for the frame with the smallest time interval as the synchronized time; the space synchronization module takes a front protection center point as an original point, and takes a left-right positive, front-right and back-negative own vehicle coordinate system as a reference coordinate system of all sensors.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111155811.5A CN113734166B (en) | 2021-09-30 | 2021-09-30 | Automatic automobile driving control system and method based on sensing fusion SWC |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111155811.5A CN113734166B (en) | 2021-09-30 | 2021-09-30 | Automatic automobile driving control system and method based on sensing fusion SWC |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113734166A CN113734166A (en) | 2021-12-03 |
CN113734166B true CN113734166B (en) | 2023-09-22 |
Family
ID=78741913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111155811.5A Active CN113734166B (en) | 2021-09-30 | 2021-09-30 | Automatic automobile driving control system and method based on sensing fusion SWC |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113734166B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114460583A (en) * | 2022-01-22 | 2022-05-10 | 重庆长安汽车股份有限公司 | Active security software implementation method and system based on system on chip |
CN114684154B (en) * | 2022-03-24 | 2024-06-21 | 重庆长安汽车股份有限公司 | Method for visually detecting target course angle based on Lei Dadian cloud correction and storage medium |
CN115042784A (en) * | 2022-07-20 | 2022-09-13 | 重庆长安汽车股份有限公司 | Control method and device for automobile adaptive cruise system, vehicle and storage medium |
CN115497195A (en) * | 2022-08-09 | 2022-12-20 | 重庆长安汽车股份有限公司 | Data control system, method, device and medium for realizing driving record |
CN118144806B (en) * | 2024-05-06 | 2024-07-26 | 北京茵沃汽车科技有限公司 | Camera sensor and fault detection method thereof |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100070590A (en) * | 2008-12-18 | 2010-06-28 | 재단법인대구경북과학기술원 | Method for authenticating control signal of vehicle's electronic control unit, control signal transmitter and ecu using the same |
CN103318176A (en) * | 2013-06-28 | 2013-09-25 | 郑州宇通客车股份有限公司 | Coach self-adaptive cruise control system and control method thereof |
CN108445885A (en) * | 2018-04-20 | 2018-08-24 | 鹤山东风新能源科技有限公司 | A kind of automated driving system and its control method based on pure electric vehicle logistic car |
CN110147109A (en) * | 2019-05-21 | 2019-08-20 | 重庆长安汽车股份有限公司 | A kind of archetype development system of automated driving system |
CN111469838A (en) * | 2020-04-22 | 2020-07-31 | 芜湖伯特利汽车安全系统股份有限公司 | Collaborative ACC/AEB decision management system based on Internet of vehicles and vehicle |
CN111634290A (en) * | 2020-05-22 | 2020-09-08 | 华域汽车系统股份有限公司 | Advanced driving assistance forward fusion system and method |
CN113313154A (en) * | 2021-05-20 | 2021-08-27 | 四川天奥空天信息技术有限公司 | Integrated multi-sensor integrated automatic driving intelligent sensing device |
-
2021
- 2021-09-30 CN CN202111155811.5A patent/CN113734166B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100070590A (en) * | 2008-12-18 | 2010-06-28 | 재단법인대구경북과학기술원 | Method for authenticating control signal of vehicle's electronic control unit, control signal transmitter and ecu using the same |
CN103318176A (en) * | 2013-06-28 | 2013-09-25 | 郑州宇通客车股份有限公司 | Coach self-adaptive cruise control system and control method thereof |
CN108445885A (en) * | 2018-04-20 | 2018-08-24 | 鹤山东风新能源科技有限公司 | A kind of automated driving system and its control method based on pure electric vehicle logistic car |
CN110147109A (en) * | 2019-05-21 | 2019-08-20 | 重庆长安汽车股份有限公司 | A kind of archetype development system of automated driving system |
CN111469838A (en) * | 2020-04-22 | 2020-07-31 | 芜湖伯特利汽车安全系统股份有限公司 | Collaborative ACC/AEB decision management system based on Internet of vehicles and vehicle |
CN111634290A (en) * | 2020-05-22 | 2020-09-08 | 华域汽车系统股份有限公司 | Advanced driving assistance forward fusion system and method |
CN113313154A (en) * | 2021-05-20 | 2021-08-27 | 四川天奥空天信息技术有限公司 | Integrated multi-sensor integrated automatic driving intelligent sensing device |
Also Published As
Publication number | Publication date |
---|---|
CN113734166A (en) | 2021-12-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113734166B (en) | Automatic automobile driving control system and method based on sensing fusion SWC | |
CN111123933B (en) | Vehicle track planning method and device, intelligent driving area controller and intelligent vehicle | |
US10139834B2 (en) | Methods and systems for processing local and cloud data in a vehicle and a cloud server for transmitting cloud data to vehicles | |
US7974748B2 (en) | Driver assistance system with vehicle states, environment and driver intention | |
CN106537180B (en) | Method for mitigating radar sensor limitations with camera input for active braking of pedestrians | |
US20230286519A1 (en) | Endogenic protection method for function security and network security of sensing and decision-making module of intelligent connected vehicle | |
CN108334087B (en) | Software definition-based platform advanced driving assistance system | |
CN109298713B (en) | Instruction sending method, device and system and automatic driving vehicle | |
US11897511B2 (en) | Multi-hypothesis object tracking for automated driving systems | |
US11631325B2 (en) | Methods and systems for traffic light state monitoring and traffic light to lane assignment | |
CN111169443A (en) | Anti-collision automatic emergency braking system and method for bus | |
WO2020038446A1 (en) | Vehicle controller, vehicle control method, and vehicle | |
WO2023162491A1 (en) | Distributed processing of vehicle sensor data | |
CN116901875A (en) | Perception fusion system, vehicle and control method | |
US20230339486A1 (en) | Autonomous driving system and autonomous vehicle | |
WO2023201563A1 (en) | Control method and apparatus, and means of transportation | |
CN116546067A (en) | Internet of vehicles formation method, system and medium based on hong Mongolian system | |
KR20240015793A (en) | Vehicle data collection and transmission apparatus and method | |
CN114104004A (en) | Method and device for taking over longitudinal control of vehicle by driver, automobile and computer readable storage medium | |
Sathiyan et al. | A comprehensive review on cruise control for intelligent vehicles | |
CN109017634B (en) | Vehicle-mounted network system | |
CN112351407A (en) | AEB strategy method based on 5G hierarchical decision | |
US20240087445A1 (en) | Method for providing an object message about an object, which is recognized in the surroundings of a road user, in a communication network for the communication with other road users | |
WO2024043011A1 (en) | Verification on prediction function of vehicle | |
Iclodean et al. | Autonomous Driving Systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |