CN113884123A - Sensor calibration method and device, vehicle and storage medium - Google Patents
Sensor calibration method and device, vehicle and storage medium Download PDFInfo
- Publication number
- CN113884123A CN113884123A CN202111117177.6A CN202111117177A CN113884123A CN 113884123 A CN113884123 A CN 113884123A CN 202111117177 A CN202111117177 A CN 202111117177A CN 113884123 A CN113884123 A CN 113884123A
- Authority
- CN
- China
- Prior art keywords
- image
- sensor
- sub
- target
- comparison images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 230000002159 abnormal effect Effects 0.000 claims abstract description 59
- 238000012795 verification Methods 0.000 claims description 22
- 230000005856 abnormality Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 10
- 238000003745 diagnosis Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000002547 anomalous effect Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010252 digital analysis Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000004092 self-diagnosis Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D18/00—Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention discloses a sensor checking method and device, a vehicle and a storage medium, wherein the method comprises the following steps: acquiring a target image acquired by a target sensor; acquiring reference images acquired by at least two reference sensors to obtain at least two reference images, wherein each reference sensor and the target sensor respectively have a superposition acquisition area; determining a first sub-image corresponding to the overlapping acquisition area from the target image, determining a second sub-image corresponding to the overlapping acquisition area from the reference image, and obtaining at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to one overlapping acquisition area; judging whether the first sub-image and the second sub-image of each group of comparison images are the same; when the judgment results of each group of comparison images are different, determining that the target sensor is abnormal; whether the sensor is abnormal or not is accurately checked only by adding software logic, and the reliability of the vehicle sensor in working is improved, so that the safety of the vehicle is improved.
Description
Technical Field
The invention relates to the technical field of automatic auxiliary driving, in particular to a sensor checking method and device, a vehicle and a storage medium.
Background
With the development of the automobile industry and the progress of the human-computer interaction technology, intelligent automobiles increasingly enter the visual field of people. At present, most of intelligent automobiles are equipped with an automatic Assisted Driving (ADAS), wherein the ADAS senses the surrounding environment at any time during the driving process of the automobile by using various sensors installed on the automobile, collects data, identifies, detects and tracks static and dynamic objects, and performs systematic operation and analysis by combining with navigator map data, so that drivers can observe possible dangers in advance, and the comfort and safety of automobile driving are effectively improved.
The ADAS depends on the sensors, the requirements on the reliability of the sensors are extremely high, but in the actual use process, some sensors can also be in failure, although some sensors can judge the running state of the hardware of the sensors through self diagnosis on the hardware, for special conditions, such as the condition that lenses of a Cmos camera are hung up, the hardware cannot be used for complete diagnosis, and software is executed according to error image signals acquired by the camera, so that the safety problem of the vehicle can be caused.
Disclosure of Invention
The embodiment of the invention discloses a sensor checking method and device, a vehicle and a storage medium, which are used for improving the reliability of a vehicle sensor in working so as to improve the safety of the vehicle.
The first aspect of the embodiments of the present invention discloses a sensor verification method, which may include:
acquiring a target image acquired by a target sensor;
acquiring reference images acquired by at least two reference sensors to obtain at least two reference images, wherein each reference sensor and the target sensor respectively have a superposition acquisition area;
determining a first sub-image corresponding to the overlapping acquisition area from the target image, determining a second sub-image corresponding to the overlapping acquisition area from the reference image, and obtaining at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to the overlapping acquisition area;
judging whether the first sub-image and the second sub-image of each group of the comparison images are the same;
and when the judgment results of each group of comparison images are different, determining that the target sensor is abnormal.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, before the determining that the determination results of each set of the comparison images are different and the determining that the target sensor is abnormal, the method further includes:
when the judgment results of each group of comparison images are different, judging whether the pixel value of the target image is matched with a preset pixel value;
and if so, executing the step of determining that the target sensor is abnormal.
As an optional implementation manner, in the first aspect of this embodiment of the present invention, the method further includes:
when the judgment results of all the comparison images are the same or the judgment result of at least one comparison image is different, judging whether the pixel value of the target image is matched with the preset pixel value;
if not, determining that the target sensor is normal;
and if the target sensor is matched with the target sensor, determining that the target sensor is abnormal.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, the determining whether the first sub-image and the second sub-image of each group of the comparison images are the same includes:
comparing the difference value of the first sub-image and the second sub-image of each group of the comparison images; and judging whether the difference value is not less than a difference threshold value; or
Comparing whether the characteristic parameters of the first sub-image and the second sub-image in each group of comparison images are the same or not; or
And comparing whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image in each group of comparison images are the same or not.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, after determining that the target sensor is abnormal, the method further includes:
and uploading abnormal information for indicating the abnormality of the target sensor to a cloud server, and performing early warning indication.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, before the acquiring the target image acquired by the target sensor, the method further includes:
detecting a second sensor on the vehicle, wherein the acquisition area of the second sensor is adjacent to and not intersected with the acquisition area of the first sensor;
and adjusting the collection angle of the first sensor and the second sensor so that the collection area of the first sensor and the collection area of the second sensor have the overlapped collection area, and after the collection angle is adjusted, each sensor and at least two other sensors on the vehicle respectively have the overlapped collection areas.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, before the acquiring the target image acquired by the target sensor, the method further includes:
adjusting all sensors on the vehicle so that the acquisition frequency of all sensors is the same.
The second aspect of the embodiments of the present invention discloses a sensor calibration apparatus, which may include:
the acquisition module is used for acquiring a target image acquired by the target sensor;
the acquisition module is further configured to acquire reference images acquired by at least two reference sensors to obtain at least two reference images, and each reference sensor and the target sensor respectively have a coincident acquisition region;
the image processing module is used for determining a first sub-image corresponding to the overlapping acquisition area from the target image and determining a second sub-image corresponding to the overlapping acquisition area from the reference image to obtain at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to the overlapping acquisition area;
the judging module is used for judging whether the first sub-image and the second sub-image of each group of comparison images are the same;
and the abnormality determining module is used for determining that the target sensor is abnormal when the judging module judges that the judging results of each group of the comparison images are different.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the determining module is further configured to determine whether the pixel value of the target image matches a preset pixel value when the determining module determines that the comparison result of each set of the comparison images is different, and trigger the abnormality determining module to perform the step of determining that the target sensor is abnormal when the determination result is matching.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the determining module is further configured to determine whether the pixel value of the target image matches the preset pixel value when the determination results of all the comparison images are the same or at least one of the comparison images is different; if not, the abnormality determination module determines that the target sensor is normal; if so, the anomaly determination module determines that the target sensor is anomalous.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the manner that the determining module is configured to determine whether the first sub-image and the second sub-image of each group of the comparison images are the same is specifically:
judging whether the first sub-image and the second sub-image are the same according to the difference value of the first sub-image and the second sub-image of each group of comparison images; or
Comparing whether the characteristic parameters of the first sub-image and the second sub-image in each group of comparison images are the same or not; or
And comparing whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image in each group of comparison images are the same or not.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the apparatus further includes:
and the early warning module is used for uploading abnormal information for indicating the abnormality of the target sensor to a cloud server and carrying out early warning indication after the abnormality determining module determines that the target sensor is abnormal.
A third aspect of an embodiment of the present invention discloses a vehicle, which may include:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute the sensor verification method disclosed by the first aspect of the embodiment of the invention.
A fourth aspect of the embodiments of the present invention discloses a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, performs the steps of any one of the methods of the first aspect of the embodiments of the present invention.
A fifth aspect of embodiments of the present invention discloses a computer program product, which, when run on a computer, causes the computer to perform some or all of the steps of any one of the methods of the first aspect.
A sixth aspect of the present embodiment discloses an application publishing platform, where the application publishing platform is configured to publish a computer program product, where the computer program product is configured to, when running on a computer, cause the computer to perform part or all of the steps of any one of the methods in the first aspect.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, at least two reference images are obtained by obtaining a target image collected by a target sensor and obtaining reference images collected by at least two reference sensors, each reference sensor and the target sensor are respectively provided with a coincident collection area, then a first sub-image corresponding to the coincident collection area is determined from the target image, a second sub-image corresponding to the coincident collection area is determined from the reference image, at least two groups of comparison images are obtained, each group of comparison images comprises a first sub-image and a second sub-image corresponding to the coincident collection area, whether the first sub-image and the second sub-image of each group of comparison images are the same or not is judged, and if the first sub-image and the second sub-image are different, the target sensor is determined to be abnormal; by implementing the embodiment of the invention, the mutual verification can be carried out by collecting images through the sensor, the defects of hardware diagnosis in the traditional technology are overcome, whether the sensor is abnormal or not is accurately verified only by adding software logic, the method is particularly suitable for diagnosing abnormal conditions such as sensor hang-up and the like, the reliability of the vehicle sensor in the working process can be improved, and the vehicle safety is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a sensor calibration method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a sensor verification method according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart illustrating a sensor verification method according to a second embodiment of the present invention;
fig. 4 is a schematic flow chart of a sensor verification method according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of a sensor calibration apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a sensor verification apparatus according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of a sensor calibration apparatus according to a third embodiment of the present invention;
FIG. 8 is a schematic structural diagram of a vehicle according to an embodiment of the present invention;
fig. 9 is a schematic diagram of the front camera being verified by the left camera and the right camera in fig. 1.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first", "second", "third", and "fourth" and the like in the description and the claims of the present invention are used for distinguishing different objects, and are not used for describing a specific order. The terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, referring to fig. 1, fig. 1 is a schematic view of an application scenario of a sensor calibration method according to an embodiment of the present invention; fig. 1 illustrates an example in which four cameras are arranged on a vehicle, and the viewing angle and range of each camera are as shown in fig. 1, where the determination of the overlapping acquisition regions between two cameras is as follows: a front left coincident acquisition region FL (i.e. a coincident acquisition region between front/left cameras), a front right coincident acquisition region FR (i.e. a coincident acquisition region between front/right cameras), a rear left coincident acquisition region BL (i.e. a coincident acquisition region between rear/left cameras), a rear right coincident acquisition region BR (i.e. a coincident acquisition region between rear/right cameras), four cameras respectively and simultaneously acquire images, for the abnormity check of the front cameras, a front left coincident sub-image 1 corresponding to the front left coincident acquisition region FL of the front camera, a front left coincident sub-image 2 corresponding to the front left coincident acquisition region of the left camera, a front right coincident sub-image 1 corresponding to the front right FR coincident acquisition region of the front camera, a front right coincident sub-image 2 corresponding to the front right coincident acquisition region of the right camera are obtained, and whether the front left coincident sub-image 1 is the same as the front left coincident sub-image 2 or not is judged respectively, Whether the front right coincident sub-image 1 is the same as the front right coincident sub-image 2 or not can be determined, if the comparison results of the two are different, the front camera is determined to be abnormal, or whether the pixel value of the image acquired by the front camera is matched with a preset pixel value (such as a pixel value corresponding to black, white or gray and the like) or not can be further determined, when the two are matched, the front camera is determined to be abnormal, if the comparison results of the two are the same or any one of the comparison results is different, the pixel value acquired by the front camera is further determined to be matched with the preset pixel value or not, if the two are matched, the front camera is determined to be abnormal, and if the two are not matched, the front camera is determined to be normal; for the exception check of the left camera, the exception check of the right camera and the exception check of the rear camera, reference is made to the check steps of the front camera, and a description thereof will not be given.
In fig. 1, the front camera is arranged at the vehicle head, the vehicle head faces forward, the left camera is arranged at the left side of the vehicle, the right camera is arranged at the right side of the vehicle, and the rear camera is arranged at the rear of the vehicle.
In combination with the above description, the embodiment of the invention discloses a sensor checking method and device, a vehicle and a storage medium, which are used for improving the reliability of a vehicle sensor in working so as to improve the safety of automatic auxiliary driving of the vehicle. The technical solution of the present invention will be described in detail with reference to the following embodiments.
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating a sensor calibration method according to an embodiment of the present invention; as shown in fig. 2, the sensor verification method may include:
201. and acquiring a target image acquired by the target sensor.
It can be understood that a plurality of sensors capable of acquiring images are installed on the vehicle, such as a camera, a laser radar and the like, for example, the camera may be a panoramic camera or a full-view camera, and the sensor and the other sensors have a coincidence acquisition region, for example, referring to the application scenario of fig. 1, a front left coincidence acquisition region is provided between the front camera and the left camera. In the embodiment of the invention, any one of the sensors on the vehicle can be used as a target sensor, the image acquired by the target sensor is a target image, and the other sensor with the overlapped acquisition area is used as a reference sensor.
202. And acquiring reference images acquired by at least two reference sensors to obtain at least two reference images, wherein each reference sensor and the target sensor respectively have a superposed acquisition area.
In the embodiment of the invention, after the target sensor is determined, other sensors on the vehicle, which have overlapped acquisition areas with the target sensor, are determined as the reference sensors, the image acquired by each reference sensor is acquired as the reference image, and the number of the reference images is at least two due to the fact that at least 2 reference sensors are arranged, namely the number of the reference images is the same as that of the reference sensors, and the reference images and the reference sensors are in one-to-one correspondence.
It is understood that the target sensor and the reference sensor acquire images at the same frequency, that is, the target image and the reference image in the embodiment of the present invention are acquired at the same time.
It should be noted that step 201 and step 202 may be executed sequentially or simultaneously.
203. And determining a first sub-image corresponding to the overlapping acquisition area from the target image, determining a second sub-image corresponding to the overlapping acquisition area from the reference image, and obtaining at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to one overlapping acquisition area.
In an embodiment of the invention, a reference sensor and a target sensor have at least one coincident acquisition region, and then a reference image acquired by the reference sensor corresponds to a second sub-image on any one coincident acquisition region, and the target sensor also corresponds to a first sub-image on the coincident acquisition region, thereby forming a set of comparison images, and a reference sensor and a target sensor correspond to at least one set of comparison images, and at least two sets of comparison images are obtained by combining at least two reference sensors. For example, when the front camera in fig. 1 is used as a target sensor, the left camera and the right camera are used as reference sensors, and two sets of comparison images are formed together, which are respectively a comparison image corresponding to a front left coincidence acquisition region (front left coincidence sub-image 1, front left coincidence sub-image 2) and a comparison image corresponding to a front right coincidence acquisition region (front right coincidence sub-image 1, front right coincidence sub-image 2).
204. Judging whether the first sub-image and the second sub-image of each group of comparison images are the same; if the determination results of each group of comparison images are different, turning to step 205; otherwise, the flow is ended.
Exemplarily, referring to fig. 9 in conjunction with fig. 1, fig. 9 is a schematic diagram of checking the front camera by using the left camera and the right camera in fig. 1, in fig. 9, the target sensor is the front camera, the reference sensor is the left camera and the right camera, the acquisition region of the front camera and the acquisition region of the left camera (indicated by the corresponding angle of the acquisition region in the figure) have a coincident acquisition region 1 (left oblique line region), the acquisition region of the front camera and the acquisition region of the right camera (indicated by the corresponding angle of the acquisition region in the figure) have a coincident acquisition region 2 (right oblique line region), and after the images are acquired by the front camera, the left camera and the right camera, two sets of comparison images are obtained, which are respectively a comparison image 1 and a comparison image 2, where the comparison image 1 includes a partial image of the front left coincident sub-image 1 (a sub-image corresponding to the coincident acquisition region 1 in the image acquired by the front camera) and a front left partial image 2 (a partial image corresponding to the coincident acquisition region 1 in the image acquired by the front camera) are included Partial image corresponding to the coincidence acquisition region 1 in the image acquired by the left camera), and the comparison image 2 comprise a front right coincidence sub-image 1 (which is partial image corresponding to the coincidence acquisition region 2 in the image acquired by the front camera) and a front right coincidence sub-image 2 (which is partial image corresponding to the coincidence acquisition region 2 in the image acquired by the right camera), whether the front left coincidence sub-image 1 is the same as the front left coincidence sub-image 2 or not is judged, whether the front right coincidence sub-image 1 is the same as the front right coincidence sub-image 2 or not is judged, if the two judgment results are different, the process goes to step 205, and the target sensor is determined to be abnormal.
It should be noted that, in the embodiment of the present invention, ending the process means that the target sensor is considered to be normal, the image information acquired by the target sensor can be used normally, and then the next process can be started, in the next process, another sensor is taken as the target sensor.
205. An abnormality of the target sensor is determined.
It can be understood that when the target sensor is determined to be abnormal, the target image information acquired by the target sensor is not reliable and cannot be used as the reference data of the ADAS, so as to improve the driving safety of the vehicle.
As an optional implementation manner, after determining that the target sensor is abnormal, uploading abnormal information for indicating that the target sensor is abnormal to a cloud server, and performing early warning indication; through the embodiment, the cloud server can timely know that the target sensor is abnormal by timely uploading the abnormal information, and the acquired image data information is unreliable, so that the acquired image data information is not provided for corresponding systems such as ADAS (adaptive digital analysis system) on the vehicle and needing the image data information, and the driving safety of the vehicle is improved.
Optionally, the abnormal information may also carry an ID of the target sensor, the acquired image, or abnormal information.
Further optionally, after it is determined that the target sensor is abnormal, a broadcast prompt may be provided in the vehicle or a prompt may be output in the central control screen, so that the driver may perform manual intervention, including finding the cause of the abnormality or returning to a service shop for repair.
It can be seen that, when the above embodiment is implemented, at least two reference images are obtained by obtaining a target image collected by a target sensor and obtaining reference images collected by at least two reference sensors, each reference sensor and the target sensor respectively have a coincident collection area, then a first sub-image corresponding to the coincident collection area is determined from the target image, a second sub-image corresponding to the coincident collection area is determined from the reference image, at least two sets of comparison images are obtained, each set of comparison images includes a first sub-image and a second sub-image corresponding to the same coincident collection area, and by judging whether the first sub-image and the second sub-image of each set of comparison images are the same, if the first sub-image and the second sub-image are different, it is determined that the target sensor is abnormal; by implementing the embodiment of the invention, the mutual verification can be carried out by collecting images through the sensor, the defects of hardware diagnosis in the traditional technology are overcome, whether the sensor is abnormal or not is accurately verified only by adding software logic, the method is particularly suitable for diagnosing abnormal conditions such as sensor hang-up and the like, the reliability of the vehicle sensor in the working process can be improved, and the vehicle safety is improved.
Referring to fig. 3, fig. 3 is a schematic flow chart of a sensor verification method according to a second embodiment of the present disclosure; as shown in fig. 3, the sensor verification method may include:
301. and acquiring a target image acquired by the target sensor.
302. And acquiring reference images acquired by at least two reference sensors to obtain at least two reference images, wherein each reference sensor and the target sensor respectively have a superposed acquisition area.
It should be noted that step 301 and step 302 may be executed sequentially or simultaneously.
303. And determining a first sub-image corresponding to the coincident acquisition region from the target image, determining a second sub-image corresponding to the coincident acquisition region from the reference image, and obtaining at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to the same coincident acquisition region.
304. Judging whether the first sub-image and the second sub-image of each group of comparison images are the same; when the judgment results of each group of comparison images are different, turning to step 305; otherwise, the flow is ended.
It should be noted that, in the embodiment of the present invention, ending the process means that the target sensor is considered to be normal, the image information acquired by the target sensor can be used normally, and then the next process can be started. It should be noted that a plurality of processes may be executed simultaneously, and each process has a different sensor as a target sensor.
Of course, in order to improve the accuracy of determining the abnormality of the sensor, and thus improve the reliability of the operation of the sensor, in the embodiment of the present invention, when the determination results of all the comparison images are the same or the determination result of at least one of the comparison images is different, it is further determined whether the pixel value of the target image matches the preset pixel value; if not, determining that the target sensor is normal; if so, determining that the target sensor is abnormal; the method can accurately detect whether the target sensor is abnormal or not by combining the pixel value of the image, and improve the detection accuracy.
Optionally, an image acquired by a sensor determined to be abnormal is acquired in advance, and a pixel value of the image is acquired and set as a preset pixel value, for example, the image acquired by the abnormal sensor may be a completely black image, and a pixel value corresponding to black acquired from the completely black image is set as the preset pixel value.
Further, when the determination result of at least one of the comparison images is different, the target sensor may be labeled as a sensor to be confirmed, which needs to be further confirmed, then the reference sensor corresponding to the second sub-image in the different comparison images is obtained as the determination result, the reference sensor is used as the target sensor, and whether the sensor to be confirmed is abnormal is further determined by combining the abnormal determination result of the new target sensor.
For example, when the target sensor is stuck or hung, the image may be output as usual, and the output image may be a black image or a white image, and if the ADAS still uses the target image as the reference data for automatically assisting driving, an error may be caused and a driving risk may occur, so that the condition of the sensor may be further more accurately determined by combining the pixel values.
As an alternative embodiment, the step 304 of determining whether the first sub-image and the second sub-image of each group of comparison images are the same may include:
judging whether the first sub-image and the second sub-image of each group of comparison images are the same according to the difference value of the two sub-images; or
Comparing whether the characteristic parameters of the first sub-image and the second sub-image in each group of comparison images are the same or not; or
And comparing whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image in each group of comparison images are the same or not.
The sensor used for collecting the image can be a camera and a laser radar, so that the target sensor and the reference sensor corresponding to a group of comparison images can be combined as follows:
1. the target sensor and the reference sensor are both cameras.
For such a combination, the above-mentioned determining whether the first sub-image and the second sub-image of the comparison image are the same specifically includes:
and judging whether the first sub-image is the same as the second sub-image or not according to the difference value between the first sub-image and the second sub-image of the comparison image.
Because the target sensor and the reference sensor are both cameras, the similarity between the first sub-image and the second sub-image can be judged through the difference measurement, and whether the first sub-image and the second sub-image are the same or not is determined.
2. The target sensor is a camera and the reference sensor is a laser radar, or the target sensor is a laser radar and the reference sensor is a camera.
For such a combination, the above-mentioned determining whether the feature parameter of the first sub-image is the same as the feature parameter of the second sub-image in the comparison image may specifically include:
the feature parameter may be extracted from the first sub-image and the feature parameter may be extracted from the second sub-image, for example, the feature parameter extracted from the first sub-image detects that a pedestrian exists in the overlapping capture area, and the feature parameter extracted from the second sub-image does not detect a pedestrian in the overlapping capture area, then the feature parameter of the first sub-image is different from the feature parameter of the second sub-image.
3. The target sensor and the reference sensor are laser radars.
For such a combination, the above-mentioned determining whether the feature parameter of the first sub-image is the same as the feature parameter of the second sub-image in the comparison image may specifically include:
and comparing whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image in each group of comparison images are the same or not.
And judging based on the laser point cloud data to determine whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image are the same.
Therefore, no matter which combination condition is adopted, a corresponding judgment mode can be adopted to accurately detect whether the first sub-image is the same as the second sub-image, so that the condition of the vehicle sensor is improved, and the working reliability of the sensor in the vehicle is improved.
305. Judging whether the pixel value of the target image is matched with a preset pixel value or not; if so, go to step 306; if not, go to step 307.
Through step 305, when the determination results of each set of comparison images are different, whether the target sensor is abnormal or not can be further determined by combining the pixel values of the target image, so that the accuracy of the abnormality determination of the target sensor can be improved. Therefore, in the embodiment of the invention, the secondary verification can be realized by presetting the preset pixel value and combining the comparison of the images of the overlapped acquisition regions and the matching judgment of the pixel value, so that the accuracy of the abnormal judgment of the sensor is improved.
306. An abnormality of the target sensor is determined.
307. And determining that the target sensor is normal.
It is obvious that through implementing above-mentioned embodiment, can realize the secondary verification through presetting the pixel value, combine the comparison of coincidence collection area image and the matching judgement of pixel value, abandon the drawback that hardware diagnosis exists among the conventional art, only need accurately verify out the sensor through increasing software logic whether unusual, be particularly useful for the diagnosis of sensor abnormal conditions such as hang up, can improve the reliability of vehicle sensor in work to promote vehicle security.
Referring to fig. 4, fig. 4 is a schematic flow chart of a sensor calibration method according to a third embodiment of the present invention; as shown in fig. 4, the sensor verification method may include:
401. all sensors on the vehicle are adjusted so that the acquisition frequency of all sensors is the same.
It should be noted that all the sensors described in the embodiment of the present invention refer to the sensors described in the corresponding embodiment of fig. 2, and have a specific scanning frequency for the camera, so that the embodiment of the present invention only aims at some sensors, and all the sensors mentioned in the vehicle may not include some sensors that cannot be adjusted so that the acquisition frequency is the same and/or cannot have a coincident acquisition region with other sensors.
402. A second sensor on the vehicle is detected having a pickup area adjacent to and non-intersecting with the pickup area of the first sensor.
The sensor mentioned in step 401 is taken as a first sensor, and the sensor which detects that the acquisition region is adjacent to and intersected with the first sensor is taken as a second sensor.
403. And adjusting the acquisition angle of the first sensor and the second sensor so that the acquisition area of the first sensor and the acquisition area of the second sensor have overlapped acquisition areas, and each sensor and at least two other sensors on the vehicle respectively have overlapped acquisition areas after the acquisition angle is adjusted.
The first sensor and the second sensor after adjustment can be enabled to have coincident acquisition regions on the basis of meeting the requirements of an ADAS system and the like on image data information by adjusting the acquisition angles of the first sensor and the second sensor, namely, any one sensor and at least two other sensors have coincident acquisition regions, so that mutual verification of the sensors can be realized through software logic, abnormal sensors are judged, and the working reliability of the sensors is improved.
It is understood that the other sensors in step 403 are those of all the sensors on the vehicle that satisfy the requirements of step 401.
404. And acquiring a target image acquired by the target sensor.
405. And acquiring reference images acquired by at least two reference sensors to obtain at least two reference images, wherein each reference sensor and the target sensor respectively have a superposed acquisition area.
It should be noted that step 404 and step 405 may be executed sequentially or simultaneously.
406. And determining a first sub-image corresponding to the coincident acquisition region from the target image, determining a second sub-image corresponding to the coincident acquisition region from the reference image, and obtaining at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to the same coincident acquisition region.
407. Judging whether the first sub-image and the second sub-image of each group of comparison images are the same; when the judgment results of each group of comparison images are different, turning to step 408; otherwise, the flow is ended.
408. And when the judgment results of each group of comparison images are different, determining that the target sensor is abnormal.
Therefore, by implementing the embodiment, the acquisition frequency setting and the acquisition area adjustment can be carried out on all the sensors meeting the conditions on the vehicle, so that any one sensor and at least two other sensors have overlapped acquisition areas, the abnormal condition of the sensors can be logically judged from software, the reliability of acquiring image data information by the sensors is realized, and the vehicle running safety is improved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a sensor calibration apparatus according to an embodiment of the present invention; as shown in fig. 5, the sensor verification apparatus may include:
an obtaining module 510, configured to obtain a target image acquired by a target sensor;
the acquiring module 510 is further configured to acquire reference images acquired by at least two reference sensors to obtain at least two reference images, where each reference sensor and the target sensor respectively have a coincident acquisition region;
the image processing module 520 is configured to determine a first sub-image corresponding to the overlapping acquisition region from the target image, determine a second sub-image corresponding to the overlapping acquisition region from the reference image, and obtain at least two sets of comparison images, where each set of comparison images includes the first sub-image and the second sub-image corresponding to one overlapping acquisition region;
a judging module 530, configured to judge whether the first sub-image and the second sub-image of each group of comparison images are the same;
and an anomaly determination module 540, configured to determine that the target sensor is anomalous when the determination module 530 determines that the determination results of each set of comparison images are different.
By implementing the embodiment of the invention, the mutual verification can be carried out by collecting images through the sensor, the defects of hardware diagnosis in the traditional technology are overcome, whether the sensor is abnormal or not is accurately verified only by adding software logic, the method is particularly suitable for diagnosing abnormal conditions such as sensor hang-up and the like, the reliability of the vehicle sensor in the working process can be improved, and the vehicle safety is improved.
As an optional implementation manner, before the determination results of each set of comparison images are different and the target sensor is determined to be abnormal, the determining module 530 is further configured to determine whether the pixel value of the target image matches the preset pixel value when the determination results of each set of comparison images are different, and when the determination results are matching, the abnormality determining module 540 determines that the target sensor is abnormal.
As an alternative embodiment, when the determination results of all the comparison images are the same or the determination result of at least one comparison image is different, the determining module 530 determines whether the pixel value of the target image matches the preset pixel value; if not, the anomaly determination module 540 determines that the target sensor is normal; if so, the anomaly determination module 540 determines that the target sensor is anomalous; by the embodiment, whether the target sensor is abnormal or not can be accurately detected by combining the pixel value of the image, and the detection accuracy is improved.
As an optional implementation manner, the manner of determining whether the first sub-image and the second sub-image of each group of comparison images are the same by the determining module 530 is specifically:
judging whether the first sub-image and the second sub-image of each group of comparison images are the same according to the difference value of the two sub-images; or comparing whether the characteristic parameters of the first sub-image and the second sub-image in each group of comparison images are the same or not; or comparing whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image in each group of comparison images are the same or not.
Through the various detection modes, the detection accuracy of the target sensor can be further improved, so that the running safety of the vehicle is improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a sensor calibration apparatus according to a second embodiment of the present invention; the apparatus shown in fig. 6 further comprises:
the sending module 610 is configured to upload, to the cloud server, abnormal information indicating that the target sensor is abnormal after the abnormality determining module 540 determines that the target sensor is abnormal, and perform an early warning indication.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a sensor calibration apparatus according to a third embodiment of the present invention; the apparatus shown in fig. 7 further comprises:
an adjusting module 710, configured to detect a second sensor on the vehicle, where a collection area of the second sensor is adjacent to and does not intersect with a collection area of the first sensor, before the obtaining module 510 obtains the target image collected by the target sensor; and adjusting the acquisition angle of the first sensor and the second sensor so that the acquisition area of the first sensor and the acquisition area of the second sensor have the overlapped acquisition area, and after the acquisition angle is adjusted, each sensor and at least two other sensors on the vehicle respectively have the overlapped acquisition area.
As an alternative embodiment, the adjusting module 710 is further configured to adjust all sensors on the vehicle before acquiring the target image acquired by the target sensor, so that the acquisition frequency of all sensors is the same.
Through the embodiment, any one sensor and at least two other sensors can have the overlapped acquisition area, so that the abnormal condition of the sensor can be logically judged from software, the reliability of acquiring image data information by the sensor is realized, and the vehicle driving safety is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a vehicle according to an embodiment of the disclosure; the vehicle shown in fig. 8 may include:
a memory 801 in which executable program code is stored;
a processor 802 coupled with the memory 801;
the processor 802 calls the executable program code stored in the memory 801 to execute a part of the steps of any one of the sensor verification methods of fig. 2 to 4.
The embodiment of the invention also discloses a computer readable storage medium, which stores a computer program, wherein the computer program enables a computer to execute the sensor verification method disclosed in fig. 2 to 4.
An embodiment of the present invention further discloses a computer program product, which, when running on a computer, causes the computer to execute part or all of the steps of any one of the methods disclosed in fig. 2 to 4.
An embodiment of the present invention further discloses an application publishing platform, where the application publishing platform is configured to publish a computer program product, where when the computer program product runs on a computer, the computer is enabled to execute part or all of the steps of any one of the methods disclosed in fig. 2 to fig. 4.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM), or other Memory, such as a magnetic disk, or a combination thereof, A tape memory, or any other medium readable by a computer that can be used to carry or store data.
The sensor calibration method and apparatus, the vehicle, and the storage medium disclosed in the embodiments of the present invention are described in detail above, and the principle and the implementation of the present invention are explained in this document by applying specific examples, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (10)
1. A sensor verification method, comprising:
acquiring a target image acquired by a target sensor;
acquiring reference images acquired by at least two reference sensors to obtain at least two reference images, wherein each reference sensor and the target sensor respectively have a superposition acquisition area;
determining a first sub-image corresponding to the overlapping acquisition area from the target image, determining a second sub-image corresponding to the overlapping acquisition area from the reference image, and obtaining at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to the overlapping acquisition area;
judging whether the first sub-image and the second sub-image of each group of the comparison images are the same;
and when the judgment results of each group of comparison images are different, determining that the target sensor is abnormal.
2. The method according to claim 1, wherein before the determination result of each set of the comparison images is different and the determination that the target sensor is abnormal, the method further comprises:
when the judgment results of each group of comparison images are different, judging whether the pixel value of the target image is matched with a preset pixel value;
and if so, executing the step of determining that the target sensor is abnormal.
3. The method of claim 2, further comprising:
when the judgment results of all the comparison images are the same or the judgment result of at least one comparison image is different, judging whether the pixel value of the target image is matched with the preset pixel value;
if not, determining that the target sensor is normal;
and if the target sensor is matched with the target sensor, determining that the target sensor is abnormal.
4. The method of any of claims 1 to 3, wherein said determining whether the first sub-image and the second sub-image of each set of the comparison images are the same comprises:
judging whether the first sub-image and the second sub-image are the same according to the difference value of the first sub-image and the second sub-image of each group of comparison images; or
Comparing whether the characteristic parameters of the first sub-image and the second sub-image in each group of comparison images are the same or not; or
And comparing whether the laser point cloud data corresponding to the first sub-image and the laser point cloud data corresponding to the second sub-image in each group of comparison images are the same or not.
5. The method of claim 4, wherein after determining that the target sensor is abnormal, the method further comprises:
and uploading abnormal information for indicating the abnormality of the target sensor to a cloud server, and performing early warning indication.
6. The method of any of claims 1 to 3, wherein prior to acquiring the image of the target acquired by the target sensor, the method further comprises:
detecting a second sensor on the vehicle, wherein the acquisition area of the second sensor is adjacent to and not intersected with the acquisition area of the first sensor;
and adjusting the collection angle of the first sensor and the second sensor so that the collection area of the first sensor and the collection area of the second sensor have the overlapped collection area, and after the collection angle is adjusted, each sensor and at least two other sensors on the vehicle respectively have the overlapped collection areas.
7. The method of claim 6, wherein prior to acquiring the image of the target acquired by the target sensor, the method further comprises:
adjusting all sensors on the vehicle so that the acquisition frequency of all sensors is the same.
8. A sensor verification device, comprising:
the acquisition module is used for acquiring a target image acquired by the target sensor;
the acquisition module is further configured to acquire reference images acquired by at least two reference sensors to obtain at least two reference images, and each reference sensor and the target sensor respectively have a coincident acquisition region;
the image processing module is used for determining a first sub-image corresponding to the overlapping acquisition area from the target image and determining a second sub-image corresponding to the overlapping acquisition area from the reference image to obtain at least two groups of comparison images, wherein each group of comparison images comprises the first sub-image and the second sub-image corresponding to the overlapping acquisition area;
the judging module is used for judging whether the first sub-image and the second sub-image of each group of comparison images are the same;
and the abnormality determining module is used for determining that the target sensor is abnormal when the judging module judges that the judging results of each group of the comparison images are different.
9. A vehicle, characterized by comprising:
a memory storing executable program code;
a processor coupled with the memory;
the processor invokes the executable program code stored in the memory to perform a sensor verification method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111117177.6A CN113884123A (en) | 2021-09-23 | 2021-09-23 | Sensor calibration method and device, vehicle and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111117177.6A CN113884123A (en) | 2021-09-23 | 2021-09-23 | Sensor calibration method and device, vehicle and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113884123A true CN113884123A (en) | 2022-01-04 |
Family
ID=79010419
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111117177.6A Pending CN113884123A (en) | 2021-09-23 | 2021-09-23 | Sensor calibration method and device, vehicle and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113884123A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116074479A (en) * | 2023-03-03 | 2023-05-05 | 山东交通学院 | Image analysis-based passenger monitoring system, method, equipment and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008283527A (en) * | 2007-05-11 | 2008-11-20 | Alpine Electronics Inc | Image processor |
CN204350309U (en) * | 2014-09-29 | 2015-05-20 | 罗斯蒙特公司 | Wireless industrial process monitor and wireless supervisory control system |
CN105141945A (en) * | 2014-05-16 | 2015-12-09 | 通用汽车环球科技运作有限责任公司 | Surround-view camera system (vpm) online calibration |
US20160007018A1 (en) * | 2014-07-02 | 2016-01-07 | Denso Corporation | Failure detection apparatus and failure detection program |
CN108156452A (en) * | 2017-12-22 | 2018-06-12 | 深圳怡化电脑股份有限公司 | A kind of method, apparatus of detection sensor, equipment and storage medium |
CN109166152A (en) * | 2018-07-27 | 2019-01-08 | 深圳六滴科技有限公司 | Bearing calibration, system, computer equipment and the storage medium of panorama camera calibration |
US20190049958A1 (en) * | 2017-08-08 | 2019-02-14 | Nio Usa, Inc. | Method and system for multiple sensor correlation diagnostic and sensor fusion/dnn monitor for autonomous driving application |
US20190149813A1 (en) * | 2016-07-29 | 2019-05-16 | Faraday&Future Inc. | Method and apparatus for camera fault detection and recovery |
CN111401423A (en) * | 2020-03-10 | 2020-07-10 | 北京百度网讯科技有限公司 | Data processing method and device for automatic driving vehicle |
CN111788828A (en) * | 2018-09-25 | 2020-10-16 | 松下知识产权经营株式会社 | Processing system, sensor system, moving object, abnormality determination method, and program |
CN112348784A (en) * | 2020-10-28 | 2021-02-09 | 北京市商汤科技开发有限公司 | Method, device and equipment for detecting state of camera lens and storage medium |
CN113068019A (en) * | 2021-03-17 | 2021-07-02 | 杭州海康消防科技有限公司 | Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium |
-
2021
- 2021-09-23 CN CN202111117177.6A patent/CN113884123A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008283527A (en) * | 2007-05-11 | 2008-11-20 | Alpine Electronics Inc | Image processor |
CN105141945A (en) * | 2014-05-16 | 2015-12-09 | 通用汽车环球科技运作有限责任公司 | Surround-view camera system (vpm) online calibration |
US20160007018A1 (en) * | 2014-07-02 | 2016-01-07 | Denso Corporation | Failure detection apparatus and failure detection program |
CN204350309U (en) * | 2014-09-29 | 2015-05-20 | 罗斯蒙特公司 | Wireless industrial process monitor and wireless supervisory control system |
US20190149813A1 (en) * | 2016-07-29 | 2019-05-16 | Faraday&Future Inc. | Method and apparatus for camera fault detection and recovery |
US20190049958A1 (en) * | 2017-08-08 | 2019-02-14 | Nio Usa, Inc. | Method and system for multiple sensor correlation diagnostic and sensor fusion/dnn monitor for autonomous driving application |
CN108156452A (en) * | 2017-12-22 | 2018-06-12 | 深圳怡化电脑股份有限公司 | A kind of method, apparatus of detection sensor, equipment and storage medium |
CN109166152A (en) * | 2018-07-27 | 2019-01-08 | 深圳六滴科技有限公司 | Bearing calibration, system, computer equipment and the storage medium of panorama camera calibration |
CN111788828A (en) * | 2018-09-25 | 2020-10-16 | 松下知识产权经营株式会社 | Processing system, sensor system, moving object, abnormality determination method, and program |
CN111401423A (en) * | 2020-03-10 | 2020-07-10 | 北京百度网讯科技有限公司 | Data processing method and device for automatic driving vehicle |
CN112348784A (en) * | 2020-10-28 | 2021-02-09 | 北京市商汤科技开发有限公司 | Method, device and equipment for detecting state of camera lens and storage medium |
CN113068019A (en) * | 2021-03-17 | 2021-07-02 | 杭州海康消防科技有限公司 | Dual-optical camera calibration apparatus, method, electronic apparatus, and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116074479A (en) * | 2023-03-03 | 2023-05-05 | 山东交通学院 | Image analysis-based passenger monitoring system, method, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106840242B (en) | Sensor self-checking system and multi-sensor fusion system of intelligent driving automobile | |
US20170067764A1 (en) | Method and device for detecting at least one sensor malfunction of at least one first sensor of at least one first vehicle | |
US7218757B2 (en) | Method and device for detecting obstruction in image sensor systems | |
US9769469B2 (en) | Failure detection apparatus and failure detection program | |
CN110562249B (en) | Automatic parking assistance method, readable storage medium, and electronic device | |
US20170293895A1 (en) | Device and method for calculating damage repair cost | |
CN112578683B (en) | Optimized in-loop simulation test method for automobile auxiliary driving controller | |
EP3140777B1 (en) | Method for performing diagnosis of a camera system of a motor vehicle, camera system and motor vehicle | |
US11367349B2 (en) | Method of detecting speed using difference of distance between object and monitoring camera | |
CN108466616B (en) | Method for automatically identifying collision event, storage medium and vehicle-mounted terminal | |
US11230286B2 (en) | System and method for inspecting vehicle lane keeping performance | |
CN105554493B (en) | Image processing apparatus | |
WO2021111599A1 (en) | Method, device, system, and computer program for detecting wheel fastener looseness and computer-readable medium storing computer program | |
CN111222441A (en) | Point cloud target detection and blind area target detection method and system based on vehicle-road cooperation | |
CN113884123A (en) | Sensor calibration method and device, vehicle and storage medium | |
CN115878494A (en) | Test method and device for automatic driving software system, vehicle and storage medium | |
CN105022987A (en) | Method of misalignment correction and diagnostic function for lane sensing sensor | |
CN111800508B (en) | Automatic driving fault monitoring method based on big data | |
CN112927552B (en) | Parking space detection method and device | |
CN111800314B (en) | Automatic driving fault monitoring system | |
US20240078632A1 (en) | Vehicle vision system | |
JP3760426B2 (en) | Traveling lane detection method and apparatus | |
JP3400643B2 (en) | In-vehicle image processing device | |
US11544934B2 (en) | Method for operating an environment sensor system of a vehicle and environment sensor system | |
CN114730453A (en) | Method for detecting a movement state of a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220104 |