EP3526725A1 - Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle - Google Patents
Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicleInfo
- Publication number
- EP3526725A1 EP3526725A1 EP17757697.2A EP17757697A EP3526725A1 EP 3526725 A1 EP3526725 A1 EP 3526725A1 EP 17757697 A EP17757697 A EP 17757697A EP 3526725 A1 EP3526725 A1 EP 3526725A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor data
- motor vehicle
- wavelength range
- basis
- classified
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000007613 environmental effect Effects 0.000 title claims abstract description 36
- 238000000034 method Methods 0.000 title claims abstract description 21
- 238000001514 detection method Methods 0.000 title claims description 35
- 230000003068 static effect Effects 0.000 claims abstract description 36
- 230000005855 radiation Effects 0.000 claims description 13
- 239000002131 composite material Substances 0.000 description 3
- 238000013021 overheating Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 229930002875 chlorophyll Natural products 0.000 description 2
- 235000019804 chlorophyll Nutrition 0.000 description 2
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000029553 photosynthesis Effects 0.000 description 2
- 238000010672 photosynthesis Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the present invention relates to a method for detecting objects in an environmental region of a motor vehicle, wherein by means of a detecting device of the motor vehicle first sensor data describing the environmental region in the visible wavelength range, and second sensor data describing the environmental region in the infrared wavelength range, are provided and the object is detected on the basis of the first sensor data and/or the second sensor data.
- the present invention relates to an object detection apparatus.
- the present invention relates to a driver assistance system.
- the present invention relates to a motor vehicle.
- the interest is presently directed to the detection of objects in an environmental region of a motor vehicle.
- objects in the environment are detected with sensors, which are arranged on the motor vehicle.
- the information about the objects can then be used by the driver assistance systems of the motor vehicle.
- a warning to the driver of the motor vehicle can be outputted if a collision with an object or obstacle is imminent.
- the objects can for example be detected by the aid of cameras that provide images of the environmental region. Using appropriate object detection methods, the objects can then be detected in the images.
- the objects are classified or assigned to a group.
- the objects can be detected as pedestrians, vehicles or road markings. Further, it may be provided that the objects are classified as static or dynamic objects.
- WO 2015/161208 A1 discloses a vehicle with a vision system comprising a stereo camera for light in the visible wavelength range.
- the vision system comprises an infrared camera.
- the vision system comprises a camera for the near infrared wavelength range.
- first sensor data describing in particular the environmental region in the visible wavelength range, and second sensor data describing in particular the environmental region in the infrared wavelength range are provided.
- the object is preferably detected on the basis of the first sensor data and/or the second sensor data.
- the object is preferably classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data and if the object is classified as a static or as a dynamic object, it is preferably verified on the basis of the second sensor data, whether the object is a plant.
- a method according to the invention serves for detection in an environmental region of a motor vehicle.
- first sensor data describing the environmental region in the visible wavelength range
- second sensor data describing the environmental region in the infrared wavelength range
- the object is detected on the basis of the first sensor data and/or the second sensor data.
- the object is detected on the basis of the first sensor data and/or the second sensor data.
- the object is preferably classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data. If the object is classified as a static or as a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant.
- the first sensor data are provided with the detection device.
- the first sensor data describe the environmental region in the visible wavelength range.
- the first sensor data describes in particular light in the visible wavelength range, which is reflected from the object and/or emitted by the object.
- the second sensor data are provided by means of the detection device, which describe the surrounding region and particularly the object in the infrared wavelength range.
- the second sensor data describe in particular the radiation in the infrared wavelength range, which is reflected by the object and/or is emitted from it.
- the object On the basis of the first sensor data and/or the second sensor data, the object can then be detected.
- a computing device can be used by means of which the first sensor data and/or the second sensor data can be evaluated. If the first sensor data and the second sensor data are provided each as an image, an appropriate object recognition algorithm can for example be carried out using the computing device to recognize the object. In this case, for example, methods of segmentation and/or classification can be used to detect the object in the images. It can also be provided that the detected object is compared with known forms. In particular, it is provided that the object is classified based on the first sensor data and/or the second sensor data.
- the classification it can in particular be differentiated between a static or non-moving object on the one hand and a dynamic or a moving object on the other hand.
- dynamic objects those objects are to be understood, which move relative to the motor vehicle.
- the dynamic objects can move in particular on a road or a floor.
- static objects those objects are to be understood, which do not move relative to the motor vehicle.
- the static objects can be located in particular on the road or near the road.
- the object is classified as a static object or as a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant.
- This is based on the finding that plants can move as a result of environmental influences. For example, the plants and in particular their leaves can move in wind or precipitation. This can lead to the situation that these plants or parts thereof are classified as a moving object.
- textures in static vegetation can mimic the patterns of objects or pedestrians used in classification methodologies. Thus, these textures of the static plants can be classified as static objects. If the object is classified as a static or as a moving object, the second sensor data describing the object can be used.
- the infrared radiation, and particularly the near infrared radiation which is reflected and/or emitted from the object is examined.
- plants typically reflect or scatter radiation in the near infrared wavelength region during photosynthesis.
- the plants reflect or scatter the radiation in the near infrared wavelength range in order to prevent overheating and damage to the cells.
- the second sensor data it is possible to distinguish the objects classified as static or as moving objects from plants.
- the false positive rate can be significantly reduced.
- the object can be classified as a static pedestrian. As already mentioned, this may be due to the texture of the plants.
- the verification based on the second sensor data it can be determined in a reliable manner, whether it is a pedestrian or a plant, for example a tree, a bush or the like. This information can then be used for example by a driver assistance system of the motor vehicle to avoid a collision between the motor vehicle and the pedestrian.
- the object that is classified as static or as dynamic is detected as a plant on the basis of the radiation emitted from the object in the near infrared wavelength range, in particular in a wavelength range of between 0.7 ⁇ and 1 .1 ⁇ .
- the knowledge is taken into account that plants especially during
- photosynthesis significantly reflect or scatter radiation in the near infrared region to prevent overheating and thus damage to the cells. If it is now recognized that the object emits radiation in this wavelength range, it can be assumed with a high probability that it is a plant.
- the first sensor data can be used to assess the object that is classified as a static object or as a dynamic object closer. For example, it can be verified, which colour the object has. If the object has a green colour, it can be assumed with a high probability that it is a plant.
- the colour of the object or a part of the object can be compared with predetermined colours, which describe different plants. This takes into account that leaves of trees and shrubs can have different colours. Thus, the false positive rate can be further reduced.
- the object that is classified as static or as dynamic is detected as a plant based on a radiation emitted from the object in a wavelength range of between 0.4 ⁇ and 0.7 ⁇ .
- a radiation emitted from the object in a wavelength range of between 0.4 ⁇ and 0.7 ⁇ .
- the chlorophyll in the plants and in particular in the leaves of plants reflects or scatters light in the visible wavelength range of between 0.4 ⁇ and 0.7 ⁇ .
- a normalized differenced vegetation index of the as object that is classified as dynamic is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant on the basis of the normalized differenced vegetation index.
- the normalized differenced vegetation index is usually calculated on the basis of satellite data. This NDVI can now be determined based on the first sensor data and the second sensor data. In particular, the NDVI can be determined from reflection values in the near infrared range, which are included in the second sensor data, and reflection values in the red visible range, which are included in the first sensor data. Based on the NDVI plants in the environmental region can be determined in a simple and reliable manner.
- an image is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant on the basis of the image.
- a composite image that contains information in the visible wavelength region and the infrared wavelength range can be provided based on the first sensor data and the second sensor data.
- this image can describe the NDVI.
- a so-called NRG image (NRG - Near- infrared/Red/Green) is provided, describing the environmental region in the near infrared wavelength range, in the red wavelength range and the green wavelength range.
- NRG image NRG - Near- infrared/Red/Green
- the proportion of plants or vegetation can be determined in the environmental region.
- the object detection apparatus is adapted for performing a method according to the invention and
- the computing device is in particular connected to the detecting device for data transmission and can receive the first sensor data and the second sensor data from the detection device.
- the computing device may also identify and classify the object in the surrounding area based on the first sensor data and/or the second sensor data. Further, the computing device can verify based on the second sensor data, if the object that is classified as static or as dynamic is a plant.
- the detection device comprises a camera for providing the first sensor data and a sensor for providing the second sensor data.
- the camera may be a camera, which is typically used for object detection in the automotive field. With the camera, for example, images of the environmental region can be provided as the first sensor data. With the sensor then additionally the second sensor data in the infrared wavelength range, and in particular near-infrared wavelength region, can be provided.
- the sensor may be, for example, a corresponding infrared camera.
- the detection device can comprise a camera, which can provide both the first sensor data and the second sensor data.
- a camera can be used which does not have an infrared filter.
- this camera can provide, for example, images that provide in addition to the information in the visible wavelength range also information in the infrared wavelength range.
- a driver assistance system comprises an object detection apparatus according to the invention.
- objects and in particular pedestrians can be detected reliably.
- static objects such as walls, curbs, guardrails or the like can be detected with the object detection apparatus based on the first sensor data and/or the second sensor data.
- other motor vehicles, motorcycles or cyclists can be detected. This information can be used by the driver assistance system in order to assist the driver when driving the motor vehicle.
- a motor vehicle according to the invention includes a camera system according to the invention.
- the motor vehicle is in particular formed as a passenger car.
- Fig. 1 a motor vehicle according to an embodiment of the present invention, comprising a driver assistance system with an object detection apparatus;
- Fig. 2 an image which is provided by the object detection apparatus.
- Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view.
- the motor vehicle 1 is configured as a passenger car.
- the motor vehicle 1 comprises a driver assistance system 2, which is used to assist a driver when driving the motor vehicle 1 .
- the driver assistance system 2 includes an object detection apparatus 3. With the aid of the object detection apparatus 3 objects 9 can be detected in an environmental region 8 of the motor vehicle 1 . In the present case, as objects 9 a pedestrian 10 and a plant 1 1 or a tree are located in the environmental region 8 of the vehicle 1 in the direction of travel in front of the motor vehicle 1 .
- the driver assistance system 2 can intervene in a steering system, a drive motor and/or a braking system of the motor vehicle 1 in order to prevent a collision between the motor vehicle 1 and the object 9.
- the object recognition device 3 includes a camera 5, by means of which the first sensor data can be provided.
- the first sensor data describe the environmental region 8 in the visible wavelength range.
- the first sensor data for example, image data or images can be provided by the camera 5.
- the object detection apparatus 3 comprises a sensor 6 by means of which the second sensor data can be provided.
- the second sensor data describe the environmental region 8 in the infrared wavelength range.
- the sensor 6 is embodied as an infrared camera.
- the object detection apparatus 3 includes a computing device 7.
- the computing device 7 is connected with the camera 5 and the sensor 6 for data transmission. By means of the calculating device 7, the first sensor data from the camera 5 and the second sensor data from the sensor 6 can be received.
- the objects 9 can be detected in the
- the first sensor data and the second sensor data can be provided, for example, as an image. It can also be provided that a composite image 12 is determined by means of the computing device 7 based on the first sensor data and the second sensor data.
- the objects 9 can be detected by means of a corresponding object detection algorithm. For example, a segmentation method can be used. By means of the computing device 7, the detected objects 9 are classified additionally. In particular, it is provided that the objects 9 are classified as static objects or as dynamic objects.
- an image 12 showing the environmental region 8 and the objects 9 is provided with the calculating device 7 based on the first sensor data and the second sensor data.
- Such an image 12 is exemplarily shown in Fig. 2.
- the image 12 shows the pedestrian 10 and a part of the plant 1 1 or the tree. It can be the case that both the pedestrian 10 and the plant 1 1 are classified as static or moving object, and in particular as a pedestrian. This may be due to the plant 1 1 and in particular the leaves of the plant 1 1 moving as a result of environmental conditions such as wind.
- the object 9 If the object 9 is classified as static or as dynamic, it can be verified on the basis of the second sensor data whether the object 9 reflects or scatters radiation in the near infrared wavelength region and in particular in a wavelength range between 0.7 ⁇ and 1 .1 ⁇ . This takes into account that plants 1 1 scatter radiation in this wavelength range in order to prevent overheating and thereby caused damage to the cells. Alternatively or additionally, it can be verified on the basis of the first sensor data, whether the object scatters light in a wavelength range of between 0.4 ⁇ and 0.7 ⁇ . This takes into account that the chlorophyll in the leaves of plants strongly absorbs light in this wavelength range.
- the image 12 a composite image is provided that shows both components in the visible wavelength range and in the near infrared wavelength range.
- the image 12 can show the normalized differenced vegetation index (NDVI).
- NDVI normalized differenced vegetation index
- the pedestrian 10 and the plant 1 1 have different values of the NDVI.
- the plant can have an NDVI of 1 . This is schematically illustrated by the hatching 13.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to a method for detecting objects (9) in an environmental region (8) of a motor vehicle (1), wherein by means of a detecting device (4) of the motor vehicle (1) first sensor data describing the environmental region (8) in the visible wavelength range, and second sensor data describing the environmental region (8) in the infrared wavelength range, are provided and the object (9) is detected on the basis of the first sensor data and/or the second sensor data, wherein the object (9) is classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data and if the object (9) is classified as a static or a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant (11).
Description
Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle
The present invention relates to a method for detecting objects in an environmental region of a motor vehicle, wherein by means of a detecting device of the motor vehicle first sensor data describing the environmental region in the visible wavelength range, and second sensor data describing the environmental region in the infrared wavelength range, are provided and the object is detected on the basis of the first sensor data and/or the second sensor data. Moreover, the present invention relates to an object detection apparatus. Furthermore, the present invention relates to a driver assistance system. Finally, the present invention relates to a motor vehicle.
The interest is presently directed to the detection of objects in an environmental region of a motor vehicle. For this purpose it is known from the prior art that objects in the environment are detected with sensors, which are arranged on the motor vehicle. The information about the objects can then be used by the driver assistance systems of the motor vehicle. For example, a warning to the driver of the motor vehicle can be outputted if a collision with an object or obstacle is imminent. The objects can for example be detected by the aid of cameras that provide images of the environmental region. Using appropriate object detection methods, the objects can then be detected in the images. It is further known that the objects are classified or assigned to a group. For example, the objects can be detected as pedestrians, vehicles or road markings. Further, it may be provided that the objects are classified as static or dynamic objects.
From the prior art it is also known that for detecting objects sensor data are used which describe the surrounding region in the infrared wavelength range. For this purpose, WO 2015/161208 A1 discloses a vehicle with a vision system comprising a stereo camera for light in the visible wavelength range. In addition, the vision system comprises an infrared camera. It can further be provided that the vision system comprises a camera for the near infrared wavelength range. With the aid of the visual system, a three- dimensional point cloud can be determined for recognizing objects in the surroundings of the vehicle.
It is object of the present invention to demonstrate a solution, as to how objects in an environment of the motor vehicle can be detected and classified more reliably.
According to the invention, this object is solved by a method, by an object detection apparatus, by a driver assistance system as well as by a motor vehicle having the features according to the respective independent claims. Advantageous developments of the present invention are the subject matter of the dependent claims.
In an embodiment of a method for detecting objects in an environmental region of a motor vehicle preferably by means of a detecting device of the motor vehicle first sensor data describing in particular the environmental region in the visible wavelength range, and second sensor data describing in particular the environmental region in the infrared wavelength range, are provided. In addition, the object is preferably detected on the basis of the first sensor data and/or the second sensor data. Furthermore, the object is preferably classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data and if the object is classified as a static or as a dynamic object, it is preferably verified on the basis of the second sensor data, whether the object is a plant.
A method according to the invention serves for detection in an environmental region of a motor vehicle. Here, by means of a detecting device of the motor vehicle first sensor data describing the environmental region in the visible wavelength range, and second sensor data describing the environmental region in the infrared wavelength range, are provided. Further, the object is detected on the basis of the first sensor data and/or the second sensor data. In addition, the object is detected on the basis of the first sensor data and/or the second sensor data. Furthermore, the object is preferably classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data. If the object is classified as a static or as a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant.
With the aid of the method, objects should be detected in the environmental region of the motor vehicle. For this purpose, the first sensor data are provided with the detection device. The first sensor data describe the environmental region in the visible wavelength range. Thus, the first sensor data describes in particular light in the visible wavelength range, which is reflected from the object and/or emitted by the object. In addition, the second sensor data are provided by means of the detection device, which describe the surrounding region and particularly the object in the infrared wavelength range. Thus, the
second sensor data describe in particular the radiation in the infrared wavelength range, which is reflected by the object and/or is emitted from it.
On the basis of the first sensor data and/or the second sensor data, the object can then be detected. Thus, for example, a computing device can be used by means of which the first sensor data and/or the second sensor data can be evaluated. If the first sensor data and the second sensor data are provided each as an image, an appropriate object recognition algorithm can for example be carried out using the computing device to recognize the object. In this case, for example, methods of segmentation and/or classification can be used to detect the object in the images. It can also be provided that the detected object is compared with known forms. In particular, it is provided that the object is classified based on the first sensor data and/or the second sensor data. In the classification it can in particular be differentiated between a static or non-moving object on the one hand and a dynamic or a moving object on the other hand. As dynamic objects in particular, those objects are to be understood, which move relative to the motor vehicle. The dynamic objects can move in particular on a road or a floor. As static objects in particular, those objects are to be understood, which do not move relative to the motor vehicle. The static objects can be located in particular on the road or near the road.
According to an essential aspect of the invention it is provided that if the object is classified as a static object or as a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant. This is based on the finding that plants can move as a result of environmental influences. For example, the plants and in particular their leaves can move in wind or precipitation. This can lead to the situation that these plants or parts thereof are classified as a moving object. Furthermore, textures in static vegetation can mimic the patterns of objects or pedestrians used in classification methodologies. Thus, these textures of the static plants can be classified as static objects. If the object is classified as a static or as a moving object, the second sensor data describing the object can be used. Thus, the infrared radiation, and particularly the near infrared radiation which is reflected and/or emitted from the object is examined. This takes into account that plants typically reflect or scatter radiation in the near infrared wavelength region during photosynthesis. The plants reflect or scatter the radiation in the near infrared wavelength range in order to prevent overheating and damage to the cells. By additionally considering the second sensor data, it is possible to distinguish the objects classified as static or as moving objects from plants. Thus, the false positive rate can be significantly reduced.
Preferably, it is verified on the basis of the second sensor data whether the object is a plant, if the object is classified as a pedestrian. Based on the first sensor data and/or the second sensor data, the object can be classified for example as a pedestrian, which moves relative to the motor vehicle. Here, tests have shown that in particular moving plants are often recognized as a moving pedestrian. Based on the first sensor data and/or the second sensor data, the object can be classified as a static pedestrian. As already mentioned, this may be due to the texture of the plants. By the verification based on the second sensor data it can be determined in a reliable manner, whether it is a pedestrian or a plant, for example a tree, a bush or the like. This information can then be used for example by a driver assistance system of the motor vehicle to avoid a collision between the motor vehicle and the pedestrian.
Furthermore, it is advantageous if the object that is classified as static or as dynamic is detected as a plant on the basis of the radiation emitted from the object in the near infrared wavelength range, in particular in a wavelength range of between 0.7 μηι and 1 .1 μηι. Here, the knowledge is taken into account that plants especially during
photosynthesis significantly reflect or scatter radiation in the near infrared region to prevent overheating and thus damage to the cells. If it is now recognized that the object emits radiation in this wavelength range, it can be assumed with a high probability that it is a plant.
In another embodiment, it is additionally verified on the basis of the first sensor data, whether the object that is classified as static or as dynamic is a plant. Also the first sensor data describing the surrounding area in the visible wavelength range, can be used to assess the object that is classified as a static object or as a dynamic object closer. For example, it can be verified, which colour the object has. If the object has a green colour, it can be assumed with a high probability that it is a plant. Here, the colour of the object or a part of the object can be compared with predetermined colours, which describe different plants. This takes into account that leaves of trees and shrubs can have different colours. Thus, the false positive rate can be further reduced.
It is provided in particular, that the object that is classified as static or as dynamic is detected as a plant based on a radiation emitted from the object in a wavelength range of between 0.4 μηι and 0.7 μηι. This takes into account that the chlorophyll in the plants and in particular in the leaves of plants reflects or scatters light in the visible wavelength range of between 0.4 μηι and 0.7 μηι. This enables a reliable detection of plants in the area surrounding the motor vehicle.
According to another embodiment, a normalized differenced vegetation index of the as object that is classified as dynamic is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant on the basis of the normalized differenced vegetation index. The normalized differenced vegetation index (NDVI) is usually calculated on the basis of satellite data. This NDVI can now be determined based on the first sensor data and the second sensor data. In particular, the NDVI can be determined from reflection values in the near infrared range, which are included in the second sensor data, and reflection values in the red visible range, which are included in the first sensor data. Based on the NDVI plants in the environmental region can be determined in a simple and reliable manner.
According to a further embodiment, an image is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant on the basis of the image. In particular, a composite image that contains information in the visible wavelength region and the infrared wavelength range can be provided based on the first sensor data and the second sensor data. For example, this image can describe the NDVI. It can also be provided that a so-called NRG image (NRG - Near- infrared/Red/Green) is provided, describing the environmental region in the near infrared wavelength range, in the red wavelength range and the green wavelength range. Thus it can be easily distinguished between rural and urban areas. Furthermore, the proportion of plants or vegetation can be determined in the environmental region.
An object detection apparatus according to the invention for a driver assistance system of a motor vehicle includes a detecting device and a computing device. The object detection apparatus is adapted for performing a method according to the invention and
advantageous embodiments thereof. The computing device is in particular connected to the detecting device for data transmission and can receive the first sensor data and the second sensor data from the detection device. The computing device may also identify and classify the object in the surrounding area based on the first sensor data and/or the second sensor data. Further, the computing device can verify based on the second sensor data, if the object that is classified as static or as dynamic is a plant.
In one embodiment, the detection device comprises a camera for providing the first sensor data and a sensor for providing the second sensor data. The camera may be a camera, which is typically used for object detection in the automotive field. With the camera, for example, images of the environmental region can be provided as the first
sensor data. With the sensor then additionally the second sensor data in the infrared wavelength range, and in particular near-infrared wavelength region, can be provided. The sensor may be, for example, a corresponding infrared camera.
According to an alternative embodiment, the detection device can comprise a camera, which can provide both the first sensor data and the second sensor data. As a detecting device, a camera can be used which does not have an infrared filter. Thus, this camera can provide, for example, images that provide in addition to the information in the visible wavelength range also information in the infrared wavelength range.
A driver assistance system according to the invention comprises an object detection apparatus according to the invention. With the aid of the object detection apparatus objects and in particular pedestrians can be detected reliably. Further, static objects, such as walls, curbs, guardrails or the like can be detected with the object detection apparatus based on the first sensor data and/or the second sensor data. Furthermore, other motor vehicles, motorcycles or cyclists can be detected. This information can be used by the driver assistance system in order to assist the driver when driving the motor vehicle.
A motor vehicle according to the invention includes a camera system according to the invention. The motor vehicle is in particular formed as a passenger car.
The preferred embodiments presented with respect to the method according to the invention and the advantages thereof correspondingly apply to the object detection apparatus according to the invention, the driver assistance system according to the invention as well as to the motor vehicle according to the invention.
Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone are usable not only in the respectively specified combination, but also in other combinations or alone without departing from the scope of the invention. Thus, implementations are also to be considered as encompassed and disclosed by the invention, which are not explicitly shown in the figures and explained, but arise from and can be generated by separated feature combinations from the explained implementations. Implementations and feature combinations are also to be considered as disclosed, which thus do not have all of the features of an originally formulated independent claim. Moreover, implementations and feature combinations are to be
considered as disclosed, in particular by the implementations set out above, which extend beyond or deviate from the feature combinations set out in the relations of the claims.
Now, the invention is explained in more detail based on preferred embodiments as well as with reference to the attached drawings.
These show in:
Fig. 1 a motor vehicle according to an embodiment of the present invention, comprising a driver assistance system with an object detection apparatus; and
Fig. 2 an image which is provided by the object detection apparatus.
In the figures, identical and functionally identical elements are provided with the same reference characters.
Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view. Presently, the motor vehicle 1 is configured as a passenger car.
The motor vehicle 1 comprises a driver assistance system 2, which is used to assist a driver when driving the motor vehicle 1 . The driver assistance system 2 includes an object detection apparatus 3. With the aid of the object detection apparatus 3 objects 9 can be detected in an environmental region 8 of the motor vehicle 1 . In the present case, as objects 9 a pedestrian 10 and a plant 1 1 or a tree are located in the environmental region 8 of the vehicle 1 in the direction of travel in front of the motor vehicle 1 .
If an object 9 is detected in the environmental region 8 by means of the object detection apparatus 3, for example a warning to the driver of the motor vehicle 1 can be output. Alternatively or additionally, the driver assistance system 2 can intervene in a steering system, a drive motor and/or a braking system of the motor vehicle 1 in order to prevent a collision between the motor vehicle 1 and the object 9.
The object recognition device 3 includes a camera 5, by means of which the first sensor data can be provided. The first sensor data describe the environmental region 8 in the visible wavelength range. As the first sensor data, for example, image data or images can be provided by the camera 5. In addition, the object detection apparatus 3 comprises a
sensor 6 by means of which the second sensor data can be provided. The second sensor data describe the environmental region 8 in the infrared wavelength range. Here, the sensor 6 is embodied as an infrared camera. In addition, the object detection apparatus 3 includes a computing device 7. The computing device 7 is connected with the camera 5 and the sensor 6 for data transmission. By means of the calculating device 7, the first sensor data from the camera 5 and the second sensor data from the sensor 6 can be received.
With the aid of the computing device 7, the objects 9 can be detected in the
environmental region 8 based on the first sensor data and/or the second sensor data. The first sensor data and the second sensor data can be provided, for example, as an image. It can also be provided that a composite image 12 is determined by means of the computing device 7 based on the first sensor data and the second sensor data. The objects 9 can be detected by means of a corresponding object detection algorithm. For example, a segmentation method can be used. By means of the computing device 7, the detected objects 9 are classified additionally. In particular, it is provided that the objects 9 are classified as static objects or as dynamic objects.
If the objects 9 in the environmental region 8 are classified as a static or as a moving objects, it is additionally verified, whether the object that is classified as static or as dynamic is a plant 1 1 . For this purpose, an image 12 showing the environmental region 8 and the objects 9 is provided with the calculating device 7 based on the first sensor data and the second sensor data. Such an image 12 is exemplarily shown in Fig. 2. The image 12 shows the pedestrian 10 and a part of the plant 1 1 or the tree. It can be the case that both the pedestrian 10 and the plant 1 1 are classified as static or moving object, and in particular as a pedestrian. This may be due to the plant 1 1 and in particular the leaves of the plant 1 1 moving as a result of environmental conditions such as wind.
If the object 9 is classified as static or as dynamic, it can be verified on the basis of the second sensor data whether the object 9 reflects or scatters radiation in the near infrared wavelength region and in particular in a wavelength range between 0.7 μηι and 1 .1 μηι. This takes into account that plants 1 1 scatter radiation in this wavelength range in order to prevent overheating and thereby caused damage to the cells. Alternatively or additionally, it can be verified on the basis of the first sensor data, whether the object scatters light in a wavelength range of between 0.4 μηι and 0.7 μηι. This takes into account that the chlorophyll in the leaves of plants strongly absorbs light in this wavelength range.
In the present example, as the image 12 a composite image is provided that shows both components in the visible wavelength range and in the near infrared wavelength range. In particular, the image 12 can show the normalized differenced vegetation index (NDVI). In the image 12 the pedestrian 10 and the plant 1 1 have different values of the NDVI. For example, the plant can have an NDVI of 1 . This is schematically illustrated by the hatching 13.
In this way, objects 9 that are initially classified as static or as dynamic objects, are examined in more detail. Further, the false positive rate in the detection of static or moving objects can be significantly reduced. This will allow a reliable detection of objects 9 in the environmental region 8 of the motor vehicle 1 . Thus, a reliable operation of the driver assistance system 2 can be enabled.
Claims
1 . Method for detecting objects (9) in an environmental region (8) of a motor vehicle
(I ) , wherein by means of a detecting device (4) of the motor vehicle (1 ) first sensor data describing the environmental region (8) in the visible wavelength range, and second sensor data describing the environmental region (8) in the infrared wavelength range, are provided and the object (9) is detected on the basis of the first sensor data and/or the second sensor data,
characterized in that
the object (9) is classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data and if the object (9) is classified as a static object or a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant (1 1 ).
2. Method according to claim 1 ,
characterized in that
it is verified on the basis of the second sensor data whether the object (9) is a plant
(I I ) , if the object (9) is classified as a pedestrian (10).
3. Method according to claim 1 or 2,
characterized in that
the object that is classified as static or dynamic is detected as a plant (1 1 ) on the basis of the radiation emitted from the object (9) in the near infrared wavelength range, in particular in a wavelength range of between 0.7 μηι and 1 .1 μηι.
4. Method according to any one of the preceding claims,
characterized in that
it is additionally verified on the basis of the first sensor data, whether the object that is classified as static or dynamic is a plant (1 1 ).
5. Method according to claim 4,
characterized in that
the object classified as static or dynamic is detected as a plant (1 1 ) on the basis of
a radiation emitted from the object (9) in a wavelength range between 0.4 μηι and 0.7 μπι.
6. Method according to any one of the preceding claims,
characterized in that
a normalized differenced vegetation index of the as static or dynamic classified object is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant (1 1 ) on the basis of the normalized differenced vegetation index.
7. Method according to any one of the preceding claims,
characterized in that
an image (12) is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant (1 1 ) on the basis of the image (12).
8. Object detection apparatus (3) for a driver assistance system (2) of a motor vehicle (1 ) comprising a detecting device (4) and a computing device (7), wherein the object detection apparatus (3) is adapted to perform a method according to any one of the preceding claims.
9. Object detection apparatus (3) according to claim 8,
characterized in that
detecting device (4) comprises a camera (5) for providing the first sensor data and a sensor (6) for providing the second sensor data.
10. Object detection apparatus (3) according to claim 8,
characterized in that
the detecting device (4) comprises a camera which is adapted for providing the first sensor data and the second sensor data.
1 1 . Driver assistance system (2) for a motor vehicle (1 ) comprising an object detection apparatus (3) according to any one of the claims 8 to 10.
12. Motor vehicle (1 ) comprising a driver assistance system (2) according to claim 1 1 .
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016119592.8A DE102016119592A1 (en) | 2016-10-14 | 2016-10-14 | Method for detecting objects in an environmental region of a motor vehicle taking into account sensor data in the infrared wavelength range, object recognition device, driver assistance system and motor vehicle |
PCT/EP2017/069928 WO2018068919A1 (en) | 2016-10-14 | 2017-08-07 | Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3526725A1 true EP3526725A1 (en) | 2019-08-21 |
Family
ID=59702673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17757697.2A Withdrawn EP3526725A1 (en) | 2016-10-14 | 2017-08-07 | Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3526725A1 (en) |
DE (1) | DE102016119592A1 (en) |
WO (1) | WO2018068919A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020105821A1 (en) | 2020-03-04 | 2021-09-09 | Connaught Electronics Ltd. | Method and system for driving a vehicle |
DE102022127833A1 (en) | 2022-10-21 | 2024-05-02 | Bayerische Motoren Werke Aktiengesellschaft | Driver assistance system and driver assistance method for a vehicle |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2454891C (en) * | 2001-07-24 | 2009-07-21 | The Board Of Regents For Oklahoma State University | A process for in-season nutrient application based on predicted yield potential |
DE102006059033A1 (en) * | 2006-12-14 | 2008-06-19 | Volkswagen Ag | Method and system for recognizing a road user and for generating a warning |
US8350724B2 (en) * | 2009-04-02 | 2013-01-08 | GM Global Technology Operations LLC | Rear parking assist on full rear-window head-up display |
US20120155714A1 (en) * | 2009-06-11 | 2012-06-21 | Pa Llc | Vegetation indices for measuring multilayer microcrop density and growth |
BR112015030886B1 (en) | 2014-04-18 | 2022-09-27 | Autonomous Solutions, Inc. | VEHICLE, VISION SYSTEM FOR USE BY A VEHICLE AND METHOD OF STEERING A VEHICLE WITH THE USE OF A VISION SYSTEM |
DE102014223741A1 (en) * | 2014-11-20 | 2016-05-25 | Conti Temic Microelectronic Gmbh | Detecting terahertz radiation to assist a driver of a vehicle |
DE102014224857A1 (en) * | 2014-12-04 | 2016-06-09 | Conti Temic Microelectronic Gmbh | Sensor system and method for classifying road surfaces |
-
2016
- 2016-10-14 DE DE102016119592.8A patent/DE102016119592A1/en active Pending
-
2017
- 2017-08-07 EP EP17757697.2A patent/EP3526725A1/en not_active Withdrawn
- 2017-08-07 WO PCT/EP2017/069928 patent/WO2018068919A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
DE102016119592A1 (en) | 2018-05-03 |
WO2018068919A1 (en) | 2018-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3342660B1 (en) | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method | |
US8301344B2 (en) | Device for classifying at least one object in the surrounding field of a vehicle | |
US9499171B2 (en) | Driving support apparatus for vehicle | |
CN106537180B (en) | Method for mitigating radar sensor limitations with camera input for active braking of pedestrians | |
CN110065494B (en) | Vehicle anti-collision method based on wheel detection | |
US9359009B2 (en) | Object detection during vehicle parking | |
US20170297488A1 (en) | Surround view camera system for object detection and tracking | |
CN108263279A (en) | The pedestrian detection and pedestrian impact avoiding device and method integrated based on sensor | |
CN107093180B (en) | Vision-based wet condition detection using rearward tire splash | |
CN110033621B (en) | Dangerous vehicle detection method, device and system | |
CN107273785B (en) | Multi-scale fused road surface condition detection | |
US9870513B2 (en) | Method and device for detecting objects from depth-resolved image data | |
KR20190049221A (en) | an Autonomous Vehicle of pedestrians facial features | |
CN106846394A (en) | Snow cover travel path surface appearance is detected | |
US20060038885A1 (en) | Method for detecting the environment ahead of a road vehicle by means of an environment detection system | |
KR20170127036A (en) | Method and apparatus for detecting and assessing road reflections | |
EP3282392A1 (en) | Vision system and method for a motor vehicle | |
CN109677402A (en) | The security protection system and method for automatic Pilot tool | |
JP4116643B2 (en) | Device for classifying at least one object around a vehicle | |
EP1854666B1 (en) | System for the detection of objects located in an external front-end zone of a vehicle, which is suitable for industrial vehicles | |
KR102017958B1 (en) | Augmented reality head up display system for railway train | |
CN111505617B (en) | Vehicle positioning method, device, equipment and storage medium | |
EP3526725A1 (en) | Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle | |
CN111066024B (en) | Method and device for recognizing a lane, driver assistance system and vehicle | |
WO2016079117A1 (en) | Gradient detection based on perspective-transformed image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20190402 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20191202 |