KR20170069096A - Driver Assistance Apparatus and Vehicle Having The Same - Google Patents
Driver Assistance Apparatus and Vehicle Having The Same Download PDFInfo
- Publication number
- KR20170069096A KR20170069096A KR1020150176349A KR20150176349A KR20170069096A KR 20170069096 A KR20170069096 A KR 20170069096A KR 1020150176349 A KR1020150176349 A KR 1020150176349A KR 20150176349 A KR20150176349 A KR 20150176349A KR 20170069096 A KR20170069096 A KR 20170069096A
- Authority
- KR
- South Korea
- Prior art keywords
- vehicle
- information
- unit
- vehicle driving
- processor
- Prior art date
Links
- 238000004891 communication Methods 0.000 claims abstract description 64
- 230000033001 locomotion Effects 0.000 claims abstract description 12
- 238000000034 method Methods 0.000 claims description 41
- 230000002093 peripheral effect Effects 0.000 claims description 6
- 239000003086 colorant Substances 0.000 claims 1
- 230000006870 function Effects 0.000 description 26
- 238000012544 monitoring process Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 17
- 238000001514 detection method Methods 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 9
- 238000007781 pre-processing Methods 0.000 description 8
- 230000011218 segmentation Effects 0.000 description 7
- 230000007774 longterm Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 239000000446 fuel Substances 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 239000007788 liquid Substances 0.000 description 5
- 238000012795 verification Methods 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 239000000725 suspension Substances 0.000 description 4
- 238000003491 array Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000002485 combustion reaction Methods 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 239000012530 fluid Substances 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000007921 spray Substances 0.000 description 2
- 238000005507 spraying Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 239000010409 thin film Substances 0.000 description 2
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000002803 fossil fuel Substances 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
- B60R21/0134—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/18—Conjoint control of vehicle sub-units of different type or different function including control of braking systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B60W2050/0081—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B60W2550/22—
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Traffic Control Systems (AREA)
Abstract
A vehicle driving assist apparatus according to an embodiment of the present invention includes: a sensor unit for sensing a first object around a vehicle; A processor for recognizing the sensed first object and generating risk information for the first object based on the recognized movement state of the first object; And a communication unit for transmitting the risk information generated through the processor to the outside.
Description
BACKGROUND OF THE
A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.
The automobiles are internal combustion engine cars, external combustion engine cars, gas turbine cars or electric vehicles according to the prime mover used.
Electric vehicles are electric vehicles that use electric energy to drive electric motors. They include pure electric vehicles, hybrid electric vehicles (HEV), plug-in hybrid electric vehicles (PHEV), and hydrogen fuel cell vehicles (FCEV).
Meanwhile, in recent years, the development of an intelligent vehicle (Smart Vehicle) has been actively developed for safety and convenience of drivers, pedestrians, and the like.
Intelligent automobiles are also called smart automobiles, cutting-edge vehicles that combine information technology (IT) technology. Intelligent automobiles provide optimum transportation efficiency through interworking with intelligent transportation system (ITS) as well as introducing advanced system of automobile itself.
For example, intelligent vehicles have the technical advantage of maximizing the safety of the occupants as well as the occupants by developing key safety-related technologies such as Adaptive Cruise Control (ACC), obstacle detection, collision detection or mitigation.
In addition, recently, V2V (Vehicle to Vehicle) technology which attracts automobiles to run while exchanging wireless communication between automobiles running adjacent to each other is getting attention. With this V2V technology, automobiles can run at constant distance from each other on the road. In detail, it is possible to prevent a sudden traffic accident by sharing the location and speed information of the nearby vehicle in real time.
On the other hand, accidents that occur while driving are caused by objects that have suddenly entered the blind spot.
That is, it is difficult to recognize an object entering from a blind spot with current vehicle technology, and it is difficult to defend against an unauthorized traverse.
The embodiments of the present invention provide a vehicle driving assistant device capable of recognizing and tracking object information that has entered a lane and informing neighboring vehicles of the danger information accordingly, and a vehicle including the same.
In addition, the embodiment of the present invention provides a vehicle driving assistant device capable of providing warning information informing an approaching object to an object that has entered a lane, and a vehicle including the same.
It is to be understood that the technical objectives to be achieved by the embodiments are not limited to the technical matters mentioned above and that other technical subjects not mentioned are apparent to those skilled in the art to which the embodiments proposed from the following description belong, It can be understood.
A vehicle driving assist apparatus according to an embodiment of the present invention includes: a sensor unit for sensing a first object around a vehicle; A processor for recognizing the sensed first object and generating risk information for the first object based on the recognized movement state of the first object; And a communication unit for transmitting the risk information generated through the processor to the outside.
A sensor unit for sensing an object existing in the vicinity of the vehicle; A display unit for displaying an image of a traffic situation around the vehicle; And a processor for, when the object is sensed, displaying information of the object on the image based on the position of the sensed object, wherein the processor divides the image into a plurality of regions, The information of the object is displayed in an area corresponding to the position of the object among the areas, and information of the object is displayed on the image in different ways according to the degree of danger of the object.
According to the embodiment of the present invention, it is possible to detect an object (for example, a pedestrian, a bicycle, a motorcycle or the like) that has entered a lane through a camera image, predict a moving direction of the detected object, By transmitting the information, it is possible to reduce the accident occurrence rate of the pedestrian entering from a position difficult to predict.
Further, according to the embodiment of the present invention, by utilizing the inter-vehicle communication technology and the pedestrian assist function, a more stable driving environment can be provided.
BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 shows an appearance of a vehicle equipped with a vehicle driving assist system according to an embodiment of the present invention.
FIG. 2 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention. FIG. 3 is a plan view of a vehicle having a sensor unit according to an embodiment of the present invention. 5 and 6 are views for explaining an example of a method of generating image information from a camera image according to an embodiment of the present invention, FIG. 7 is a diagram for explaining an indicator output unit according to FIG.
8 is a flowchart for explaining steps of transmitting the risk information of the
FIG. 9 is a view for explaining an object entering in a blind spot according to an embodiment of the present invention, FIG. 10 is a view for explaining an object which is traversed in an unauthorized position or in an unrecognizable position according to an embodiment of the present invention FIG.
11 is a diagram for setting a communication structure according to an embodiment of the present invention.
12 is a view for explaining a communication structure between the
13 is a diagram for explaining the degree of danger according to the embodiment of the present invention.
14 is a view for explaining danger information displayed through an internal display unit according to an embodiment of the present invention.
15 to 19 are diagrams for explaining notification information according to an embodiment of the present invention.
20 and 21 are views showing a traffic situation image according to an embodiment of the present invention.
Fig. 22 is an example of an internal block diagram of the vehicle of Fig. 1 including the above-described vehicle driving assist system.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.
The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, and the right side of the vehicle means the right side in the running direction of the vehicle.
Unless otherwise mentioned in the following description, the LHD (Left Hand Drive) vehicle will be mainly described.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a vehicle driving assistance device according to an embodiment will be described in detail with reference to the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 shows an appearance of a vehicle equipped with a vehicle driving assist system according to an embodiment of the present invention.
Referring to Fig. 1, a
In the embodiment, the vehicle
A set of some of the units of the
2) of the vehicle
In addition, the vehicle driving
In other words, the other nearby vehicles may not be able to detect the object by the
For convenience of explanation, it is assumed that the vehicle driving
The vehicle driving assistant device (100) senses an object in the vicinity of the vehicle, and generates danger information on the object based on the sensed movement state of the object.
At this time, the vehicle driving
Here, the moving state of the object may include at least one of moving direction and moving speed of the object. Preferably, the moving state may include a moving speed of the object.
Then, the vehicle driving assistant (100) determines a risk level of the object based on the moving speed of the object, and generates risk information for the object, including information on the determined risk level.
The risk level can be defined as shown in Table 1 below.
(Pedestrian, stroller, car, etc.)
(Such as a bicycle or a running pedestrian)
(Such as motorcycles, cars and bicycles)
In the above, the higher the risk level, the greater the risk of injury in the event of an accident. Accordingly, the
The vehicle driving assistant (100) detects a peripheral vehicle existing in the vicinity of the vehicle (700). The neighboring vehicle can be detected using an image photographed through a camera, or alternatively can be detected through a sensor such as a proximity sensor.
Further, when the surrounding vehicle is detected, the vehicle driving
At this time, the vehicle driving
That is, the vehicle driving
In other words, the vehicle driving
At this time, the indicator I is output in order to provide warning information to the notification target determined in the vehicle driving
Also, the notification status may include a situation where a specific object exists around the
In addition, the indicator display method refers to various methods of determining an indicator display position, size, brightness, saturation, color, phase, and indicator image and displaying the indicator outside the vehicle.
Hereinafter, each unit constituting the vehicle driving
FIG. 2 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention. FIG. 3 is a plan view of a vehicle having a sensor unit according to an embodiment of the present invention. 5 and 6 are views for explaining an example of a method of generating image information from a camera image according to an embodiment of the present invention, FIG. 7 is a diagram for explaining an indicator output unit according to FIG.
2, the vehicle driving
Returning to the description of the configuration, first, the vehicle driving
For example, the user inputs a transmission condition for transmitting the danger information through the
The
Next, the vehicle driving
The vehicle driving
In addition, the vehicle driving
In addition, the vehicle driving
The
At this time, when the object is sensed and the risk information for the object is generated, the
Accordingly, when the walking signal is changed in a situation where the pedestrian corresponding to the object does not cross the pedestrian crossing, the
The
In addition, the
In addition, the
In particular, the
The
Here, the common vehicle running information may include at least one of direction information, position information, vehicle speed information, acceleration information, movement route information, forward / backward information, adjacent vehicle information, and turn signal information.
Accordingly, the vehicle driving
In addition, when the user is boarding the
The
More specifically, the communication unit can wirelessly communicate using a wireless data communication system. Wireless data communication schemes include, but are not limited to, technical standards or communication schemes for mobile communication (e.g., Global System for Mobile communications (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (Long Term Evolution), LTE Term Evolution-Advanced) or the like can be used.
In addition, the
In addition, the
In addition, the vehicle driving
Next, the vehicle driving
In detail, the vehicle driving
Such navigation information and sensor information may be used as additional information by the
Further, the vehicle driving
To this end, the
In detail, the
The
Here, the sensor information includes direction information of the
Also, the sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, a vehicle internal humidity sensor, and a door sensor. On the other hand, the position module may include a GPS module for receiving GPS information.
The
In addition, the
Next, the
The
And this application program is stored in the
In addition, the
Meanwhile, the
For example, when the image acquired through the
The
In addition, the vehicle driving
Next, the
In detail, the
In detail, the
For example, the
The biometric information detected by the
Next, the vehicle driving
As described above, the vehicle driving
The
The
The
3, the
The
For example, the distance sensor may be a laser sensor, which may be a laser sensor, which uses a time-of-flight (TOF) or / and a phase-shift according to a laser signal modulation method, Can be measured. More specifically, the time delay method can measure the distance between the object and the object by emitting a pulsed laser signal and measuring the time that the reflected pulse signals from the objects within the measurement range arrive at the receiver.
On the other hand, whether or not the object corresponds to the notification target and the attribute of the object for determining the notification status can be obtained by analyzing the image captured by the
To this end, the
More specifically, the vehicle driving
Here, the image information may be included in the sensor information as at least one of the type of the object, the traffic signal information displayed by the object, the distance between the object and the vehicle, and the position of the object.
More specifically, the
In order for the
The
The
3, a plurality of
The
The
Further, the
The
Further, the
Such a
In an embodiment, the
Hereinafter, with reference to Figs. 4 to 6, a method for the
4, the
The
This vehicle driving
5, the
An image preprocessor 410 may receive an image from the
Specifically, the image preprocessing unit 410 may perform a noise reduction, a rectification, a calibration, a color enhancement, a color space conversion (CSC Interpolation,
The disparity calculator 420 receives the image signal processed by the image preprocessing unit 410, performs stereo matching on the received images, and performs disparity calculation based on stereo matching, A disparity map can be obtained. That is, it is possible to obtain the disparity information about the stereo image with respect to the front of the vehicle.
At this time, the stereo matching may be performed on a pixel-by-pixel basis of stereo images or on a predetermined block basis. On the other hand, the disparity map may mean a map in which binaural parallax information of stereo images, i.e., left and right images, is numerically expressed.
The segmentation unit 432 may perform segmenting and clustering on at least one of the images based on the disparity information from the disparity calculating unit 420. [
Specifically, the segmentation unit 432 can separate the background and the foreground for at least one of the stereo images based on the disparity information.
For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated. As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.
Thus, by separating the foreground and the background based on the disparity information information extracted based on the stereo image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.
Next, the object detector 434 can detect the object based on the image segment from the segmentation unit 432. [
That is, the object detecting unit 434 can detect an object for at least one of the images based on the disparity information.
More specifically, the object detecting unit 434 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.
The object verification unit 436 then classifies and verifies the isolated object.
For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.
On the other hand, the object checking unit 436 can check the objects by comparing the objects stored in the
For example, the object identifying unit 436 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, etc., located in the vicinity of the vehicle.
The object tracking unit 440 may perform tracking on the identified object. For example, it sequentially identifies an object in the acquired stereo images, calculates a motion or a motion vector of the identified object, and tracks movement of the object based on the calculated motion or motion vector . Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, dangerous areas, tunnels, etc., located in the vicinity of the vehicle.
Next, the application unit 450 can calculate the risk of the vehicle and the like based on various objects located in the vicinity of the vehicle, for example, other vehicles, lanes, roads, signs and the like. It is also possible to calculate the possibility of a collision with a preceding vehicle, whether the vehicle is slipping or the like.
Then, the application unit 450 can output a message or the like for notifying the user to the user as vehicle driving assistance information, based on the calculated risk, possibility of collision, or slip. Alternatively, a control signal for attitude control or running control of the vehicle may be generated as the vehicle control information.
The object preprocessing unit 440 and the application unit 450 are connected to the processor 440. The image processing unit 410, the dispatcher unit 420, the segmentation unit 432, the object detection unit 434, the object verification unit 436, And may be an internal configuration of the image processing unit in the
The
Referring to FIG. 6, during the first frame period, the
The disparity calculating unit 420 in the
The
On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.
In the figure, first to
The segmentation unit 432, the object detection unit 434 and the object identification unit 436 determine whether or not the segments, the object detection, and the object detection information for at least one of the stereo images FR1a and FR1b based on the
In the figure, using the
That is, the first to
With the image processing as described above, the vehicle driving
In addition, the vehicle driving
In addition, when the vehicle driving
To this end, the output unit may include an
First, the
The
In addition, the
Further, the
Hereinafter, an indicator display method of the
In detail, the
7, a
A
The indicator output units can display the indicator around the vehicle by irradiating laser beams to the arranged positions.
In addition, the seventh
The plurality of
A
That is, the plurality of
Accordingly, the
For example, the
The arrangement of the above-described indicator output units is an example, and it is possible to arrange various indicator output units capable of displaying indicators on the front, rear, left and right sides of the vehicle as in the other embodiments including only a part of the above indicators.
The
The
On the other hand, the
In addition, the
To this end, the
First, the
Once the first object is detected, the
The
The
The
Such a
That is, the
The projection image displayed on the
In addition, the
That is, the
Also, the
More specifically, the
The
The
In addition, the vehicle driving
In detail, the
For example, the
In addition, the
In particular, the
That is, the
That is, in general, when an object moves between vehicles parked or parked on a shoulder of a lane, the object is covered by the parked or parked vehicle, so that a vehicle or a driver, It occurs in a situation that can not be detected.
Accordingly, in the present invention, power is continuously supplied to specific components so that detection of the object and transmission of danger information according to the stopped or parked vehicle can be performed.
Finally, the vehicle driving
In addition,
Such a
The
In addition to the operations associated with the application programs stored in the
Hereinafter, the operation of the vehicle driving
8 is a flowchart for explaining steps of transmitting the risk information of the
First, referring to FIG. 8, the
The object may be a pedestrian suddenly entering a lane in a blind spot, or alternatively a pedestrian entering when an unauthorized traverse or a signal change.
FIG. 9 is a view for explaining an object entering in a blind spot according to an embodiment of the present invention, FIG. 10 is a view for explaining an object which is traversed in an unauthorized position or in an unrecognizable position according to an embodiment of the present invention FIG.
9, a
However, the
As shown in (b), the
However, the
10, the
However, the
As shown in (b), the
However, the
In such a situation, the vehicle driving
At this time, the vehicle driving assistant (100) determines a risk level based on the moving state of the first object (900), generates the risk information including information on the determined risk level, Transmit risk information.
11 is a diagram for setting a communication structure according to an embodiment of the present invention.
11, in a state where the
Accordingly, the detection of the
To this end, when the object is sensed, the vehicle driving
That is, the vehicle driving
Alternatively, the
That is, the vehicle driving
If it is detected that the vehicle is in danger of collision with the object, the
In addition, if no other vehicle with a risk of collision with the object is sensed, in other words, the
Then, the
In addition, the
On the road, there is a
Accordingly, the
12 is a view for explaining a communication structure between the
Referring to FIG. 12, a
Accordingly, the vehicle driving
Meanwhile, when the vehicle driving
13 is a diagram for explaining the degree of danger according to the embodiment of the present invention.
Referring to FIG. 13, the
At this time, the peripheral region of the
In other words, around the
Accordingly, the vehicle driving
Meanwhile, the vehicle driving
In addition, the vehicle driving
14 is a view for explaining danger information displayed through an internal display unit according to an embodiment of the present invention.
Referring to FIG. 14, the
At this time, the vehicle function may include an automatic full brake function.
For example, when the brake operation pressure is insufficient and the vehicle can not be stopped within the collision required time even if the brake is operated at the present time based on the current speed of the vehicle and the collision required time, (100) automatically activates the full brake so that the vehicle can stop before colliding with the object.
On the other hand, the vehicle driving
15 to 19 are diagrams for explaining notification information according to an embodiment of the present invention.
Referring to FIG. 15, the
Referring to FIG. 16, the vehicle driving
Referring to FIG. 17, the
Referring to FIG. 18, the vehicle driving
18 (a), the notification information may be stop information E5 for generating a stop signal according to the existence of the object. Alternatively, as shown in (b) of FIG. 18, (E6).
Referring to FIG. 19, the vehicle driving
The transmitted risk information may further include not only the above-described information on the risk level, but also information on the degree of danger, as well as location information and movement state information of the object. Accordingly, .
Meanwhile, the vehicle driving
20 and 21 are views showing a traffic situation image according to an embodiment of the present invention.
Referring to FIGS. 20 and 21, a plurality of objects are positioned on the road based on the
At this time, the vehicle driving
That is, the vehicle driving
Then, the vehicle driving
At this time, the image is divided into a plurality of areas, and information of the corresponding objects is displayed in an area corresponding to the position of each of the divided areas.
At this time, the information of the object is displayed on the image in different directions depending on the degree of danger of the object. In other words, an area in which the information on the object is displayed among the divided areas on the image is displayed in a different color or pattern according to the degree of danger of the object.
For example, the area where the high-risk object is located is displayed in red, and the area where the low-risk object is located may be displayed in blue. Alternatively, as shown in FIG. 21, different patterns may be displayed according to the degree of danger.
When the areas in which the different objects are displayed on the image overlap each other, that is, when the area in which the first object is displayed and the area in which the second object are displayed overlap each other, Information indicating that there is a risk of mutual collision is transmitted to the first object and the second object.
In other words, when the areas overlap each other, it means that the first object and the second object collide with each other. Therefore, the
At this time, an area where the object is not located among the divided areas on the image is displayed as a blank area. Here, the free area may be classified into a safe area in which the vehicle can move, and a danger area in which the vehicle can not move or is in danger of moving.
Accordingly, if there is a risk of a collision in the surroundings as described above, the vehicle driving
According to the embodiment of the present invention, it is possible to detect an object (for example, a pedestrian, a bicycle, a motorcycle or the like) that has entered a lane through a camera image, predict a moving direction of the detected object, By transmitting the information, it is possible to reduce the accident occurrence rate of the pedestrian entering from a position difficult to predict.
Further, according to the embodiment of the present invention, by utilizing the inter-vehicle communication technology and the pedestrian assist function, a more stable driving environment can be provided.
Referring to Fig. 22, the above-described vehicle driving
The
The
The
The
The
Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Wide Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and Long Term Evolution-Advanced (LTE-A) (712) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the
The short-
The short
The
The
The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.
The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle. For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the
The
The driving operation means 721 receives a user input for driving the vehicle. The driving operation means 721 may include a steering input means 721A, a shift input means 721D, an acceleration input means 721C, and a brake input means 721B.
The steering input means 721A receives the input of the traveling direction of the vehicle from the user. The steering input means 721A is preferably formed in a wheel shape so that steering input is possible by rotation. According to the embodiment, the steering input means 721A may be formed of a touch screen, a touch pad, or a button.
The shift input means 721D receives inputs of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle from the user. The shift input means 721D is preferably formed in a lever shape. According to an embodiment, the shift input means 721D may be formed of a touch screen, a touch pad, or a button.
The acceleration input means 721C receives an input for acceleration of the vehicle from the user. The brake inputting means 721B receives an input for decelerating the vehicle from the user. The acceleration input means 721C and the brake input means 721B are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 721C or the brake input means 721B may be formed of a touch screen, a touch pad, or a button.
The
The
22, the
The
The
The
The
Thereby, the
In addition, the
The
The
The
The
The
Meanwhile, the
Meanwhile, according to the embodiment, the
The
The
The
The power
For example, when the fossil fuel-based engine (not shown) is a power source, the power
As another example, when the electric motor (not shown) is a power source, the power
The
The
The
The air
The
The
The
The
The
The
Meanwhile, the
The
The
The
The
Alternatively, the
The
The AVN (Audio Video Navigation)
The features, structures, effects and the like described in the embodiments are included in at least one embodiment and are not necessarily limited to only one embodiment. Furthermore, the features, structures, effects and the like illustrated in the embodiments can be combined and modified by other persons skilled in the art to which the embodiments belong. Accordingly, the contents of such combinations and modifications should be construed as being included in the scope of the embodiments.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention. It can be seen that the modification and application of branches are possible. For example, each component specifically shown in the embodiments can be modified and implemented. It is to be understood that the present invention may be embodied in many other specific forms without departing from the spirit or essential characteristics thereof.
Claims (19)
A processor for recognizing the sensed first object and generating risk information for the first object based on the recognized movement state of the first object; And
And a communication unit for transmitting the risk information generated through the processor to the outside
Vehicle driving assistance device.
The moving state may include:
And at least one of a moving direction and a moving speed of the first object,
The processor comprising:
Determines a risk level of the first object based on at least one of a moving direction and a moving speed of the first object, and generates risk information including information on the determined risk level
Vehicle driving assistance device.
The processor comprising:
When the start of the vehicle is turned off, driving power is supplied to the sensor unit and the communication unit for sensing the first object and transmitting the danger information
Vehicle driving assistance device.
The processor comprising:
Detecting at least one surrounding vehicle existing in the vicinity of the vehicle, and transmitting the danger information to the at least one nearby vehicle
Vehicle driving assistance device.
The processor comprising:
Detecting a neighboring vehicle that is moving to the position of the first object among the sensed neighboring vehicles, and transmitting the risk information to the detected adjacent vehicle
Vehicle driving assistance device.
The processor comprising:
And notifying the presence of the adjacent vehicle to the first object
Vehicle driving assistance device.
The processor comprising:
A lamp driving signal is outputted so that light is generated in a position of the first object and a moving direction of the first object
Vehicle driving assistance device.
Further comprising an announcement information output unit for outputting announcement information informing existence of the first object to the adjacent vehicle,
The announcement information output unit,
And an indicator output unit for displaying an indicator according to the first object outside the vehicle
Vehicle driving assistance device.
The processor comprising:
Detecting a danger level of the first object with respect to the adjacent vehicle on the basis of the moving state of the adjacent vehicle and the moving state of the first object and transmitting the danger information including the detected danger level to the adjacent vehicle doing
Vehicle driving assistance device.
The processor comprising:
A peripheral region of the vehicle is divided into a plurality of regions according to positions,
Setting a different risk level for each of the plurality of divided areas,
And transmits information on the set risk level to an adjacent vehicle existing in each of the divided areas
Vehicle driving assistance device.
Wherein,
The risk information is transmitted to a traffic server existing in the vicinity of the vehicle and the traffic light system is changed based on the transmitted risk information
Vehicle driving assistance device.
Further comprising a display unit for receiving the danger information on the second object recognized by the adjacent vehicle from the adjacent vehicle and displaying the received danger information
Vehicle driving assistance device.
The displayed risk information may include,
Wherein the second object includes at least one of position information of the second object, moving state information of the second object, and danger level information of the second object
Vehicle driving assistance device.
The displayed risk information may include,
Further comprising collision time information on the second object,
The processor comprising:
And the brake is automatically controlled based on the collision required time information
Vehicle driving assistance device.
A display unit for displaying an image of a traffic situation around the vehicle; And
And a processor for displaying information of the object on the image based on a position of the detected object when the object is sensed,
The processor comprising:
Dividing the image into a plurality of regions,
The information of the object is displayed in an area corresponding to the position of the object among the divided areas,
Wherein the information of the object includes:
Are displayed on the image in different ways depending on the degree of danger of the object
Vehicle driving assistance device.
Wherein the area in which the information about the object among the divided areas is displayed,
Depending on the degree of risk, different colors or patterns are distinguished
Vehicle driving assistance device.
Further comprising a communication unit for transmitting the collision risk information according to the overlapping to the first and second objects when the first and second objects in the image overlap with each other,
Vehicle driving assistance device.
The processor comprising:
If the collision risk is detected, the vehicle is moved to a position corresponding to the safe zone on the image
Vehicle driving assistance device.
vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150176349A KR20170069096A (en) | 2015-12-10 | 2015-12-10 | Driver Assistance Apparatus and Vehicle Having The Same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150176349A KR20170069096A (en) | 2015-12-10 | 2015-12-10 | Driver Assistance Apparatus and Vehicle Having The Same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170069096A true KR20170069096A (en) | 2017-06-20 |
Family
ID=59281140
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150176349A KR20170069096A (en) | 2015-12-10 | 2015-12-10 | Driver Assistance Apparatus and Vehicle Having The Same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170069096A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200064439A (en) * | 2018-11-29 | 2020-06-08 | 주식회사 알티스트 | System and method of obstacle verification based on inter-vehicular communication |
KR20200097831A (en) * | 2019-02-08 | 2020-08-20 | 배민재 | Device and method for controlliing sound singal of vehicle, and device of outputting soung signal |
KR20200104221A (en) * | 2019-02-26 | 2020-09-03 | 도요타지도샤가부시키가이샤 | In-vehicle information processing device, inter-vehicle information processing system, and information processing system |
-
2015
- 2015-12-10 KR KR1020150176349A patent/KR20170069096A/en unknown
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200064439A (en) * | 2018-11-29 | 2020-06-08 | 주식회사 알티스트 | System and method of obstacle verification based on inter-vehicular communication |
KR20200097831A (en) * | 2019-02-08 | 2020-08-20 | 배민재 | Device and method for controlliing sound singal of vehicle, and device of outputting soung signal |
US11450156B2 (en) | 2019-02-08 | 2022-09-20 | Minjae BAE | Device and method for controlling sound signal of vehicle, and device of outputting sound signal |
KR20200104221A (en) * | 2019-02-26 | 2020-09-03 | 도요타지도샤가부시키가이샤 | In-vehicle information processing device, inter-vehicle information processing system, and information processing system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101750178B1 (en) | Warning Method Outside Vehicle, Driver Assistance Apparatus For Executing Method Thereof and Vehicle Having The Same | |
KR101916993B1 (en) | Display apparatus for vehicle and control method thereof | |
KR101832466B1 (en) | Parking Assistance Apparatus and Vehicle Having The Same | |
US10737689B2 (en) | Parking assistance apparatus and vehicle having the same | |
KR101826408B1 (en) | Display Apparatus and Vehicle Having The Same | |
EP3481692B1 (en) | Driver assistance apparatus | |
CN109789778B (en) | Automatic parking assist device and vehicle comprising same | |
KR20170058188A (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR101860626B1 (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR20180040235A (en) | Parking Assistance Apparatus and Vehicle Having The Same | |
KR20170099188A (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR20180037730A (en) | Display apparatus for vehicle and vehicle having the same | |
KR20170111084A (en) | Display Apparatus and Vehicle Having The Same | |
KR101962348B1 (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR101790426B1 (en) | Apparatus for automatic parking and vehicle having the same | |
KR20170072092A (en) | Driver assistance appratus and method thereof | |
KR20170033612A (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR101972352B1 (en) | Parking Assistance Apparatus and Vehicle Having The Same | |
KR101929294B1 (en) | Parking Assistance Apparatus and Vehicle Having The Same | |
KR101843535B1 (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR20170069096A (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR101888259B1 (en) | Vehicle Assistance Apparatus and Vehicle Having The Same | |
KR20180073042A (en) | Driving assistance apparatus and vehicle having the same | |
KR101737236B1 (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR101894636B1 (en) | Driver Assistance Apparatus and Vehicle Having The Same |