[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR20170069096A - Driver Assistance Apparatus and Vehicle Having The Same - Google Patents

Driver Assistance Apparatus and Vehicle Having The Same Download PDF

Info

Publication number
KR20170069096A
KR20170069096A KR1020150176349A KR20150176349A KR20170069096A KR 20170069096 A KR20170069096 A KR 20170069096A KR 1020150176349 A KR1020150176349 A KR 1020150176349A KR 20150176349 A KR20150176349 A KR 20150176349A KR 20170069096 A KR20170069096 A KR 20170069096A
Authority
KR
South Korea
Prior art keywords
vehicle
information
unit
vehicle driving
processor
Prior art date
Application number
KR1020150176349A
Other languages
Korean (ko)
Inventor
김석근
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150176349A priority Critical patent/KR20170069096A/en
Publication of KR20170069096A publication Critical patent/KR20170069096A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/0081
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • B60W2550/22

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

A vehicle driving assist apparatus according to an embodiment of the present invention includes: a sensor unit for sensing a first object around a vehicle; A processor for recognizing the sensed first object and generating risk information for the first object based on the recognized movement state of the first object; And a communication unit for transmitting the risk information generated through the processor to the outside.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a vehicle driving assist apparatus,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a vehicle driving assist apparatus provided in a vehicle, a control method thereof, and a vehicle including the same.

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

The automobiles are internal combustion engine cars, external combustion engine cars, gas turbine cars or electric vehicles according to the prime mover used.

Electric vehicles are electric vehicles that use electric energy to drive electric motors. They include pure electric vehicles, hybrid electric vehicles (HEV), plug-in hybrid electric vehicles (PHEV), and hydrogen fuel cell vehicles (FCEV).

Meanwhile, in recent years, the development of an intelligent vehicle (Smart Vehicle) has been actively developed for safety and convenience of drivers, pedestrians, and the like.

Intelligent automobiles are also called smart automobiles, cutting-edge vehicles that combine information technology (IT) technology. Intelligent automobiles provide optimum transportation efficiency through interworking with intelligent transportation system (ITS) as well as introducing advanced system of automobile itself.

For example, intelligent vehicles have the technical advantage of maximizing the safety of the occupants as well as the occupants by developing key safety-related technologies such as Adaptive Cruise Control (ACC), obstacle detection, collision detection or mitigation.

In addition, recently, V2V (Vehicle to Vehicle) technology which attracts automobiles to run while exchanging wireless communication between automobiles running adjacent to each other is getting attention. With this V2V technology, automobiles can run at constant distance from each other on the road. In detail, it is possible to prevent a sudden traffic accident by sharing the location and speed information of the nearby vehicle in real time.

On the other hand, accidents that occur while driving are caused by objects that have suddenly entered the blind spot.

That is, it is difficult to recognize an object entering from a blind spot with current vehicle technology, and it is difficult to defend against an unauthorized traverse.

The embodiments of the present invention provide a vehicle driving assistant device capable of recognizing and tracking object information that has entered a lane and informing neighboring vehicles of the danger information accordingly, and a vehicle including the same.

In addition, the embodiment of the present invention provides a vehicle driving assistant device capable of providing warning information informing an approaching object to an object that has entered a lane, and a vehicle including the same.

It is to be understood that the technical objectives to be achieved by the embodiments are not limited to the technical matters mentioned above and that other technical subjects not mentioned are apparent to those skilled in the art to which the embodiments proposed from the following description belong, It can be understood.

A vehicle driving assist apparatus according to an embodiment of the present invention includes: a sensor unit for sensing a first object around a vehicle; A processor for recognizing the sensed first object and generating risk information for the first object based on the recognized movement state of the first object; And a communication unit for transmitting the risk information generated through the processor to the outside.

A sensor unit for sensing an object existing in the vicinity of the vehicle; A display unit for displaying an image of a traffic situation around the vehicle; And a processor for, when the object is sensed, displaying information of the object on the image based on the position of the sensed object, wherein the processor divides the image into a plurality of regions, The information of the object is displayed in an area corresponding to the position of the object among the areas, and information of the object is displayed on the image in different ways according to the degree of danger of the object.

According to the embodiment of the present invention, it is possible to detect an object (for example, a pedestrian, a bicycle, a motorcycle or the like) that has entered a lane through a camera image, predict a moving direction of the detected object, By transmitting the information, it is possible to reduce the accident occurrence rate of the pedestrian entering from a position difficult to predict.

Further, according to the embodiment of the present invention, by utilizing the inter-vehicle communication technology and the pedestrian assist function, a more stable driving environment can be provided.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 shows an appearance of a vehicle equipped with a vehicle driving assist system according to an embodiment of the present invention.
FIG. 2 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention. FIG. 3 is a plan view of a vehicle having a sensor unit according to an embodiment of the present invention. 5 and 6 are views for explaining an example of a method of generating image information from a camera image according to an embodiment of the present invention, FIG. 7 is a diagram for explaining an indicator output unit according to FIG.
8 is a flowchart for explaining steps of transmitting the risk information of the vehicle driving assistant 100 according to the embodiment of the present invention.
FIG. 9 is a view for explaining an object entering in a blind spot according to an embodiment of the present invention, FIG. 10 is a view for explaining an object which is traversed in an unauthorized position or in an unrecognizable position according to an embodiment of the present invention FIG.
11 is a diagram for setting a communication structure according to an embodiment of the present invention.
12 is a view for explaining a communication structure between the server 500 and vehicles according to the embodiment of the present invention.
13 is a diagram for explaining the degree of danger according to the embodiment of the present invention.
14 is a view for explaining danger information displayed through an internal display unit according to an embodiment of the present invention.
15 to 19 are diagrams for explaining notification information according to an embodiment of the present invention.
20 and 21 are views showing a traffic situation image according to an embodiment of the present invention.
Fig. 22 is an example of an internal block diagram of the vehicle of Fig. 1 including the above-described vehicle driving assist system.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, and the right side of the vehicle means the right side in the running direction of the vehicle.

Unless otherwise mentioned in the following description, the LHD (Left Hand Drive) vehicle will be mainly described.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a vehicle driving assistance device according to an embodiment will be described in detail with reference to the drawings.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 shows an appearance of a vehicle equipped with a vehicle driving assist system according to an embodiment of the present invention.

Referring to Fig. 1, a vehicle 700 according to an embodiment may include wheels 13FL and 13RL rotated by a power source and a vehicle driving assist device.

In the embodiment, the vehicle driving assistant device 100 is a separate device that transmits and receives necessary information through data communication with the vehicle 700 and transmits necessary information to nearby vehicles existing in the vicinity of the vehicle 700 . In addition, the vehicle driving assistant device 100 can perform an indicator display function. The indicator display function may be a function for informing a nearby vehicle existing in the vicinity of the vehicle 700 of existence of a specific object. In addition, the indicator display function may be a function for informing an object existing in the vicinity of the vehicle 700 of approaching the neighboring vehicle.

 A set of some of the units of the vehicle 700 may also be defined as the vehicle driving assistant device 100. [

2) of the vehicle driving assistant device 100 is not included in the vehicle driving assistant device 100, and the vehicle driving assistant device 100 is not included in the vehicle driving assistant device 100 700, or another unit mounted on the vehicle 700. The external units can be understood as being included in the vehicle driving assistant device 100 by transmitting and receiving data through the interface of the vehicle driving assistant device 100. [

In addition, the vehicle driving assistant device 100 can exchange data with the surrounding vehicles in the vicinity of the vehicle 700, as well as with the function of exchanging data with the vehicle 700. Preferably, the vehicle driving assistant device 100 transmits danger information about an object acquired by the vehicle driving assist device 100 to other nearby vehicles in the vicinity of the vehicle 700.

In other words, the other nearby vehicles may not be able to detect the object by the vehicle 700. Accordingly, the vehicle driving assistant apparatus 100 generates danger information on the object, transmits the generated danger information to other nearby vehicles existing in the vicinity of the vehicle 700, So that the presence of the object can be recognized.

For convenience of explanation, it is assumed that the vehicle driving assistant apparatus 100 according to the embodiment directly includes the respective units shown in FIG.

The vehicle driving assistant device (100) senses an object in the vicinity of the vehicle, and generates danger information on the object based on the sensed movement state of the object.

At this time, the vehicle driving assistant apparatus 100 determines a danger level of the object according to the moving state of the object, and generates danger information including information on the determined danger level.

Here, the moving state of the object may include at least one of moving direction and moving speed of the object. Preferably, the moving state may include a moving speed of the object.

Then, the vehicle driving assistant (100) determines a risk level of the object based on the moving speed of the object, and generates risk information for the object, including information on the determined risk level.

The risk level can be defined as shown in Table 1 below.

Risk Level Object example Stage 1 Static object Step 2 Objects moving at low speed
(Pedestrian, stroller, car, etc.)
Step 3 Objects moving with speed
(Such as a bicycle or a running pedestrian)
Step 4 Objects passing at high speed
(Such as motorcycles, cars and bicycles)

In the above, the higher the risk level, the greater the risk of injury in the event of an accident. Accordingly, the vehicle driving assistant 100 transmits information about the risk level at the time of transmitting the risk information to the object .

The vehicle driving assistant (100) detects a peripheral vehicle existing in the vicinity of the vehicle (700). The neighboring vehicle can be detected using an image photographed through a camera, or alternatively can be detected through a sensor such as a proximity sensor.

Further, when the surrounding vehicle is detected, the vehicle driving assistant device 100 transmits the generated danger information to the detected nearby vehicle. That is, the neighboring vehicle may not be able to detect the presence of the object obscured by the vehicle 700. Accordingly, the vehicle driving assistant 100 senses the object, and transmits dangerous information to the nearby vehicle .

At this time, the vehicle driving assistant device 100 may provide the danger information to all the detected nearby vehicles, and may provide the danger information only to specific nearby vehicles.

That is, the vehicle driving assistant device 100 determines whether or not the adjacent vehicle being moved to the position of the object among the neighboring vehicles based on the moving direction and the moving speed of the object, the position, the moving direction, And may provide the danger information only to the detected adjacent vehicle.

In other words, the vehicle driving assistant device 100 generates the danger information and transmits the danger information to the neighboring vehicle or the neighboring vehicle when the object is detected and the notification condition for informing the detection of the object is detected. In addition, the vehicle driving assistant apparatus 100 determines an indicator (I) indicating the presence of the danger information and a display method of the indicator, outputs the indicator I according to a determined display method, It is possible to inform the outside to prevent accidents and to cooperate with the driver to carry out the smooth running.

At this time, the indicator I is output in order to provide warning information to the notification target determined in the vehicle driving assistant device 100. At this time, the notification object may be an object detected by the vehicle driving assistant device 100, or may be an adjacent vehicle or an adjacent vehicle existing around the vehicle 700. [

Also, the notification status may include a situation where a specific object exists around the vehicle 700, or the object is approaching by moving in a specific direction.

In addition, the indicator display method refers to various methods of determining an indicator display position, size, brightness, saturation, color, phase, and indicator image and displaying the indicator outside the vehicle.

Hereinafter, each unit constituting the vehicle driving assistant apparatus 100 for transmitting the danger information will be described in detail with reference to FIG. 2 to FIG.

FIG. 2 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention. FIG. 3 is a plan view of a vehicle having a sensor unit according to an embodiment of the present invention. 5 and 6 are views for explaining an example of a method of generating image information from a camera image according to an embodiment of the present invention, FIG. 7 is a diagram for explaining an indicator output unit according to FIG.

2, the vehicle driving assistance apparatus 100 includes an input unit 110, a communication unit 120, an interface unit 130, a memory 140, a monitoring unit 150, a sensor unit 190, a processor 170, a display unit 180, an audio output unit 185, and a power supply unit 190. However, the units of the vehicle driving assistance device 100 shown in FIG. 2 are not essential for implementing the vehicle driving assistance device 100, so that the vehicle driving assistant device 100 described in this specification can be applied to the configuration It can have more or fewer components than components.

Returning to the description of the configuration, first, the vehicle driving assistance apparatus 100 may include an input unit 110 for sensing a user's input.

For example, the user inputs a transmission condition for transmitting the danger information through the input unit 110 or inputs an execution command for turning on / off the vehicle driving assistant 100 .

The input unit 110 may include a gesture input unit (e.g., an optical sensor) for sensing a user gesture, a touch input unit (e.g., a touch sensor, a touch key, A microphone, a mechanical key, and the like, and a microphone for sensing a voice input.

Next, the vehicle driving assistant apparatus 100 may include a communication unit 120 that communicates with the other vehicle 510, the terminal 600, and the server 500 and the like.

The vehicle driving assistant apparatus 100 may receive at least one of navigation information, other vehicle driving information, and traffic information through the communication unit 120. [ The information received by the communication unit 120 may be used as additional information for detecting nearby traffic conditions or surrounding objects.

In addition, the vehicle driving assistant apparatus 100 may use the big data of the server 500 to detect the traffic situation or object around the communication unit 120. [

In addition, the vehicle driving assistant apparatus 100 senses an object, transmits information about the sensed object to the server 500, and receives the danger information about the sensed object from the server 500, It is possible to transmit the danger information to a nearby vehicle or an adjacent vehicle existing in the vicinity of the vehicle.

The server 500 may include an ITS (Intelligent Transport System) server.

At this time, when the object is sensed and the risk information for the object is generated, the vehicle driving assistant 100 transmits the generated risk information to the server 500 as well as the neighboring vehicle or the adjacent vehicle .

Accordingly, when the walking signal is changed in a situation where the pedestrian corresponding to the object does not cross the pedestrian crossing, the server 500 can extend the walking signal for a predetermined time based on the danger information, .

The communication unit 120 receives at least one of the location information, the weather information, and the traffic situation information of the road (for example, TPEG (Transport Protocol Expert Group)) from the mobile terminal 600 and / or the server 500 Lt; / RTI >

In addition, the communication unit 120 can receive traffic information from the server 500 equipped with the intelligent traffic system (ITS). Here, the traffic information may include traffic light information, lane information, and the like.

In addition, the communication unit 120 may receive navigation information from the server 500 and / or the mobile terminal 600. [ Here, the navigation information may include at least one of map information related to a vehicle running, lane information, vehicle location information, set destination information, and route information according to a destination.

In particular, the communication unit 120 can receive the real-time position of the vehicle 700 with the navigation information. For example, the communication unit 120 may include a GPS (Global Positioning System) module or a WiFi (Wireless Fidelity) module to acquire the position of the vehicle.

The communication unit 120 may also receive the running information of the other vehicle 510 from the other vehicle 510 and transmit the running information of the vehicle 700 to share the running information of the vehicle.

Here, the common vehicle running information may include at least one of direction information, position information, vehicle speed information, acceleration information, movement route information, forward / backward information, adjacent vehicle information, and turn signal information.

Accordingly, the vehicle driving assistant apparatus 100 can detect a nearby vehicle or an adjacent vehicle existing in the vicinity of the vehicle 700 based on the vehicle running information transmitted from the other vehicle 510, The danger information can be transmitted to the nearby vehicle or the adjacent vehicle.

In addition, when the user is boarding the vehicle 700, the user's mobile terminal 600 and the vehicle driving assistant 100 may perform pairing with each other automatically or by execution of the user's application.

The communication unit 120 may exchange data with another vehicle 510, the mobile terminal 600, or the server 500 in a wireless manner.

More specifically, the communication unit can wirelessly communicate using a wireless data communication system. Wireless data communication schemes include, but are not limited to, technical standards or communication schemes for mobile communication (e.g., Global System for Mobile communications (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (Long Term Evolution), LTE Term Evolution-Advanced) or the like can be used.

In addition, the communication unit 120 may use a wireless Internet technology. For example, the wireless communication unit 120 may be a WLAN (Wireless LAN), a Wi-Fi (Wireless-Fidelity) (HSDPA), Long Term Evolution (LTE), Long Term Evolution (LTE), and Long Term Evolution (LTE). Term Evolution-Advanced).

In addition, the communication unit 120 may use short range communication, and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB) ), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless Universal Serial Bus (USB) technologies.

In addition, the vehicle driving assistant device 100 may exchange data with other vehicles and servers wirelessly by using a wireless communication module of the mobile terminal, by pairing with a mobile terminal inside the vehicle using a local communication method .

Next, the vehicle driving assistance apparatus 100 may include an interface unit 130 that receives vehicle-related data or transmits a signal processed or generated by the processor 170 to the outside.

In detail, the vehicle driving assistance apparatus 100 can receive at least one of the navigation information and the sensor information through the interface unit 130. [

Such navigation information and sensor information may be used as additional information by the processor 170 to detect an object and a dangerous situation according to the object.

Further, the vehicle driving assistant device 100 can transmit a control signal for executing the vehicle driving assistant function, information generated in the vehicle driving assistant device 100, and the like through the interface unit 130. [

To this end, the interface unit 130 performs data communication with at least one of a control unit 770, an AVN (Audio Video Navigation) device 400 and a sensing unit 760 in the vehicle by a wired communication or a wireless communication method .

In detail, the interface unit 130 can receive the navigation information by the data communication with the control unit 770, the AVN apparatus 400 and / or the separate navigation apparatus.

The interface unit 130 may receive the sensor information from the control unit 770 or the sensing unit 760.

Here, the sensor information includes direction information of the vehicle 700, position information, vehicle speed information, acceleration information, tilt information, forward / backward information, fuel information, distance information to front and rear vehicles, distance information between the vehicle and the lane, And may include at least one or more of the information.

Also, the sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, a vehicle internal humidity sensor, and a door sensor. On the other hand, the position module may include a GPS module for receiving GPS information.

The interface unit 130 may receive a user input received through the user input unit 110 of the vehicle 700. The interface unit 130 may receive the user input from the input unit of the vehicle 700 or the control unit 770. [ That is, when the input unit is arranged in the vehicle itself, user input may be received through the interface unit 130. [

In addition, the interface unit 130 may receive the traffic information obtained from the server 500. The server 500 may be a server located in a traffic control station that controls traffic. For example, when traffic information is received from the server 500 through the communication unit 120 of the vehicle 700, the interface unit 130 may receive the traffic information from the control unit 770. [

Next, the memory 140 may store various data for operation of the vehicle driving assistant 100, such as a program for processing or controlling the processor 170. [

The memory 140 may store a plurality of application programs (application programs or applications), data for operation of the vehicle driving assistant 100, and commands that are driven by the vehicle driving assistant 100 . At least some of these applications may be downloaded from an external server via wireless communication. Also, at least some of these applications may reside on the vehicle driving assistance device 100 from the time of shipment for a basic function (e.g., vehicle driving assistance function) of the vehicle driving assistance device 100.

And this application program is stored in the memory 140 and can be driven by the processor 170 to perform the operation (or function) of the vehicle driving assistant device 100. [

In addition, the memory 140 may store various indicators (e.g., a stop icon, an object icon (pedestrian, motorcycle, bicycle, etc.)) to be displayed according to the notification status. For example, the memory 140 may store an icon corresponding to the image of the detected object. Thus, the vehicle driving assistant device 100 can easily recognize the kind of object existing in the vicinity of the vehicle 700 by the nearby vehicle or the adjacent vehicle.

Meanwhile, the memory 140 may store data for object identification included in the image. For example, when a predetermined object is detected in the vehicle surroundings image acquired through the camera 160, the memory 140 may store data for confirming what the object corresponds to according to a predetermined algorithm .

 For example, when the image acquired through the camera 160 includes a predetermined object such as a lane, a traffic sign, a two-wheeled vehicle, or a pedestrian, the memory 140 may determine by the predetermined algorithm what the object corresponds to The data can be stored.

The memory 140 may be implemented in hardware, such as a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type) (RAM), a static random access memory (SRAM), a read-only memory (ROM), an EEPROM , electrically erasable programmable read-only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, and optical disk.

In addition, the vehicle driving assistance apparatus 100 may be operated in association with a web storage that performs a storage function of the memory 140 on the Internet.

Next, the monitoring unit 150 can acquire biometric information of the user.

In detail, the monitoring unit 150 can detect the operation of the user or the biometric information and obtain the monitoring information. And the obtained monitoring information can be used to determine the risk level of the sensed object.

In detail, the monitoring unit 150 may acquire an image for biometrics of the user. That is, the monitoring unit 150 may include an image acquisition module disposed inside the vehicle.

For example, the monitoring unit 150 may include a monitoring camera that captures an in-vehicle image, so that the inside of the vehicle can be photographed. In addition, the processor 170 can image the image photographed inside the vehicle and monitor the status of the driver or the passenger in real time.

The biometric information detected by the monitoring unit 150 may include image information captured by the user, fingerprint information, iris-scan information, retina-scan information, geo-metry information, facial recognition information, and voice recognition information. The monitoring unit 150 may include a sensor for sensing the biometric information.

Next, the vehicle driving assistant device 100 may further include a sensor unit 150 for sensing an object around the vehicle.

As described above, the vehicle driving assistant device 100 may include a separate sensor unit 150 to sense peripheral objects, and may transmit sensor information obtained from the sensing unit 770 of the vehicle 700 itself, (130).

The sensor unit 150 may include a distance sensor 191 that senses the position of the object.

The distance sensor 191 can precisely detect the direction in which the object is separated from the vehicle 700, the distance, and the moving direction of the object. Also, the distance sensor 191 continuously measures the positional relationship with the sensed object, and can accurately detect a change in the positional relationship.

The distance sensor 191 may sense objects located at front, back, right, and left sides of the vehicle 700. To this end, the distance sensor 191 may be located at various positions of the vehicle 700. [

3, the distance sensor 191 may be disposed at a position of at least one of front, rear, left and right sides 191a, 191b, 191c, and 191d of the body of the vehicle 700 and the ceiling 191e.

The distance sensor 191 may include various distance measuring sensors such as a lidar sensor, a laser sensor, an ultrasonic waves sensor, and a stereo camera.

For example, the distance sensor may be a laser sensor, which may be a laser sensor, which uses a time-of-flight (TOF) or / and a phase-shift according to a laser signal modulation method, Can be measured. More specifically, the time delay method can measure the distance between the object and the object by emitting a pulsed laser signal and measuring the time that the reflected pulse signals from the objects within the measurement range arrive at the receiver.

On the other hand, whether or not the object corresponds to the notification target and the attribute of the object for determining the notification status can be obtained by analyzing the image captured by the camera 160 by the processor 170.

To this end, the sensor unit 150 may include a camera 160.

More specifically, the vehicle driving assistance apparatus 100 may include a camera 160 for acquiring a vehicle surroundings image. Then, the acquired vehicle surroundings image can be detected by the processor 170 around the vehicle, and the property of the object can be detected to generate the image information.

Here, the image information may be included in the sensor information as at least one of the type of the object, the traffic signal information displayed by the object, the distance between the object and the vehicle, and the position of the object.

More specifically, the processor 170 generates image information by performing object analysis such as detecting an object in an image photographed through image processing, tracking an object, measuring a distance to the object, and checking an object .

In order for the processor 170 to perform object analysis more easily, the camera 160 may be a stereo camera that takes images and measures distances to objects in the captured images. However, the embodiment is not limited thereto.

The camera 160 may include an internal camera that captures the front of the vehicle and acquires a forward image within the vehicle.

The camera 160 may be provided at various positions outside the vehicle.

3, a plurality of cameras 160 may be disposed on at least one of the left, rear, right, front, and ceiling of the vehicle 700, respectively.

The left camera 160b may be disposed in a case surrounding the left side mirror. Alternatively, the left camera 160b may be disposed outside the case surrounding the left side mirror. Alternatively, the left camera 160b may be disposed in one area outside the left front door, the left rear door, or the left fender.

The right camera 160c may be disposed in a case surrounding the right side mirror. Or the right camera 160c may be disposed outside the case surrounding the right side mirror. Alternatively, the right camera 160c may be disposed in one area outside the right front door, the right rear door, or the right fender.

Further, the rear camera 160d can be disposed in the vicinity of a rear license plate or a trunk switch. The front camera 160a may be disposed in the vicinity of the ambulance or in the vicinity of the radiator grill.

The processor 170 may synthesize images photographed in all directions to provide an overview image of the vehicle 700 viewed from the top view. When the surrounding view image is generated, a boundary portion between each image area occurs. These boundary portions can be naturally displayed by image blending processing.

Further, the ceiling camera 160e may be disposed on the ceiling of the vehicle 700 to photograph all the front, rear, left, and right sides of the vehicle 700. [

Such a camera 160 may directly include an image sensor and an image processing module. The camera 160 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). In addition, the image processing module may process the still image or the moving image obtained through the image sensor, extract required image information, and transmit the extracted image information to the processor 170.

In an embodiment, the sensor portion 150 may include a stereo camera including a distance sensor 191 and a camera 160. [ That is, the stereo camera can acquire the image and can sense the position of the object.

Hereinafter, with reference to Figs. 4 to 6, a method for the processor 170 to detect positional information and image information using a stereo camera will be described in more detail.

4, the stereo camera 160 may include a first camera 160a having a first lens 163a, and a second camera 160b having a second lens 163b .

The stereo camera 160 further includes a first light shield 162a and a second light shield 162b for shielding light incident on the first lens 163a and the second lens 163b, And a part 162b.

This vehicle driving assistance apparatus 100 acquires a stereo image of the surroundings of the vehicle from the first and second cameras 160a and 160b, performs disparity detection based on the stereo image, Based on the information, object detection may be performed on at least one stereo image, and object motion may be continuously tracked after object detection.

5, the processor 170 in the vehicle driving assistance apparatus 100 includes an image preprocessing unit 410, a disparity computing unit 420, an object detecting unit 434 An object tracking unit 440, and an application unit 450. 5 and FIG. 5, an image is processed in the order of an image preprocessing unit 410, a disparity computing unit 420, an object detecting unit 434, an object tracking unit 440, and an application unit 450, But is not limited to.

An image preprocessor 410 may receive an image from the camera 160 and perform preprocessing.

Specifically, the image preprocessing unit 410 may perform a noise reduction, a rectification, a calibration, a color enhancement, a color space conversion (CSC Interpolation, camera 160 gain control, and the like. Accordingly, a clearer image than the stereo image captured by the camera 160 can be obtained.

The disparity calculator 420 receives the image signal processed by the image preprocessing unit 410, performs stereo matching on the received images, and performs disparity calculation based on stereo matching, A disparity map can be obtained. That is, it is possible to obtain the disparity information about the stereo image with respect to the front of the vehicle.

At this time, the stereo matching may be performed on a pixel-by-pixel basis of stereo images or on a predetermined block basis. On the other hand, the disparity map may mean a map in which binaural parallax information of stereo images, i.e., left and right images, is numerically expressed.

The segmentation unit 432 may perform segmenting and clustering on at least one of the images based on the disparity information from the disparity calculating unit 420. [

Specifically, the segmentation unit 432 can separate the background and the foreground for at least one of the stereo images based on the disparity information.

For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated. As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.

Thus, by separating the foreground and the background based on the disparity information information extracted based on the stereo image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.

Next, the object detector 434 can detect the object based on the image segment from the segmentation unit 432. [

That is, the object detecting unit 434 can detect an object for at least one of the images based on the disparity information.

More specifically, the object detecting unit 434 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.

The object verification unit 436 then classifies and verifies the isolated object.

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

On the other hand, the object checking unit 436 can check the objects by comparing the objects stored in the memory 140 with the detected objects.

For example, the object identifying unit 436 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, etc., located in the vicinity of the vehicle.

The object tracking unit 440 may perform tracking on the identified object. For example, it sequentially identifies an object in the acquired stereo images, calculates a motion or a motion vector of the identified object, and tracks movement of the object based on the calculated motion or motion vector . Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, dangerous areas, tunnels, etc., located in the vicinity of the vehicle.

Next, the application unit 450 can calculate the risk of the vehicle and the like based on various objects located in the vicinity of the vehicle, for example, other vehicles, lanes, roads, signs and the like. It is also possible to calculate the possibility of a collision with a preceding vehicle, whether the vehicle is slipping or the like.

Then, the application unit 450 can output a message or the like for notifying the user to the user as vehicle driving assistance information, based on the calculated risk, possibility of collision, or slip. Alternatively, a control signal for attitude control or running control of the vehicle may be generated as the vehicle control information.

The object preprocessing unit 440 and the application unit 450 are connected to the processor 440. The image processing unit 410, the dispatcher unit 420, the segmentation unit 432, the object detection unit 434, the object verification unit 436, And may be an internal configuration of the image processing unit in the image processing unit 170.

The processor 170 includes an image preprocessing unit 410, a disparity computing unit 420, a segmentation unit 432, an object detection unit 434, an object verification unit 436, an object tracking unit 440, and an application unit 450. [0040] For example, when the camera 160 is composed of the mono camera 160 or the surround view camera 160, the disparity calculating unit 420 may be omitted. Also, according to the embodiment, the segmentation section 432 may be omitted.

Referring to FIG. 6, during the first frame period, the camera 160 may acquire a stereo image.

The disparity calculating unit 420 in the processor 170 receives the stereo images FR1a and FR1b signal-processed by the image preprocessing unit 410 and performs stereo matching on the received stereo images FR1a and FR1b And obtains a disparity map (520).

The disparity map 520 is obtained by leveling the parallax between the stereo images FR1a and FR1b. The higher the disparity level is, the closer the distance is from the vehicle, and the smaller the disparity level is, It is possible to calculate that the distance is long.

On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.

In the figure, first to fourth lanes 528a, 528b, 528c, and 528d have corresponding disparity levels in the disparity map 520, and the construction area 522, the first front vehicle 524 ) And the second front vehicle 526 have corresponding disparity levels, respectively.

The segmentation unit 432, the object detection unit 434 and the object identification unit 436 determine whether or not the segments, the object detection, and the object detection information for at least one of the stereo images FR1a and FR1b based on the disparity map 520 Perform object verification.

In the figure, using the disparity map 520, object detection and confirmation for the second stereo image FRlb is performed.

That is, the first to fourth lanes 538a, 538b, 538c, and 538d, the construction area 532, the first forward vehicle 534, and the second forward vehicle 536 are included in the image 530, And verification may be performed.

With the image processing as described above, the vehicle driving assistant device 100 can accurately sense what objects are nearby, where they are located, and the like by using the sensor unit 150. Thus, the vehicle driving assistant device 100 can detect whether or not the notification is necessary, and can detect the notification destination, and can detect the notification information to be displayed to the notification destination.

In addition, the vehicle driving assistant apparatus 100 may include an output unit for outputting notification information on a dangerous situation to the inside and outside of the vehicle 700. [

In addition, when the vehicle driving assistant device 100 needs to be notified of the inside or the outside of the vehicle, the vehicle driving assistant device 100 may display an indicator through the display unit or output an audio signal to notify the inside / outside notification target.

To this end, the output unit may include an indicator output unit 181, a display unit 183, and an audio output unit 185.

First, the indicator output unit 181 can display the indicator as a light source outside the vehicle. For example, the indicator output unit 181 can display an indicator by irradiating a laser on the road surface so that an image of the indicator is formed on the road surface.

The indicator output unit 181 may display an indicator on at least one area around the vehicle.

In addition, the indicator output unit 181 can generate notification information such as fog in the headlamp. In other words, the indicator output section may include a fog generating section for generating the fog.

Further, the indicator output section 181 may include a washer liquid spraying section provided in the windshield or the head lamp to spray the washer liquid. In other words, the washer liquid sprayed through the washer liquid spraying portion can be utilized as one indicator for notifying the dangerous situation.

Hereinafter, an indicator display method of the indicator output unit 181 will be described in more detail with reference to FIG. The indicator output unit 181 of FIG. 7 performs a function of outputting an indicator such as a specific image to the outside of the vehicle by using a laser.

In detail, the indicator output unit 181 includes a plurality of indicator output units and is arranged to illuminate the vehicle 700 with laser beams at different positions, so that the indicator can be displayed on all the front, rear, left and right regions UA of the vehicle .

7, a first indicator output 181a is disposed near the left headlamp of the vehicle, a second indicator output 181b is disposed on the left side body of the vehicle, and a third indicator By disposing the output section 181c, an indicator can be outputted in the upper and lower regions on the left side of the vehicle.

A fourth indicator output 181d is disposed near the right head lamp of the vehicle 700. A fifth indicator output 181e is disposed on the right side body of the vehicle and a sixth indicator output 181e is provided near the right tail lamp. By arranging the portion 181f, it is possible to output the indicator to the upper right and lower regions of the vehicle.

The indicator output units can display the indicator around the vehicle by irradiating laser beams to the arranged positions.

In addition, the seventh indicator output section 181g may be disposed on the ceiling of the vehicle to display an indicator in all areas of the vehicle in front, rear, left, and right.

The plurality of indicator output units 181 may display an indicator not only in the front and rear of the vehicle but also in the left and right regions, and display the indicator at an appropriate position according to the positional relationship with the notification target.

A second indicator output 181b disposed on the left side body of the vehicle to inform the surrounding vehicles that the vehicle 700 is running on the left side of the vehicle 700, It is possible to display the indicator of the approach of the object to secure the safety of the object while inducing the safe driving of the peripheral vehicle.

That is, the plurality of indicator output units 181 may display an indicator in a field of view of a notification target (e.g., a nearby vehicle, a nearby vehicle, or a detected object).

Accordingly, the indicator output unit 181 can improve the indicator discriminating power by displaying the indicator in the area corresponding to the change in the positional relationship with the notification target.

For example, the indicator output unit 181 may output an indicator to the left side area of the vehicle to display the indicator in the view area of the notification target, and the moving direction of the notification target may be If it changes to the right side, the indicator can be displayed in the rear or rear right area of the vehicle.

The arrangement of the above-described indicator output units is an example, and it is possible to arrange various indicator output units capable of displaying indicators on the front, rear, left and right sides of the vehicle as in the other embodiments including only a part of the above indicators.

The indicator output unit 181 may display an indicator including various images. That is, the indicator output unit 181 can display an indicator for displaying different images according to situations, and can transmit accurate notification information to the notification target.

 The indicator output unit 181 may display an indicator including a symbol image indicating notification information.

On the other hand, the output unit 190 can output information about an indicator displayed on the outside, setting information and the like to the interior of the vehicle, and can also notify the inside of the notification information.

In addition, the output unit 190 may output the danger information transmitted from another nearby vehicle or an adjacent vehicle.

To this end, the output unit 190 may include a display unit 183 and an audio output unit 185.

First, the display unit 183 can display the danger information on the object inside the vehicle. The object may be an object detected by the other vehicle in a situation in which the object itself can not be detected by itself. Hereinafter, the object detected by the self is referred to as a 'first object', and the object detected by another vehicle is referred to as a 'second object'.

Once the first object is detected, the processor 170 generates risk information for the first object and transmits the risk information to the neighboring vehicle or the adjacent vehicle. Hereinafter, the neighboring vehicle and the adjacent vehicle will be collectively referred to as another vehicle.

The processor 170 displays the danger information on the sensed first object through the display unit 183.

The processor 170 displays the risk information on the received second object on the display unit 183 when the risk information for the second object transmitted from the outside through the communication unit 120 is received.

The display unit 183 may include a plurality of display units.

Such a display unit 183 can project an image onto a windshield W of the vehicle 700. [

That is, the display unit 183 may be a head up display (HUD), and may include a projection module that projects an image on the windshield W. Also, the projection image projected by the projection module has a certain transparency, and the user can simultaneously view the projected image and the projected image.

The projection image displayed on the display unit 183 overlaps with the projection image projected on the windshield W to form an augmented reality (AR).

In addition, the display unit 183 can display an image on the side glass G. [

That is, the display unit 183 displays an image on the left side glass, displays an image on the right side glass, displays an image on the rear left side glass, displays an image on the rear right side glass, can do.

Also, the display unit 183 may be separately installed inside the vehicle to display an image.

More specifically, the display unit 183 may be a display of the vehicle navigation apparatus or a cluster on the inside of the vehicle interior.

The display unit 183 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) (flexible display), a three-dimensional display (3D display), and an electronic ink display (e-ink display).

The display unit 183 may be combined with a touch input unit to form a touch screen.

In addition, the vehicle driving assistance apparatus 100 may further include an audio output unit 185 and a power supply unit 145.

In detail, the audio output unit 185 can output a message for confirming the function of the vehicle driving assistant device 100, whether or not the function of the vehicle driving assistant device 100 is executable, and notification information through audio. That is, the vehicle driving assistant device 100 can supplement the description of the function of the vehicle driving assistant device 100 through visual output through the display portion 183 and sound output of the audio output portion 185 have.

For example, the audio output unit 185 may output a beep warning sound together with an indicator requesting driving attention based on the danger information on the sensed object, thereby improving the conveyance power of the notification information.

In addition, the power supply unit 145 may receive external power and internal power under the control of the processor 170 to supply power necessary for operation of the respective components.

In particular, the power supply unit 145 continuously supplies power to a specific component to detect the first object and transmit danger information to the detected first object even when the vehicle is turned off.

That is, the power supply unit 145 continuously supplies the power to the processor 170, the communication unit 120, and the sensor unit 190 even in the state where the power supply is turned off.

That is, in general, when an object moves between vehicles parked or parked on a shoulder of a lane, the object is covered by the parked or parked vehicle, so that a vehicle or a driver, It occurs in a situation that can not be detected.

Accordingly, in the present invention, power is continuously supplied to specific components so that detection of the object and transmission of danger information according to the stopped or parked vehicle can be performed.

Finally, the vehicle driving assistance apparatus 100 may include a processor 170 that controls the overall operation of each unit in the vehicle driving assistance apparatus 100. [

In addition, processor 170 may control at least some of the components discussed with FIG. 2 to drive an application program. Further, the processor 170 may operate at least two of the components included in the vehicle driving assistant device 100 in combination with each other for driving the application program.

Such a processor 170 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) 170 may be implemented using at least one of processors, controllers, micro-controllers, microprocessors 170, and electrical units for performing other functions.

The processor 170 may be controlled by the control unit or may control various functions of the vehicle 700 through the control unit.

In addition to the operations associated with the application programs stored in the memory 140, the processor 170 typically controls the overall operation of the vehicle driving assistance system 100. [ Processor 170 may provide or process appropriate information or functionality to a user by processing signals, data, information, etc., input or output through the components discussed above, or by driving application programs stored in memory 170.

Hereinafter, the operation of the vehicle driving assist system 100 according to the embodiment of the present invention will be described in more detail with reference to FIGS.

8 is a flowchart for explaining steps of transmitting the risk information of the vehicle driving assistant 100 according to the embodiment of the present invention.

First, referring to FIG. 8, the vehicle driving assistant 100 senses an object existing in the vicinity of the vehicle 700 (operation 100).

The object may be a pedestrian suddenly entering a lane in a blind spot, or alternatively a pedestrian entering when an unauthorized traverse or a signal change.

FIG. 9 is a view for explaining an object entering in a blind spot according to an embodiment of the present invention, FIG. 10 is a view for explaining an object which is traversed in an unauthorized position or in an unrecognizable position according to an embodiment of the present invention FIG.

9, a vehicle 700, a first rider 800a, and a second rider 800b are running on the lane, and the first object 900 The vehicle 700 and the first rider 800a may sense the first object 900. In this case,

However, the second vehicle 800b may not detect the first object 900 hidden by the vehicle 700 or the first vehicle 800a.

As shown in (b), the vehicle 700, the first rider 800a, and the second rider 800b are running on the lane, and the first object 900 suddenly crosses the lane The vehicle 700 and the first rider 800a can sense the first object 900 in a situation where the person enters the sidewalk.

However, the second vehicle 800b may not be able to detect the first object 900 hidden by the vehicle 700. FIG.

10, the vehicle 700 and the first rider 800a are running on the lane, and the first object 900 is suddenly entered into the lane, In this situation, the vehicle 700 may sense the first object 900.

However, the first vehicle 800a may not be able to detect the first object 900 hidden by the vehicle 700. FIG.

As shown in (b), the vehicle 700, the first rider 800a, and the first object 900 are running on the lane, and the first object 900 is located in front of the vehicle 700 The vehicle 700 may sense whether the first object 900 is present or not in a state of passing over the first object 800a through the first object 800a.

However, the first vehicle 800a may not detect whether the first object 900 is hidden from the vehicle 700 or not.

In such a situation, the vehicle driving assistant apparatus 100 provided in the vehicle 700 detects the presence of the first object 900 and generates danger information about the sensed first object 900 And transmits the generated danger information to the first vehicle 800a or the second vehicle 800b.

At this time, the vehicle driving assistant (100) determines a risk level based on the moving state of the first object (900), generates the risk information including information on the determined risk level, Transmit risk information.

11 is a diagram for setting a communication structure according to an embodiment of the present invention.

11, in a state where the vehicle 700, the first rider 800a, the second rider 800b, and the third rider 800c are running on the lane, the vehicle 700, The one-vehicle 800a, the second vehicle 800b, and the third vehicle 800c can be mutually connected to exchange information.

Accordingly, the detection of the first object 900 can be performed in the vehicle 700, the first vehicle 800a, and the second vehicle 800b, and the vehicle 700 and the first vehicle 800a can be detected. , The danger information for the sensed first object 900 may be transmitted to the third vehicle 800c in the second vehicle 800b.

To this end, when the object is sensed, the vehicle driving assistant device 100 determines a notification target for notifying the presence of the sensed object.

That is, the vehicle driving assistant apparatus 100 senses other vehicles existing in the vicinity (step 110). As described above, the other vehicle may be a nearby vehicle existing in the vicinity of the vehicle 700, or may be an adjacent vehicle adjacent to the vehicle 700, as described above. Then, the vehicle driving assistant device 100 transmits the danger information for notifying the presence of the object to the detected other vehicle.

Alternatively, the vehicle driving assistant 100 determines whether there is another vehicle that is in danger of collision with the detected object among the detected other vehicles (operation 120).

That is, the vehicle driving assistant device 100 detects whether there is another vehicle that may collide with the object based on the moving direction, the moving speed and the position of the other vehicle, the position, the moving direction, and the moving speed of the object .

If it is detected that the vehicle is in danger of collision with the object, the vehicle driving assistant 100 transmits the danger information about the detected object to the other vehicle at risk of collision (operation 130).

In addition, if no other vehicle with a risk of collision with the object is sensed, in other words, the vehicle driving assistant 100 can transmit only warning information indicating the existence of the object to another vehicle having no risk of collision with the object Step 160).

Then, the vehicle driving assistant 100 outputs notification information indicating that there is a risk of a collision due to the presence of the object in the surrounding vehicle (Step 140).

In addition, the vehicle driving assistant 100 outputs notification information (i.e., an indicator) indicating the presence of the object to the other vehicle (operation 150).

On the road, there is a server 500 equipped with an intelligent transportation system (ITS). The server 500 manages traffic lights and the like existing on the road.

Accordingly, the vehicle driving assistant 100 transmits the generated risk information to the server 500 when the risk information on the object is generated.

12 is a view for explaining a communication structure between the server 500 and vehicles according to the embodiment of the present invention.

Referring to FIG. 12, a server 500 is provided on a road, and the server and the vehicles traveling on the road can exchange data with each other.

Accordingly, the vehicle driving assistant device 100 transmits the generated danger information to the server 500 as well as the other vehicle. That is, when an object entering the pedestrian crossing is detected even before the walking signal is changed, the object may be exposed to danger in accordance with the change of the walking signal. Therefore, the vehicle driving assistant device 100 transmits the danger information of the object to the server 500, and the walking signal can be extended through the server 500 for a predetermined time to secure the safety of the object .

Meanwhile, when the vehicle driving assistant device 100 transmits the danger information, the vehicle driving assistant device 100 detects a danger level according to a positional relationship between the object and the other vehicle, and transmits the risk information including information on the detected danger level .

13 is a diagram for explaining the degree of danger according to the embodiment of the present invention.

Referring to FIG. 13, the first vehicle 800a and the second vehicle 800b are present around the vehicle 700. FIG.

At this time, the peripheral region of the vehicle 700 is divided into a plurality of regions according to positions, and a different degree of danger is set in the plurality of divided regions. For example, a region of the plurality of regions close to the vehicle 700 is set to a high risk level, and a region farther away from the vehicle 700 is set to a low risk level.

In other words, around the vehicle 700, the peripheral area may be divided into a first area Z1, a second area Z2, and a third area Z3. The first zone Z1 belongs to the critical zone, and a high degree of danger is set accordingly. The second zone Z2 belongs to the major zone, and a degree of danger according to the degree of danger is set. Z3 belong to the minor zone, and a lower danger degree is set accordingly.

Accordingly, the vehicle driving assistant device 100 transmits the risk information about the object to the first and second vehicles 800a and 800b, and the first and second vehicles 800a and 800b The risk information transmitted to the vehicle 800b is different from each other. That is, the first righthander 800a is located in the critical zone, so that danger information including a high risk level is transmitted, and the second righthander 800b is located in the minor zone, The risk information is transmitted.

Meanwhile, the vehicle driving assistant device 100 detects a collision time according to the positional relationship between the object and the other vehicle, and transmits the danger information including the information on the detected collision time.

In addition, the vehicle driving assistant device 100 may receive danger information transmitted from an external vehicle. Accordingly, the danger information for the object is displayed on the display unit 183 of the user or on the display unit of the other vehicle.

14 is a view for explaining danger information displayed through an internal display unit according to an embodiment of the present invention.

Referring to FIG. 14, the display unit 183 displays first information IF1 indicating a type of an object representing the danger information, second information IF1 indicating a time required for the collision according to the distance between the vehicle and the object, (IF2), and third information IF3 indicating the vehicle function to be automatically activated according to the collision time.

At this time, the vehicle function may include an automatic full brake function.

For example, when the brake operation pressure is insufficient and the vehicle can not be stopped within the collision required time even if the brake is operated at the present time based on the current speed of the vehicle and the collision required time, (100) automatically activates the full brake so that the vehicle can stop before colliding with the object.

On the other hand, the vehicle driving assistant device 100 can output the notification information as described above. The notification information may be information provided to the object, or may be information provided to the other vehicle.

 15 to 19 are diagrams for explaining notification information according to an embodiment of the present invention.

Referring to FIG. 15, the vehicle driving assistant 100 may generate a horn E1 to inform the object of the existence of another vehicle existing in the vicinity. Accordingly, the attention of the object is guided to the vehicle 700 even for a moment.

Referring to FIG. 16, the vehicle driving assistant apparatus 100 operates the washer fluid sprayer existing in the windshield W to spray the washer liquid E2, thereby guiding the sight line of the object. Alternatively, the vehicle driving assistant 100 operates a washer fluid injector existing in the headlamp and generates a mist (E3), thereby guiding the gaze of the object.

Referring to FIG. 17, the vehicle driving assistant 100 generates a lamp driving signal so that light E4 is generated in a direction in which the object exists. At this time, the light E4 is for allowing nearby vehicles to easily recognize the object as the object is detected on the night road. In addition, the vehicle driving assistant 100 may change the color of the light E4 so that the object can be easily detected. At this time, the vehicle driving assistant device 100 generates the light only to the height of the chest of the object, so as not to inconvenience the object. Then, the vehicle driving assistant 100 changes the position of the light in the moving direction of the object as the object moves.

Referring to FIG. 18, the vehicle driving assistant device 100 outputs notification information for informing the neighboring vehicle of the existence of the object through the display unit 183 provided in the side mirror.

18 (a), the notification information may be stop information E5 for generating a stop signal according to the existence of the object. Alternatively, as shown in (b) of FIG. 18, (E6).

Referring to FIG. 19, the vehicle driving assistant device 100 outputs a specific indicator E7 to the outside of the vehicle through a laser, as described above.

The transmitted risk information may further include not only the above-described information on the risk level, but also information on the degree of danger, as well as location information and movement state information of the object. Accordingly, .

Meanwhile, the vehicle driving assistant device 100 displays an image of the surrounding traffic situation through the display unit 183, and can determine whether to transmit the danger information of the object and the transmission object using the displayed image.

20 and 21 are views showing a traffic situation image according to an embodiment of the present invention.

Referring to FIGS. 20 and 21, a plurality of objects are positioned on the road based on the vehicle 700. The plurality of objects move on the road.

At this time, the vehicle driving assistant device 100 may utilize an internal sensor unit or receive various information from outside in order to generate the traffic situation image.

That is, the vehicle driving assistance apparatus 100 may include server information, a forward camera, follow-up vehicle information, forward vehicle information, opposing vehicle information, traffic signal recognition (TSR) information, ), And traffic light information.

Then, the vehicle driving assistant device 100 displays an image including information about various objects disposed around the current vehicle 700 using the obtained information.

At this time, the image is divided into a plurality of areas, and information of the corresponding objects is displayed in an area corresponding to the position of each of the divided areas.

At this time, the information of the object is displayed on the image in different directions depending on the degree of danger of the object. In other words, an area in which the information on the object is displayed among the divided areas on the image is displayed in a different color or pattern according to the degree of danger of the object.

For example, the area where the high-risk object is located is displayed in red, and the area where the low-risk object is located may be displayed in blue. Alternatively, as shown in FIG. 21, different patterns may be displayed according to the degree of danger.

When the areas in which the different objects are displayed on the image overlap each other, that is, when the area in which the first object is displayed and the area in which the second object are displayed overlap each other, Information indicating that there is a risk of mutual collision is transmitted to the first object and the second object.

In other words, when the areas overlap each other, it means that the first object and the second object collide with each other. Therefore, the vehicle driving assistant 100 transmits the collision risk information according to overlapping of the areas.

At this time, an area where the object is not located among the divided areas on the image is displayed as a blank area. Here, the free area may be classified into a safe area in which the vehicle can move, and a danger area in which the vehicle can not move or is in danger of moving.

Accordingly, if there is a risk of a collision in the surroundings as described above, the vehicle driving assistant device 100 may cause the vehicle to move to a position corresponding to the safe area of the blank area on the image, To prevent any dangerous situations in advance.

According to the embodiment of the present invention, it is possible to detect an object (for example, a pedestrian, a bicycle, a motorcycle or the like) that has entered a lane through a camera image, predict a moving direction of the detected object, By transmitting the information, it is possible to reduce the accident occurrence rate of the pedestrian entering from a position difficult to predict.

Further, according to the embodiment of the present invention, by utilizing the inter-vehicle communication technology and the pedestrian assist function, a more stable driving environment can be provided.

Referring to Fig. 22, the above-described vehicle driving assistant device 100 may be included in the vehicle.

The vehicle 700 includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driving unit 750, a memory 730, an interface unit 780, a control unit 770, A power source unit 790, a vehicle driving assistant apparatus 100, and an AVN apparatus 400. [ Here, the unit of the vehicle driving assistant device 100 and the unit of the vehicle 700 are described as being provided in the vehicle, but the present invention is not limited thereto.

The communication unit 710 may include one or more modules that enable wireless communication between the vehicle and the mobile terminal 600, between the vehicle and the external server 510, or between the vehicle and the other vehicle 520. [ In addition, the communication unit 710 may include one or more modules that connect the vehicle to one or more networks.

The communication unit 710 may include a broadcast receiving module 711, a wireless Internet module 712, a local area communication module 713, a location information module 714, and an optical communication module 715.

The broadcast receiving module 711 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 712 is a module for wireless Internet access, and can be built in or externally mounted in a vehicle. The wireless Internet module 712 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Wide Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and Long Term Evolution-Advanced (LTE-A) (712) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the wireless Internet module 712 can exchange data with the external server 510 wirelessly. The wireless Internet module 712 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from the external server 510. [

The short-range communication module 713 is for short-range communication and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB) It is possible to support near-field communication using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Universal Serial Bus)

The short range communication module 713 may form short range wireless communication networks (Wireless Area Networks) to perform short range communication between the vehicle and at least one external device. For example, the short-range communication module 713 can exchange data with the mobile terminal 600 wirelessly. The short distance communication module 713 can receive weather information and traffic situation information of the road (for example, TPEG (Transport Protocol Expert Group)) from the mobile terminal 600. For example, when the user has boarded the vehicle, the user's mobile terminal 600 and the vehicle can perform pairing with each other automatically or by execution of the user's application.

The position information module 714 is a module for acquiring the position of the vehicle, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.

The optical communication module 715 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle. For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 715 can exchange data with another vehicle 520 via optical communication.

The input unit 720 may include a driving operation unit 721, a camera 195, a microphone 723, and a user input unit 724.

The driving operation means 721 receives a user input for driving the vehicle. The driving operation means 721 may include a steering input means 721A, a shift input means 721D, an acceleration input means 721C, and a brake input means 721B.

The steering input means 721A receives the input of the traveling direction of the vehicle from the user. The steering input means 721A is preferably formed in a wheel shape so that steering input is possible by rotation. According to the embodiment, the steering input means 721A may be formed of a touch screen, a touch pad, or a button.

The shift input means 721D receives inputs of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle from the user. The shift input means 721D is preferably formed in a lever shape. According to an embodiment, the shift input means 721D may be formed of a touch screen, a touch pad, or a button.

The acceleration input means 721C receives an input for acceleration of the vehicle from the user. The brake inputting means 721B receives an input for decelerating the vehicle from the user. The acceleration input means 721C and the brake input means 721B are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 721C or the brake input means 721B may be formed of a touch screen, a touch pad, or a button.

The camera 722 may include an image sensor and an image processing module. The camera 722 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module processes the still image or moving image obtained through the image sensor, extracts necessary information, and transmits the extracted information to the control unit 770. Meanwhile, the vehicle may include a camera 722 for photographing the vehicle front image or the vehicle periphery image, and a monitoring unit 150 for photographing the in-vehicle image.

The monitoring unit 150 may acquire an image of the passenger. The monitoring unit 150 may obtain an image for biometrics of the passenger.

22, the monitoring unit 150 and the camera 722 are included in the input unit 720. However, the camera 722 may be described as being included in the vehicle driving assist device as described above .

The microphone 723 can process an external sound signal as electrical data. The processed data can be used variously depending on the function being performed in the vehicle. The microphone 723 can convert the voice command of the user into electrical data. The converted electrical data can be transmitted to the control unit 770.

The camera 722 or the microphone 723 may be a component included in the sensing unit 760 rather than a component included in the input unit 720. [

The user input unit 724 is for receiving information from a user. When information is input through the user input unit 724, the control unit 770 can control the operation of the vehicle to correspond to the input information. The user input unit 724 may include touch input means or mechanical input means. According to an embodiment, the user input 724 may be located in one area of the steering wheel. In this case, the driver can operate the user input portion 724 with his / her finger while holding the steering wheel.

The sensing unit 760 senses a signal related to the running or the like of the vehicle. To this end, the sensing unit 760 may include a sensor, a wheel sensor, a velocity sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, , A position module, a vehicle forward / reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, an internal humidity sensor, an ultrasonic sensor, a radar, .

Thereby, the sensing unit 760 can acquire the vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, , Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like.

In addition, the sensing unit 760 may include an acceleration pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 760 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the monitoring unit 150 and the microphones 723 may operate as sensors. The biometric information sensing unit can acquire the hand shape information and the face recognition information through the monitoring unit 150.

The output unit 740 is for outputting information processed by the control unit 770 and may include a display unit 741, an acoustic output unit 742, and a haptic output unit 743. [

The display unit 741 can display information processed in the control unit 770. For example, the display unit 741 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display unit 741 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display unit 741 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. This touch screen may function as a user input 724 that provides an input interface between the vehicle and the user, while providing an output interface between the vehicle and the user. In this case, the display unit 741 may include a touch sensor that senses a touch with respect to the display unit 741 so that a control command can be received by a touch method. When a touch is made to the display unit 741, the touch sensor senses the touch, and the control unit 770 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display unit 741 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the line of sight ahead of the vehicle.

Meanwhile, according to the embodiment, the display unit 741 may be implemented as a Head Up Display (HUD). When the display unit 741 is implemented as a HUD, information can be output through a transparent display provided in the windshield. Alternatively, the display unit 741 may include a projection module to output information through an image projected on the windshield.

The sound output unit 742 converts an electric signal from the control unit 770 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 742 may include a speaker or the like. It is also possible for the sound output section 742 to output a sound corresponding to the operation of the user input section 724. [

The haptic output unit 743 generates a tactile output. For example, the haptic output section 743 may operate to vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.

The vehicle drive unit 750 can control the operation of various devices of the vehicle. The vehicle driving unit 750 includes a power source driving unit 751, a steering driving unit 752, a brake driving unit 753, a lamp driving unit 754, an air conditioning driving unit 755, a window driving unit 756, an airbag driving unit 757, A driving unit 758 and a suspension driving unit 759.

The power source drive section 751 can perform electronic control of the power source in the vehicle.

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 751 can perform electronic control on the engine. Thus, the output torque of the engine and the like can be controlled. When the power source drive unit 751 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 770. [

As another example, when the electric motor (not shown) is a power source, the power source driving unit 751 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 752 may perform electronic control of a steering apparatus in the vehicle. Thus, the traveling direction of the vehicle can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle. For example, it is possible to reduce the speed of the vehicle by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driver 754 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The air conditioning driving unit 755 can perform electronic control on an air conditioner (not shown) in the vehicle. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.

The window driving unit 756 may perform electronic control of a window apparatus in the vehicle. For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.

The airbag driving unit 757 can perform electronic control of the airbag apparatus in the vehicle. For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle. For example, the opening or closing of the sunroof can be controlled.

The suspension driving unit 759 can perform electronic control of a suspension apparatus (not shown) in the vehicle. For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle.

The memory 730 is electrically connected to the control unit 770. The memory 770 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 790 can be, in hardware, various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like. The memory 730 may store various data for operation of the entire vehicle, such as a program for processing or controlling the control unit 770.

The interface unit 780 can serve as a pathway to various kinds of external devices connected to the vehicle. For example, the interface unit 780 may include a port that can be connected to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 can exchange data with the mobile terminal 600.

Meanwhile, the interface unit 780 may serve as a channel for supplying electrical energy to the connected mobile terminal 600. The interface unit 780 provides electric energy supplied from the power supply unit 790 to the mobile terminal 600 under the control of the control unit 770 when the mobile terminal 600 is electrically connected to the interface unit 780 do.

The control unit 770 can control the overall operation of each unit in the vehicle. The control unit 770 may be referred to as an ECU (Electronic Control Unit).

The control unit 770 can perform a function corresponding to the transmitted signal in accordance with the execution signal transmission of the vehicle driving assist system.

The controller 770 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.

The control unit 770 can delegate the role of the processor 170 described above. That is, the processor 170 of the vehicle driving assist apparatus can be set directly to the control unit 770 of the vehicle. In this embodiment, it is understood that the vehicle driving assist device refers to a combination of some components of the vehicle.

Alternatively, the control unit 770 may control the configurations so as to transmit the information requested by the processor 170. [

The power supply unit 790 can supply power necessary for the operation of each component under the control of the control unit 770. [ Particularly, the power supply unit 770 can receive power from a battery (not shown) in the vehicle.

The AVN (Audio Video Navigation) device 400 can exchange data with the control unit 770. The control unit 770 can receive navigation information from the AVN apparatus 400 or a separate navigation device (not shown). Here, the navigation information may include set destination information, route information according to the destination, map information about the vehicle driving, or vehicle location information.

The features, structures, effects and the like described in the embodiments are included in at least one embodiment and are not necessarily limited to only one embodiment. Furthermore, the features, structures, effects and the like illustrated in the embodiments can be combined and modified by other persons skilled in the art to which the embodiments belong. Accordingly, the contents of such combinations and modifications should be construed as being included in the scope of the embodiments.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention. It can be seen that the modification and application of branches are possible. For example, each component specifically shown in the embodiments can be modified and implemented. It is to be understood that the present invention may be embodied in many other specific forms without departing from the spirit or essential characteristics thereof.

Claims (19)

A sensor unit for sensing a first object around the vehicle;
A processor for recognizing the sensed first object and generating risk information for the first object based on the recognized movement state of the first object; And
And a communication unit for transmitting the risk information generated through the processor to the outside
Vehicle driving assistance device.
The method according to claim 1,
The moving state may include:
And at least one of a moving direction and a moving speed of the first object,
The processor comprising:
Determines a risk level of the first object based on at least one of a moving direction and a moving speed of the first object, and generates risk information including information on the determined risk level
Vehicle driving assistance device.
The method according to claim 1,
The processor comprising:
When the start of the vehicle is turned off, driving power is supplied to the sensor unit and the communication unit for sensing the first object and transmitting the danger information
Vehicle driving assistance device.
The method according to claim 1,
The processor comprising:
Detecting at least one surrounding vehicle existing in the vicinity of the vehicle, and transmitting the danger information to the at least one nearby vehicle
Vehicle driving assistance device.
5. The method of claim 4,
The processor comprising:
Detecting a neighboring vehicle that is moving to the position of the first object among the sensed neighboring vehicles, and transmitting the risk information to the detected adjacent vehicle
Vehicle driving assistance device.
6. The method of claim 5,
The processor comprising:
And notifying the presence of the adjacent vehicle to the first object
Vehicle driving assistance device.
6. The method of claim 5,
The processor comprising:
A lamp driving signal is outputted so that light is generated in a position of the first object and a moving direction of the first object
Vehicle driving assistance device.
6. The method of claim 5,
Further comprising an announcement information output unit for outputting announcement information informing existence of the first object to the adjacent vehicle,
The announcement information output unit,
And an indicator output unit for displaying an indicator according to the first object outside the vehicle
Vehicle driving assistance device.
5. The method of claim 4,
The processor comprising:
Detecting a danger level of the first object with respect to the adjacent vehicle on the basis of the moving state of the adjacent vehicle and the moving state of the first object and transmitting the danger information including the detected danger level to the adjacent vehicle doing
Vehicle driving assistance device.
10. The method of claim 9,
The processor comprising:
A peripheral region of the vehicle is divided into a plurality of regions according to positions,
Setting a different risk level for each of the plurality of divided areas,
And transmits information on the set risk level to an adjacent vehicle existing in each of the divided areas
Vehicle driving assistance device.
The method according to claim 1,
Wherein,
The risk information is transmitted to a traffic server existing in the vicinity of the vehicle and the traffic light system is changed based on the transmitted risk information
Vehicle driving assistance device.
The method according to claim 1,
Further comprising a display unit for receiving the danger information on the second object recognized by the adjacent vehicle from the adjacent vehicle and displaying the received danger information
Vehicle driving assistance device.
13. The method of claim 12,
The displayed risk information may include,
Wherein the second object includes at least one of position information of the second object, moving state information of the second object, and danger level information of the second object
Vehicle driving assistance device.
14. The method of claim 13,
The displayed risk information may include,
Further comprising collision time information on the second object,
The processor comprising:
And the brake is automatically controlled based on the collision required time information
Vehicle driving assistance device.
A sensor unit for sensing an object existing around the vehicle;
A display unit for displaying an image of a traffic situation around the vehicle; And
And a processor for displaying information of the object on the image based on a position of the detected object when the object is sensed,
The processor comprising:
Dividing the image into a plurality of regions,
The information of the object is displayed in an area corresponding to the position of the object among the divided areas,
Wherein the information of the object includes:
Are displayed on the image in different ways depending on the degree of danger of the object
Vehicle driving assistance device.
16. The method of claim 15,
Wherein the area in which the information about the object among the divided areas is displayed,
Depending on the degree of risk, different colors or patterns are distinguished
Vehicle driving assistance device.
17. The method of claim 16,
Further comprising a communication unit for transmitting the collision risk information according to the overlapping to the first and second objects when the first and second objects in the image overlap with each other,
Vehicle driving assistance device.
18. The method of claim 17,
The processor comprising:
If the collision risk is detected, the vehicle is moved to a position corresponding to the safe zone on the image
Vehicle driving assistance device.
A vehicle driving assist system comprising the vehicle driving assist system of claim 1
vehicle.
KR1020150176349A 2015-12-10 2015-12-10 Driver Assistance Apparatus and Vehicle Having The Same KR20170069096A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150176349A KR20170069096A (en) 2015-12-10 2015-12-10 Driver Assistance Apparatus and Vehicle Having The Same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150176349A KR20170069096A (en) 2015-12-10 2015-12-10 Driver Assistance Apparatus and Vehicle Having The Same

Publications (1)

Publication Number Publication Date
KR20170069096A true KR20170069096A (en) 2017-06-20

Family

ID=59281140

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150176349A KR20170069096A (en) 2015-12-10 2015-12-10 Driver Assistance Apparatus and Vehicle Having The Same

Country Status (1)

Country Link
KR (1) KR20170069096A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200064439A (en) * 2018-11-29 2020-06-08 주식회사 알티스트 System and method of obstacle verification based on inter-vehicular communication
KR20200097831A (en) * 2019-02-08 2020-08-20 배민재 Device and method for controlliing sound singal of vehicle, and device of outputting soung signal
KR20200104221A (en) * 2019-02-26 2020-09-03 도요타지도샤가부시키가이샤 In-vehicle information processing device, inter-vehicle information processing system, and information processing system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200064439A (en) * 2018-11-29 2020-06-08 주식회사 알티스트 System and method of obstacle verification based on inter-vehicular communication
KR20200097831A (en) * 2019-02-08 2020-08-20 배민재 Device and method for controlliing sound singal of vehicle, and device of outputting soung signal
US11450156B2 (en) 2019-02-08 2022-09-20 Minjae BAE Device and method for controlling sound signal of vehicle, and device of outputting sound signal
KR20200104221A (en) * 2019-02-26 2020-09-03 도요타지도샤가부시키가이샤 In-vehicle information processing device, inter-vehicle information processing system, and information processing system

Similar Documents

Publication Publication Date Title
KR101750178B1 (en) Warning Method Outside Vehicle, Driver Assistance Apparatus For Executing Method Thereof and Vehicle Having The Same
KR101916993B1 (en) Display apparatus for vehicle and control method thereof
KR101832466B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
US10737689B2 (en) Parking assistance apparatus and vehicle having the same
KR101826408B1 (en) Display Apparatus and Vehicle Having The Same
EP3481692B1 (en) Driver assistance apparatus
CN109789778B (en) Automatic parking assist device and vehicle comprising same
KR20170058188A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101860626B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR20180040235A (en) Parking Assistance Apparatus and Vehicle Having The Same
KR20170099188A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR20180037730A (en) Display apparatus for vehicle and vehicle having the same
KR20170111084A (en) Display Apparatus and Vehicle Having The Same
KR101962348B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101790426B1 (en) Apparatus for automatic parking and vehicle having the same
KR20170072092A (en) Driver assistance appratus and method thereof
KR20170033612A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101972352B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101929294B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101843535B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR20170069096A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101888259B1 (en) Vehicle Assistance Apparatus and Vehicle Having The Same
KR20180073042A (en) Driving assistance apparatus and vehicle having the same
KR101737236B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101894636B1 (en) Driver Assistance Apparatus and Vehicle Having The Same