WO2012172713A1 - 道路形状判定装置、車載用画像認識装置、撮像軸調整装置およびレーン認識方法 - Google Patents
道路形状判定装置、車載用画像認識装置、撮像軸調整装置およびレーン認識方法 Download PDFInfo
- Publication number
- WO2012172713A1 WO2012172713A1 PCT/JP2012/001576 JP2012001576W WO2012172713A1 WO 2012172713 A1 WO2012172713 A1 WO 2012172713A1 JP 2012001576 W JP2012001576 W JP 2012001576W WO 2012172713 A1 WO2012172713 A1 WO 2012172713A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- unit
- imaging
- lane
- road
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 62
- 238000004364 calculation method Methods 0.000 claims abstract description 30
- 238000003384 imaging method Methods 0.000 claims description 190
- 230000008569 process Effects 0.000 claims description 55
- 238000001514 detection method Methods 0.000 claims description 35
- 238000012937 correction Methods 0.000 claims description 32
- 230000008859 change Effects 0.000 claims description 3
- 230000010354 integration Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 38
- 230000006399 behavior Effects 0.000 description 31
- 238000010586 diagram Methods 0.000 description 13
- 230000014509 gene expression Effects 0.000 description 13
- 230000000694 effects Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 7
- 239000003550 marker Substances 0.000 description 7
- 238000009795 derivation Methods 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000004069 differentiation Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
- B60W40/072—Curvature of the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
- B60W2050/0083—Setting, resetting, calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
- B60W2050/0215—Sensor drifts or sensor failures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a technique for recognizing a lane shape or the like on which a vehicle travels by a camera mounted on the vehicle.
- the left and right lane markers in the traveling lane in which the vehicle is traveling are image-recognized based on the result captured by the camera. Then, the intersection of the extension lines is obtained from the recognized left and right lane markers, and these are collected and averaged to obtain the camera mounting angle error.
- the present invention has been made in view of the above problems, and an object of the present invention is to determine whether or not the imaging angle of an imaging unit provided in a vehicle is a straight road with a smaller calculation load.
- the periphery of the vehicle is imaged by an imaging unit provided in the vehicle, and the lane shape of the traveling lane in which the vehicle travels is recognized from the captured image.
- the left and right positions located in the vicinity region are based on the lane shape in the vicinity region that is relatively close to the host vehicle and the lane shape in a distant region that is far from the host vehicle among the recognized lane shapes.
- the deviation of the intersection of the extension line that approximates the lane marker with a straight line and the intersection of the extension line that linearly approximates the left and right lane markers located in the far region portion is determined to be equal to or less than a preset threshold value, it is determined as a straight road.
- FIG. 1 It is a figure showing an example of vehicles carrying an in-vehicle image recognition device concerning a 1st embodiment of the present invention.
- It is a functional block diagram which shows an example of a structure of the vehicle-mounted image recognition apparatus which concerns on 1st Embodiment of this invention.
- 3 is a functional block diagram illustrating an example of a configuration of a lane shape recognition unit 102.
- FIG. It is a schematic diagram which shows the concept of the process in the lane shape recognition part. It is a schematic diagram which shows the concept in the case of performing a lane recognition process by classifying a near field and a far field.
- It is a flowchart which shows an example of the process in the vehicle-mounted image recognition apparatus which concerns on 1st Embodiment of this invention.
- FIG. 1 is a diagram illustrating an example of a vehicle equipped with an in-vehicle image recognition device according to the present embodiment.
- the in-vehicle image recognition apparatus according to the present embodiment is an apparatus for recognizing a lane in which a vehicle travels from an image captured by an in-vehicle camera.
- the vehicle 1 includes a camera 10 incorporating an image processing device 10a, a vehicle speed detection device 20, a steering angle detection device 30, a steering angle control device 40, and a steering angle actuator 50.
- the camera 10 captures an image in front of the vehicle 1.
- the camera 10 is a digital camera provided with an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). More specifically, the camera 10 is a progressive scan type 3CMOS camera that captures images at high speed.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the camera 10 is installed, for example, at the front center of the ceiling of the vehicle 1 so as to image the front of the vehicle 1 and images the traveling road ahead of the vehicle 1 through the windshield. Yes.
- any other installation mode can be used as long as it is a camera that captures an image of the traveling path of the host vehicle 1.
- the camera 10 can be attached to the rear of the vehicle 1 like a back view camera, or can be attached to the front end of the vehicle 1 such as a bumper, and the vanishing point is not reflected in the field of view of the camera 10. You can also.
- the virtual vanishing point can be calculated by detecting the edge of the lane marker and calculating the approximate straight line.
- the image processing apparatus 10a is an apparatus that executes the lane recognition process according to the present embodiment. That is, the camera 10 incorporating the image processing apparatus 10a of FIG. 1 corresponds to the in-vehicle image recognition apparatus according to the present embodiment. Information output from the image processing device 10 a, the vehicle speed detection device 20, and the steering angle detection device 30 is input to the steering angle control device 40. Then, the steering angle control device 40 outputs a signal for realizing target steering to the steering angle actuator 50.
- the camera 10 and the steering angle control device 40 are each provided with a microcomputer, peripheral components thereof, drive circuits for various actuators, and the like, and transmit / receive information to / from each other via a communication circuit.
- the lane recognition process according to the present embodiment is realized by the hardware configuration as described above.
- the camera 10 incorporating the image processing apparatus 10a functionally has an imaging unit 101, a lane shape recognition unit 102, a vehicle behavior recognition unit 103, an imaging angle derivation unit 104, and an information bias determination unit 105.
- the imaging angle correction unit 106 is provided.
- the imaging unit 101 images the periphery of the vehicle 1.
- the lane shape recognition unit 102 recognizes the lane shape of the travel lane in which the vehicle 1 travels from the captured image captured by the imaging unit 101.
- a traveling lane detection method for example, a known method described in JP-A-2004-252827 may be employed.
- a method for calculating the shape of the traveling lane and the position and posture of the vehicle for example, a known method described in Japanese Patent Application Laid-Open No. 2004-318618 may be employed.
- the lane shape recognizing unit 102 obtains the intersection coordinates from the extension lines of the pair of left and right lane markers in the far region and the nearby region, using the lane shape recognized in this way. For example, the intersection coordinates are obtained from the extension lines of a pair of left and right lane markers in the far and near areas by the following method. That is, the lane shape recognition unit 102 analyzes the captured image of the imaging unit 101, and the yaw angle C of the vehicle 1, the pitch angle D of the vehicle 1, the height H of the imaging unit 101 from the road surface, and the lateral displacement from the lane center. An arithmetic unit for calculating A and the curvature B of the traveling lane is provided.
- the lane shape recognition unit 102 outputs the yaw angle C of the vehicle 1, the lateral displacement A from the center of the lane, and the curvature B of the traveling lane calculated by the calculation device to the steering angle control device 40. Thereby, for example, automatic steering of the vehicle 1 is realized.
- FIG. 3 is a diagram illustrating a configuration example of the lane shape recognition unit 102.
- FIG. 4 is a schematic diagram showing the concept of processing in the lane shape recognition unit 102.
- the lane shape recognition unit 102 includes a white line candidate point detection unit 200, a lane recognition processing unit 300, and an optical axis correction unit 400.
- the white line candidate point detection unit 200 detects white line candidate points to be lane markings based on image data captured by the imaging unit 101.
- the white line candidate point detection unit 200 acquires an image obtained by imaging the travel path of the host vehicle 1 from the imaging unit 101, and detects the white line edge Ed by performing image processing.
- the position of the image processing frame F is determined based on road parameters (road shape and vehicle posture with respect to the road) described later with respect to the lane markings (white lines) positioned on the left and right of the acquired captured image.
- the set image processing frame F is subjected to, for example, primary spatial differentiation using a Sobel filter to emphasize the edge of the boundary between the white line and the road surface, and then extract the white line edge Ed.
- the lane recognition processing unit 300 includes a road shape calculation unit 310 that linearly approximates the road shape, and a road parameter estimation unit 320 that estimates the road shape and the vehicle posture with respect to the road.
- the road shape calculation unit 310 passes pixels whose intensity of the white line edge Ed extracted by the white line candidate point detection unit 200 is equal to or greater than a preset threshold Edth, and more than a preset number of pixels Pth, and An approximate straight line Rf of the road shape is calculated by extracting a straight line connecting one point on the upper side and one lower side of the detection region by Hough transform.
- the road image data obtained by photographing is divided into two areas, a distant area and a vicinity, and the road shape is linearly approximated in each area (see FIG. 5).
- the road parameter estimation unit 320 estimates a road parameter (road shape and vehicle attitude with respect to the road) as a road model equation from the approximate straight line Rf detected by the road shape calculation unit 310 using the following equation (1). .
- the parameters A, B, C, D, and H in the equation (1) are road parameters and vehicle state quantities estimated by the road parameter estimation unit 320.
- the parameters A, B, C, D, and H are the lateral displacement (A) with respect to the lane of the vehicle 1, the road curvature (B), the yaw angle (C) with respect to the lane of the host vehicle 1, and the pitch of the vehicle 1, respectively.
- W is a constant indicating the lane width (the distance between the inside of the left and right white lines on the actual road)
- f is a camera perspective transformation constant
- j is a parameter for distinguishing the left and right white lines
- j 0 for the left white line
- For the right white line, j 1.
- (x, y) is the coordinates on the road image at an arbitrary point on the inner edge of the left or right white line, taking the upper left of the road image as the origin, the right direction is the x-axis positive direction, the lower direction Is the positive y-axis direction.
- the optical axis correction unit 400 includes a straight road determination unit 410 that determines that the travel path of the host vehicle 1 is a straight road, and a parallel travel determination unit that determines that the host vehicle 1 travels parallel to the travel path. 420 and a virtual vanishing point calculation unit 430 that calculates a virtual vanishing point from the approximate straight line Rf of the road shape.
- the straight road determination unit 410 compares the degree of coincidence between the slope of the linear equation and the intercept value of the approximate straight line Rf of the distant and neighboring road shapes calculated by the road shape calculation unit 310, so that the traveling road of the host vehicle 1 It is determined whether or not is a straight road.
- the parallel travel determination unit 420 determines that the host vehicle 1 is traveling in parallel with the travel path from the vehicle posture with respect to the travel path of the host vehicle 1 estimated by the road parameter estimation unit 320. Specifically, the parallel running determination unit 420 uses the lateral displacement A of the host vehicle 1 with respect to the lane, which is one of the vehicle state quantities estimated by the road parameter estimation unit 320, from the difference between the current value and the past value. The lateral speed (the differential value of the lateral displacement A) with respect to the lane of the host vehicle 1 is calculated. When the calculated lateral speed is equal to or less than a preset threshold value, it is determined that the host vehicle 1 is traveling in parallel with the travel path. The fact that the host vehicle 1 is traveling in parallel with the travel path indicates that the host vehicle 1 is traveling straight when the travel path is a straight road.
- the virtual vanishing point calculation unit 430 determines that the straight road determination unit 410 and the parallel travel determination unit 420 determine that the travel path of the host vehicle 1 is a straight road and the host vehicle 1 travels parallel to the travel path of the host vehicle 1. In this state, the intersection of the right and left approximate straight lines Rf of the road shape is calculated as a virtual vanishing point.
- FIG. 5 is a schematic diagram showing a concept when a lane recognition process is performed by dividing a near area and a far area.
- a curved road having a relatively large radius is shown as an example.
- the lane shape recognition unit 102 divides the image data captured by the imaging unit 101 into a near region (lower image portion) and a far region (image center portion), and in each region, white line candidate inspection is performed.
- the exit unit 200 and the lane recognition processing unit 300 detect the white line edge Ed and the approximate straight line Rf.
- the straight road determination unit 410 determines whether or not the traveling road is a straight line. For example, the straight road determination unit 410 determines that these approximate straight lines Rf match when the slopes and intercepts of the approximate straight lines Rf in the near and far regions are within a preset threshold.
- the travel path is determined to be a straight line.
- the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 provided with the imaging unit 101. Specifically, the vehicle behavior recognition unit 103 detects the vehicle speed (traveling speed) of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, the front and rear of the vehicle detected by an acceleration sensor (not shown), and The behavior of the vehicle 1 is determined based on the acceleration in the vehicle width direction, the yaw rate value detected by the yaw rate sensor, and the like.
- the imaging angle deriving unit 104 obtains the imaging angle of the imaging unit 101 from the lane shape recognition of the lane shape recognition unit 102.
- the information bias determination unit 105 determines whether or not there is a bias with respect to at least one of the lane shape recognition unit 102 and the vehicle behavior recognition unit 103.
- the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 using the imaging angle output by the imaging angle deriving unit 104 when the information bias determination unit 105 determines that the bias is equal to or less than a preset threshold value.
- step S ⁇ b> 101 the lane shape recognition unit 102 reads an image in front of the vehicle 1 captured by the imaging unit 101.
- step S102 the vehicle behavior recognition unit 103 reads the vehicle speed of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, and the like.
- step S103 the lane shape recognition unit 102 processes the captured image of the imaging unit 101 read in step S101 to recognize the traveling lane in which the vehicle 1 is traveling, and the position and vehicle posture of the vehicle 1 with respect to the traveling lane. Etc. are calculated.
- step S104 the lane shape recognition unit 102 uses the lane shape recognized in step S103 to obtain the intersection coordinates from the extension lines of a pair of left and right lane markers in the far and near areas. As described above, in the present step, the intersection coordinates of the neighboring area obtained by the lane shape recognition unit 102 are defined as (Pn_x, Pn_y), and the intersection coordinates of the far area are defined as (Pf_x, Pf_y).
- step S105 the straight road determination unit 410 determines whether or not the road is a straight road from the intersection coordinates of the distant area and the near area obtained in step S104 by the following expression. If the following expression is satisfied, the process proceeds to step S106. Otherwise, the process proceeds to step S110. abs (Pn_x-Pf_x) ⁇ TH_Px (0-1)
- abs (A) is a function that returns the absolute value of A.
- the value of HP_Px is a positive value set in advance, such as 1.0. Note that satisfying the above equation (0-1) means that the intersection coordinates on the extension of the left and right lane markers detected from the area only in the vicinity of the camera imaging screen and the intersection coordinates on the extension of the left and right lane markers detected from the far field are , Being close. In other words, the fact that this condition is satisfied means that the straight lines are in the same direction from a distant place to the vicinity.
- step S106 the parallel running determination unit 420 uses the lateral offset position (the distance in the left-right direction to the lane marker) Y of the host vehicle with respect to the host lane obtained in step S103 as input, and performs pseudo time differentiation using the following transfer function. Then, the lateral speed Ydot of the host vehicle is calculated. If the following expression (0-4) is satisfied, the process proceeds to step S107, and otherwise, the process proceeds to step S110.
- G (Z -1 ) (c-cZ -2 ) / (1-aZ -1 + bZ -2 ) (0-2)
- Ydot G (Z -1 ) Y (0-3) abs (Ydot) ⁇ TH_Ydot (0-4)
- Z ⁇ 1 is a delay operator, and coefficients a, b, and c are all positive numbers, and these are discretized at a sampling period of 50 ms so as to have a preset frequency characteristic.
- the value of TH_Ydot is a positive value set in advance, such as 0.03, for example, and may be a large value according to the height of the vehicle speed.
- satisfying the above expression (0-4) means that the vehicle has not moved laterally with respect to the lane marker. In other words, it means that the vehicle is traveling along the lane marker without wobbling from side to side. Further, when the above equations (0-1) and (0-4) are satisfied at the same time, it means that the vehicle is traveling straight on a straight road.
- step S107 the straight road determination unit 410 determines whether or not the road curvature Row obtained in step S103 satisfies all the following conditions. If the road curvature Row satisfies all of the following conditions, the process proceeds to step S108, and otherwise, the process proceeds to step S110. abs (Row) ⁇ TH_ROW (1-1) abs (SumTotalRow + Row) ⁇ TH_ROW (1-2)
- abs (A) is a function that returns the absolute value of A.
- SumTotalRow is the total value of the road curvature Row.
- TH_ROW is a threshold value when the traveling lane is regarded as a straight road.
- the information bias determination unit 105 determines that the traveling lane is a straight road when the absolute value of the road curvature Row and the absolute value of the sum SumTotalRow + Row are less than TH_ROW.
- the value of TH_ROW is a preset positive value such as 0.0003, for example.
- the imaging angle of the imaging unit 101 is corrected in step S108 and subsequent steps.
- the reason why the imaging angle of the image capturing unit 101 is corrected only in a scene corresponding to a straight road is that, in the case of a straight road, the point at infinity of the straight road becomes the center coordinate on the image when image processing is performed. It is. That is, in general, it is more accurate to obtain the center coordinates from the intersection of the pair of left and right markers recognized on the straight road than the case where the center coordinates are obtained by correcting from the estimated curvature value on the curved road. Because.
- step S108 the sum total value SumTotalRow of road curves is updated by the following equation.
- SumTotalRow SumTotalRow + Row (1-3)
- the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 by the following equation and determines the corrected imaging angle of the imaging unit 101.
- FOE_X_est 0.9 * FOE_X_est + 0.1 * Pn_x (1-4)
- FOE_Y_est 0.9 * FOE_Y_est + 0.1 * Pn_y (1-5)
- FOE_X_est and FOE_Y_est are coordinates on the captured image of the front of the vehicle 1 corresponding to the imaging angle of the imaging unit 101, and the initial value is initial adjustment at a factory or the like (calibration of mounting error called factory aiming or the like) ) Is a measurement value of a camera mounting error (also referred to as a camera imaging angle error) with respect to a fixed target.
- the coordinates calculated by the above formulas (1-4) and (1-5) are used as the origin coordinates when performing the lane recognition process in the subsequent step S103.
- aiming refers to optical axis adjustment.
- step S110 the past value used in the filter and the counter value used in the timer are updated and the process ends. Note that the value of SumTotalRow is initialized to “0” before the processing flow of FIG. 6 is executed.
- the on-vehicle image recognition apparatus described above recognizes the lane shape of a travel lane in which the vehicle 1 travels from a captured image of the image capturing unit 101 that captures a travel path around the vehicle 1. Further, the imaging angle of the imaging unit 101 is obtained from the recognized lane shape. Then, it is determined whether or not the recognized lane shape is biased with respect to the recognized lane shape, and the imaging angle of the imaging unit 101 is corrected using the imaging angle when it is determined that there is no bias.
- the imaging angle error of the imaging unit can be accurately detected with less processing load. Can be corrected.
- the standard deviation calculation unit is not provided.
- a result targeting a specific state in which the vehicle travels straight on a straight road without any bias is input. It may be corrected. In this case, the input value tends to have a normal distribution, so that the correction accuracy is also high.
- the in-vehicle image recognition apparatus of the present embodiment is an in-vehicle image recognition apparatus provided in the vehicle 1.
- the imaging unit 101 images the periphery of the vehicle 1.
- the lane shape recognition unit 102 recognizes the lane shape of the traveling lane in which the vehicle 1 travels from the captured image of the imaging unit 101.
- the imaging angle deriving unit 104 obtains the imaging angle of the imaging unit 101 from the lane shape recognized by the lane shape recognition unit 102.
- the straight road determination unit 410 is located in the vicinity region based on the lane shape in the vicinity region relatively close to the host vehicle and the lane shape in the far region far from the host vehicle among the lane shapes recognized by the lane shape recognition unit.
- the deviation of the intersection of the extension line obtained by linearly approximating the left and right lane markers and the intersection of the extension line obtained by linearly approximating the left and right lane markers located in the distant region portion is determined.
- the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 using the imaging angle obtained by the imaging angle deriving unit 104 when the straight road determination unit 410 determines that the bias is within the threshold.
- the imaging angle of the imaging unit can be estimated with a smaller calculation load even when the road shape is biased, such as driving on a highway only one way or on a road with many right curves. it can.
- the straight path can be determined with higher accuracy by using the deviation of the intersection.
- the straight road determination unit 410 has the absolute value of the value indicating the recognized lane shape of the lane shape recognized by the lane shape recognition unit 102 smaller than a predetermined threshold, and the sum of the values indicating the lane shape. Is smaller than the threshold, it is determined that the bias is within the threshold. Further, the information bias determination unit 105 determines the bias using the road curvature recognized by the lane shape recognition unit 102. This makes it possible to estimate the imaging angle of the imaging unit with less computational load even when the road shape is biased, such as driving on a highway only on one way or on a road with many right curves. Can do.
- the vehicle speed detection device 20 detects the vehicle speed of the vehicle 1.
- the steering angle detection device 30 detects the steering angle of the vehicle 1.
- the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 from the vehicle speed detected by the vehicle speed detection device 20 and the steering angle detected by the steering angle detection unit 30. If it is determined that the deviation from the behavior of the vehicle 1 recognized by the vehicle behavior recognition unit 103 is within the threshold, the imaging angle of the imaging unit 101 is corrected. As a result, the imaging angle of the imaging unit can be estimated with a smaller calculation load even when the road shape is biased, such as driving on a highway only one way or on a road with many right curves. it can.
- the lane shape recognition unit 102 detects a parameter related to road curvature.
- the information bias determination unit 105 determines that the parameter value related to the road curvature has converged within a predetermined range.
- the information bias determination unit 105 integrates parameter values within a preset time from the time of determining the convergence.
- the information bias determination unit 105 determines the straight running state of the vehicle by determining that the integrated value is within a predetermined value. Then, the image recognition apparatus performs an aiming process when the information bias determination unit 105 determines that the vehicle is in a straight traveling state.
- the imaging angle of the imaging unit can be estimated with a smaller calculation load even when the road shape is biased, such as driving on a highway only one way or on a road with many right curves. it can.
- FIG. 7 is a diagram illustrating an example of the configuration of the in-vehicle image recognition apparatus according to the present embodiment.
- the in-vehicle image recognition apparatus according to the present embodiment includes an imaging unit 101, a lane shape recognition unit 102, a vehicle behavior recognition unit 103, an imaging angle derivation unit 104, an information bias determination unit 105, an imaging angle correction unit 106, and a standard deviation calculation unit. 107.
- the standard deviation calculating unit 107 calculates the standard deviation of the imaging angle obtained by the imaging angle deriving unit 104 when the information bias determining unit 105 determines that there is no bias.
- the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 according to the standard deviation calculated by the standard deviation calculation unit 107.
- step S ⁇ b> 201 the lane shape recognition unit 102 reads an image in front of the vehicle 1 captured by the imaging unit 101.
- step S202 the vehicle behavior recognition unit 103 detects the vehicle speed of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, the longitudinal acceleration detected by the acceleration sensor, and the yaw rate sensor. Read the yaw rate value of each.
- step S203 the lane shape recognition unit 102 processes the captured image of the imaging unit 101 read in step S201 to recognize the traveling lane in which the vehicle 1 is traveling, and the position and vehicle posture of the vehicle 1 with respect to the traveling lane. Etc. are calculated.
- step S204 the lane shape recognition unit 102 uses the lane shape recognized in step S203 to obtain the intersection coordinates from the extension lines of a pair of left and right lane markers in the far and near regions. As described above, in the present step, the intersection coordinates of the neighboring area obtained by the lane shape recognition unit 102 are defined as (Pn_x, Pn_y), and the intersection coordinates of the far area are defined as (Pf_x, Pf_y).
- step S205 the straight road determination unit 410 determines whether or not the road is a straight road from the intersection coordinates of the far area and the near area obtained in step S204 by the following equation. If all of the following expressions are satisfied, the process proceeds to step S206. Otherwise, the process proceeds to step S213. This process is the same as step S105 of the first embodiment. abs (Pn_x-Pf_x) ⁇ TH_PX (2-1) abs (Pn_y-Pf_y) ⁇ TH_PY (2-2)
- TH_PX is a threshold value for the difference in intersection coordinates between the distant region and the nearby region in the horizontal direction of the imaging screen.
- TH_PY is a threshold value for a difference in intersection coordinates between a far area and a near area in the vertical direction of the imaging screen.
- the parallel running determination unit 420 receives as input the lateral offset position (lateral distance to the lane marker) Y of the host vehicle with respect to the host lane obtained in step S203, and uses the above formulas (0-2) (0-3). ) Is pseudo-differentiated with respect to the transfer function to calculate the lateral speed Ydot of the host vehicle. If the above expression (0-4) is satisfied, the process proceeds to step S207. Otherwise, the process proceeds to step S213.
- step S207 the information bias determination unit 105 determines whether or not all the conditions of the following expression are satisfied. If all the conditions of the following expression are satisfied, the process proceeds to step S208. Otherwise, the process proceeds to step S213.
- YawRate is a yaw rate value representing the speed of the vehicle 1 in the rotational direction.
- SumTotalYR is the total value of YawRate.
- TH_YR is a threshold value when the vehicle 1 is considered to be traveling straight, and if the absolute value of YawRate and the total value SumTotalYR of YawRate is less than TH_YR, the vehicle 1 is regarded as traveling straight (formula (2-5), (2-6)).
- VspDot is the acceleration in the longitudinal direction of the vehicle 1.
- TH_VD is a threshold value when the vehicle 1 is considered to be traveling at a constant speed. When the absolute value of VspDot is less than TH_VD, the vehicle 1 is regarded as traveling at a constant speed. SumTotalVD is the total value of VspDot.
- step S208 the imaging angle of the imaging unit 101 is corrected.
- the reason why the imaging angle of the imaging unit 101 is corrected in a scene where the vehicle 1 travels straight on a straight road and there is no bias between the travel path and the travel is as follows. Street.
- a time delay in hardware such as capturing of a captured image of the image capturing unit 101 and a time delay in software such as image processing always occur, but even in that case, disturbance due to the behavior of the vehicle 1 may occur. This is because it is difficult to be influenced and the intersection coordinates corresponding to the imaging angle of the imaging unit 101 are calculated with high accuracy. Further, even when the difference in encounter frequency between the right curve and the left curve is large, the camera mounting angle error is reduced. It can be obtained correctly.
- step S208 SumTotalPx, SumTotalPy, SumTotalYR, SumTotalVD, and SumCount are updated and the coordinate data of the neighboring intersections are stored in the collection memory according to the following equations.
- SumTotalPx SumTotalPx + Pn_x-Pf_x (2-9)
- SumTotalPy SumTotalPy + Pn_y-Pf_y (2-10)
- SumTotalYR SumTotalYR + YawRate (2-11)
- SumTotalVD SumTotalVD + VspDot (2-12)
- SumCount SumCount + 1 (2-15)
- FOE_X_DataRcd [] is a parameter for storing the horizontal coordinate on the captured image in front of the traveling direction of the vehicle 1
- FOE_Y_DataRcd [] stores the vertical coordinate on the captured image in front of the traveling direction of the vehicle 1. It is a parameter to do.
- These parameters are stored in a RAM memory or the like (not shown).
- SumCount is a counter that counts the number of collected coordinate data of neighboring intersections, and the initial value is “0”. Note that SumCount is initialized before the processing flow of FIG. 8 is executed.
- step S210 the imaging angle deriving unit 104 calculates the imaging angle of the imaging unit 101 using the following equations (2-17) and (2-18). Further, the standard deviation calculation unit 107 calculates the standard deviation of the imaging angle of the imaging unit 101 by the following equations (2-19) and (2-20).
- FOE_X_e_tmp ⁇ FOE_X_DataRcd / SumCount (2-17)
- FOE_Y_e_tmp ⁇ FOE_Y_DataRcd / SumCount
- FOE_X_stdev ⁇ (FOE_X_e_tmp-FOE_X_DataRcd) 2 / SumCount
- FOE_Y_stdev ⁇ (FOE_Y_e_tmp-FOE_Y_DataRcd) 2 / SumCount (2-20)
- ⁇ in the above equation is an operator for calculating the sum of the number of coordinate data of the neighboring intersections represented by SumCount.
- step S ⁇ b> 211 the variation of the imaging angle candidates of the imaging unit 101 obtained by the imaging angle deriving unit 104 is determined. Specifically, if all the conditions of the following equation are satisfied, the process proceeds to step S212, and if not, the process proceeds to step S213.
- TH_STDEV represents a threshold value indicating a variation allowed for the imaging angle candidate of the imaging unit 101 obtained by the imaging angle deriving unit 104.
- TH_STDEV takes a positive value such as 1.0 pix, for example. That is, when the values of the standard deviations FOE_X_stdev and FOE_Y_stdev obtained in step S210 are smaller than TH_STDEV, it is determined that the variation of the imaging angle candidates of the imaging unit 101 obtained by the imaging angle deriving unit 104 is small. In S212, the imaging angle of the imaging unit 101 is corrected.
- the correction accuracy can be improved as compared with the first embodiment by performing the correction only when the variation from the obtained standard deviation is small. Furthermore, the correction accuracy of the imaging angle based on the present invention performed after factory aiming can be defined as a specific value.
- step S212 the imaging angle correction unit 106 determines the corrected imaging angle of the imaging unit 101 by the following equation. These coordinates are used as the origin coordinates when performing the lane recognition process in the subsequent step S203.
- FOE_X_est FOE_X_e_tmp (2-23)
- FOE_Y_est FOE_Y_e_tmp (2-24)
- step S213 the past value used in the filter or the counter value used in the timer is updated and the process ends. Note that each value of SumCount is initialized to “0” before the processing flow of FIG. 8 is executed.
- the on-vehicle image recognition apparatus of this embodiment is the same as that of the first embodiment except for the configuration of the standard deviation calculation unit 107.
- the standard deviation calculation unit 107 calculates the standard deviation of the imaging angle of the imaging unit 101 when it is determined that there is no bias. Then, the imaging angle of the imaging unit 101 is corrected according to the calculated standard deviation. Thereby, the accuracy of estimation of the imaging angle of the imaging unit 101 can be improved.
- the standard deviation calculation unit 107 calculates the standard deviation of the imaging angle obtained by the imaging angle deriving unit 104 when the information bias determination unit 105 determines that the bias is within the threshold.
- the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 according to the standard deviation calculated by the standard deviation calculation unit 107.
- the information bias determination unit 105 determines the presence or absence of information bias and collects only the information determined to have no bias and calculates the standard deviation, only a small number of data (for example, the number of 50 data) However, the tendency to become a normal distribution becomes strong, and a correct determination of the degree of variation can be realized with a small calculation load.
- the behavior of the vehicle 1 recognized by the vehicle behavior recognition unit 103 is information regarding the rotational behavior of the vehicle 1 in the vehicle width direction. Further, the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 from the position in the vehicle width direction of the vehicle 1 with respect to the travel lane recognized by the lane shape recognition unit 102 or the time change of the yaw angle. Thereby, the accuracy of estimation of the imaging angle of the imaging unit 101 can be improved.
- FIG. 9 is a diagram illustrating an example of the configuration of the in-vehicle image recognition apparatus according to the present embodiment.
- the in-vehicle image recognition apparatus according to the present embodiment includes an imaging unit 101, a lane shape recognition unit 102, a vehicle behavior recognition unit 103, an imaging angle derivation unit 104, an information bias determination unit 105, an imaging angle correction unit 106, and a standard deviation calculation unit. 107 and an end unit 108.
- the standard deviation calculated by the standard deviation calculating unit 107 is less than a predetermined value
- the ending unit 108 ends the correction of the imaging angle.
- step S ⁇ b> 301 the lane shape recognition unit 102 reads an image in front of the vehicle 1 captured by the imaging unit 101.
- step S302 the vehicle behavior recognizing unit 103 detects the vehicle width direction speed of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, and the longitudinal acceleration detected by the acceleration sensor. The yaw rate value from the yaw rate sensor is read.
- step S303 the lane shape recognition unit 102 processes the captured image of the imaging unit 101 read in step S301 to recognize the traveling lane in which the vehicle 1 is traveling, and the position and vehicle posture of the vehicle 1 with respect to the traveling lane. Etc. are calculated.
- step S304 the end unit 108 determines whether the imaging angle correction processing of the imaging unit 101 has been completed. If it has been completed, step S305 follows. If it is not completed, step S314 follows. Specifically, if the condition of the following expression is satisfied, the process proceeds to step S305, and if not, the process proceeds to step S314.
- FlgAimComplt is a flag indicating whether or not the imaging angle correction processing of the imaging unit 101 has been completed.
- the initial value of FlgAimComplt is “0”.
- step S305 the lane shape recognition unit 102 uses the lane shape recognized in step S303 to obtain the intersection coordinates from the extension lines of a pair of left and right lane markers in the far and near areas.
- the intersection coordinates of the neighboring area obtained by the lane shape recognition unit 102 are defined as (Pn_x, Pn_y), and the intersection coordinates of the far area are defined as (Pf_x, Pf_y).
- step S306 the straight road determination unit 410 determines whether or not the road is a straight road from the intersection coordinates of the distant area and the vicinity area obtained in step S305 by the following expression. If all of the above equations (2-1) and (2-2) are satisfied, the process proceeds to step S307, and otherwise, the process proceeds to step S314. This process is the same as step S205 of the second embodiment.
- step S307 the parallel running determination unit 420 receives as input the lateral offset position (lateral distance to the lane marker) Y of the own vehicle with respect to the own lane obtained in step S303, and the above equations (0-2) (0-3) ) Is pseudo-differentiated with respect to the transfer function to calculate the lateral speed Ydot of the host vehicle. If the above equation (0-4) is satisfied, the process proceeds to step S308. Otherwise, the process proceeds to step S314. This process is the same as step S206 in the second embodiment.
- step S308 the information bias determination unit 105 determines whether or not all the conditions of the following expression are satisfied. If all the conditions of the following equation are satisfied, the process proceeds to step S309, and otherwise, the process proceeds to step S314.
- ysoku is a parameter indicating the speed of the vehicle 1 in the vehicle width direction.
- the speed in the vehicle width direction of the state variable of the lane recognition Kalman filter used in the lane recognition process in step S303 may be used as it is, or the position in the vehicle width direction to the travel lane is equivalent to time differentiation. It is.
- SumTotalYsoku is the total value of ysoku.
- TH_YS is a threshold value when the vehicle 1 is regarded as traveling straight, and as shown in the equations (3-4) and (3-5), the absolute value of the vehicle width direction speed ysoku and its sum SumTotalYsoku + ysoku When the absolute value of is less than TH_YS, it is determined that the vehicle 1 is traveling straight.
- the yaw rate of the state variable of the lane recognition Kalman filter used in the lane recognition process of step S303 may be used as it is.
- the yaw angle with respect to the lane may be time differentiated.
- the meanings of expressions other than expressions (3-4) and (3-5) are the same as those in the first embodiment and the second embodiment.
- SumTotalRow SumTotalRow + Row (3-10)
- SumTotalYsoku SumTotalYsoku + ysoku (3-11)
- SumTotalYR SumTotalYR + YawRate (3-12)
- SumTotalVD SumTotalVD + VspDot (3-13)
- FOE_X_DataRcd [SumCount] Pn_x (3-14)
- FOE_Y_DataRcd [SumCount] Pn_y (3-15)
- SumCount SumCount + 1 (3-16)
- step S313 the imaging angle correction unit 106 sets a completion flag FlgAimComplt of the imaging angle estimation process of the imaging unit 101 and determines the imaging angle of the imaging unit 101 according to the following equation. These coordinates are used as the origin coordinates when performing the lane recognition process in the subsequent step S303.
- FlgAimComplt 1 (3-17)
- FOE_Y_est FOE_Y_e_tmp (3-19)
- step S314 the past value used in the filter or the counter value used in the timer is updated and the process ends. Note that before executing the processing flow of FIG. 10, each value of FlgAimComplt and SumTotalRow is initialized to “0”.
- the on-vehicle image recognition apparatus of this embodiment is the same as that of the second embodiment except for the configuration of the end unit 108.
- the end unit 108 ends the correction of the imaging angle when the calculated standard deviation is less than a predetermined value. Thereby, once it is determined that the variation of the imaging angle candidates of the imaging unit 101 is small, the processing for correcting the imaging angle can be completed, so the processing load of the in-vehicle image recognition device is reduced. can do.
- FIG. 11 is a diagram illustrating the effects achieved by the in-vehicle image recognition device according to the present embodiment.
- the graph shown in FIG. 11 is a result of executing the lane recognition process according to the present embodiment in a scene where there are many gentle curves on the highway.
- the data in a range 70 surrounded by a circle is data indicating the result of Pn_x calculated in step S305.
- the data in range 71 is the result of Pn_x collected in step S307.
- the worst values indicated by the broken lines 80 and 81 approach the true value of 120.0 pixels by about 10 pixels. It was also confirmed from the standard deviation that the variation was almost halved (down about 44%). Furthermore, the number of data was reduced from 8000 to 50, and the processing load of standard deviation was also reduced.
- the behavior of the vehicle 1 recognized by the vehicle behavior recognition unit 103 is information regarding the translational behavior of the vehicle 1 in the vehicle width direction. Further, the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 from the position in the vehicle width direction of the vehicle 1 with respect to the travel lane recognized by the lane shape recognition unit 102 or the time change of the yaw angle. Thereby, the accuracy of estimation of the imaging angle of the imaging unit 101 can be improved.
- the vehicle speed detection device 20 constitutes a vehicle speed detection unit.
- the steering angle detection device 30 constitutes a steering angle detection unit.
- the lane shape recognition unit 102 constitutes a parameter detection unit.
- the vehicle behavior recognition unit 103, or the vehicle speed detection device 20, the steering angle detection device 30, and the steering angle control device 40 constitute a parameter detection unit.
- the straight road determination unit 410 constitutes an intersection bias determination unit and a recognition bias determination unit.
- the information bias determination unit 105 constitutes a convergence determination unit, an integration unit, and a straight traveling state determination unit.
- the imaging angle derivation unit 104 and the imaging angle correction unit 106 constitute an aiming execution unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Steering Control In Accordance With Driving Conditions (AREA)
Abstract
Description
本発明は、上記課題に鑑みてなされたものであり、車両に備えられた撮像部の撮像角度をより少ない演算負荷で直進路か否かを判定することを目的とする。
(第1実施形態)
(車載用画像認識装置の構成)
図1は、本実施形態に係る車載用画像認識装置を搭載した車両の一例を示す図である。本実施形態に係る車載用画像認識装置は、車両に備えられ、車載カメラが撮像した画像から車両が走行するレーン認識を行うための装置である。車両1は、画像処理装置10aを内蔵するカメラ10と、車速検出装置20と、操舵角検出装置30と、操舵角制御装置40と、操舵角アクチュエータ50とを備える。
カメラ10は、例えば、CCD(Charge Coupled Device)あるいはCMOS(Complementary Metal Oxide Semiconductor)等の撮像素子を備えたデジタルカメラである。より具体的には、カメラ10は、高速に撮像するプログレッシブスキャン式の3CMOSカメラである。
画像処理装置10a、車速検出装置20、および操舵角検出装置30から出力される情報は操舵角制御装置40へ入力される。そして、操舵角制御装置40は、目標とする操舵を実現するための信号を操舵角アクチュエータ50へ出力する。
画像処理装置10aを内蔵するカメラ10は、機能的には、図2に示すように、撮像部101、レーン形状認識部102、車両挙動認識部103、撮像角度導出部104、情報偏り判定部105、撮像角度補正部106を備える。
レーン形状認識部102は、撮像部101の撮像した撮像画像から車両1が走行する走行レーンのレーン形状を認識する。走行レーンの検出方法としては、例えば、特開2004-252827号公報に記載されている公知の方法を採用すればよい。また、走行レーンの形状や車両1の位置や姿勢などの算出方法としては、例えば、特開2004-318618号公報に記載されている公知の方法を採用すればよい。
すなわち、レーン形状認識部102は、撮像部101の撮影画像を解析し、車両1のヨー角C、車両1のピッチ角D、路面からの撮像部101の高さH、レーン中心からの横変位Aおよび走行レーンの曲率Bを算出する演算装置を備えている。なお、レーン形状認識部102は、この演算装置によって算出した車両1のヨー角C、車線中心からの横変位Aおよび走行車線の曲率Bを、操舵角制御装置40に出力する。これにより、例えば車両1の自動操舵等が実現される。
図3において、レーン形状認識部102は、白線候補点検出部200と、レーン認識処理部300と、光軸補正部400とを含む。
白線候補点検出部200は、撮像部101が撮影した画像データに基づいて、車線区分線となる白線の候補点を検出する。
道路形状算出部310は、図4に示すように、白線候補点検出部200によって抽出した白線エッジEdの強度が予め設定した閾値Edth以上の画素を、予め設定した画素数Pth以上通過し、かつ検出領域の上辺の1点と下辺の1点を結ぶ直線を、ハフ変換により抽出することにより道路形状の近似直線Rfを算出する。本実施形態においては、撮影して得た道路の画像データを遠方と近傍と2つの領域に分けて、それぞれの領域で道路形状を直線近似する(図5参照)。
道路パラメータ推定部320は、道路形状算出部310によって検出した道路形状の近似直線Rfから、次式(1)を用いて、道路モデル式として道路パラメータ(道路形状および道路に対する車両姿勢)を推定する。
直線路判定部410は、道路形状算出部310によって算出した遠方と近傍の道路形状の近似直線Rfについて、直線式の傾きと切片の値の一致度合いを比較することで、自車両1の走行路が直線路であるか否かを判定する。
図5に示すように、レーン形状認識部102は、撮像部101が撮影した画像データを近傍領域(画像下部)と遠方領域(画像中央部)とに区分し、それぞれの領域において、白線候補点検出部200およびレーン認識処理部300が白線エッジEdおよび近似直線Rfの検出を行う。さらに、それらの一致度合いを基に、直線路判定部410が、走行路が直線であるか否かを判定する。例えば、直線路判定部410は、近傍領域と遠方領域とにおける近似直線Rfそれぞれの傾きおよび切片が予め設定した閾値以内の差である場合に、これらの近似直線Rfが一致しているものと判定し、走行路が直線であると判定する。
図2に戻り、車両挙動認識部103は、撮像部101が備えられている車両1の挙動を認識する。具体的には、車両挙動認識部103は、車速検出装置20が検出する車両1の車速(走行速度)、操舵角検出装置30が検出する操舵角度、図示しない加速度センサが検出する車両の前後および車幅方向の加速度、ヨーレートセンサが検出するヨーレート値、などによって車両1の挙動を判断する。
情報偏り判定部105は、レーン形状認識部102および車両挙動認識部103の少なくとも一方に対する偏りの有無を判定する。
撮像角度補正部106は、情報偏り判定部105において偏り予め設定した閾値以下と判定したときの撮像角度導出部104の出力した撮像角度を用いて撮像部101の撮像角度を補正する。
以下、図6に示すフローチャートを参照して、本実施形態に係る車載用画像認識装置における処理について説明する。なお、図6の車載用画像認識装置の処理は、予め設定された時間間隔(例えば50ms(ミリ秒))ごとに繰り返し行われる。
ステップS101において、レーン形状認識部102は、撮像部101が撮像した車両1の前方の画像を読み込む。また、ステップS102において、車両挙動認識部103は、車速検出装置20が検出した車両1の車速や、操舵角検出装置30が検出した操舵角などを読み込む。
また、ステップS104では、レーン形状認識部102は、ステップS103で認識したレーン形状を用いて、遠方領域と近傍領域における左右一対のレーンマーカの延長線からその交点座標を求める。なお、上述したように、本ステップにおいてレーン形状認識部102で求められる近傍領域の交点座標を(Pn_x, Pn_y)と定義し、遠方領域の交点座標を(Pf_x,Pf_y)と定義する。
abs(Pn_x-Pf_x) ≦ TH_Px (0-1)
G(Z-1)=(c-cZ―2)/(1-aZ-1+bZ-2) (0-2)
Ydot =G(Z-1)Y (0-3)
abs(Ydot) ≦ TH_Ydot (0-4)
abs( Row ) < TH_ROW (1-1)
abs( SumTotalRow + Row ) < TH_ROW (1-2)
SumTotalRow = SumTotalRow + Row (1-3)
ステップS109では、撮像角度補正部106は、次式により撮像部101の撮像角度を補正し、補正後の撮像部101の撮像角度として決定する。
FOE_X_est = 0.9 * FOE_X_est + 0.1 * Pn_x (1-4)
FOE_Y_est = 0.9 * FOE_Y_est + 0.1 * Pn_y (1-5)
ステップS110では、フィルターなどで使う過去値やタイマーなどで使うカウンター値を更新して終了する。
なお、図6の処理フローを実行する前に、SumTotalRowの値は“0”に初期化しておく。
上記説明した車載用画像認識装置は、車両1周辺の走路を撮像する撮像部101の撮像画像から車両1が走行する走行レーンのレーン形状を認識する。また、認識したレーン形状から撮像部101の撮像角度を求める。そして、認識したレーン形状の認識したレーン形状に対する偏りの有無を判定し、偏りが無いと判定された時の撮像角度を用いて撮像部101の撮像角度を補正する。
なお、本実施形態では標準偏差算出部を有しないが、偏りが無い状態で、直線路を直進するという特定の状態を狙った結果を入力として、上記(1-4)(1-5)で補正しても良い。この場合、入力値は正規分布になる傾向が強くなるため、補正精度も高い状態となる。
本実施形態は、次のような効果を奏する。
(1)本実施形態の車載用画像認識装置は、車両1に備えられる車載用画像認識装置である。撮像部101は車両1周辺を撮像する。レーン形状認識部102は撮像部101の撮像画像から車両1が走行する走行レーンのレーン形状を認識する。撮像角度導出部104はレーン形状認識部102の認識したレーン形状から撮像部101の撮像角度を求める。直線路判定部410は、前記レーン形状認識部が認識したレーン形状のうち相対的に自車両に近い近傍領域のレーン形状と自車両から遠い遠方領域のレーン形状とに基づき、上記近傍領域に位置する左右のレーンマーカを直線近似した延長線の交点と、遠方領域部分に位置する左右のレーンマーカを直線近似した延長線の交点との交点の偏りを判定する。撮像角度補正部106は直線路判定部410において偏りが閾値以内と判定したときの撮像角度導出部104の求めた撮像角度を用いて撮像部101の撮像角度を補正する。
また、交点の偏りを利用することで、より精度良く直進路の判定をすることができる。
これにより、片道だけ高速道路を走行したり、右カーブが多い道路を走行するなど、道路形状に偏りがある場合であっても、撮像部の撮像角度の推定を、より少ない演算負荷で行なうことができる。
これにより、片道だけ高速道路を走行したり、右カーブが多い道路を走行するなど、道路形状に偏りがある場合であっても、撮像部の撮像角度を、より少ない演算負荷で推定することができる。
これにより、片道だけ高速道路を走行したり、右カーブが多い道路を走行するなど、道路形状に偏りがある場合であっても、撮像部の撮像角度を、より少ない演算負荷で推定することができる。
次に、第2実施形態について図面を参照して説明する。なお、上記第1実施形態と同様な構成について同一の符号を付して説明する。
(車載用画像認識装置の構成)
本実施形態の基本構成は、上記第1実施形態と同様である。ただし、本実施形態の車載用画像認識装置は、標準偏差算出部をさらに備える点が異なる。
標準偏差算出部107は、情報偏り判定部105において偏りが無いと判定された時の撮像角度導出部104が求めた撮像角度の標準偏差を算出する。
また、撮像角度補正部106は、標準偏差算出部107で算出した標準偏差に応じて撮像部101の撮像角度を補正する。
以下、図8に示すフローチャートを参照して、本実施形態に係る車載用画像認識装置における処理について説明する。なお、図8の車載用画像認識装置の処理は、予め設定された時間間隔(例えば50ms(ミリ秒))ごとに繰り返し行われる。
ステップS201において、レーン形状認識部102は、撮像部101が撮像した車両1の前方の画像を読み込む。また、ステップS202において、車両挙動認識部103は、車速検出装置20が検出した車両1の車速、操舵角検出装置30が検出した操舵角、加速度センサにより検出される前後方向の加速度、ヨーレートセンサからのヨーレート値をそれぞれ読み込む。
abs( Pn_x - Pf_x ) < TH_PX (2-1)
abs( Pn_y - Pf_y ) < TH_PY (2-2)
ステップS206において、平行走行判定部420はステップS203で求めた自車線に対する自車両の横方向オフセット位置(レーンマーカまでの左右方向の距離)Yを入力として、上記式(0-2)(0-3)の伝達関数によりに擬似的に時間微分し、自車両の横方向速度Ydotを算出する。そして、上記式(0-4)を満足する場合にはステップS207へ移行し、そうでない場合にはステップS213へ移行する。
abs( SumTotalPx + Pn_x - Pf_x ) < TH_PX (2-3)
abs( SumTotalPy + Pn_y - Pf_y ) < TH_PY (2-4)
abs( YawRate ) < TH_YR (2-5)
abs( SumTotalYR + YawRate ) < TH_YR (2-6)
abs( VspDot ) < TH_VD (2-7)
abs( SumTotalVD + VspDot ) < TH_VD (2-8)
また、VspDotは車両1の前後方向の加速度である。また、TH_VDは車両1が一定速度で走行をしているとみなす場合の閾値であって、VspDotの絶対値がTH_VD未満である場合には、車両1は一定速度で走行しているとみなす。また、SumTotalVDはVspDotの総和値である。
SumTotalPx = SumTotalPx + Pn_x - Pf_x (2-9)
SumTotalPy = SumTotalPy + Pn_y - Pf_y (2-10)
SumTotalYR = SumTotalYR + YawRate (2-11)
SumTotalVD = SumTotalVD + VspDot (2-12)
FOE_X_DataRcd[ SumCount ] = Pn_x (2-13)
FOE_Y_DataRcd[ SumCount ] = Pn_y (2-14)
SumCount = SumCount + 1 (2-15)
なお、SumCountは収集された近傍交点の座標データの個数をカウントするカウンターであって、初期値は“0”とする。なお、SumCountの初期化は図8の処理フローを実行する前に行われる。
SumCount >= 50 (2-16)
FOE_X_e_tmp = Σ FOE_X_DataRcd / SumCount (2-17)
FOE_Y_e_tmp = Σ FOE_Y_DataRcd / SumCount (2-18)
FOE_X_stdev = √Σ ( FOE_X_e_tmp - FOE_X_DataRcd ) 2 / SumCount
(2-19)
FOE_Y_stdev = √Σ ( FOE_Y_e_tmp - FOE_Y_DataRcd ) 2 / SumCount
(2-20)
ここで、上記式におけるΣとは、SumCountで表される近傍交点の座標データの個数分の総和を求める演算子である。
FOE_X_stdev < TH_STDEV (2-21)
FOE_Y_stdev < TH_STDEV (2-22)
FOE_X_est = FOE_X_e_tmp (2-23)
FOE_Y_est = FOE_Y_e_tmp (2-24)
ステップS213では、フィルターなどで使う過去値やタイマーなどで使うカウンター値を更新して終了する。
なお、図8の処理フローを実行する前に、SumCountの各値は“0”に初期化しておく。
本実施形態の車載用画像認識装置は、標準偏差算出部107以外の構成については、第1実施形態と同様である。
本実施形態の車載用画像認識装置は、標準偏差算出部107が、偏りが無いと判定された時の撮像部101の撮像角度の標準偏差を算出する。そして、この算出した標準偏差に応じて撮像部101の撮像角度を補正する。
これにより、撮像部101の撮像角度の推定の精度を高めることができる。
本実施形態は、第1実施形態の効果に加えて、次の効果を奏する。
(1)標準偏差算出部107は、情報偏り判定部105において偏りが閾値以内と判定されたときの撮像角度導出部104が求めた撮像角度の標準偏差を算出する。また、撮像角度補正部106は、標準偏差算出部107で算出した標準偏差に応じて撮像部101の撮像角度を補正する。
これにより、撮像部101の撮像角度の推定の精度を高めることができる。
次に、第3実施形態について図面を参照して説明する。なお、上記第1実施形態および第2実施形態と同様な構成について同一の符号を付して説明する。
(車載用画像認識装置の構成)
本実施形態の基本構成は、上記第2実施形態と同様である。ただし、本実施形態の車載用画像認識装置は、終了部をさらに備える点が異なる。
終了部108は、標準偏差算出部107で算出した標準偏差が予め定められた値未満である場合には、撮像角度の補正を終了させる。
以下、図10に示すフローチャートを参照して、本実施形態に係る車載用画像認識装置における処理について説明する。なお、図10の車載用画像認識装置の処理は、予め設定された時間間隔(例えば50ms(ミリ秒))ごとに繰り返し行われる。
ステップS301において、レーン形状認識部102は、撮像部101が撮像した車両1の前方の画像を読み込む。また、ステップS302において、車両挙動認識部103は、車速検出装置20が検出した車両1の車幅方向の速度、操舵角検出装置30が検出した操舵角、加速度センサにより検出される前後方向の加速度、ヨーレートセンサからのヨーレート値をそれぞれ読み込む。
ステップS304では、終了部108において撮像部101の撮像角度の補正処理が完了しているか判断する。完了している場合にはステップS305へ進み、完了していない場合にはステップS314へ進む。具体的には、次式の条件を満足すればステップS305へ、そうでなければステップS314へ、それぞれ進む。
FlgAimComplt < 1 (3-1)
abs( Row ) < TH_ROW (3-2)
abs( SumTotalRow + Row ) < TH_ROW (3-3)
abs( ysoku ) < TH_YS (3-4)
abs( SumTotalYsoku + ysoku ) < TH_YS (3-5)
abs( YawRate ) < TH_YR (3-6)
abs( SumTotalYR + YawRate ) < TH_YR (3-7)
abs( VspDot ) < TH_VD (3-8)
abs( SumTotalVD + VspDot ) < TH_VD (3-9)
なお、式(3-4)、(3-5)以外の式の意味については、第1実施形態および第2実施形態と同様である。
SumTotalRow = SumTotalRow + Row (3-10)
SumTotalYsoku = SumTotalYsoku + ysoku (3-11)
SumTotalYR = SumTotalYR + YawRate (3-12)
SumTotalVD = SumTotalVD + VspDot (3-13)
FOE_X_DataRcd[ SumCount ] = Pn_x (3-14)
FOE_Y_DataRcd[ SumCount ] = Pn_y (3-15)
SumCount = SumCount + 1 (3-16)
そして、ステップS313では、撮像角度補正部106は、次式により、撮像部101の撮像角度の推定処理の完了フラグFlgAimCompltを立てるとともに、撮像部101の撮像角度を決定する。また、これらの座標は、以降のステップS303のレーン認識処理を行う際の原点座標として用いられる。
FlgAimComplt = 1 (3-17)
FOE_X_est = FOE_X_e_tmp (3-18)
FOE_Y_est = FOE_Y_e_tmp (3-19)
ステップS314では、フィルターなどで使う過去値やタイマーなどで使うカウンター値を更新して終了する。
なお、図10の処理フローを実行する前に、FlgAimComplt、SumTotalRowの各値は“0”に初期化しておく。
本実施形態の車載用画像認識装置は、終了部108以外の構成については、第2実施形態と同様である。
本実施形態の車載用画像認識装置は、終了部108が、算出された標準偏差が予め定められた値未満である場合には、撮像角度の補正を終了させる。
これにより、一旦、撮像部101の撮像角度候補のばらつきが小さいと判断された場合には、撮像角度を補正するための処理を完了させることができるため、車載用画像認識装置の処理負荷を軽減することができる。
図11において、丸で囲まれた範囲70のデータは、ステップS305で算出されるPn_xの結果を示すデータである。また範囲71のデータは、ステップS307で収集されるPn_xの結果である。
本実施形態は、第1実施形態および第2実施形態の効果に加えて、次の効果を奏する。(1)終了部108は、標準偏差算出部107で算出した標準偏差が予め定められた値未満である場合には、撮像角度の補正を終了させる。
これにより、一旦、撮像部101の撮像角度候補のばらつきが小さいと判断された場合には、撮像角度を補正するための処理を完了させることができるため、車載用画像認識装置の処理負荷を軽減することができる。
これにより、撮像部101の撮像角度の推定の精度を高めることができる。
ここでは、限られた数の実施形態を参照しながら説明したが、権利範囲はそれらに限定されるものではなく、上記の開示に基づく各実施形態の改変は当業者にとって自明な事である。
10 カメラ
10a 画像処理装置
20 車速検出装置
30 操舵角検出装置
40 操舵角制御装置
50 操舵角アクチュエータ
101 撮像部
102 レーン形状認識部
103 車両挙動認識部
104 撮像角度導出部
105 判定部
106 撮像角度補正部
107 標準偏差算出部
108 終了部
200 白線候補点検出部
300 レーン認識処理部
310 道路形状算出部
320 道路パラメータ推定部
400 光軸補正部
410 直線路判定部
420 平行走行判定部
430 仮想消失点算出部
Claims (13)
- 前記車両周辺を撮像する撮像部と、
前記撮像部が撮像した撮像画像に基づき前記車両が走行する走行レーンのレーン形状を認識するレーン形状認識部と、
自車両が走行路と平行に走行していることを判定する平行走行判定部と、
前記レーン形状認識部が認識したレーン形状のうち相対的に自車両に近い近傍領域のレーン形状と自車両から遠い遠方領域のレーン形状とに基づき、上記近傍領域に位置する左右のレーンマーカを直線近似した延長線の交点と、遠方領域部分に位置する左右のレーンマーカを直線近似した延長線の交点との交点の偏りを判定する交点偏り判定部と、
前記交点偏り判定部において前記交点の偏りが予め設定した閾値以下と判定したときに直線路と判定する直線路判定部と、
を備えることを特徴とする道路形状判定装置。 - 請求項1に記載の道路形状判定装置と、
前記道路形状判定装置が直進路と判定したときに、前記撮像部の撮像角度を補正する撮像角度補正部と、
を備えることを特徴とする車載用画像認識装置。 - 前記レーン形状認識部が認識したレーン形状から前記撮像部の撮像角度を求める撮像角度導出部を更に備え、
前記撮像角度補正部は、前記道路形状判定装置が直進路と判定したときの前記撮像角度導出部が求めた撮像角度を用いて前記撮像部の撮像角度を補正する請求項2に記載の車載用画像認識装置。 - 前記交点偏り判定部において交点の偏りが予め設定した閾値以下と判定されたときの前記撮像角度導出部が求めた前記撮像角度の標準偏差を算出する標準偏差算出部をさらに備え、
前記撮像角度補正部は、前記標準偏差算出部が算出した標準偏差に応じて前記撮像部の撮像角度を補正することを特徴とする請求項3に記載の車載用画像認識装置。 - 前記標準偏差算出部が算出した標準偏差が予め定められた値未満である場合には、前記撮像角度の補正を終了させることを特徴とする請求項4に記載の車載用画像認識装置。
- 前記レーン形状認識部で認識される道路曲率を用いて認識したレーン形状に対する偏りから走行路の形状を判定する認識偏り判定部を更に備え、
前記撮像角度補正部は、前記交点偏り判定部において前記交点の偏りが予め設定した閾値以下と判定し、且つ前記認識偏り判定部において直進路と判定したときに、前記撮像部の撮像角度を補正することを特徴とする請求項2から5のいずれか一項に記載の車載用画像認識装置。 - 前記認識偏り判定部は、前記レーン形状認識部が認識したレーン形状に対する偏りの絶対値が予め定められた閾値よりも小さく、かつ、前記偏りの総和が前記閾値よりも小さい場合に直進路と判定することを特徴とする請求項6に記載の車載用画像認識装置。
- 前記車両の車速を検出する車速検出部と、
前記車両の操舵角を検出する操舵角検出部と、
前記車速検出部が検出した車速と、前記操舵角検出部が検出した操舵角から、車両の挙動を認識する車両挙動認識部と、を更に有し、
前記撮像角度補正部は、更に、前記車両挙動認識部が認識した車両の挙動に対する偏りが予め設定した設定閾値以下と判定したときに、前記撮像部の撮像角度を補正することを特徴とする請求項2から7のいずれか一項に記載の車載用画像認識装置。 - 前記車両挙動認識部にて認識された前記車両の挙動は、前記車両の車幅方向の並進挙動に関する情報であることを特徴とする請求項8に記載の車載用画像認識装置。
- 前記車両挙動認識部にて認識された前記車両の挙動は、前記車両の車幅方向の回転挙動に関する情報であることを特徴とする請求項8に記載の車載用画像認識装置。
- 前記車両挙動認識部は、前記レーン形状認識部にて認識された前記走行レーンに対する前記車両の車幅方向の位置もしくはヨー角度の時間変化から前記車両の挙動を認識することを特徴とする請求項8に記載の車載用画像認識装置。
- 車両に備えられた撮像部の撮像軸を自動調整する撮像軸調整装置であって、
道路曲率に関するパラメータを検出するパラメータ検出部と、
前記道路曲率に関するパラメータの値が、予め定められた範囲以内に収束していることを判定する収束判定部と、
前記収束判定部が判定した時点から所定時間内のパラメータ値を積算する積算部と、
前記積算部が積算した積算値が、予め定められた値以内であることを判定することで、前記車両の直線走行状態を判定する直線走行状態判定部と、
前記直線走行状態判定部が直線走行状態であると判定した場合には、エーミング処理を行なうエーミング実行部と、
を備えることを特徴とする撮像軸調整装置。 - 車両に備えられた撮像部で前記車両周辺を撮像し、その撮像した撮像画像から前記車両が走行する走行レーンのレーン形状を認識し、その認識したレーン形状から前記撮像部の撮像角度を求め、
前記認識したレーン形状のうち相対的に自車両に近い近傍領域のレーン形状と自車両から遠い遠方領域のレーン形状とに基づき、上記近傍領域に位置する左右のレーンマーカを直線近似した延長線の交点と、遠方領域部分に位置する左右のレーンマーカを直線近似した延長線の交点との交点の偏りが、予め設定した閾値以下と判定したときの前記検出した撮像角度を用いて前記撮像部の撮像角度を補正することを特徴とするレーン認識方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/125,832 US20140118552A1 (en) | 2011-06-13 | 2012-03-07 | Road shape determining device, in-vehicle image recognizing device, imaging axis adjusting device, and lane recognizing method |
EP12801148.3A EP2720213A4 (en) | 2011-06-13 | 2012-03-07 | DEVICE FOR DETERMINING A ROAD PROFILE, VEHICLE INTERNAL IMAGE RECOGNITION DEVICE, APPARATUS FOR ADJUSTING A PICTURE BASE AXIS AND TRACE RECOGNITION METHOD |
CN201280026555.2A CN103582907B (zh) | 2011-06-13 | 2012-03-07 | 车载用图像识别装置及车道识别方法 |
JP2013520407A JP5733395B2 (ja) | 2011-06-13 | 2012-03-07 | 車載用画像認識装置、撮像軸調整装置およびレーン認識方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-131222 | 2011-06-13 | ||
JP2011131222 | 2011-06-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012172713A1 true WO2012172713A1 (ja) | 2012-12-20 |
Family
ID=47356734
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/001576 WO2012172713A1 (ja) | 2011-06-13 | 2012-03-07 | 道路形状判定装置、車載用画像認識装置、撮像軸調整装置およびレーン認識方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20140118552A1 (ja) |
EP (1) | EP2720213A4 (ja) |
JP (1) | JP5733395B2 (ja) |
CN (1) | CN103582907B (ja) |
WO (1) | WO2012172713A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103991474A (zh) * | 2013-02-14 | 2014-08-20 | 本田技研工业株式会社 | 车辆的转向控制装置 |
JP2018005617A (ja) * | 2016-07-04 | 2018-01-11 | 株式会社Soken | 走路形状認識装置、走路形状認識方法 |
US10800412B2 (en) * | 2018-10-12 | 2020-10-13 | GM Global Technology Operations LLC | System and method for autonomous control of a path of a vehicle |
US20200408586A1 (en) * | 2018-03-22 | 2020-12-31 | Panasonic Intellectual Property Management Co., Ltd. | Axle load measuring apparatus and axle load measuring method |
WO2022145054A1 (ja) * | 2021-01-04 | 2022-07-07 | 日本電気株式会社 | 画像処理装置、画像処理方法、及び記録媒体 |
JP7359922B1 (ja) | 2022-09-26 | 2023-10-11 | 株式会社デンソーテン | 情報処理装置、情報処理方法およびプログラム |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5747482B2 (ja) * | 2010-03-26 | 2015-07-15 | 日産自動車株式会社 | 車両用環境認識装置 |
WO2014170989A1 (ja) * | 2013-04-18 | 2014-10-23 | 西日本高速道路エンジニアリング四国株式会社 | 道路走行面の形状を調査する装置 |
JP6093314B2 (ja) * | 2014-02-14 | 2017-03-08 | 株式会社デンソー | 境界線認識装置 |
JP6299373B2 (ja) * | 2014-04-18 | 2018-03-28 | 富士通株式会社 | 撮像方向の正常性の判定方法、撮像方向の正常性の判定プログラムおよび撮像方向の正常性の判定装置 |
JP6189816B2 (ja) * | 2014-11-19 | 2017-08-30 | 株式会社Soken | 走行区画線認識装置 |
JP6389119B2 (ja) * | 2014-12-25 | 2018-09-12 | 株式会社デンソー | 車線境界線認識装置 |
JP6363518B2 (ja) * | 2015-01-21 | 2018-07-25 | 株式会社デンソー | 区画線認識装置 |
KR101673776B1 (ko) * | 2015-06-05 | 2016-11-07 | 현대자동차주식회사 | 자동차용 헤드유닛 및 카메라 유닛의 고장 진단 방법 |
KR101748269B1 (ko) * | 2015-11-11 | 2017-06-27 | 현대자동차주식회사 | 자율 주행 차량의 조향 제어 방법 및 장치 |
KR102433791B1 (ko) | 2015-11-20 | 2022-08-19 | 주식회사 에이치엘클레무브 | 차선 이탈 경고 장치 및 방법 |
KR102503253B1 (ko) * | 2015-12-14 | 2023-02-22 | 현대모비스 주식회사 | 주변 차량 인지 시스템 및 방법 |
US20170307743A1 (en) * | 2016-04-22 | 2017-10-26 | Delphi Technologies, Inc. | Prioritized Sensor Data Processing Using Map Information For Automated Vehicles |
DE102016207436A1 (de) | 2016-04-29 | 2017-11-02 | Ford Global Technologies, Llc | System und Verfahren zum Steuern- und/oder Regeln eines Lenksystems eines Fahrzeugs sowie Fahrzeug |
JP6637399B2 (ja) * | 2016-09-30 | 2020-01-29 | 株式会社デンソー | 領域認識装置及び領域認識方法 |
CN110140158A (zh) * | 2017-01-10 | 2019-08-16 | 三菱电机株式会社 | 行驶路径识别装置及行驶路径识别方法 |
JP2018173834A (ja) * | 2017-03-31 | 2018-11-08 | 本田技研工業株式会社 | 車両制御装置 |
CN106910358B (zh) * | 2017-04-21 | 2019-09-06 | 百度在线网络技术(北京)有限公司 | 用于无人车的姿态确定方法和装置 |
JP6627822B2 (ja) * | 2017-06-06 | 2020-01-08 | トヨタ自動車株式会社 | 車線変更支援装置 |
CN108099905B (zh) * | 2017-12-18 | 2020-08-18 | 深圳大学 | 车辆偏航检测方法、系统及机器视觉系统 |
CN108229386B (zh) * | 2017-12-29 | 2021-12-14 | 百度在线网络技术(北京)有限公司 | 用于检测车道线的方法、装置和介质 |
CN108427417B (zh) * | 2018-03-30 | 2020-11-24 | 北京图森智途科技有限公司 | 自动驾驶控制系统及方法、计算机服务器和自动驾驶车辆 |
CN108921079B (zh) * | 2018-06-27 | 2022-06-10 | 盯盯拍(深圳)技术股份有限公司 | 拍摄角度调整方法、拍摄角度调整设备以及车载摄像装置 |
CN112352260B (zh) * | 2018-06-27 | 2024-09-17 | 日本电信电话株式会社 | 车道估计装置、方法以及计算机可读存储介质 |
KR102132899B1 (ko) * | 2018-10-08 | 2020-07-21 | 주식회사 만도 | 교차로에서의 경로 생성 장치 및 교차로에서의 차량 제어 장치 및 방법 |
CN113016179A (zh) * | 2018-11-15 | 2021-06-22 | 松下知识产权经营株式会社 | 摄像机系统和车辆 |
US10728461B1 (en) * | 2019-01-31 | 2020-07-28 | StradVision, Inc. | Method for correcting misalignment of camera by selectively using information generated by itself and information generated by other entities and device using the same |
CN112686904A (zh) * | 2020-12-14 | 2021-04-20 | 深兰人工智能(深圳)有限公司 | 车道划分方法、装置、电子设备和存储介质 |
CN112862899B (zh) * | 2021-02-07 | 2023-02-28 | 黑芝麻智能科技(重庆)有限公司 | 用于图像获取设备的外参标定方法、装置和系统 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000242899A (ja) | 1999-02-24 | 2000-09-08 | Mitsubishi Electric Corp | 白線認識装置 |
JP2002259995A (ja) * | 2001-03-06 | 2002-09-13 | Nissan Motor Co Ltd | 位置検出装置 |
JP2004252827A (ja) | 2003-02-21 | 2004-09-09 | Nissan Motor Co Ltd | 車線認識装置 |
JP2004318618A (ja) | 2003-04-17 | 2004-11-11 | Nissan Motor Co Ltd | 車線認識装置 |
JP2008003959A (ja) * | 2006-06-23 | 2008-01-10 | Omron Corp | 車両用通信システム |
JP2009234543A (ja) * | 2008-03-28 | 2009-10-15 | Mazda Motor Corp | 車両の車線逸脱警報装置 |
WO2010140578A1 (ja) * | 2009-06-02 | 2010-12-09 | 日本電気株式会社 | 画像処理装置、画像処理方法、及び画像処理用プログラム |
JP2011221983A (ja) * | 2010-03-26 | 2011-11-04 | Nissan Motor Co Ltd | 車両用環境認識装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0624035B2 (ja) * | 1988-09-28 | 1994-03-30 | 本田技研工業株式会社 | 走行路判別装置 |
US5359666A (en) * | 1988-09-28 | 1994-10-25 | Honda Giken Kogyo Kabushiki Kaisha | Driving way judging device and method |
JP3357749B2 (ja) * | 1994-07-12 | 2002-12-16 | 本田技研工業株式会社 | 車両の走行路画像処理装置 |
JP3521860B2 (ja) * | 2000-10-02 | 2004-04-26 | 日産自動車株式会社 | 車両の走行路認識装置 |
JP3645196B2 (ja) * | 2001-02-09 | 2005-05-11 | 松下電器産業株式会社 | 画像合成装置 |
DE602006020231D1 (de) * | 2005-12-06 | 2011-04-07 | Nissan Motor | Detektionsvorrichtung und -verfahren |
JP4820221B2 (ja) * | 2006-06-29 | 2011-11-24 | 日立オートモティブシステムズ株式会社 | 車載カメラのキャリブレーション装置およびプログラム |
JP2008241446A (ja) * | 2007-03-27 | 2008-10-09 | Clarion Co Ltd | ナビゲーション装置及びその制御方法 |
JP4801821B2 (ja) * | 2007-09-21 | 2011-10-26 | 本田技研工業株式会社 | 道路形状推定装置 |
JP5375958B2 (ja) * | 2009-06-18 | 2013-12-25 | 富士通株式会社 | 画像処理装置および画像処理方法 |
-
2012
- 2012-03-07 JP JP2013520407A patent/JP5733395B2/ja active Active
- 2012-03-07 CN CN201280026555.2A patent/CN103582907B/zh not_active Expired - Fee Related
- 2012-03-07 US US14/125,832 patent/US20140118552A1/en not_active Abandoned
- 2012-03-07 EP EP12801148.3A patent/EP2720213A4/en not_active Withdrawn
- 2012-03-07 WO PCT/JP2012/001576 patent/WO2012172713A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000242899A (ja) | 1999-02-24 | 2000-09-08 | Mitsubishi Electric Corp | 白線認識装置 |
JP2002259995A (ja) * | 2001-03-06 | 2002-09-13 | Nissan Motor Co Ltd | 位置検出装置 |
JP2004252827A (ja) | 2003-02-21 | 2004-09-09 | Nissan Motor Co Ltd | 車線認識装置 |
JP2004318618A (ja) | 2003-04-17 | 2004-11-11 | Nissan Motor Co Ltd | 車線認識装置 |
JP2008003959A (ja) * | 2006-06-23 | 2008-01-10 | Omron Corp | 車両用通信システム |
JP2009234543A (ja) * | 2008-03-28 | 2009-10-15 | Mazda Motor Corp | 車両の車線逸脱警報装置 |
WO2010140578A1 (ja) * | 2009-06-02 | 2010-12-09 | 日本電気株式会社 | 画像処理装置、画像処理方法、及び画像処理用プログラム |
JP2011221983A (ja) * | 2010-03-26 | 2011-11-04 | Nissan Motor Co Ltd | 車両用環境認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2720213A4 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103991474A (zh) * | 2013-02-14 | 2014-08-20 | 本田技研工业株式会社 | 车辆的转向控制装置 |
JP2018005617A (ja) * | 2016-07-04 | 2018-01-11 | 株式会社Soken | 走路形状認識装置、走路形状認識方法 |
US20200408586A1 (en) * | 2018-03-22 | 2020-12-31 | Panasonic Intellectual Property Management Co., Ltd. | Axle load measuring apparatus and axle load measuring method |
US11976960B2 (en) * | 2018-03-22 | 2024-05-07 | Panasonic Intellectual Property Management Co., Ltd. | Axle load measuring apparatus and axle load measuring method |
US10800412B2 (en) * | 2018-10-12 | 2020-10-13 | GM Global Technology Operations LLC | System and method for autonomous control of a path of a vehicle |
WO2022145054A1 (ja) * | 2021-01-04 | 2022-07-07 | 日本電気株式会社 | 画像処理装置、画像処理方法、及び記録媒体 |
JP7505596B2 (ja) | 2021-01-04 | 2024-06-25 | 日本電気株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
JP7359922B1 (ja) | 2022-09-26 | 2023-10-11 | 株式会社デンソーテン | 情報処理装置、情報処理方法およびプログラム |
JP2024047398A (ja) * | 2022-09-26 | 2024-04-05 | 株式会社デンソーテン | 情報処理装置、情報処理方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20140118552A1 (en) | 2014-05-01 |
JP5733395B2 (ja) | 2015-06-10 |
CN103582907B (zh) | 2016-07-20 |
EP2720213A4 (en) | 2015-03-11 |
CN103582907A (zh) | 2014-02-12 |
EP2720213A1 (en) | 2014-04-16 |
JPWO2012172713A1 (ja) | 2015-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5733395B2 (ja) | 車載用画像認識装置、撮像軸調整装置およびレーン認識方法 | |
CN106096525B (zh) | 一种复合型车道识别系统及方法 | |
US8670590B2 (en) | Image processing device | |
EP3179445B1 (en) | Outside environment recognition device for vehicles and vehicle behavior control device using same | |
JP3711405B2 (ja) | カメラを利用した車両の道路情報抽出方法及びシステム | |
US7542835B2 (en) | Vehicle image processing device | |
JP3780848B2 (ja) | 車両の走行路認識装置 | |
US11398051B2 (en) | Vehicle camera calibration apparatus and method | |
US20100201814A1 (en) | Camera auto-calibration by horizon estimation | |
EP2933790A1 (en) | Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method | |
EP3282389B1 (en) | Image processing apparatus, image capturing apparatus, moving body apparatus control system, image processing method, and program | |
US8730325B2 (en) | Traveling lane detector | |
US20150363653A1 (en) | Road environment recognition system | |
US10235579B2 (en) | Vanishing point correction apparatus and method | |
CN111164648B (zh) | 移动体的位置推断装置及位置推断方法 | |
JP6035095B2 (ja) | 車両の衝突判定装置 | |
JP3961584B2 (ja) | 区画線検出装置 | |
US20120128211A1 (en) | Distance calculation device for vehicle | |
EP3287948B1 (en) | Image processing apparatus, moving body apparatus control system, image processing method, and program | |
JP6963490B2 (ja) | 車両制御装置 | |
JP3319383B2 (ja) | 走行路認識装置 | |
JP5559650B2 (ja) | 車線推定装置 | |
WO2014167393A1 (en) | Travel path detection apparatus and travel path detection method | |
EP3288260B1 (en) | Image processing device, imaging device, equipment control system, equipment, image processing method, and carrier means | |
EP3825648A1 (en) | Object detection device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12801148 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013520407 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2012801148 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012801148 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14125832 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |