WO2011158304A1 - 運転支援装置、運転支援システム、および運転支援カメラユニット - Google Patents
運転支援装置、運転支援システム、および運転支援カメラユニット Download PDFInfo
- Publication number
- WO2011158304A1 WO2011158304A1 PCT/JP2010/004085 JP2010004085W WO2011158304A1 WO 2011158304 A1 WO2011158304 A1 WO 2011158304A1 JP 2010004085 W JP2010004085 W JP 2010004085W WO 2011158304 A1 WO2011158304 A1 WO 2011158304A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- state
- vehicle
- image
- camera
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/168—Driving aids for parking, e.g. acoustic or visual feedback on parking space
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Definitions
- the present invention relates to a driving support device that assists driving by allowing a driver to visually recognize the situation around the vehicle when the stopped vehicle moves backward or forward.
- the driving support device captures a situation around the vehicle with a camera attached to the vehicle, and displays the captured camera image according to the state of the vehicle. For example, the situation around the vehicle is imaged with a plurality of cameras, and when the vehicle is stopped, an image with the number of viewpoints corresponding to the number of cameras is displayed so that the driver can easily grasp the surrounding situation.
- the driving support device of Patent Document 1 switches from a plurality of viewpoints to a single viewpoint image as soon as the vehicle starts to move, it becomes difficult to check the surroundings as soon as the movement starts. Therefore, there is a problem that the vehicle cannot be moved slowly while checking the situation around the vehicle.
- the driving support device of Patent Document 2 displays an image with a small angle of view when the movement starts with a small steering angle of the steering wheel, the surrounding situation is confirmed regardless of when the vehicle starts moving. There is a problem that it is difficult.
- the display of the image is not appropriately switched according to the state of the vehicle.
- An object of the present invention is to provide a driving support device capable of displaying an image that allows a sense of distance to be easily grasped after elapses.
- a driving support device is connected to a camera having a wide-angle lens that is attached to a vehicle and images a road surface in a direction in which the vehicle moves, and displays an image based on a camera image that is an image captured by the camera on a display device.
- a driving support apparatus for displaying image generation information including lens distortion information indicating distortion of the camera image due to a lens shape of the camera and projection information indicating distortion of the camera image according to a projection method of the wide-angle lens.
- An information storage unit a vehicle information acquisition unit that acquires vehicle information including a gear state and speed that is a state of the transmission of the vehicle, and a vehicle state that is the state of the vehicle based on the vehicle information And generating the image to be displayed on the display device by processing the camera image according to the vehicle state, using the vehicle state determination unit that performs the image generation, and the image generation information
- An image generation unit, and the vehicle state determination unit sets the vehicle state as a movement preparation state in which the vehicle is movable and stopped, and a predetermined in-movement condition is established after the movement is started. Until the vehicle is moving, and a moving start state in which the vehicle is moving after the moving condition is satisfied is determined.
- a wide-angle image that is distorted but can be seen in a wide range is generated, and when the vehicle state is the movement state, the lens is obtained from the camera image.
- a distortion-free image which is an image from which distortion due to the shape and distortion due to the projection method has been removed, is generated.
- a driving support camera unit is a driving support camera unit that captures an image of a road surface in a direction in which a vehicle moves and displays an image based on the captured camera image on a display device, and is attached to the vehicle.
- image generation including a camera having a wide-angle lens for imaging the road surface, lens distortion information indicating distortion of the camera image due to the lens shape of the camera, and projection information indicating distortion of the camera image by the projection method of the wide-angle lens
- An information storage unit that stores information, a vehicle information acquisition unit that acquires vehicle information including a gear state and speed that is a state of the transmission of the vehicle, and a vehicle that is in the state of the vehicle based on the vehicle information Using the vehicle state determination unit that determines the state and the image generation information, the camera image is processed according to the vehicle state and displayed on the display device.
- An image generation unit that generates an image, and the vehicle state determination unit sets, as the vehicle state, a movement preparation state in which the vehicle is movable and stopped;
- the image generation unit determines a movement start state in which the vehicle is moving until a condition is satisfied, and a moving state in which the vehicle is moving after the moving condition is satisfied,
- a wide-angle image that is an image having a distortion but a wide range is seen, and when the vehicle state is the moving state, the camera A distortion-free image, which is an image obtained by removing distortion due to the lens shape and distortion due to the projection method from an image, is generated.
- an image for confirming a wide range of the road surface in the direction in which the vehicle moves is displayed for a predetermined period after the vehicle starts moving. After elapses, it is possible to display an image in which a sense of distance can be easily grasped.
- FIG. 1 is a block diagram illustrating a configuration of a driving support system according to Embodiment 1.
- FIG. 3 is a block diagram illustrating a configuration of a guide line calculation unit of the driving support system according to Embodiment 1.
- FIG. 4 is an example of a guide line on a real space calculated by a guide line generation unit of the driving support system according to the first embodiment.
- 2 is a block diagram illustrating a configuration of a camera image correction unit of the driving support system according to Embodiment 1.
- FIG. 4 is an example of a guide line image displayed under a first display condition in the driving support system according to Embodiment 1; 4 is an example of a guide line image displayed under a second display condition in the driving support system according to Embodiment 1.
- FIG. 1 is a block diagram illustrating a configuration of a driving support system according to Embodiment 1.
- FIG. 3 is a block diagram illustrating a configuration of a guide line calculation unit of the driving support system according to Embodiment 1.
- a photograph of an image displayed on a display device illustrating an example of the relationship between a wide-angle image displayed under the first display condition and an undistorted image displayed under the second display condition It is.
- an image displayed on a display device that explains the relationship between a wide-angle image displayed under the first display condition and another viewpoint undistorted image displayed under the third display condition by way of example.
- It is a photograph of. It is an example of the guide line image displayed on the 4th display condition in the driving support system concerning Embodiment 1. It is a figure explaining the change of the vehicle state which the display condition determination part of the driving assistance system which concerns on Embodiment 1 recognizes.
- FIG. 6 is a flowchart illustrating an operation for determining a vehicle state in a display condition determination unit of the driving support system according to the first embodiment.
- FIG. 6 is a flowchart illustrating an operation for determining a vehicle state in a display condition determination unit of the driving support system according to the first embodiment.
- 6 is a block diagram illustrating a configuration of a driving support system according to Embodiment 2.
- FIG. 10 is a flowchart illustrating an operation for determining a vehicle state in a display condition determination unit of a driving support system according to a second embodiment.
- FIG. 10 is a flowchart illustrating an operation for determining a vehicle state in a display condition determination unit of a driving support system according to a second embodiment.
- FIG. 10 is a flowchart illustrating an operation for determining a vehicle state in a display condition determination unit of a driving support system according to a second embodiment.
- FIG. 10 is a block diagram illustrating a configuration of a driving support system according to a third embodiment.
- FIG. 10 is a block diagram illustrating a configuration of a driving support system according to a fourth embodiment.
- FIG. 1 is a block diagram illustrating a configuration of the driving support system according to the first embodiment.
- the driving support system includes a host unit 1 and a camera unit 2 which are driving support devices.
- the electronic control unit 3 is an ECU (Electric Control Unit) generally mounted on a vehicle that controls an electronic device mounted on the vehicle by an electronic circuit, and detects vehicle information and outputs it to the host unit 1. It is.
- the vehicle information output device particularly includes gear state information indicating the position of the select bar operated by the driver's operation for changing the state of the transmission of the vehicle (hereinafter referred to as a gear state), and the vehicle speed.
- Vehicle information such as speed information indicating, acceleration information indicating vehicle acceleration, moving distance information indicating the moving distance of the vehicle in one cycle in which the vehicle information is detected, side brake information indicating the position of the side brake, etc. to the host unit 1 Output.
- vehicle is an AT (Automatic Transmission) vehicle that does not require the driver to operate the clutch.
- an automobile (vehicle) is equipped with a navigation device for guiding a route to a destination.
- the navigation device is pre-installed in the vehicle and sold separately from the vehicle and attached to the vehicle.
- the ECU is provided with a terminal for outputting vehicle information so that a commercially available navigation device can be attached. Therefore, in the driving support system according to the present embodiment, vehicle information can be acquired by connecting the host unit 1 to this output terminal.
- the host unit 1 may be integrated with the navigation device or may be a separate device.
- the host unit 1 sets a camera image that is an image around the vehicle (particularly the back) captured by a camera having a wide-angle lens that is an imaging unit included in the camera unit 2 at a predetermined position with respect to the vehicle behind the vehicle.
- the guide line images which are the images of the guide lines, are superimposed and displayed on the display unit 18 (display device) which is a monitor in the passenger compartment, for example.
- the vehicle state that is the state of the vehicle related to movement is determined from the vehicle speed and gear state, and the displayed image is changed according to the determined vehicle state, so that the driver can easily recognize the surrounding state.
- the host unit 1 includes a display unit 18 for displaying an image, a vehicle information acquisition unit 10 for acquiring vehicle information output from the electronic control unit 3, and an information storage unit 11 (guide for storing information for calculating a guide line) A line generation information storage unit), and a display condition determination unit 12 that generates display condition information on how to display the guide line image and the camera image on the display unit 18 based on the vehicle information acquired by the vehicle information acquisition unit 10 ( A vehicle state determination unit), a guide line calculation unit 13 (guide line information) that calculates guide line information, which is information about the drawing position and shape of the guide line, based on information stored in the information storage unit 11 and display condition information Generator), a line drawing unit 14 (guide line) that generates a guide line image in which a guide line is drawn based on the guide line information calculated by the guide line calculation unit 13 Image generation unit), camera image reception unit 15 that receives a camera image transmitted from the camera unit 2, and information received by the camera image reception unit 15 based on information stored in the information storage unit 11 and display condition information
- an image superimposing unit 17 that superimposes the guide line image and the corrected camera image.
- the guide line image and the corrected camera image having different layers output from the image superimposing unit 17 are combined and displayed on the display unit 18 as one image.
- the camera image correction unit 16 and the image superimposing unit 17 constitute an image output unit.
- the host unit 1 When the vehicle gear state acquired by the vehicle information acquisition unit 10 of the host unit 1 is reverse (reverse), the host unit 1 operates the camera of the camera unit 2 to transmit the captured camera image. Control.
- the display unit 18 displays an image in which the guide line image generated by the line drawing unit 14 is superimposed on the camera image transmitted from the camera unit 2, and the vehicle driver confirms this image. By doing so, the vehicle can be parked using the guide line as a guideline while visually confirming the situation behind and around the vehicle to be driven. Note that when there is an instruction from the driver, an image captured by the camera may be displayed on the display unit 18.
- each component which comprises a driving assistance device is demonstrated.
- the information storage unit 11 stores the following information as guide line calculation information for calculating a guide line to be described later.
- the attachment information is information indicating how the camera is attached to the vehicle, that is, the attachment position and the attachment angle of the camera.
- the angle-of-view information is angle information indicating a range of a subject imaged by the camera of the camera unit 2 and display information indicating a display range when an image is displayed on the display unit 18.
- the angle information includes the maximum horizontal field angle Xa and the maximum vertical field angle Ya or diagonal field angle of the camera.
- the display information includes the maximum horizontal drawing pixel size Xp and the maximum vertical drawing pixel size Yp of the display unit 18.
- the projection information is information indicating the projection method of the lens used for the camera of the camera unit 2.
- the projection information value is any one of three-dimensional projection, equidistant projection, equisolid-angle projection, and orthographic projection.
- D Lens distortion information.
- the lens distortion information is information on lens characteristics relating to image distortion caused by the lens.
- E Viewpoint information.
- the viewpoint information is information related to another position where it is assumed that there is a camera.
- the guide line interval information is parking width information, vehicle width information, and distance information on safety distance, caution distance, and warning distance from the rear end of the vehicle.
- the parking width information is information indicating a parking width obtained by adding a predetermined margin width to the width of the vehicle (for example, the width of the parking section).
- the distance information of the safety distance, the caution distance, and the warning distance from the rear end of the vehicle is the distance from the rear end of the vehicle, for example, the safety distance is 1 m from the rear end of the vehicle, the caution distance is 50 cm, and the warning distance is As an indication of 10 cm, an indication of distance behind the vehicle is shown. Based on the safety distance, caution distance, and warning distance from the rear end of the vehicle, the driver can grasp how far the obstacle reflected in the rear of the vehicle has from the rear end of the vehicle.
- (C) projection information, (D) lens distortion information, and (E) viewpoint information are also image generation information used to convert a camera image captured by a camera.
- FIG. 2 is a block diagram showing the configuration of the guide line calculation unit 13.
- the guide line calculation unit 13 includes a guide line generation unit 131, a lens distortion function calculation unit 132, a projection function calculation unit 133, a projection plane conversion function calculation unit 134, a viewpoint conversion function calculation unit 135, and a video output conversion function calculation unit 136. It is configured to include.
- the lens distortion function calculation unit 132, the projection function calculation unit 133, and the viewpoint conversion function calculation unit 135 may not be operated depending on display condition information. Therefore, for the sake of simplicity, the case where all the above-described components operate will be described first.
- the guide line generation unit 131 receives the rear of the vehicle based on the guide line interval information acquired from the information storage unit 11 when the gear state information in which the vehicle gear state is reverse is input from the vehicle information acquisition unit 10.
- a guide line is virtually set on the road surface.
- FIG. 3 shows an example of guide lines in real space calculated by the guide line generation unit 131.
- a straight line L1 is a guide line indicating the width of the parking section
- a straight line L2 is a guide line indicating the width of the vehicle
- straight lines L3 to L5 are guide lines indicating a distance from the rear end of the vehicle.
- L3 indicates a warning distance
- L4 indicates a caution distance
- L5 indicates a safety distance.
- the straight lines L1 and L2 start from a straight line L3 closest to the vehicle, and have a length equal to or greater than the length of the parking section on the side far from the vehicle.
- the straight lines L3 to L5 are drawn so as to connect the straight lines L2 on both sides.
- a direction D1 indicates a direction in which the vehicle enters the parking section.
- the guide lines for both the vehicle width and the parking width are displayed, only one of them may be displayed. Further, the number of guide lines indicating the distance from the rear end of the vehicle may be two or less or four or more. For example, a guide line may be displayed at a position at the same distance as the vehicle length from any of the straight lines L3 to L5. Only a guide line (L1 and L2 in FIG.
- the display form (color, thickness, line type, etc.) of the guide line parallel to the traveling direction of the vehicle may be changed depending on the distance from the rear end of the vehicle.
- the length may be either the parking width or the vehicle width.
- the guide line generation unit 131 obtains and outputs the coordinates of the start point and end point of each guide line shown in FIG.
- Each function calculation unit in the subsequent stage calculates the value of the coordinate having the same influence as the influence received when the image is captured by the camera, for the necessary points on each guide line.
- the line drawing unit 14 Based on the calculated guide line information, the line drawing unit 14 generates a guide line image.
- the display unit 18 displays an image in which the guide line image is superimposed with no deviation from the camera image.
- the coordinate P can be defined as a position on orthogonal coordinates with a point on the road surface behind the vehicle at a predetermined distance from the vehicle as an origin.
- the lens distortion function calculation unit 132 calculates a lens distortion function i () determined based on the lens distortion information acquired from the information storage unit 11 with respect to the coordinates P indicating the guide line calculated by the guide line generation unit 131. As a result, the coordinates i (P) subjected to lens distortion are converted.
- the lens distortion function i () is a function expressing the distortion that a camera image receives due to the lens shape when a subject is imaged by the camera of the camera unit 2.
- the lens distortion function i () can be obtained by, for example, a Zhang model relating to lens distortion. In the Zhang model, lens distortion is modeled by radial distortion, and the following calculation is performed.
- (X 0 , y 0 ) is obtained from the mounting information of the camera unit 2.
- the optical axis of the lens is perpendicular to the road surface and passes through the above (x 0 , y 0 ).
- the projection function calculation unit 133 further applies a function h based on the projection method determined based on the projection information acquired from the information storage unit 11 to the coordinate i (P) subjected to the lens distortion output from the lens distortion function calculation unit 132.
- the coordinates are converted into coordinates h (i (P)) that are affected by the projection method (hereinafter referred to as projection distortion).
- the function h () by the projection method is a function indicating how far the light incident on the lens at an angle ⁇ is collected from the lens center.
- h () by the projection method is expressed as follows: f is the focal length of the lens, ⁇ is the incident angle of incident light, that is, the half angle of view, and Y is the image height (distance between the lens center and the condensing position) on the imaging surface of the camera.
- f the focal length of the lens
- ⁇ the incident angle of incident light
- Y the image height (distance between the lens center and the condensing position) on the imaging surface of the camera.
- the projection function calculation unit 133 converts the coordinate i (P) subjected to the lens distortion output from the lens distortion function calculation unit 132 into an incident angle ⁇ with respect to the lens, and substitutes it into any of the above projection expressions to generate an image. By calculating the height Y and returning the image height Y to the coordinates, the coordinates h (i (P)) subjected to the projection distortion are calculated.
- the projection plane conversion function calculation unit 134 is determined based on the attachment information acquired from the information storage unit 11 with respect to the coordinate h (i (P)) subjected to the projection distortion output from the projection function calculation unit 133. By calculating the projection plane conversion function f (), it is converted into coordinates f (h (i (P))) subjected to the projection plane conversion.
- Projection plane conversion refers to conversion that adds the influence of the mounting state because the image captured by the camera depends on the mounting state such as the mounting position and angle of the camera. By this conversion, each coordinate indicating the guide line is converted into a coordinate imaged by a camera attached to the vehicle at a position defined by the attachment information.
- the mounting information used in the projection plane conversion function f () includes the height L of the camera mounting position with respect to the road surface, the mounting vertical angle ⁇ that is the tilt angle of the optical axis of the camera with respect to the vertical line, and the center line that longitudinally crosses the vehicle
- the mounting horizontal angle ⁇ h which is an inclination angle with respect to, and the distance H from the center of the vehicle width.
- the projection plane conversion function f () is expressed by a geometric function using these. It is assumed that the camera is not displaced in the direction of tilt rotation with the optical axis as the rotation axis, and is correctly attached.
- the viewpoint conversion function calculation unit 135 further converts the viewpoint f acquired from the information storage unit 11 to the coordinates f (h (i (P))) subjected to the projection plane conversion output from the projection plane conversion function calculation unit 134.
- the coordinates are converted into coordinates j (f (h (i (P))) subjected to viewpoint conversion.
- the image obtained when the subject is imaged by the camera is an image as if the subject was seen from the position where the camera was attached. This image is taken by a camera at another position (for example, a camera that is virtually installed at a predetermined height on the road surface behind the vehicle so as to face the road surface), that is, another viewpoint.
- the conversion to the image from is the viewpoint conversion.
- This viewpoint transformation adds a kind of transformation called affine transformation to the original image.
- Affine transformation is coordinate transformation that combines translation and linear mapping.
- the parallel movement in the affine transformation corresponds to moving the camera from the attachment position defined by the attachment information to the other position.
- the linear mapping corresponds to rotating the camera so that it matches the direction of the camera existing at the other position from the direction defined by the mounting information.
- the viewpoint information includes parallel movement information related to the difference between the camera attachment position and the position of another viewpoint, and rotation information related to the difference between the direction defined by the camera attachment information and the direction of another viewpoint. Note that the image conversion used for the viewpoint conversion is not limited to the affine transformation, and may be another type of conversion.
- the video output function calculation unit 136 further determines the video output function determined based on the angle-of-view information acquired from the information storage unit 11 with respect to the coordinate j (f (h (i (P))))) subjected to the viewpoint conversion. By calculating g (), it is converted into video output coordinates g (j (f (h (i (P))))). Since the size of the camera image captured by the camera and the size of the image that can be displayed by the display unit 18 are generally different, the camera image is changed to a size that can be displayed by the display unit 18.
- the video output conversion function g () is represented by a mapping function that uses the maximum horizontal field angle Xa and maximum vertical field angle Ya of the camera, and the maximum horizontal drawing pixel size Xp and maximum vertical drawing pixel size Yp in video output.
- the lens distortion function, projection function, viewpoint conversion function, projection plane conversion function, and video output function are calculated in this order for each coordinate indicating the guide line. This order does not have to be this order.
- the projection plane conversion function f () in the projection plane conversion function calculation unit 134 includes a camera field angle (maximum horizontal field angle Xa and maximum vertical field angle Ya) as information indicating the size of the captured camera image. include. Therefore, even when a part of the camera image received by the camera image receiving unit 15 is cut out and displayed, the camera image obtained by cutting out a part by changing the coefficient of the camera field angle in the projection plane conversion function f (). A guide line can be displayed so as to suit.
- FIG. 4 is a block diagram showing a configuration of the camera image correction unit 16.
- the camera image correction unit 16 includes a lens distortion inverse function calculation unit 161, a projection distortion inverse function calculation unit 162, and a viewpoint conversion function calculation unit 163. These configurations may not be operated depending on display condition information. Therefore, for the sake of simplicity, the case where all of these configurations operate will be described first.
- the lens distortion inverse function calculation unit 161 obtains the inverse function i ⁇ 1 () of the lens distortion function i () described above based on the lens distortion information included in the image generation information, and calculates the camera image. Since the camera image transmitted from the camera unit 2 is affected by lens distortion when captured by the camera, it is not affected by lens distortion by calculating the lens distortion inverse function i ⁇ 1 (). The camera image can be corrected.
- the projection inverse function calculation unit 162 obtains the inverse function h ⁇ 1 () of the above projection function h () based on the projection information included in the image generation information, and the lens output from the lens distortion inverse function calculation unit 161. Calculation is performed on camera images that are not affected by distortion. Since the camera image transmitted from the camera unit 2 is distorted by the projection method of the lens when captured by the camera, a camera that is not distorted by calculating the inverse projection function h ⁇ 1 (). The image can be corrected.
- the viewpoint conversion function calculation unit 163 applies the above-described viewpoint conversion function j () to the camera image output from the projection inverse function calculation unit 162 without receiving the projection distortion based on the viewpoint information included in the image generation information. Apply. In this way, a camera image subjected to viewpoint conversion can be obtained.
- the image superimposing unit 17 guides the guide line image and the corrected camera image so that the guide line image calculated and drawn by the line drawing unit 14 is overlaid on the corrected camera image output from the camera image correcting unit 16. Is superimposed as an image of another layer.
- the display unit 18 applies the video output function g () to the corrected camera image among the guide line image and the corrected camera image having different layers, so that the size of the corrected camera image can be displayed by the display unit 18. change. Then, the guide line image and the corrected camera image whose size has been changed are combined and displayed.
- the video output function g () may be executed by the camera image correction unit 16.
- the video output function g () may be executed on the guide line image by the display unit 18 instead of the guide line calculation unit 13.
- the operations of the guide line calculation unit 13 and the camera image correction unit 16 differ depending on the display condition information output from the display condition determination unit 12.
- the display condition information for example, the following four display conditions are conceivable depending on the operation of the camera image correction unit 16, that is, the difference in the display method of the camera image.
- a guide line image is drawn so as to match the camera image.
- the guide line calculation unit 13 calculates guide line information to which projection plane conversion is applied by adding lens distortion and distortion by a projection method. Since the camera lens of the camera unit 2 is a so-called fish-eye lens having an angle of view of 180 degrees or more, a wide range including the periphery of the camera installation location is displayed on the camera image, and the situation around the vehicle is easily grasped. It is suitable for confirming that there are no pedestrians around the vehicle when the vehicle starts. Since an image displayed under the first display condition is an image having distortion but a wide range can be seen, an image displayed under the first display condition is referred to as a wide-angle image.
- the camera image correction unit 16 corrects the camera image so as to remove lens distortion and distortion due to the projection method.
- the guide line calculation unit 13 calculates guide line information to which only projection plane conversion is applied. Since it becomes an image of a rectangular coordinate system that is easy to grasp between distances, it is an image suitable for the backward movement in which it is important to grasp the sense of distance. Note that there is a limit to the angle of view at which linearity can be maintained, and the field of view is narrower than that of the first display condition.
- An image displayed under the second display condition which is an image obtained by removing distortion due to the lens shape and distortion due to the projection method, is referred to as an undistorted image.
- the camera image correction unit 16 removes lens distortion and distortion due to the projection method, and corrects the camera image as if the viewpoint was converted.
- the guide line calculation unit 13 calculates guide line information to which projection plane conversion and viewpoint conversion are applied.
- the viewpoint after the viewpoint conversion is, for example, at a predetermined position and a predetermined height (for example, 5 m) such that the center of the rear end of the vehicle comes to the end of the image, and is facing directly below.
- the camera image converted to this viewpoint is an image of the road surface behind the vehicle viewed from directly above, and the angle between the directions parallel or perpendicular to the vehicle appears to be a right angle, and the actual image in the horizontal and vertical directions is displayed. Since it becomes an image which can grasp the sense of distance close to the distance, it is easy to grasp the positional relationship of the vehicle on the road surface.
- An image displayed under the third display condition is referred to as another viewpoint undistorted image.
- the camera image correction unit 16 corrects the camera image as if the viewpoint has been changed.
- the guide line calculation unit 13 calculates guide line information to which projection plane transformation and viewpoint transformation are applied by adding lens distortion and projection-type distortion.
- the viewpoint after the viewpoint conversion is the same as in the third display condition.
- the camera image converted into the viewpoint is an image obtained by viewing the road surface behind the vehicle from directly above, and although there is distortion, a wide range around the vehicle can be seen.
- An image displayed under the fourth display condition is referred to as a different viewpoint wide-angle image.
- An image displayed under the third display condition or the fourth display condition is referred to as a different viewpoint image.
- the guide line image generated by the line drawing unit 14 is as shown in FIG.
- FIG. 5 is an example of a guide line image generated under the first display condition.
- a guide line image to which similar distortion is applied is generated so as to be matched with a camera image having lens distortion and projection-type distortion.
- a line L1a is a guide line indicating the width of the parking section, and corresponds to the straight line L1 in FIG.
- a line L2a is a guide line indicating the width of the vehicle and corresponds to the straight line L2 in FIG.
- Lines L3a to L5a are guide lines indicating the distance from the vehicle, and correspond to the straight lines L3 to L5 in FIG. Further, all the components of the camera image correction unit 16 shown in FIG. 4 are not operated. That is, the camera image correcting unit 16 outputs the input camera image to the image superimposing unit 17 as it is.
- the viewpoint conversion function calculation unit 132, the projection function calculation unit 133, and the viewpoint conversion function calculation unit 135 are not operated.
- the coordinate P output from the guide line generation unit 131 is input to the projection plane conversion function calculation unit 134 as it is.
- the guide line image generated by the line drawing unit 14 is as shown in FIG. FIG. 6 is an example of a guide line image generated under the second display condition. A guide line image without distortion is generated so as to match with the camera image excluding lens distortion and projection-type distortion.
- FIG. 6 is an example of a guide line image generated under the second display condition. A guide line image without distortion is generated so as to match with the camera image excluding lens distortion and projection-type distortion.
- a straight line L1b is a guide line indicating the width of the parking section, and corresponds to the straight line L1 in FIG.
- a straight line L2b is a guide line indicating the width of the vehicle, and corresponds to the straight line L2 in FIG.
- Straight lines L3b to L5b are guide lines indicating the distance from the vehicle, and correspond to the straight lines L3 to L5 in FIG.
- the configuration of the camera image correction unit 16 shown in FIG. 4 other than the viewpoint conversion function calculation unit 163 is operated. That is, the camera image output from the projection inverse function calculation unit 162 is input to the image superimposing unit 17 as a corrected camera image.
- FIG. 7 shows a photograph of an image displayed on the display device, illustrating the relationship between the wide-angle image displayed under the first display condition and the undistorted image displayed under the second display condition.
- the upper side of FIG. 7 is a wide-angle image displayed under the first display condition, and a wide range is displayed although the peripheral portion of the image is distorted.
- the lower side is an undistorted image displayed under the second display condition. In the non-distorted image, the portion surrounded by the black square at the center of the wide-angle image is displayed without distortion.
- the image height Y is a tangent function (tan ⁇ )
- ⁇ ⁇ 45 to +45 degrees
- Incident light with an incident angle outside the range is greatly distorted, so that an image that cannot reach the imaging surface or is formed even if it can be formed is formed.
- the camera unit 2 according to the present embodiment uses a fisheye lens, it can capture a wider field angle with less distortion than a normal lens.
- the configuration other than the lens distortion function calculation unit 132 and the projection function calculation unit 133 is operated in the configuration of the guide line calculation unit 13 illustrated in FIG. That is, the coordinate P of the point on the guide line generated by the guide line generation unit 131 is input to the viewpoint conversion function calculation unit 135 as it is.
- the guide line image generated by the line drawing unit 14 is as shown in FIG.
- all the components of the camera image correction unit 16 shown in FIG. 4 are operated.
- a guide line image without distortion as seen from another viewpoint is superimposed and displayed on a camera image taken from another viewpoint except for lens distortion and projection-type distortion.
- FIG. 8 shows a photograph of an image displayed on the display device, illustrating an example of the relationship between the wide-angle image displayed under the first display condition and the different viewpoint undistorted image displayed under the third display condition.
- the lower side of FIG. 8 is an undistorted image displayed under the third display condition.
- a portion surrounded by a black square at the center of the wide-angle image is displayed as an image having no distortion as viewed from the viewpoint above the rear of the vehicle.
- FIG. 9 is an example of a guide line image generated under the fourth display condition.
- a guide line image as seen from another viewpoint is generated by applying the same distortion so as to match with a camera image having a lens distortion taken from another viewpoint and distortion by a projection method.
- a line L1c is a guide line indicating the width of the parking section, and corresponds to the straight line L1 in FIG.
- a line L2c is a guide line indicating the width of the vehicle and corresponds to the straight line L2 in FIG.
- Lines L3c to L5c are guide lines indicating the distance from the vehicle, and correspond to the straight lines L3 to L5 in FIG. Further, only the viewpoint conversion function calculation unit 163 is operated in the configuration of the camera image correction unit 16 illustrated in FIG. In other words, the camera image received by the camera image receiving unit 15 is directly input to the viewpoint conversion function calculating unit 163, and the image subjected to the viewpoint conversion by the viewpoint conversion function calculating unit 163 is output to the image superimposing unit 17 as a corrected camera image. Is done.
- FIG. 10 is a diagram illustrating a change in the vehicle state recognized by the display condition determination unit 13.
- the vehicle states recognized by the display condition determination unit 13 include the following states.
- the vehicle speed is positive when the vehicle is moving in the reverse direction.
- Reverse preparation state (JB) A state in which preparations are made for reverse operation.
- the condition C JB for the reverse preparation state (JB) is as follows.
- C JB The gear state is reverse, the moving distance L is zero, and the speed V is zero.
- Reverse start state A state from the start of reverse operation to movement of a predetermined distance (L1).
- the reverse start state is set.
- C JC The gear state is reverse, the moving distance L is positive and less than the predetermined distance (L1), and the speed V is positive and less than the predetermined speed (Vr1).
- Reversible state A state in which the vehicle has stopped from the start of retreat until it has moved a predetermined distance (L1).
- C JD The gear state is reverse, the moving distance L is positive and less than the predetermined distance (L1), the speed V is zero, and the side brake is OFF (not effective). If the side brake is turned on (effective) in the reversible state (JD), a reverse stop state (JM) described later is set.
- Non-reversible state A state in which the transmission is in a state other than reverse in the reversible state (JD) and the predetermined time (Tn1) has not elapsed.
- the initial state JA
- C JD The movement distance L is positive and less than the predetermined distance (L1), the speed V is zero, the gear state is other than reverse, and the non-reverse duration (Tn) is less than the predetermined time (Tn1) Yes, and the side brake is OFF. If the side brake is turned on in the reverse impossible state (JE), a reverse stop state (JM) described later is set.
- the vehicle is set in a reverse-possible state (JD).
- JD a reverse-possible state
- Tn1 a predetermined time
- JM reverse stop state
- Backward state (JF) A state in which the reverse is continued even after moving a predetermined distance (L1) or more after the start of reverse, and the deceleration condition that is the stop transition detection condition is not satisfied.
- the next reverse stop transition state (JG) is set.
- the condition of deceleration is that deceleration, that is, acceleration a is negative for a predetermined time (Ta1).
- the deceleration is given the duration condition when the acceleration a frequently fluctuates between negative and zero or more.
- the reverse state (JF) and the reverse stop transition state (JG) This is to prevent frequent switching at short intervals.
- C JF The gear state is reverse, the moving distance L is not less than the predetermined distance (L1), the speed V is positive and less than the predetermined speed (Vr1), and the deceleration condition C gn is not satisfied.
- C gn Acceleration a is negative and duration (Ta) in which acceleration a is negative is equal to or longer than a predetermined time (Ta1).
- Reverse stop transition state A state in which the vehicle is moving backward while the deceleration condition is satisfied after entering the reverse state (JF).
- C JG The gear state is reverse, the moving distance L is not less than the predetermined distance (L1), the speed V is positive and less than the predetermined speed (Vr1), and the deceleration condition C gn is satisfied.
- Re-reversible state A state in which the vehicle is stopped in a reversible state after entering the reverse stop transition state (JG).
- C JH Gear state is reverse and side brake is OFF, The moving distance L is not less than the predetermined distance (L1) and the speed V is zero.
- Non-retreatable state A state in which the transmission is in a state other than reverse in the re-reverseable state (JH) and the predetermined time (Tn1) has not elapsed.
- the initial state (JA) is set.
- C JD the moving distance L is not less than the predetermined distance (L1), the speed V is zero, the gear state is other than reverse, and the non-reverse duration (Tn) is less than the predetermined time (Tn1), And the side brake is OFF. If the side brake is turned on in the reverse impossible state (JE), a reverse stop state (JM) described later is set. If the gear state becomes reverse, the vehicle is set in a re-reversible state (JH).
- Re-reverse state A state in which the vehicle is retreating immediately after the re-retractable state (JH).
- C JL The gear state is reverse, the speed V is positive and less than the predetermined speed (Vr1), and the moving distance L is equal to or greater than the predetermined distance (L1).
- Reverse stop state A state in which the vehicle is stopped in a state where it is not possible to reverse after taking a state other than the reverse preparation state (JB).
- C JM The speed V is zero and the side brake is ON.
- the display condition determination unit 13 determines the display conditions as follows. (1) The first display condition is set in the reverse preparation state (JB), the reverse start state (JC), the reverse enable state (JD), and the reverse disable state (JE).
- the camera image is an image captured by the camera as it is, and has lens distortion and distortion due to the projection method. Since the camera lens of the camera unit 2 is a so-called fish-eye lens having an angle of view of 180 degrees or more, a wide range including the periphery of the camera installation location is displayed on the camera image, and the situation around the vehicle is easily grasped. It is suitable for confirming that there are no pedestrians around the vehicle when the vehicle starts. Since the guide line image is also displayed so as to match the camera image, it is easy to grasp the distance from the parking section.
- the reverse preparation state (JB), the reverse possibility state (JD), and the reverse impossible state (JE) are movement preparation states in which the vehicle is movable and the vehicle is stopped.
- the predetermined moving condition for determining that the vehicle is moving is that the vehicle moves a predetermined distance (L1).
- the reverse start state (JC), which is a state where the vehicle is moving backward until the vehicle moves a predetermined distance (L1), is the movement start state.
- a reverse state (JF) in which the vehicle moves backward after moving the predetermined distance (L1) is a moving state in which the vehicle is moving after the moving condition is satisfied.
- the third display condition is set in the reverse stop transition state (JG), the retreat-possible state (JH), the reverse stop state (JM), and the non-retractable state (JK).
- the camera image converted from the viewpoint is an image when the road surface behind the vehicle is viewed from directly above, and the angle between the directions parallel or perpendicular to the vehicle appears to be a right angle, and is close to the actual distance in the horizontal and vertical directions. Since the image provides a sense of distance, it is easy to grasp the positional relationship of the vehicle on the road surface.
- Stop in which the reverse stop transition state (JG) is a state in which it is detected that a predetermined stop transition detection condition (in this embodiment, deceleration condition C gn ) for detecting that the vehicle has started to stop is satisfied.
- the re-reversible state (JH), reverse stop state (JM), and re-reverse impossible state (JK) are stop states in which the vehicle is stopped after the stop transition state.
- the re-retreat state (JL) is a re-movement state in which the vehicle is moving after the stop state.
- the screen of the navigation device is displayed on the display device.
- the screen displayed before entering the reverse preparation state (JB) or the state when returning to the initial state (JA) The screen determined by is displayed. Note that the screen in the state immediately before the change to the initial state (JA) may be displayed until an event that changes the display of the screen occurs.
- FIG. 11 and 12 are flowcharts for explaining the operation of the display condition determination unit 12 for determining the vehicle state.
- FIG. 11 and FIG. 12 will be described including the relationship with the diagram for explaining the state change of FIG.
- S O S N in S5. If C EN is not satisfied in S4, S O is checked whether the initial state (EN) in S6. When C JA is not established, the speed V is equal to or greater than zero and less than the predetermined speed (Vr1). When the speed V is not zero, the gear state is reverse.
- S O is not the initial state (EN) in S6 is in S16 from S10, calculates the information necessary to determine the vehicle state.
- the gear state is other than R, it is checked in S42 whether the duration (Tn) where the gear state is other than R is equal to or longer than the predetermined time (Tn1). If it is equal to or longer than the predetermined time (Tn1), SN is set to the initial state (JA) in S43, and the moving distance L is set to 0 (arrow t16 in FIG. 10). If it is not equal to or longer than the predetermined time (Tn1), SN is set in a non-retreatable state (JE) in S44 (arrow t17 in FIG. 10). S37 When S O is not a retreat disabled state (JE) in, S O to check whether it is a backward state (JF) or retraction stop transition state (JG) in the S45.
- JE retreat disabled state
- S N is set in the reverse state (JF) in S50 (arrows t22 and t23 in FIG. 10). If S O is not retracted state (JF) or retraction stop transition state (JG) at S45 is, S O in S51 it is checked whether the re-retracted state (JH).
- re-retracted state (JH) is if a re-retracted state (JH)
- the velocity V in S52 checks whether zero. If the speed V is not zero, S N is set to the re-retreat state (JL) in S53 (arrow t26 in FIG. 10). If the speed V is zero, it is checked in S54 whether the side brake is ON. When the side brake is ON, S N is set in the reverse stop state (JM) in S55 (arrow t27 in FIG. 10). If the side brake is OFF, it is checked in S56 whether or not the gear state is R.
- SN is set in a non-retreatable state (JK) in S66 (arrow t34 in FIG. 10).
- S O in S59 is if it is not a re-recession disabled state (JK), S O in S67 to check whether the re-recession state (JL).
- S O processing S67 in the re-retracted state (JL) is the case of the re-retracted state (JL)
- the speed V at S68 checks whether zero.
- SN is set in a retreatable state (JH) in S69 (arrow t35 in FIG. 10). If the speed V is not zero, S N is set to the re-retreat state (JL) in S70 (arrow t36 in FIG. 10).
- S O is if not re-backward disabled state (JK), so that a retraction stop state (JM) at S66.
- the state of the vehicle from the state of the transmission that is, the reverse preparation state (JB), Treatment start state (JC), retreat enable state (JD), retreat impossible state (JE), retreat state (JF), retreat stop transition state (JG), re-retractable state (JH), re-retreat impossible state (JK)
- JB the reverse preparation state
- JC Retreat start state
- JD retreat enable state
- JE retreat impossible state
- JF retreat stop transition state
- JG retreat stop transition state
- JH re-retractable state
- JK re-retreat impossible state
- An appropriate camera image can be displayed for assisting the driver according to the determined vehicle state.
- the vehicle is ready to move, ie, the vehicle is ready to move, that is, the vehicle is ready to move backward (JB), the vehicle is ready to move backward (JD), and is not allowed to move backward (JE).
- the movement start state that is, the reverse start state (JC) in which the vehicle is moving until the moving condition is satisfied
- JC reverse start state
- a wide-angle image that is a wide range of camera images is displayed although there is distortion due to the fisheye lens.
- the moving state that is, the reverse state (JF) in which the vehicle is moving after the moving condition is satisfied
- an undistorted image that is an image from which lens distortion and distortion due to the projection method are removed is displayed. Is easy to grasp and can be easily retracted to an appropriate position.
- a stop transition state which is a state in which it is detected that a predetermined stop transition detection condition for detecting that the moving vehicle starts to stop is satisfied
- the stop transition state In the stop state where the vehicle is stopped that is, the re-retractable state (JH), the non-retractable state (JK), and the reverse stop state (JM)
- lens distortion and distortion due to the projection method are removed, and the rear of the vehicle
- Another viewpoint undistorted image which is an image viewed from another viewpoint in the sky, is displayed, so that it is easy to grasp the positional relationship of the vehicle on the road surface.
- the predetermined moving direction state confirmation period after the re-moving state is distorted by the fisheye lens, but is wide. Since the wide-angle image that is the camera image is displayed, it is easy to check the surrounding situation when resuming movement. After the movement direction situation confirmation period has elapsed, another viewpoint undistorted image is displayed, so that it is easy to grasp the positional relationship of the vehicle on the road surface.
- the guide line image is displayed superimposed on the camera image, but the above-described effects can be obtained by simply changing the camera image according to the vehicle state. Displaying the guide line image also makes it easier to grasp the position of the vehicle after movement, and is particularly effective when the vehicle is parked for parking.
- the moving distance after starting the movement is a predetermined distance or more, but the time after starting the movement is a predetermined time or more, and the vehicle speed is the predetermined speed or more.
- Other conditions such as may be used.
- a predetermined stop transition detection condition for detecting that the moving vehicle starts to stop it is assumed that deceleration continues for a predetermined time, but the vehicle speed becomes equal to or lower than the predetermined speed.
- Other conditions such as the vehicle speed being equal to or less than a predetermined speed after moving the distance, may be used.
- the condition for determining that the vehicle has stopped is that the speed is zero and the side brake is ON. However, other conditions such as a predetermined time may have elapsed since the vehicle stopped.
- Information on the steering angle of the steering device that changes the traveling direction of the vehicle is also input as vehicle information, and only when it is determined that the vehicle is in a moving state and the vehicle is traveling substantially straight from the steering angle, an undistorted image behind the vehicle is displayed. You may make it display. If the vehicle moves while the steering angle is large and the vehicle is rotating, you may be trying to avoid an obstacle near the vehicle. Is desirable.
- the vehicle information acquisition unit acquires the moving distance of the vehicle in one cycle from the electronic control unit, but acquires only the speed, and moves in one cycle by trapezoidal approximation using the previous and current speeds and the time of one cycle.
- the distance may be obtained.
- the acceleration may be output by the electronic control unit, or may be obtained from the previous and current speeds in the vehicle information acquisition unit.
- the vehicle information acquisition unit may be anything as long as it acquires a vehicle state necessary for the driving support device. The above also applies to other embodiments.
- Embodiment 2 the case where the vehicle is parked while being moved backward has been described. However, the vehicle may be moved forward and parked. When advancing and parked, a small driver can directly see the situation around the vehicle, so a driver assistance device is not required.However, in the case of a large vehicle with a high driver seat, the situation in front of the vehicle Because it is difficult to confirm from the driver's seat, there is a high need for a driving assistance device. Therefore, the driving support system according to the second embodiment is configured to determine the state of the vehicle and switch the displayed camera image when the vehicle is moved forward and parked. In addition, the guide line image is not displayed on the road surface.
- FIG. 13 is a block diagram illustrating a configuration of the driving support system according to the second embodiment. Only differences from FIG. 1 which is the configuration of the first embodiment will be described.
- the driving support system includes a host unit 1a and a camera unit 2 which are driving support devices.
- the host unit 1a does not include the guide line calculation unit 13 (guide line information generation unit), the line drawing unit 14 (guide line image generation unit), and the image superimposition unit 17. Therefore, an image output from the camera image correction unit 16 is displayed on the display unit 18, and the camera image correction unit 16 constitutes an image output unit.
- the information storage unit 11a stores field angle information, projection information, lens distortion information, and viewpoint information.
- the vehicle information acquisition unit 10a includes gear state information indicating the state (gear state) of the transmission of the vehicle, speed information indicating the speed of the vehicle, and travel distance information indicating the travel distance of the vehicle in one cycle in which the vehicle information is detected. To get.
- the display condition determination unit 12a (vehicle state determination unit) generates display condition information on how to display the camera image on the display unit 18 based on the vehicle information acquired by the vehicle information acquisition unit 10a.
- the camera unit 2 has a camera installed at a position in front of the vehicle where a portion that cannot be seen from the driver's seat can be imaged.
- the gear state acquired by the vehicle information acquisition unit 10a of the host unit 1a is a state in which the vehicle can move forward, for example, low (L), second (S), drive (D), or neutral (N)
- the unit 1 controls the camera of the camera unit 2 to take an image and transmit a camera image.
- the state in which the gear state can advance is called forward gear (abbreviated as Fw).
- FIG. 14 is a diagram for explaining a change in the vehicle state recognized by the display condition determination unit 13.
- the vehicle states recognized by the display condition determination unit 13 include the following states.
- the vehicle speed is positive when the vehicle is moving in the forward direction.
- Initial state State other than the following. It will be in the initial state when the vehicle engine is turned on. It is not a state to be supported by the driving support device.
- the gear state is not the forward gear, or when the speed V becomes equal to or higher than the predetermined speed (Vr1), the process returns to the initial state (KA).
- the following are not all of the conditions for the initial state (KA), but if the following conditions are satisfied, it can be determined that the state is the initial state (KA).
- the following condition C KA is referred to as a condition that is clearly the initial state at the time of forward movement.
- C KA Speed V is negative, Speed V is equal to or higher than a predetermined speed (Vr1), or the gear state is other than forward gear.
- Advance preparation state (KB) Preparation for advance.
- C KB Gear state is forward gear, travel distance L is zero, and speed V is zero.
- Advance start state A state from the start of advance to the movement of a predetermined distance.
- the forward start state is entered.
- C KC The gear state is a forward gear, the moving distance L is positive and less than a predetermined distance (L1), and the speed V is positive and less than a predetermined speed (Vr1).
- Advanceable state A state in which the vehicle has stopped by moving a predetermined distance after starting to advance, and a predetermined time (Tz1) has not elapsed since the stop.
- C KD the gear state is the forward gear, the moving distance L is positive and less than the predetermined distance (L1), the speed V is zero, and the duration (Tz) where the speed V is zero is the predetermined time (Tz1 ).
- Tz1 the predetermined time
- Advancing state (KE) A state in which the advancing is continued even after moving a predetermined distance (L1) or more after the advancing starts, and the low speed condition that is the stop transition detection condition is not satisfied.
- the low speed condition is that the speed V continues to be lower than a predetermined speed (Vr2, Vr2 ⁇ Vr1) for a predetermined time (Tv2).
- the speed V has a duration condition below the predetermined speed (Vr2) when the speed V frequently fluctuates between the predetermined speed (Vr2) and less than the predetermined speed (Vr2). This is to prevent frequent switching between the state (KE) and the forward stop transition state (KF) at short intervals.
- C KE The gear state is the forward gear, the moving distance L is not less than the predetermined distance (L1), the speed V is positive and less than the predetermined speed (Vr1), and the low speed condition C lw is not satisfied.
- C lw The duration (Tv) when the speed V is less than the predetermined speed (Vr2) and the speed V is less than the predetermined speed (Vr2) is equal to or longer than the predetermined time (Tv2).
- Forward stop transition state (KF) A state in which the vehicle is moving forward while the low-speed condition is satisfied after the forward state (KE) is reached.
- C KF The gear condition is a forward gear, the moving distance L is not less than the predetermined distance (L1), the speed V is positive and less than the predetermined speed (Vr1), and the low speed condition C lw is satisfied.
- Forward stop state A state where the vehicle has stopped after entering the forward state (KE), and a predetermined time (Tz1) has not elapsed since the stop.
- C KG The speed V is zero, the gear state is the forward gear, and the duration (Tz) where the speed V is zero is less than the predetermined time (Tz1).
- Re-advance state A state where the vehicle is moving forward after the re-advance possible state (JH).
- C KH The gear state is the forward gear, the speed V is positive and less than the predetermined speed (Vr1), and the moving distance L is equal to or greater than the predetermined distance (L1).
- the display condition determination unit 13 determines the display conditions as follows. (1) In the advance preparation state (KB), the advance start state (KC), and the advance enable state (KD), the first display condition is set.
- the camera image is an image captured by the camera as it is, and has lens distortion and distortion due to the projection method. Since the camera lens of the camera unit 2 is a so-called fish-eye lens having an angle of view of 180 degrees or more, a wide range including the periphery of the camera installation location is displayed on the camera image, and the situation around the vehicle is easily grasped. It is suitable for confirming that there are no pedestrians around the vehicle when the vehicle starts.
- the forward preparation state (KB) and the forward advanceable state (KD) are movable preparation states in which the vehicle is movable and the vehicle is stopped.
- the predetermined moving condition for determining that the vehicle is moving is that the vehicle moves a predetermined distance (L1).
- the forward start state (KC), which is a state where the vehicle is moving forward until the vehicle moves a predetermined distance (L1), is the movement start state.
- a forward state (KE) in which the vehicle moves forward after moving the predetermined distance (L1) is a moving state in which the vehicle is moving after the moving condition is satisfied.
- the third display condition is set in the forward stop transition state (KF) and the forward stop state (KG).
- the viewpoint after the viewpoint conversion is, for example, at a predetermined position and a predetermined height (for example, 5 m) such that the center of the front end of the vehicle is at the end of the image, and is facing downward.
- the camera image converted to this viewpoint is an image of the road surface in front of the vehicle viewed from directly above, and the angle between the directions parallel to or perpendicular to the vehicle appears to be a right angle, and the actual image in the horizontal and vertical directions is displayed. Since it becomes an image which can grasp the sense of distance close to the distance, it is easy to grasp the positional relationship of the vehicle on the road surface.
- the forward stop transition state (KF) is a state in which it is detected that a predetermined stop transition detection condition (in this embodiment, a low speed condition C lw ) for detecting that the vehicle starts to stop is satisfied. Transition state.
- the forward stop state (KG) is a stop state in which the vehicle is stopped after the stop transition state.
- the re-advance state (KH) is a re-movement state in which the vehicle is moving after the stop state.
- the screen of the navigation device is displayed on the display device.
- the screen displayed before entering the advance preparation state (KB) or the state when returning to the initial state (KA) The screen determined by is displayed. Note that the screen in the state immediately before the change to the initial state (KA) may be displayed until an event that changes the display of the screen occurs.
- FIGS. 15 and 16 are flowcharts for explaining the operation of determining the vehicle state in the display condition determination unit 12a.
- FIGS. 15 and 16 will be described together with the relationship with the diagram for explaining the state change of FIG.
- the display condition determination unit 12 sets the vehicle state ( SO ) to the initial state (KA) in U2. Thereafter, the process after U3 is repeatedly executed at a cycle ( ⁇ T) in which vehicle information is input from the ECU, and a new vehicle state (S N ) is determined. In U3, it is checked whether or not a condition C KA that is clearly the initial state at the time of forward movement is satisfied.
- CKA is established, SN is set to the initial state (KA) at U4, and the moving distance L is set to 0 (all arrows entering the initial state (KA) in FIG. 14).
- S O S N at U5. If the C KA is not established in the U3, S O is checked whether the initial state (KA) at U6. If CKA is not established, the speed V is not less than zero and less than the predetermined speed (Vr1), and the gear state is the forward gear.
- S O is not the initial state (KA) in U6 is a U16 from U10, calculates the information necessary to determine the vehicle state.
- the process S O in the re-advancing state (KH) is the case of the re-advance state (KH), it is checked whether the speed V is zero or at U45. If the speed V is zero, the S N a forward stop state (KG) with U46 (arrow w24 in Fig. 14). When the speed V is not zero, SN is set to the re-advance state (KH) at U47 (arrow w25 in FIG. 14).
- the state of the vehicle that is, the forward preparation state (KB), the forward start state (KC), and the forward possible state It is determined whether the vehicle is in the state (KD), the forward state (KE), the forward stop transition state (KF), the forward stop state (KG), the re-forward state (KH), or the initial state (KA).
- An appropriate camera image can be displayed for assisting the driver according to the determined vehicle state. Specifically, in the forward preparation state (KB) and the forward start state (KC), since a wide range of camera images (with distortion) by the fisheye lens is displayed, it is easy to check the surrounding situation at the start of forward movement.
- the forward state In the forward state (KE), an image from which lens distortion and distortion by the projection method are removed is displayed, so that the sense of distance can be easily grasped and the image can be easily advanced to an appropriate position.
- the forward stop transition state (KF), the re-advanceable state (JH), and the forward stop state (KG) In the forward stop transition state (KF), the re-advanceable state (JH), and the forward stop state (KG), the lens distortion and the distortion due to the projection method are removed, and an image viewed from above the vehicle is displayed. Easy to grasp the relationship.
- the vehicle when the vehicle moves backward, and in the second embodiment, the vehicle moves forward, and an image is displayed so that the driver can easily understand the road surface condition in the moving direction.
- the road surface in the moving direction may be displayed by an appropriate display method according to the vehicle state.
- the driver when the vehicle moves again after stopping, the driver is supported only when moving in the same direction as before stopping.
- the driver When the vehicle moves again after stopping the movement, the driver may be supported also when moving in a different direction from that before the vehicle stops.
- the above also applies to other embodiments.
- the host unit is provided with a display unit.
- an image output device 4 that outputs a composite image in which a guide line image is superimposed on a camera image
- an external display device 5 such as an in-vehicle device.
- a combined image output from the image output device 4 may be displayed on the display device 5 in combination with the navigation device.
- the image output device 4 is a driving support device.
- FIG. 17 is a block diagram illustrating a configuration of the driving support system according to the third embodiment. Components that are the same as or correspond to those in FIG. 1 are given the same reference numerals, and descriptions thereof are omitted. In FIG.
- gear state information is output from the electronic control unit 3 to the vehicle information acquisition unit 10 and the display device 5. Since the connection interface with the electronic control unit 3 in the image output device 4 is the same as that of a general navigation device, communication between the image output device 4 and the electronic control unit 3 is possible without preparing a special interface. It can be performed. An image signal output from the image output device 4 is input to an external input terminal of the display device 5.
- the display device 5 switches to a mode for displaying an image input to the external input terminal while the gear state information indicating that the vehicle gear state is reverse is input from the electronic control unit 3, and is output from the image output device 4. Display the image to be displayed. Therefore, when the driver of the vehicle puts the transmission of the vehicle in the reverse direction, the composite image is output from the image output device 4 and the composite image is displayed on the display device 5. Thus, parking can be supported by displaying an image of the road surface behind the vehicle during parking.
- the display device 5 displays an image output from the image output device 4 when the gear state information in which the vehicle gear state is reverse is input from the electronic control unit 3.
- the display device 5 is provided with a changeover switch for switching to a mode for displaying an image input to the external input terminal of the display device 5 and is output from the image output device 4 when the user presses this changeover switch. An image may be displayed. This point also applies to other embodiments.
- the host unit determines display conditions based on the vehicle state, and combines the camera image and the guide line image transmitted from the camera unit.
- a camera information acquisition unit, a display condition determination unit, and a camera image correction unit can be provided in the camera unit.
- a camera unit that outputs an image under an appropriate display condition according to the vehicle state based on the captured camera image is called a driving assistance camera unit.
- a driving support system is configured by combining a driving support camera unit and a display device that displays an image output from the driving support camera unit.
- the driving support camera unit of this embodiment also has a configuration for generating a guide line image such as an information storage unit, a guide line calculation unit, and a line drawing unit, and a composite image in which the guide line image is superimposed on the camera image. Output.
- a guide line image such as an information storage unit, a guide line calculation unit, and a line drawing unit
- FIG. 18 is a block diagram illustrating a configuration of the driving support system according to the fourth embodiment.
- the imaging unit 21 of the camera unit 2a captures an image of the road surface behind the vehicle while receiving from the vehicle information acquisition unit 10 the gear state information in which the vehicle gear state is reverse.
- the camera image captured by the imaging unit 21 is output to the camera image correction unit 16.
- the camera image correction unit 16 corrects the camera image as in the first embodiment.
- the image superimposing unit 18 outputs a composite image in which the image output from the camera image correcting unit 16 and the guide line image output from the line drawing unit 14 are superimposed.
- An image signal output from the camera unit 2 a is input to an external input terminal of the display device 5.
- the display device 5 in the present embodiment is also input to the external input terminal while the gear state information in which the vehicle gear state is reverse is input from the electronic control unit 3. Switch to the image display mode. Therefore, an image for driving assistance is displayed on the display device 5 when the transmission of the vehicle is in a reverse state according to the operation of the driver of the vehicle.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、実施の形態1に係る運転支援システムの構成を示すブロック図である。図1において、運転支援システムは、運転支援装置であるホストユニット1とカメラユニット2とを含んで構成されている。電子制御ユニット3は車両に搭載された電子機器を電子回路により制御する一般に車両に搭載されているECU(Electric Control Unit)であり、車両情報を検出してホストユニット1に出力する車両情報出力装置である。本実施の形態において車両情報出力装置は特に、車両の変速機の状態(以下、ギア状態と呼ぶ)を変化させる運転者の操作が操作するセレクトバーの位置を示すギア状態情報、車両の速度を示す速度情報、車両の加速度を示す加速度情報、車両情報が検出される1周期での車両の移動距離を示す移動距離情報、サイドブレーキの位置を示すサイドブレーキ情報などの車両情報をホストユニット1に対して出力する。車両は、運転者がクラッチを操作する必要が無いAT(Automatic Transmission)車であるとする。
以下、運転支援装置を構成する各構成要素について、説明する。
(A)取り付け情報。取り付け情報とは、車両に対してカメラがどのように取り付けられているか、すなわちカメラの取り付け位置と取り付け角度とを示す情報である。
(B)画角情報。画角情報とは、カメラユニット2のカメラで撮像される被写体の範囲を示す角度情報、および表示部18での画像表示時における表示範囲を示す表示情報である。角度情報には、カメラの最大水平画角Xaおよび最大垂直画角Yaもしくは対角画角が含まれている。表示情報には、表示部18の最大水平描画ピクセルサイズXpおよび最大垂直描画ピクセルサイズYpが含まれている。
(C)射影情報。射影情報とは、カメラユニット2のカメラに用いられるレンズの射影方式を示す情報である。本実施の形態では、カメラが有する広角レンズとして魚眼レンズを用いているため、射影情報の値としては、立体射影、等距離射影、等立体角射影、および正射影のいずれかをとる。
(D)レンズ歪情報。レンズ歪情報は、レンズによる画像の歪に関するレンズの特性の情報である。
(E)視点情報。視点情報は、カメラがあると想定する別の位置に関する情報である。
(F)ガイド線間隔情報。ガイド線間隔情報とは、駐車幅情報、車両幅情報、および車両後部端からの安全距離、注意距離、警告距離の距離情報である。駐車幅情報とは、車両の幅に所定の余裕幅を加えた駐車幅(例えば駐車区画の幅)を示す情報である。車両後部端からの安全距離、注意距離、警告距離の距離情報とは、車両の後部の端からの後方に対する距離で、例えば、安全距離は車両後部端から1m、注意距離は50cm、警告距離は10cmとして、車両後方における距離の目安を示す。車両後部端からの安全距離、注意距離、警告距離により、車両後方に映る障害物が、車両後部端からどの程度の距離を持つかを運転者が把握することができる。
なお、(C)射影情報、(D)レンズ歪情報、(E)視点情報は、カメラで撮像したカメラ画像を変換するために使用する画像生成用情報でもある。
(u, v)をレンズ歪の影響を受けない正規化座標とし、(um, vm)をレンズ歪の影響を受けた正規化座標とすると、以下の関係が成立する。
um=u + u*(k1*r2+k2*r4)
vm=v + v*(k1*r2+k2*r4)
r2=u2 + u2
ここで、k1およびk2は、放射歪曲によるレンズ歪を多項式で表したときの係数であり、レンズにより固有の定数である。
xm=x + (x - x0)*(k1*r2+k2*r4)
ym=y + (y - y0)*(k1*r2+k2*r4)
r2=(x - x0)2 + (y - y0)2
ここに、(x0, y0)は、レンズ歪の影響を受けていない座標における放射歪曲の中心である主点に対応する路面上の点である。カメラユニット2の取り付け情報から、(x0, y0)を求めておく。なお、レンズ歪関数演算部132と射影関数演算部133では、レンズの光軸は、路面に垂直であり、上記の(x0, y0)を通るものとする。
立体射影 Y=2*f*tan(θ/2)
等距離射影 Y=f*θ
等立体角射影 Y=2*f*sin(θ/2)
正射影 Y=f*sinθ
射影関数演算部133は、レンズ歪関数演算部132から出力されるレンズ歪を受けた座標i(P)を、レンズに対する入射角度θに変換し、上記の射影式の何れかに代入して像高Yを計算し、像高Yを座標に戻すことにより、射影歪を受けた座標h(i(P))を演算する。
第1の表示条件で表示される画像は歪を有するが広い範囲が見える画像であるので、第1の表示条件で表示される画像を広角画像と呼ぶ。
表示条件決定部13が認識する車両状態には、以下の状態がある。なお、車両の速度は、後退方向に車両が移動している場合を正とする。
下記が初期状態(JA)である条件のすべてではないが、下記の条件が満足する場合は、初期状態(JA)であると判断できる。下記の条件CJAを、明らかに初期状態である条件と呼ぶ。
CJA=速度Vが負である、または
速度Vが所定速度(Vr1)以上である、または
(速度Vがゼロでなく、かつギア状態がリバース以外である)。
CJB=ギア状態がリバースであり、かつ
移動距離Lがゼロ、かつ
速度Vがゼロである。
CJC=ギア状態がリバースであり、かつ
移動距離Lが正かつ所定距離(L1)未満、かつ
速度Vが正で所定速度(Vr1)未満である。
CJD=ギア状態がリバースであり、かつ
移動距離Lが正かつ所定距離(L1)未満、かつ
速度Vがゼロであり、かつ
サイドブレーキがOFF(効いていない)である。
なお、後退可能状態(JD)でサイドブレーキがON(効いている)になれば、後述する後退停止状態(JM)とする。
CJD=移動距離Lが正かつ所定距離(L1)未満、かつ
速度Vがゼロであり、かつ
ギア状態がリバース以外であり、かつ
リバース以外の継続時間(Tn)が所定時間(Tn1)未満であり、かつ
サイドブレーキがOFFである。
なお、後退不可状態(JE)でサイドブレーキがONになれば、後述する後退停止状態(JM)とする。ギア状態がリバースになれば、後退可能状態(JD)とする。
車両を駐車させる場合に、車両を停止させた後でサイドブレーキをONにする前にギア状態を変更した場合でも後退停止状態(JM)に変化できるように、所定時間(Tn1)までは後退不可状態(JE)として扱うことにしている。
CJF=ギア状態がリバースであり、かつ
移動距離Lが所定距離(L1)以上であり、かつ
速度Vが正で所定速度(Vr1)未満であり、かつ
減速の条件Cgnが成立しない。
Cgn=加速度aが負、かつ
加速度aが負である継続時間(Ta)が所定時間(Ta1)以上である。
CJG=ギア状態がリバースであり、かつ
移動距離Lが所定距離(L1)以上であり、かつ
速度Vが正で所定速度(Vr1)未満であり、かつ
減速の条件Cgnが成立する。
CJH=ギア状態がリバースであり、かつ
サイドブレーキがOFFであり、
移動距離Lが所定距離(L1)以上であり、かつ
速度Vがゼロである。
CJD=移動距離Lが所定距離(L1)以上、かつ
速度Vがゼロであり、かつ
ギア状態がリバース以外であり、かつ
リバース以外の継続時間(Tn)が所定時間(Tn1)未満であり、かつ
サイドブレーキがOFFである。
なお、後退不可状態(JE)でサイドブレーキがONになれば、後述する後退停止状態(JM)とする。ギア状態がリバースになれば、再後退可能状態(JH)とする。
CJL=ギア状態がリバースであり、かつ
速度Vが正で所定速度(Vr1)未満であり、かつ
移動距離Lが所定距離(L1)以上である。
CJM=速度Vがゼロで、かつ
サイドブレーキがONである。
(1)後退準備状態(JB)、後退開始状態(JC)、後退可能状態(JD)および後退不可状態(JE)では、第1の表示条件とする。カメラ画像はカメラで撮像された画像そのままであり、レンズ歪と射影方式による歪を有している。カメラユニット2のカメラのレンズは、180度以上の画角を有するいわゆる魚眼レンズであるため、カメラ画像にはカメラの設置場所の周辺を含んだ広い範囲が表示され、車両周辺の状況が把握しやすく、車両の発進時に車両の周囲に歩行者などがいないかを確認するのに適している。カメラ画像に整合するようにガイド線画像も表示されるので、駐車区画との距離が把握しやすくなる。
車両が所定距離(L1)を移動した後で後退している後退状態(JF)が、移動中条件が成立した後で車両が移動している状態である移動中状態である。
後退停止移行状態(JG)が、車両が停止し始めていることを検出する所定の停止移行検出条件(この実施の形態では減速の条件Cgn)が成立することを検出している状態である停止移行状態である。再後退可能状態(JH)、後退停止状態(JM)および再後退不可状態(JK)は、停止移行状態の後で車両が停止している状態である停止状態である。
再後退状態(JL)が、停止状態の後で車両が移動している状態である再移動状態である。
S4でCJAが成立しない場合は、S6でSOが初期状態(JA)であるかどうかをチェックする。なお、CJAが成立しない場合は、速度Vがゼロ以上で所定速度(Vr1)未満であり、速度Vがゼロでない場合は、ギア状態はリバースであることになる。
S6でSOが初期状態(JA)である場合は、S7で条件CJBが成立するかどうかをチェックする。CJBが成立する場合は、S8でSNを後退準備状態(JB)とする(図10の矢印t1)。CJBが成立しない場合は、S9でSNを初期状態(JA)とする(図10の矢印t2)。
S17でSOが後退準備状態(JB)であるかどうかをチェックする。
S17でSOが後退準備状態(JB)である場合は、S18で速度Vがゼロかどうかをチェックする。速度Vがゼロでない場合は、S19でSNを後退開始状態(JC)とする(図10の矢印t3)。速度Vがゼロである場合は、S20でギア状態がRであり、かつサイドブレーキがOFFであるかどうかをチェックする。ギア状態がRであり、かつサイドブレーキがOFFである場合は、S21でSNを後退準備状態(JB)とする(図10の矢印t4)。そうでない場合は、S22でSNを初期状態(JA)とし、移動距離L=0とする(図10の矢印t5)。
S17でSOが後退準備状態(JB)でない場合は、S23でSOが後退開始状態(JC)であるかどうかをチェックする。
S23でSOが後退開始状態(JC)である場合は、S24で移動距離Lが所定距離(L1)以上であるかどうか(L≧L1)をチェックする。L≧L1である場合は、S25で後退状態(JF)とする(図10の矢印t6)。L<L1である場合は、S26で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロでない場合は、S27でSNを後退開始状態(JC)とする(図10の矢印t7)。速度Vがゼロである場合は、S28でSNを後退可能状態(JD)とする(図10の矢印t8)。
S23でSOが後退開始状態(JC)でない場合は、S29でSOが後退可能状態(JD)であるかどうかをチェックする。
S29でSOが後退可能状態(JD)である場合は、S30で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロでない場合は、S31でSNを後退開始状態(JC)とする(図10の矢印t10)。速度Vがゼロである場合は、S32でサイドブレーキがONであるかどうかをチェックする。サイドブレーキがONである場合は、S33で移動距離LをL1とし(L=L1)、SNを後退停止状態(JM)とする(図10の矢印t11)。サイドブレーキがOFFである場合は、S34でギア状態がRかどうかをチェックする。ギア状態がRである場合は、S35でSNを後退可能状態(JD)とする(図10の矢印t12)。ギア状態がR以外である場合は、S36でSNを後退不可状態(JE)とする(図10の矢印t13)。
S29でSOが後退可能状態(JD)でない場合は、S37でSOが後退不可状態(JE)であるかどうかをチェックする。
S37でSOが後退不可状態(JE)である場合は、S38でサイドブレーキがONであるかどうかをチェックする。サイドブレーキがONである場合は、S39で移動距離LをL1とし(L=L1)、SNを後退停止状態(JM)とする(図10の矢印t14)。サイドブレーキがOFFである場合は、S40でギア状態がRかどうかをチェックする。ギア状態がRである場合は、S41でSNを後退可能状態(JD)とする(図10の矢印t15)。ギア状態がR以外である場合は、S42でギア状態がR以外である継続時間(Tn)が所定時間(Tn1)以上であるかどうかをチェックする。所定時間(Tn1)以上である場合は、S43でSNを初期状態(JA)とし、移動距離L=0とする(図10の矢印t16)。所定時間(Tn1)以上でない場合は、S44でSNを後退不可状態(JE)とする(図10の矢印t17)。
S37でSOが後退不可状態(JE)でない場合は、S45でSOが後退状態(JF)または後退停止移行状態(JG)であるかどうかをチェックする。
図12に示すS45でSOが後退状態(JF)または後退停止移行状態(JG)である場合は、S46で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロである場合は、S47でSNを再後退可能状態(JH)とする(図10の矢印t18、t19)。速度Vがゼロでない場合は、S48で減速の条件Cgnが成立するかどうかをチェックする。Cgnが成立する場合は、S49でSNを後退停止移行状態(JG)とする(図10の矢印t20、t21)。Cgnが成立しない場合は、S50でSNを後退状態(JF)とする(図10の矢印t22、t23)。
S45でSOが後退状態(JF)または後退停止移行状態(JG)でない場合は、S51でSOが再後退可能状態(JH)であるかどうかをチェックする。
S51でSOが再後退可能状態(JH)である場合は、S52で速度Vがゼロかどうかをチェックする。速度Vがゼロでない場合は、S53でSNを再後退状態(JL)とする(図10の矢印t26)。速度Vがゼロである場合は、S54でサイドブレーキがONであるかどうかをチェックする。サイドブレーキがONである場合は、S55でSNを後退停止状態(JM)とする(図10の矢印t27)。サイドブレーキがOFFである場合は、S56でギア状態がRかどうかをチェックする。ギア状態がRである場合は、S57でSNを再後退可能状態(JH)とする(図10の矢印t28)。ギア状態がR以外である場合は、S58でSNを再後退不可状態(JK)とする(図10の矢印t29)。
S51でSOが再後退可能状態(JH)でない場合は、S59でSOが再後退不可状態(JK)であるかどうかをチェックする。
S59でSOが再後退不可状態(JK)である場合は、S60でサイドブレーキがONであるかどうかをチェックする。サイドブレーキがONである場合は、S61でSNを後退停止状態(JM)とする(図10の矢印t31)。サイドブレーキがOFFである場合は、S62でギア状態がRかどうかをチェックする。ギア状態がRである場合は、S63でSNを再後退可能状態(JH)とする(図10の矢印t32)。ギア状態がR以外である場合は、S64でギア状態がR以外である継続時間(Tn)が所定時間(Tn1)以上であるかどうかをチェックする。所定時間(Tn1)以上である場合は、S65でSNを初期状態(JA)とし、移動距離L=0とする(図10の矢印t33)。所定時間(Tn1)以上でない場合は、S66でSNを再後退不可状態(JK)とする(図10の矢印t34)。
S59でSOが再後退不可状態(JK)でない場合は、S67でSOが再後退状態(JL)であるかどうかをチェックする。
S67でSOが再後退状態(JL)である場合は、S68で速度Vがゼロかどうかをチェックする。速度Vがゼロである場合は、S69でSNを再後退可能状態(JH)とする(図10の矢印t35)。速度Vがゼロでない場合は、S70でSNを再後退状態(JL)とする(図10の矢印t36)。
S66でSOが再後退不可状態(JK)でない場合は、後退停止状態(JM)であることになる。
SOが後退停止状態(JM)である場合は、S71でCJMが成立するかどうかをチェックする。CJMが成立する場合は、S72でSNを後退停止状態(JM)とする(図10の矢印t38)。CJMが成立しない場合は、S73でSNを初期状態(JA)とし、移動距離L=0とする(図10の矢印t39)。
移動中条件が成立した後で車両が移動している状態である移動中状態すなわち後退状態(JF)ではレンズ歪および射影方式による歪を取り除いた画像である無歪画像を表示するので、距離感が把握しやすく、適切な位置まで容易に後退させることができる。
停止状態の後で車両が移動している状態である再移動状態すなわち再後退状態(JL)では、再移動状態になってから所定の移動方向状況確認期間は、魚眼レンズによる歪があるが広い範囲のカメラ画像である広角画像を表示するので、移動再開時に周囲の状況を確認しやすい。移動方向状況確認期間が経過した後は、別視点無歪画像を表示するので、路面における車両の位置関係を把握しやすい。
以上のことは、他の実施の形態にもあてはまる。
実施の形態1にかかる運転支援システムでは、車両を後退させて駐車する場合について説明したが、車両を前進させて駐車させる場合もある。前進して駐車させる場合、小型車では運転者が車両の周囲の状況を直接視認できるため運転支援装置は特に必要にならないが、運転席が高い位置に設けられる大型車などでは、車両の前方の状況についても運転席から確認することが難しいため、運転支援装置の必要性が高い。そこで、車両を前進させて駐車する場合に車両の状態を判断し、表示するカメラ画像を切り換えるようにしたのが、実施の形態2にかかる運転支援システムである。また、路面にガイド線画像を表示しないようにしている。
ホストユニット1aは、ガイド線計算部13(ガイド線情報生成部)、線描画部14(ガイド線画像生成部)、画像重畳部17を有しない。そのため、カメラ画像補正部16が出力する画像が表示部18に表示され、カメラ画像補正部16が画像出力部を構成する。
表示条件決定部13が認識する車両状態には、以下の状態がある。なお、車両の速度は、前進方向に車両が移動している場合を正とする。
下記が初期状態(KA)である条件のすべてではないが、下記の条件が満足する場合は、初期状態(KA)であると判断できる。下記の条件CKAを、前進時の明らかに初期状態である条件と呼ぶ。
CKA=速度Vが負である、または
速度Vが所定速度(Vr1)以上である、または
ギア状態が前進ギア以外である。
CKB=ギア状態が前進ギアであり、かつ
移動距離Lがゼロ、かつ
速度Vがゼロである。
CKC=ギア状態が前進ギアであり、かつ
移動距離Lが正かつ所定距離(L1)未満、かつ
速度Vが正で所定速度(Vr1)未満である。
CKD=ギア状態が前進ギアであり、かつ
移動距離Lが正かつ所定距離(L1)未満、かつ
速度Vがゼロであり、かつ
速度Vがゼロである継続時間(Tz)が所定時間(Tz1)未満である。
なお、停止が所定時間(Tz1)以上、経過すれば、初期状態(KA)とする。
速度Vが所定速度(Vr2)以上になることなく所定距離(L1)以上を移動した場合は、所定距離(L1)以上を移動したことを検出してから所定時間(Tv2)までは、前進状態(KE)である。
CKE=ギア状態が前進ギアであり、かつ
移動距離Lが所定距離(L1)以上であり、かつ
速度Vが正で所定速度(Vr1)未満であり、かつ
低速の条件Clwが成立しない。
Clw=速度Vが所定速度(Vr2)未満、かつ
速度Vが所定速度(Vr2)未満の継続時間(Tv)が所定時間(Tv2)以上である。
CKF=ギア状態が前進ギアであり、かつ
移動距離Lが所定距離(L1)以上であり、かつ
速度Vが正で所定速度(Vr1)未満であり、かつ
低速の条件Clwが成立する。
CKG=速度Vがゼロで、かつ
ギア状態が前進ギアであり、かつ
速度Vがゼロである継続時間(Tz)が所定時間(Tz1)未満である。
CKH=ギア状態が前進ギアであり、かつ
速度Vが正で所定速度(Vr1)未満であり、かつ
移動距離Lが所定距離(L1)以上である。
(1)前進準備状態(KB)、前進開始状態(KC)および前進可能状態(KD)では、第1の表示条件とする。カメラ画像はカメラで撮像された画像そのままであり、レンズ歪と射影方式による歪を有している。カメラユニット2のカメラのレンズは、180度以上の画角を有するいわゆる魚眼レンズであるため、カメラ画像にはカメラの設置場所の周辺を含んだ広い範囲が表示され、車両周辺の状況が把握しやすく、車両の発進時に車両の周囲に歩行者などがいないかを確認するのに適している。
車両が所定距離(L1)を移動した後で前進している前進状態(KE)が、移動中条件が成立した後で車両が移動している状態である移動中状態である。
前進停止移行状態(KF)が、車両が停止し始めていることを検出する所定の停止移行検出条件(この実施の形態では低速の条件Clw)が成立することを検出している状態である停止移行状態である。前進停止状態(KG)は、停止移行状態の後で車両が停止している状態である停止状態である。
再前進状態(KH)が、停止状態の後で車両が移動している状態である再移動状態である。
U3でCKAが成立しない場合は、U6でSOが初期状態(KA)であるかどうかをチェックする。なお、CKAが成立しない場合は、速度Vがゼロ以上で所定速度(Vr1)未満であり、ギア状態は前進ギアであることになる。
U6でSOが初期状態(KA)である場合は、U7で条件CKBが成立するかどうかをチェックする。CKBが成立する場合は、U8でSNを前進準備状態(KB)とする(図14の矢印w1)。CKBが成立しない場合は、U9でSNを初期状態(KA)とし、移動距離L=0とする(図14の矢印w2)。
U17でSOが前進準備状態(KB)であるかどうかをチェックする。
U17でSOが前進準備状態(KB)である場合は、U18で速度Vがゼロかどうかをチェックする。速度Vがゼロである場合は、U19でSNを前進準備状態(KB)とする(図14の矢印w3)。速度Vがゼロでない場合は、U20でSNを前進開始状態(KC)とする(図14の矢印w4)。
U17でSOが前進準備状態(KB)でない場合は、U21でSOが前進開始状態(KC)であるかどうかをチェックする。
U21でSOが前進開始状態(KC)である場合は、U22で移動距離Lが所定距離(L1)以上であるかどうか(L≧L1)をチェックする。L≧L1である場合は、U23でSNを前進状態(KE)とする(図14の矢印w6)。L<L1である場合は、U24で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロである場合は、U25でSNを前進可能状態(KD)とする(図14の矢印w7)。速度Vがゼロでない場合は、U26でSNを前進開始状態(KC)とする(図14の矢印w8)。
U21でSOが前進開始状態(KC)でない場合は、U27でSOが前進可能状態(KD)であるかどうかをチェックする。
図16に示すU27でSOが前進可能状態(KD)である場合は、U28で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロでない場合は、U29でSNを前進開始状態(KC)とする(図14の矢印w10)。速度Vがゼロである場合は、U30で速度Vがゼロである経過時間(Tz)が所定値(Tz1)以上であるかどうか(Tz≧Tz1)をチェックする。Tz≧Tz1である場合は、U31でSNを初期状態(KA)とし、移動距離L=0とする(図14の矢印w11)。Tz<Tz1である場合は、U32でSNを前進可能状態(KD)とする(図14の矢印w12)。
U27でSOが前進可能状態(KD)でない場合は、U33でSOが前進不可状態(JE)であるかどうかをチェックする。
U33でSOが前進状態(KE)または前進停止移行状態(KF)である場合は、U34で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロである場合は、U35でSNを前進停止状態(KG)とする(図14の矢印w13、w14)。速度Vがゼロでない場合は、U36で低速の条件Clwが成立するかどうかをチェックする。Clwが成立する場合は、U37でSNを前進停止移行状態(KF)とする(図14の矢印w15、w16)。Clwが成立しない場合は、U38でSNを前進状態(KE)とする(図14の矢印w17、w18)。
U33でSOが前進状態(KE)または前進停止移行状態(KF)でない場合は、U39でSOが前進停止状態(KG)であるかどうかをチェックする。
U39でSOが前進停止状態(KG)である場合は、U40で速度Vがゼロ(V=0)であるかどうかチェックする。速度Vがゼロでない場合は、U41でSNを再前進状態(KH)とする(図14の矢印w21)。速度Vがゼロである場合は、U42で速度Vがゼロである経過時間(Tz)が所定値(Tz1)以上であるかどうか(Tz≧Tz1)をチェックする。Tz≧Tz1である場合は、U43でSNを初期状態(KA)とし、移動距離L=0とする(図14の矢印w22)。Tz<Tz1である場合は、U44でSNを前進停止状態(KG)とする(図14の矢印w23)。
U39でSOが前進停止状態(KG)でない場合は、再前進状態(KH)であることになる。
SOが再前進状態(KH)である場合は、U45で速度Vがゼロかどうかをチェックする。速度Vがゼロである場合は、U46でSNを前進停止状態(KG)とする(図14の矢印w24)。速度Vがゼロでない場合は、U47でSNを再前進状態(KH)とする(図14の矢印w25)。
これまでの実施の形態では、車両が移動を停止した後で、再度、移動する場合に、停止する前と同じ方向に移動する場合だけ、運転者を支援している。車両が移動を停止した後で、再度、移動する場合に、停止する前と異なる方向に移動する場合も、運転者を支援するようにしてもよい。
以上のことは、他の実施の形態にもあてはまる。
実施の形態1および2では、ホストユニットは表示部を備えているものとしたが、カメラ画像にガイド線画像が重畳された合成画像を出力する画像出力装置4と、外部の表示装置5例えば車載ナビゲーション装置とを組み合わせて、画像出力装置4が出力する合成画像を表示装置5に表示するように構成することもできる。この実施の形態では、画像出力装置4が運転支援装置である。図17は、実施の形態3に係る運転支援システムの構成を示すブロック図である。図1と同一または対応する構成については、同一の符号を付し、説明を省略する。図17において、電子制御ユニット3からギア状態情報が車両情報取得部10および表示装置5に出力される。画像出力装置4における電子制御ユニット3との接続インターフェースは一般のナビゲーション装置と同じものになっているので、特別なインターフェースを用意しなくても画像出力装置4と電子制御ユニット3との間で通信を行うことができる。画像出力装置4が出力する画像信号は、表示装置5の外部入力端子に入力される。
実施の形態1においては、ホストユニットで車両状態に基づいて表示条件を決め、カメラユニットから送信されるカメラ画像とガイド線画像とを合成するようにした。カメラユニット内に、車両情報取得部、表示条件決定部、カメラ画像補正部を持たせることもできる。撮像したカメラ画像を基に、車両状態に応じて適切な表示条件での画像を出力するカメラユニットを運転支援カメラユニットと呼ぶ。この実施の形態4では、運転支援カメラユニットと、運転支援カメラユニットが出力する画像を表示する表示装置を組み合わせて、運転支援システムを構成する。
この実施の形態の運転支援カメラユニットは、情報記憶部、ガイド線計算部、線描画部といったガイド線画像を生成するための構成も有し、カメラ画像にガイド線画像が重畳された合成画像を出力する。
2 カメラユニット(カメラ)
2a カメラユニット(運転支援カメラユニット)
3 電子制御ユニット
4 画像出力装置(運転支援装置)
5 表示装置
10 車両情報取得部
11 情報記憶部(ガイド線情報記憶部)
11a 情報記憶部
12,12a 表示条件決定部(車両状態判断部)
13 ガイド線計算部(ガイド線情報生成部)
14 線描画部(ガイド線画像生成部)
15 カメラ画像受信部
16 カメラ画像補正部(画像生成部)
17 画像重畳部
18 表示部(表示装置)
21 撮像部(カメラ)
Claims (9)
- 車両に取り付けられ前記車両が移動する方向の路面を撮像する広角レンズを有するカメラと接続され、前記カメラが撮像した画像であるカメラ画像に基づく画像を表示装置に表示する運転支援装置であって、
前記カメラのレンズ形状による前記カメラ画像の歪を示すレンズ歪情報、前記広角レンズの射影方式による前記カメラ画像の歪を示す射影情報を含む画像生成用情報を記憶する情報記憶部と、
前記車両の変速機の状態であるギア状態と速度とを含む車両情報を取得する車両情報取得部と、
前記車両情報に基づいて、前記車両の状態である車両状態を判断する車両状態判断部と、
前記画像生成用情報を利用して、前記カメラ画像を前記車両状態に応じて処理して前記表示装置に表示する画像を生成する画像生成部とを備え、
前記車両状態判断部が、前記車両状態として、前記車両が移動可能で停止している状態である移動準備状態、移動を開始してから所定の移動中条件が成立するまでで前記車両が移動している状態である移動開始状態、前記移動中条件が成立した後で前記車両が移動している状態である移動中状態を判断し、
前記画像生成部が、前記車両状態が前記移動準備状態または前記移動開始状態である場合に歪を有するが広い範囲が見える画像である広角画像を生成し、前記車両状態が前記移動中状態である場合に前記カメラ画像から前記レンズ形状による歪と前記射影方式による歪を除去した画像である無歪画像を生成することを特徴とする運転支援装置。 - 前記情報記憶部に、前記カメラとは別の位置に存在する視点の位置と前記カメラの取り付け位置との差である平行移動情報と前記視点の向きと前記カメラが取り付けられている向きとの差である回転情報からなる視点情報が記憶され、
前記車両状態判断部は、移動している前記車両が停止し始めていることを検出する所定の停止移行検出条件が成立することを検出している状態である停止移行状態を判断し、
前記画像生成部は、前記車両状態が前記停止移行状態である場合に前記カメラ画像から前記レンズ形状による歪と前記射影方式による歪を除去し前記視点から見た画像である別視点無歪画像を生成することを特徴とする請求項1に記載の運転支援装置。 - 前記車両状態判断部は、前記停止移行状態の後で車両が停止している状態である停止状態を判断し、
前記画像生成部は、前記車両状態が前記停止状態である場合に前記別視点無歪画像を生成することを特徴とする請求項2に記載の運転支援装置。 - 前記車両状態判断部は、前記停止状態の後で前記車両が移動している状態である再移動状態を判断し、
前記画像生成部は、前記車両状態が前記再移動状態になってから所定の移動方向状況確認期間は前記広角画像を生成することを特徴とする請求項3に記載の運転支援装置。 - 前記画像生成部は、前記移動方向状況確認期間の後で前記車両状態が前記再移動状態である場合に前記別視点無歪画像を生成することを特徴とする請求項4に記載の運転支援装置。
- 前記車両状態判断部は、所定の停止確定条件が成立するまでに前記車両が移動している場合に前記再移動状態を判断し、前記停止確定条件が成立した後では前記移動準備状態、前記移動開始状態および前記移動中状態を判断することを特徴とする請求項4または請求項5に記載の運転支援装置。
- 前記車両が移動する方向の路面に設定されるガイド線の間隔に関するガイド線間隔情報、および前記カメラの前記車両への取り付け位置および角度を示す取り付け情報を記憶するガイド線情報記憶部と、
前記ガイド線情報記憶部に記憶された情報に基づいて、前記路面に設定される前記ガイド線の前記画像生成部が生成する画像における位置に関するガイド線情報を生成するガイド線情報生成部と、
前記ガイド線情報に基づいて前記ガイド線を表すガイド線画像を生成するガイド線画像生成部とを備え、
前記画像生成部が生成する画像に前記ガイド線画像が重畳した画像を前記表示装置に表示することを特徴とする請求項1ないし請求項6の何れかに記載の運転支援装置。 - 車両に取り付けられ前記車両が移動する方向の路面を撮像する広角レンズを有するカメラと、
前記カメラに接続され、前記カメラが撮像したカメラ画像に基づく画像を表示装置に表示する請求項1ないし請求項7の何れかに記載の運転支援装置とを備える運転支援システム。 - 車両が移動する方向の路面の画像を撮像して、撮像したカメラ画像に基づく画像を表示装置に表示する運転支援カメラユニットであって、
前記車両に取り付けられ前記路面を撮像する広角レンズを有するカメラと、
前記カメラのレンズ形状による前記カメラ画像の歪を示すレンズ歪情報、前記広角レンズの射影方式による前記カメラ画像の歪を示す射影情報を含む画像生成用情報を記憶する情報記憶部と、
前記車両の変速機の状態であるギア状態と速度とを含む車両情報を取得する車両情報取得部と、
前記車両情報に基づいて、前記車両の状態である車両状態を判断する車両状態判断部と、
前記画像生成用情報を利用して、前記カメラ画像を前記車両状態に応じて処理して前記表示装置に表示する画像を生成する画像生成部とを備え、
前記車両状態判断部が、前記車両状態として、前記車両が移動可能で停止している状態である移動準備状態、移動を開始してから所定の移動中条件が成立するまでで前記車両が移動している状態である移動開始状態、前記移動中条件が成立した後で前記車両が移動している状態である移動中状態を判断し、
前記画像生成部が、前記車両状態が前記移動準備状態または前記移動開始状態である場合に歪を有するが広い範囲が見える画像である広角画像を生成し、前記車両状態が前記移動中状態である場合に前記カメラ画像から前記レンズ形状による歪と前記射影方式による歪を除去した画像である無歪画像を生成することを特徴とする運転支援カメラユニット。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/698,227 US9007462B2 (en) | 2010-06-18 | 2010-06-18 | Driving assist apparatus, driving assist system, and driving assist camera unit |
PCT/JP2010/004085 WO2011158304A1 (ja) | 2010-06-18 | 2010-06-18 | 運転支援装置、運転支援システム、および運転支援カメラユニット |
JP2012520170A JP5052708B2 (ja) | 2010-06-18 | 2010-06-18 | 運転支援装置、運転支援システム、および運転支援カメラユニット |
DE112010005670.6T DE112010005670B4 (de) | 2010-06-18 | 2010-06-18 | Fahrunterstützungsvorrichtung, Fahrunterstützungssystem und Fahrunterstützungs-Kameraeinheit |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/004085 WO2011158304A1 (ja) | 2010-06-18 | 2010-06-18 | 運転支援装置、運転支援システム、および運転支援カメラユニット |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011158304A1 true WO2011158304A1 (ja) | 2011-12-22 |
Family
ID=45347732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/004085 WO2011158304A1 (ja) | 2010-06-18 | 2010-06-18 | 運転支援装置、運転支援システム、および運転支援カメラユニット |
Country Status (4)
Country | Link |
---|---|
US (1) | US9007462B2 (ja) |
JP (1) | JP5052708B2 (ja) |
DE (1) | DE112010005670B4 (ja) |
WO (1) | WO2011158304A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014089490A (ja) * | 2012-10-29 | 2014-05-15 | Hitachi Consumer Electronics Co Ltd | 交通情報通知装置 |
JP2015121591A (ja) * | 2013-12-20 | 2015-07-02 | 株式会社富士通ゼネラル | 車載カメラ |
JP2017163206A (ja) * | 2016-03-07 | 2017-09-14 | 株式会社デンソー | 情報処理装置及びプログラム |
JPWO2018221209A1 (ja) * | 2017-05-30 | 2020-04-02 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、画像処理方法、及び、プログラム |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201215126A (en) * | 2010-09-27 | 2012-04-01 | Hon Hai Prec Ind Co Ltd | Image dividing system for cameras and using method of the same |
JP5277272B2 (ja) * | 2011-03-04 | 2013-08-28 | 株式会社ホンダアクセス | 車両後方監視装置 |
JP2014204361A (ja) * | 2013-04-08 | 2014-10-27 | 株式会社ビートソニック | 車載モニタリングシステムにおける車載カメラ用アダプター |
DE102014116441A1 (de) * | 2014-11-11 | 2016-05-12 | Connaught Electronics Ltd. | Verfahren zum Darstellen einer Sicherheitsinformation, Fahrerassistenzsystem und Kraftfahrzeug |
KR101712399B1 (ko) * | 2014-11-25 | 2017-03-06 | 현대모비스 주식회사 | 차량의 후방 장애물 표시 방법 |
CN118816908A (zh) | 2015-02-10 | 2024-10-22 | 御眼视觉技术有限公司 | 用于自主车辆导航的稀疏地图 |
FR3047947B1 (fr) | 2016-02-24 | 2018-03-09 | Renault S.A.S | Procede d'aide a la conduite en marche avant d'un vehicule automobile, muni d'une camera a objectif de type fish-eye |
US10558222B2 (en) | 2016-07-21 | 2020-02-11 | Mobileye Vision Technologies Ltd. | Navigating a vehicle using a crowdsourced sparse map |
EP3305597B1 (en) * | 2016-10-04 | 2020-12-09 | Ficomirrors, S.A.U. | Vehicle driving assist system |
JP6497819B2 (ja) | 2017-03-10 | 2019-04-10 | 株式会社Subaru | 画像表示装置 |
JP6593803B2 (ja) | 2017-03-10 | 2019-10-23 | 株式会社Subaru | 画像表示装置 |
JP6515125B2 (ja) | 2017-03-10 | 2019-05-15 | 株式会社Subaru | 画像表示装置 |
JP6465318B2 (ja) * | 2017-03-10 | 2019-02-06 | 株式会社Subaru | 画像表示装置 |
JP6465317B2 (ja) | 2017-03-10 | 2019-02-06 | 株式会社Subaru | 画像表示装置 |
JP6497818B2 (ja) | 2017-03-10 | 2019-04-10 | 株式会社Subaru | 画像表示装置 |
JP6429413B2 (ja) | 2017-03-10 | 2018-11-28 | 株式会社Subaru | 画像表示装置 |
US10018171B1 (en) | 2017-05-17 | 2018-07-10 | Deere & Company | Work vehicle start system and method with virtual walk-around for authorizing remote start |
US10144390B1 (en) | 2017-05-17 | 2018-12-04 | Deere & Company | Work vehicle start system and method with optical verification for authorizing remote start |
US10132259B1 (en) | 2017-05-17 | 2018-11-20 | Deere & Company | Work vehicle start system and method with engine cycling |
DE102017210264A1 (de) * | 2017-06-20 | 2018-12-20 | Zf Friedrichshafen Ag | Verfahren zur Bedienung eines Fahrzeugbediensystems |
JP7091624B2 (ja) * | 2017-09-15 | 2022-06-28 | 株式会社アイシン | 画像処理装置 |
US10737725B2 (en) * | 2017-09-27 | 2020-08-11 | Gentex Corporation | System and method for assisting parallel parking using orthogonal projection |
DE102017221488A1 (de) * | 2017-11-30 | 2019-06-06 | Volkswagen Aktiengesellschaft | Verfahren zur Anzeige des Verlaufs einer Trajektorie vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm |
FR3104524B1 (fr) * | 2019-12-13 | 2021-12-31 | Renault Sas | Procédé et dispositif d’assistance au stationnement d’un véhicule et véhicule comportant un tel dispositif. |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000127874A (ja) * | 1998-10-20 | 2000-05-09 | Nissan Motor Co Ltd | 車両用後方確認装置 |
JP2003134507A (ja) * | 2001-10-24 | 2003-05-09 | Nissan Motor Co Ltd | 車両後方監視装置 |
JP2003158736A (ja) * | 2000-07-19 | 2003-05-30 | Matsushita Electric Ind Co Ltd | 監視システム |
JP2007522981A (ja) * | 2004-02-20 | 2007-08-16 | シャープ株式会社 | 状況検出表示システム、状況検出表示方法、状況検出表示システム制御プログラム、および当該プログラムを記録した記録媒体 |
JP2008013022A (ja) * | 2006-07-05 | 2008-01-24 | Sanyo Electric Co Ltd | 車両の運転支援装置 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5687249A (en) * | 1993-09-06 | 1997-11-11 | Nippon Telephone And Telegraph | Method and apparatus for extracting features of moving objects |
US7366595B1 (en) * | 1999-06-25 | 2008-04-29 | Seiko Epson Corporation | Vehicle drive assist system |
AU2001243285A1 (en) * | 2000-03-02 | 2001-09-12 | Donnelly Corporation | Video mirror systems incorporating an accessory module |
DE60139236D1 (de) * | 2000-05-12 | 2009-08-27 | Toyota Jidoshokki Kariya Kk | Hilfe beim rückwärtsfahren einen fahrzeugs |
EP1303140A4 (en) | 2000-07-19 | 2007-01-17 | Matsushita Electric Ind Co Ltd | MONITORING SYSTEM |
US7253833B2 (en) * | 2001-11-16 | 2007-08-07 | Autonetworks Technologies, Ltd. | Vehicle periphery visual recognition system, camera and vehicle periphery monitoring apparatus and vehicle periphery monitoring system |
JP3855814B2 (ja) * | 2002-03-22 | 2006-12-13 | 日産自動車株式会社 | 車両用画像処理装置 |
JP4766841B2 (ja) * | 2003-09-08 | 2011-09-07 | 株式会社オートネットワーク技術研究所 | 車両に搭載されるカメラ装置及び車両周辺監視装置 |
JP2005110202A (ja) * | 2003-09-08 | 2005-04-21 | Auto Network Gijutsu Kenkyusho:Kk | カメラ装置及び車両周辺監視装置 |
JP2005124010A (ja) * | 2003-10-20 | 2005-05-12 | Nissan Motor Co Ltd | 撮像装置 |
US7415335B2 (en) * | 2003-11-21 | 2008-08-19 | Harris Corporation | Mobile data collection and processing system and methods |
JP4457690B2 (ja) * | 2004-02-18 | 2010-04-28 | 日産自動車株式会社 | 運転支援装置 |
JP4466200B2 (ja) * | 2004-04-19 | 2010-05-26 | 株式会社豊田自動織機 | 駐車支援装置 |
US8427538B2 (en) * | 2004-04-30 | 2013-04-23 | Oncam Grandeye | Multiple view and multiple object processing in wide-angle video camera |
DE102004048185B4 (de) | 2004-09-30 | 2006-09-14 | Magna Donnelly Gmbh & Co. Kg | Verfahren zum Betrieb eines elektronischen Einsichtnahmesystems und Fahrzeug mit einem elektronischen Einsichtnahmesystem |
JP4543983B2 (ja) * | 2005-03-22 | 2010-09-15 | 株式会社豊田自動織機 | 駐車支援装置 |
JP2007114020A (ja) * | 2005-10-19 | 2007-05-10 | Aisin Aw Co Ltd | 車両の移動距離検出方法、車両の移動距離検出装置、車両の現在位置検出方法及び車両の現在位置検出装置 |
JP2007176324A (ja) * | 2005-12-28 | 2007-07-12 | Aisin Seiki Co Ltd | 駐車支援装置 |
JP5020621B2 (ja) * | 2006-12-18 | 2012-09-05 | クラリオン株式会社 | 運転支援装置 |
JP4843517B2 (ja) * | 2007-02-06 | 2011-12-21 | 本田技研工業株式会社 | 車両用視認補助装置 |
JP5182545B2 (ja) * | 2007-05-16 | 2013-04-17 | アイシン精機株式会社 | 駐車支援装置 |
JP2009060404A (ja) * | 2007-08-31 | 2009-03-19 | Denso Corp | 映像処理装置 |
US8359858B2 (en) * | 2007-10-30 | 2013-01-29 | Ford Global Technologies, Llc | Twin turbocharged engine with reduced compressor imbalance and surge |
JP5176184B2 (ja) * | 2008-03-28 | 2013-04-03 | 本田技研工業株式会社 | クラッチ制御装置 |
JP4661917B2 (ja) * | 2008-07-25 | 2011-03-30 | 日産自動車株式会社 | 駐車支援装置および駐車支援方法 |
JP5591466B2 (ja) * | 2008-11-06 | 2014-09-17 | 株式会社名南製作所 | 原木の3次元形状測定装置および方法 |
-
2010
- 2010-06-18 WO PCT/JP2010/004085 patent/WO2011158304A1/ja active Application Filing
- 2010-06-18 DE DE112010005670.6T patent/DE112010005670B4/de not_active Expired - Fee Related
- 2010-06-18 US US13/698,227 patent/US9007462B2/en not_active Expired - Fee Related
- 2010-06-18 JP JP2012520170A patent/JP5052708B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000127874A (ja) * | 1998-10-20 | 2000-05-09 | Nissan Motor Co Ltd | 車両用後方確認装置 |
JP2003158736A (ja) * | 2000-07-19 | 2003-05-30 | Matsushita Electric Ind Co Ltd | 監視システム |
JP2003134507A (ja) * | 2001-10-24 | 2003-05-09 | Nissan Motor Co Ltd | 車両後方監視装置 |
JP2007522981A (ja) * | 2004-02-20 | 2007-08-16 | シャープ株式会社 | 状況検出表示システム、状況検出表示方法、状況検出表示システム制御プログラム、および当該プログラムを記録した記録媒体 |
JP2008013022A (ja) * | 2006-07-05 | 2008-01-24 | Sanyo Electric Co Ltd | 車両の運転支援装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014089490A (ja) * | 2012-10-29 | 2014-05-15 | Hitachi Consumer Electronics Co Ltd | 交通情報通知装置 |
JP2015121591A (ja) * | 2013-12-20 | 2015-07-02 | 株式会社富士通ゼネラル | 車載カメラ |
JP2017163206A (ja) * | 2016-03-07 | 2017-09-14 | 株式会社デンソー | 情報処理装置及びプログラム |
WO2017154833A1 (ja) * | 2016-03-07 | 2017-09-14 | 株式会社デンソー | 情報処理装置及びプログラム |
CN108702491A (zh) * | 2016-03-07 | 2018-10-23 | 株式会社电装 | 信息处理装置以及程序 |
JPWO2018221209A1 (ja) * | 2017-05-30 | 2020-04-02 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、画像処理方法、及び、プログラム |
JP7150709B2 (ja) | 2017-05-30 | 2022-10-11 | ソニーセミコンダクタソリューションズ株式会社 | 画像処理装置、画像処理方法、及び、プログラム |
US11521395B2 (en) | 2017-05-30 | 2022-12-06 | Sony Semiconductor Solutions Corporation | Image processing device, image processing method, and program |
Also Published As
Publication number | Publication date |
---|---|
JP5052708B2 (ja) | 2012-10-17 |
JPWO2011158304A1 (ja) | 2013-08-15 |
US20130057690A1 (en) | 2013-03-07 |
DE112010005670T5 (de) | 2013-07-25 |
DE112010005670B4 (de) | 2015-11-12 |
US9007462B2 (en) | 2015-04-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5052708B2 (ja) | 運転支援装置、運転支援システム、および運転支援カメラユニット | |
KR101354068B1 (ko) | 차량주변화상생성장치 | |
JP5379913B2 (ja) | 駐車支援装置、駐車支援システム、および駐車支援カメラユニット | |
JP4807104B2 (ja) | 車両周囲監視システム及び画像表示方法 | |
WO2012172923A1 (ja) | 車両周辺監視装置 | |
WO2012039256A1 (ja) | 運転支援装置 | |
WO2009151053A1 (ja) | 駐車支援装置及び駐車支援方法 | |
JP2013535753A (ja) | 表示装置に画像を表示する方法、および運転者支援システム | |
JP7159802B2 (ja) | 車両用電子ミラーシステム | |
JP5516988B2 (ja) | 駐車支援装置 | |
EP3967554B1 (en) | Vehicular display system | |
WO2017057006A1 (ja) | 周辺監視装置 | |
KR20150019182A (ko) | 영상 표시 방법 및 이를 위한 위한 장치 | |
WO2016129552A1 (ja) | カメラパラメータ調整装置 | |
JP5020621B2 (ja) | 運転支援装置 | |
JP4855918B2 (ja) | 運転支援装置 | |
JP2014129093A (ja) | 車両用周辺監視装置 | |
JP2012001126A (ja) | 車両用周辺監視装置 | |
JP5561478B2 (ja) | 駐車支援装置 | |
JP2008114691A (ja) | 車両周辺監視装置および車両周辺監視映像表示方法 | |
JP2012065225A (ja) | 車載用画像処理装置、周辺監視装置、および、車両 | |
JP5226621B2 (ja) | 車両用画像表示装置 | |
JP4156181B2 (ja) | 駐車支援装置 | |
JP4855919B2 (ja) | 運転支援装置 | |
JP2002083284A (ja) | 描画装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10853186 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012520170 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13698227 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120100056706 Country of ref document: DE Ref document number: 112010005670 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10853186 Country of ref document: EP Kind code of ref document: A1 |