KR101853652B1 - Around view genegation method and apparatus performing the same - Google Patents
Around view genegation method and apparatus performing the same Download PDFInfo
- Publication number
- KR101853652B1 KR101853652B1 KR1020150188078A KR20150188078A KR101853652B1 KR 101853652 B1 KR101853652 B1 KR 101853652B1 KR 1020150188078 A KR1020150188078 A KR 1020150188078A KR 20150188078 A KR20150188078 A KR 20150188078A KR 101853652 B1 KR101853652 B1 KR 101853652B1
- Authority
- KR
- South Korea
- Prior art keywords
- view
- distance
- surrounding
- main subject
- environment
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/02—Rear-view mirror arrangements
- B60R1/08—Rear-view mirror arrangements involving special optical features, e.g. avoiding blind spots, e.g. convex mirrors; Side-by-side associations of rear-view and other mirrors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- H04N5/2257—
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Traffic Control Systems (AREA)
Abstract
The apparatus includes an environment view processor for generating an environment view through a plurality of camera sensors, a distance information processor for receiving distance information about at least one main subject in the environment view through the distance measurement sensor, And an ambient view processor for generating an ambient view by fusing the ambient environment view and the distance information to a predefined ambient view projection model. Accordingly, the surrounding view generating apparatus can provide more accurate information about the surround view through the plurality of camera sensors and the distance measuring sensor, thereby allowing the user to accurately recognize the surrounding environment.
Description
Field of the Invention [0002] The present invention relates to a technique for creating an overview view, and more particularly, to a method for generating an overview view by merging information obtained using a heterogeneous sensor, which can be used in, for example, .
The smart car or the robot can finally go to the destination without being boarded by the corresponding device of the person, or can be controlled by the user at the remote place. To this end, the smart car or robot can be equipped with an image sensor, such as a camera, to perceive surrounding obstacles or information about the target.
Conventionally, in the process of synthesizing 2D information, which is a peripheral image obtained from the four wide-angle cameras, with a three-dimensional image view around the vehicle, it properly reflects information about the surroundings of the vehicle, for example, surrounding pillars and walls The user can not pass the visual sense of space properly.
Korean Patent No. 10-1579100 relates to a vehicle surround view providing apparatus and a vehicle having the same, wherein the vehicle surround view providing apparatus includes first to fourth cameras mounted on a vehicle, first to fourth cameras mounted on the first to fourth cameras, Based on the difference between the reference image corresponding to each of the first to fourth cameras from the memory and each shot image from the first to fourth cameras, And a processor for calculating the offset information and synthesizing the respective images from the first to fourth cameras using the offset information to generate the surround view image. This makes it possible to provide an accurate arousal view image based on the calibrated image.
Korean Patent No. 10-1504335 discloses an apparatus, method, and vehicle for providing an around view. The camera section includes a plurality of cameras installed in a vehicle and capturing an image. The control unit performs calibration for generating the surround view using the images photographed by the plurality of camera units. According to the present invention, there is an effect that a straight line on an image is detected using Hough transform, and an image is matched through a homography matrix.
Korean Patent No. 10-1566964 discloses an approach view monitoring method capable of tracking a moving object, an apparatus for monitoring an ambient view for performing the method, and a recording medium storing the apparatus. The surround view monitoring apparatus includes a plurality of cameras for imaging a periphery of a working vehicle to generate a plurality of images, an at least one of the plurality of images or an ambient view generated based on the plurality of images, And an object recognizing unit for recognizing a moving object due to at least the working environment of the working vehicle and determining the necessity of tracking the recognized object in the working environment view. Therefore, the surround view monitoring apparatus can prevent a safety accident by increasing the recognition speed or recognition accuracy of the object according to the working environment of the working vehicle.
One embodiment of the present invention seeks to create an ambient view by fusing information obtained using heterogeneous sensors, which may be used, for example, in an automotive or robot device. For example, the heterogeneous sensor may include a plurality of camera sensors and a distance measurement sensor such as a radar, a lidar, or an ultrasonic sensor.
An embodiment of the present invention provides more precise information about the surround view through a plurality of camera sensors and a distance measuring sensor, so that the user intends to accurately recognize the surrounding environment.
In embodiments, the method for generating an overview view includes generating an ambient view through a plurality of camera sensors, receiving distance information about at least one primary subject in the ambient view through the distance measurement sensor, And creating an ambient view by fusing the ambient environment view and the distance information to a predefined environment view projection model.
The step (c) includes the step of measuring the direction of the main subject positioned from the moving object equipped with the surround view generating apparatus by analyzing the surrounding view, and measuring the absolute distance of the measured main subject from the distance information can do.
The step (c) may include the step of measuring the distance of the object having the longest absolute distance and determining the projection range of the environmental view projection model based on the distance of the object.
The step (c) may include overlaying each subject on the environmental view projection model according to a relative distance reflecting the position and direction of the at least one main subject based on the determined projection range.
The step (c) may include distorting the surrounding view projection model based on the surrounding shape of the surrounding view, and reflecting the distance information to the distorted surrounding view projection model.
In an embodiment of the present invention, the surround view generating apparatus includes a surrounding view processor for generating a surrounding view through the plurality of camera sensors, distance information about at least one main subject in the surrounding view through the distance measuring sensor, And a surrounding view processor for generating a surrounding view by fusing the surrounding environment view and the distance information to a predefined surrounding view projection model.
The disclosed technique may have the following effects. It is to be understood, however, that the scope of the disclosed technology is not to be construed as limited thereby, as it is not meant to imply that a particular embodiment should include all of the following effects or only the following effects.
The method of generating an approach view according to an exemplary embodiment of the present invention can generate an overview view by fusing information obtained using a heterogeneous sensor, which can be used in, for example, an automobile or a robot. For example, the heterogeneous sensor may include a plurality of camera sensors and a distance measurement sensor such as a radar, a lidar, or an ultrasonic sensor.
The method of generating an overview view according to an exemplary embodiment of the present invention provides more accurate information about the surround view through a plurality of camera sensors and a distance measurement sensor so that the user can accurately recognize the surrounding environment.
1 is a view for explaining a moving object having an apparatus for generating an overview view according to an embodiment of the present invention.
2 is a block diagram illustrating an apparatus for generating an overview view in FIG.
FIG. 3 is a flowchart of an overview view generation process performed by the surround view generation apparatus of FIG. 1;
Fig. 4 is a view for explaining a surrounding view projection model in the projection model database shown in Fig. 1. Fig.
The description of the present invention is merely an example for structural or functional explanation, and the scope of the present invention should not be construed as being limited by the embodiments described in the text. That is, the embodiments are to be construed as being variously embodied and having various forms, so that the scope of the present invention should be understood to include equivalents capable of realizing technical ideas. Also, the purpose or effect of the present invention should not be construed as limiting the scope of the present invention, since it does not mean that a specific embodiment should include all or only such effect.
Meanwhile, the meaning of the terms described in the present application should be understood as follows.
The terms "first "," second ", and the like are intended to distinguish one element from another, and the scope of the right should not be limited by these terms. For example, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.
It is to be understood that when an element is referred to as being "connected" to another element, it may be directly connected to the other element, but there may be other elements in between. On the other hand, when an element is referred to as being "directly connected" to another element, it should be understood that there are no other elements in between. On the other hand, other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.
It is to be understood that the singular " include " or "have" are to be construed as including the stated feature, number, step, operation, It is to be understood that the combination is intended to specify that it does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
In each step, the identification code (e.g., a, b, c, etc.) is used for convenience of explanation, the identification code does not describe the order of each step, Unless otherwise stated, it may occur differently from the stated order. That is, each step may occur in the same order as described, may be performed substantially concurrently, or may be performed in reverse order.
The present invention can be embodied as computer-readable code on a computer-readable recording medium, and the computer-readable recording medium includes all kinds of recording devices for storing data that can be read by a computer system . Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . In addition, the computer-readable recording medium may be distributed over network-connected computer systems so that computer readable codes can be stored and executed in a distributed manner.
All terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs, unless otherwise defined. Commonly used predefined terms should be interpreted to be consistent with the meanings in the context of the related art and can not be interpreted as having ideal or overly formal meaning unless explicitly defined in the present application.
1 is a view for explaining a moving object having an apparatus for generating an overview view according to an embodiment of the present invention.
Referring to FIG. 1, a
Fig. 1A assumes a situation in which the
A plurality of
The
The surround
2 is a block diagram illustrating an apparatus for generating an overview view in FIG.
2, the surround
The surround environment
The distance
In one embodiment, the distance
The surround
In one embodiment, the surround
The
Fig. 4 is a view for explaining a surrounding view projection model in the projection model database shown in Fig. 1. Fig.
In Fig. 4, the ambient
The surround
For example, as shown in FIG. 1A, when the moving
As another example, as shown in FIG. 1B, when the moving
Hereinafter, an overview view generation process by the surround
Assuming that a wall is detected in the periphery of the moving
Assuming that a plurality of main subjects are detected in the vicinity of the moving
In one embodiment, the first main subject 10 may correspond to the object farthest from the moving
The
FIG. 3 is a flowchart of an overview view generation process performed by the surround view generation apparatus of FIG. 1;
The surround
The surround
The surrounding
Finally, the surround
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the present invention as defined by the following claims It can be understood that
100: Around view generation device
110: a plurality of camera sensors
120: Distance measuring sensor
210: Ambient environment view processor
220: distance information processing unit 230:
240: Projection model database 250:
1: mobile body
10: first main subject 11: second main subject
20: Surrounding environment view projection model
Claims (6)
(a) synthesizing object image information received through the plurality of camera sensors to generate a peripheral environment view;
(b) measuring an absolute distance to at least one main subject through the distance measurement sensor, and receiving distance information regarding at least one main subject in the perimeter view; And
(c) determining a projection range of the surrounding view projection model by measuring a distance of the object having the longest absolute distance, and based on the determined projection range, reflecting the position and direction of the at least one main subject, Generating a surrounding view by fusing the peripheral view and the distance information to a peripheral view projection model defined as one of the parabolic, dish or cylindrical projection models of each of the at least one main subject Way.
And measuring the direction of the main subject positioned on the moving object equipped with the surround view generating apparatus by analyzing the surrounding view and measuring the absolute distance of the measured main subject from the distance information. How to create a view.
Distorting the surrounding environment view projection model based on the peripheral shape of the surrounding environment view, and reflecting the distance information to the distorted surrounding view projection model.
A distance information processing unit for receiving distance information about at least one main subject in the environment view through a distance measuring sensor; And
Determining a projection range of the surrounding environment view projection model by measuring a distance of the object having the longest absolute distance and determining a projection range of the at least one main object based on the determined projection range, An ambient view processing unit that includes an ambient view processing unit that generates an ambient view by fusing the surrounding view and the distance information to a surrounding view view projection model defined as one of a main subject, a parabolic type, a dish type, or a cylindrical projection model Device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150188078A KR101853652B1 (en) | 2015-12-29 | 2015-12-29 | Around view genegation method and apparatus performing the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150188078A KR101853652B1 (en) | 2015-12-29 | 2015-12-29 | Around view genegation method and apparatus performing the same |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170078005A KR20170078005A (en) | 2017-07-07 |
KR101853652B1 true KR101853652B1 (en) | 2018-05-03 |
Family
ID=59353898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150188078A KR101853652B1 (en) | 2015-12-29 | 2015-12-29 | Around view genegation method and apparatus performing the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101853652B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102580653B1 (en) | 2023-05-22 | 2023-09-21 | 고려웍스(주) | Vehicle around view automatic switching device and its method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102015099B1 (en) * | 2018-01-25 | 2019-10-21 | 전자부품연구원 | Apparatus and method for providing wrap around view monitoring using dis information |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101378337B1 (en) | 2012-10-30 | 2014-03-27 | 주식회사 이미지넥스트 | Apparatus and method for processing image of camera |
-
2015
- 2015-12-29 KR KR1020150188078A patent/KR101853652B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101378337B1 (en) | 2012-10-30 | 2014-03-27 | 주식회사 이미지넥스트 | Apparatus and method for processing image of camera |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102580653B1 (en) | 2023-05-22 | 2023-09-21 | 고려웍스(주) | Vehicle around view automatic switching device and its method |
Also Published As
Publication number | Publication date |
---|---|
KR20170078005A (en) | 2017-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7509501B2 (en) | Vehicle navigation based on aligned imagery and LIDAR information | |
US8559674B2 (en) | Moving state estimating device | |
JP5057936B2 (en) | Bird's-eye image generation apparatus and method | |
US9151626B1 (en) | Vehicle position estimation system | |
CN108122425B (en) | Apparatus and method for recognizing vehicle position | |
EP1961613B1 (en) | Driving support method and driving support device | |
US20140172296A1 (en) | Systems and methods for navigation | |
CN107122770B (en) | Multi-camera system, intelligent driving system, automobile, method and storage medium | |
US8885889B2 (en) | Parking assist apparatus and parking assist method and parking assist system using the same | |
WO2015156821A1 (en) | Vehicle localization system | |
CN109583416B (en) | Pseudo lane line identification method and system | |
KR20200001471A (en) | Apparatus and method for detecting lane information and computer recordable medium storing computer program thereof | |
US20120236287A1 (en) | External environment visualization apparatus and method | |
US11145112B2 (en) | Method and vehicle control system for producing images of a surroundings model, and corresponding vehicle | |
WO2018074085A1 (en) | Rangefinder and rangefinder control method | |
JP4556742B2 (en) | Vehicle direct image display control apparatus and vehicle direct image display control program | |
CN110750153A (en) | Dynamic virtualization device of unmanned vehicle | |
KR102031635B1 (en) | Collision warning device and method using heterogeneous cameras having overlapped capture area | |
US8031908B2 (en) | Object recognizing apparatus including profile shape determining section | |
JP2019128350A (en) | Image processing method, image processing device, on-vehicle device, moving body and system | |
KR101853652B1 (en) | Around view genegation method and apparatus performing the same | |
KR101868549B1 (en) | Method of generating around view and apparatus performing the same | |
KR20160125803A (en) | Apparatus for defining an area in interest, apparatus for detecting object in an area in interest and method for defining an area in interest | |
KR101316387B1 (en) | Method of object recognition using vision sensing and distance sensing | |
JP2011177334A (en) | Step detecting device and electric-powered vehicle equipped with the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
GRNT | Written decision to grant |