[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR20130078990A - Apparatus for convergence in 3d photographing apparatus - Google Patents

Apparatus for convergence in 3d photographing apparatus Download PDF

Info

Publication number
KR20130078990A
KR20130078990A KR1020120000193A KR20120000193A KR20130078990A KR 20130078990 A KR20130078990 A KR 20130078990A KR 1020120000193 A KR1020120000193 A KR 1020120000193A KR 20120000193 A KR20120000193 A KR 20120000193A KR 20130078990 A KR20130078990 A KR 20130078990A
Authority
KR
South Korea
Prior art keywords
vfl
value
image
viewing angle
unit
Prior art date
Application number
KR1020120000193A
Other languages
Korean (ko)
Other versions
KR101345971B1 (en
Inventor
조준동
박찬오
이동훈
김용한
Original Assignee
성균관대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 성균관대학교산학협력단 filed Critical 성균관대학교산학협력단
Priority to KR20120000193A priority Critical patent/KR101345971B1/en
Publication of KR20130078990A publication Critical patent/KR20130078990A/en
Application granted granted Critical
Publication of KR101345971B1 publication Critical patent/KR101345971B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

PURPOSE: A convergence angle control device of a stereoscopic imaging device is provided to control a convergence angle by focusing on a desired object according to a visual fatigue level (VFL) value with only a simple operation within short time after taking an image with a stereoscopic camera. CONSTITUTION: A depth region obtaining part (400) generates a disparity map by using disparity information extracted from an image processing part (300) and generates a first depth map by using the disparity map. A segmentation part (450) converts the generated first depth map into a histogram and generates a second depth map by using the histogram. A VFL obtaining part (500) calculates a VFL value for each object from the second depth map. An automatic convergence angle control part (600) automatically controls a convergence angle by focusing on an object having the highest VFL value among the calculated VFL values. [Reference numerals] (101) Left camera; (102) Right camera; (200) Image obtaining part; (300) Image processing part; (400) Depth region obtaining part; (450) Segmentation; (500) VFL obtaining part; (600) Automatic convergence angle control part (Apply a VFL value); (700) Semiautomatic convergence angle control part; (800) User input; (850) Display part

Description

Perpendicular Control Device in 3D Imaging Device {APPARATUS FOR CONVERGENCE IN 3D PHOTOGRAPHING APPARATUS}

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a viewing angle control device, and more particularly, to a viewing angle control device in a 3D photographing apparatus capable of precisely controlling a viewing angle within a short time by focusing a desired object by using a VFL value.

The present invention relates to a perspective control device. As a background technology of the present invention, as disclosed in Korean Patent Laid-Open Publication No. 10-2010-0131814, a 3D photographing apparatus (stereo camera) is used to focus on the center by an auto focus function when capturing an image. In order to change the focus by the user, the user may change the focus of a specific object constituting the subject through various manual adjustment steps.

However, the prior art as described above has a problem that the 3D effect may be excessive and visual fatigue may be caused by an object located at the center.

In addition, the manual adjustment function for focusing on the object desired by the user after the shooting has to change the focus position in sequence, so the process of changing the focus position is quite complicated and inconvenient.

In addition, the prior art has a problem that it takes a lot of inconvenience and a long time to select the desired depth in order to sequentially pass the depth step in the focus setting function of the 3D imaging apparatus.

For reference, Figure 1 is a view for explaining the prior art, reference numerals described in Figure 1 is irrelevant to the present invention.

Accordingly, the present invention is to solve the problem according to the prior art described above, after taking a subject by a stereo camera, the depth of the image is divided into stages and can be output to the display according to each object on the display according to the VFL value 3D shooting It is an object of the present invention to provide a viewing angle control device.

Another object of the present invention is to provide a viewing angle control device in a stereoscopic image photographing apparatus capable of focusing by selecting a desired object.

The problems to be solved by the present invention are not limited to those mentioned above, and other solutions not mentioned can be clearly understood by those skilled in the art from the following description.

In the stereoscopic image capturing apparatus according to the present invention, the apparatus for controlling the angle of view includes: a photographing unit 100 which acquires two images having different parallaxes constituting a stereoscopic image by a stereo camera; An image processor 300 for extracting disparity information, which is disparity information of a stereo image, from the acquired parallaxes of the two images; A disparity map is generated using the extracted disparity information, and a first depth-map representing the depth of an image for each of the objects included in the stereo image is generated using the disparity map. A depth area obtaining unit 400 to generate; A segmentation unit 450 configured to histogram the generated first depth map and generate a second depth map for dividing each object by distance using a histogram; A VFL obtainer 500 for calculating a VFL (Visual Fatigue Level) value for each of the objects from the generated second depth map; And an automatic viewing angle control unit 600 for automatically controlling a viewing angle with a focus on the object having the highest value among the calculated VFL values, and outputting a stereo image automatically controlled to the viewing angle on a display.

In the apparatus for controlling a viewing angle in a stereoscopic image photographing apparatus according to the present invention, a semi-automatic viewing angle for controlling the viewing angle of the stereo image by focusing on one selected object among objects included in the stereo image output on the display. It is preferable to further include a control unit.

In the apparatus for controlling the viewing angle in the stereoscopic image photographing apparatus according to the present invention, the VFL obtaining unit 500 includes: calculating a CZF function using the following Equation 1 to obtain a CZF (Comfort Zone Function); ; A weight calculation step of obtaining a weight, W, for each object divided by distance in the segmentation unit using a CZF function and Equation 2 below; In the segmentation unit, it is preferable to perform a VFL value calculating step of calculating a Visual Ftigue Level (VFL) value by using Equation 3 below and the weight W for each object classified by distance.

[Equation 1]

Figure pat00001

&Quot; (2) "

Figure pat00002

&Quot; (3) "

Figure pat00003

In the apparatus for controlling the angle of view in the stereoscopic image photographing apparatus according to the present invention, the VFL obtaining unit 500 is divided by distance in the segmentation unit by using the VFL value and the following Equation 4 after calculating the VFL value. It is preferable to further perform the calculating step of calculating a shift value D which calculates a D value which is a shift value for focusing on the respective objects.

&Quot; (4) "

Figure pat00004

In the apparatus for controlling the angle of view in the stereoscopic image photographing apparatus according to the present invention, the automatic angle of view control in the automatic angle of view control unit 600 controls the object having the highest value among the calculated VFL values (D). It is preferable that the focus is controlled by moving according to.

In the perspective control apparatus in the stereoscopic image photographing apparatus according to the present invention, in the semi-automatic perspective control in the semi-automatic perspective control unit, the focus is controlled by moving the selected object according to the calculated shift value (D). It is preferable.

In the apparatus for controlling a viewing angle in a stereoscopic image photographing apparatus according to the present invention, the display is configured as a touch screen, and the semi-automatic viewing control unit focuses on the selected object when any one of the objects output on the display is touched and selected. It is preferable to control the viewing angle with respect to the stereoscopic image.

As described above, according to the apparatus for controlling the angle of view in the stereoscopic image photographing apparatus, the angle of view can be controlled by focusing on a desired object according to the VFL value with a simple operation within a short time after shooting the stereo camera. It works.

In addition, there is an effect that the user can change the focus by arbitrarily selecting one of the objects included in the subject.

The effects of the present invention are not limited to those mentioned above, and other solutions not mentioned may be clearly understood by those skilled in the art from the following description.

1 is a diagram of the background of the present invention.
2 is an explanatory diagram of a visual angle.
3 is a block diagram of a perspective control device in a stereoscopic image photographing apparatus according to the present invention.
4 is a diagram illustrating two images having different parallax obtained through the left camera and the right camera of the photographing unit.
5 illustrates an example of generating a first depth map of FIG. 4 generated by an image processor.
6 is an exemplary diagram of generating a histogram of the depth map of FIG. 5.
FIG. 7 illustrates an example of generating a second depth map using the histogram of FIG. 6.
8 is an exemplary diagram for a segmentation scheme.
9 to 11 are exemplary views of the VFL algorithm and the vergence control scheme according to the present invention.
12 is an illustration of VFL values.
13 is a general flowchart of a vergence control method according to the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT Hereinafter, a perspective angle control device in a stereoscopic image photographing apparatus according to the present invention will be described in detail with reference to the accompanying drawings.

Before describing the drawings in detail, it is to be clarified that the division of constituent parts in this specification is merely a division by main functions of each constituent part. That is, two or more constituent parts to be described below may be combined into one constituent part, or one constituent part may be divided into two or more functions according to functions that are more subdivided. Each of the components to be described below may additionally perform some or all of the functions of other components in addition to the main functions of the components, and some of the main functions of each of the components are different. Of course, it may be carried out exclusively by. Accordingly, the presence or absence of each component described in this specification should be interpreted as a function.

The stereoscopic image capturing apparatus includes a stereo camera, that is, a left camera 101 and a right camera 102 to output a stereo image. The angle of view of the left camera and the right stereo camera is fixed by an arbitrary distance, and the left and right cameras perform stereo image processing using two images. It is common to have a configuration.

In this case, the viewing angle refers to an angle between two centerlines when the centerlines of the left camera 101 and the right camera 102 coincide with an object (object) as shown in FIG. 2. In this case, in the stereo camera, the center lines of the left and right cameras must coincide with the center of the object to be viewed, so that the object can be viewed three-dimensionally like the human eye and the gaze fatigue can be reduced. However, this method requires a mechanical element to move the optical axis of the camera, which is expensive. In this patent, the camera uses a virtual perspective control that fixes the camera and moves the image left and right to focus. In the following discussion, visual control means this virtual visual control.

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a viewing angle control device in a stereoscopic image photographing apparatus. The depth map is reconstructed by clearly histogramting the obtained depth information. Here, the newly obtained depth information is used to calculate a VFL (Visual Fatige Level) value and to control the viewing angle according to the VFL (Visual Fatige Level) value (a detailed description will be described later).

[Question]

1. Please review whether there are any modifications or additions to this specification.

2. Also, please explain the meaning and function of each subscript used in Equations 1 to 10 below.

[answer]

3 is a block diagram of a perspective control device in a stereoscopic image photographing apparatus according to the present invention.

In the stereoscopic image photographing apparatus according to the present invention, the apparatus for controlling the angle of view includes a photographing unit 100 for acquiring two images having different parallaxes constituting a stereo image by a stereo camera, and the obtained two images. The image processor 300 extracts disparity information, which is disparity information of the stereo image, from the parallax, and generates a disparity map by using the extracted disparity information, and generates the disparity map. A histogram (Histogram) from the depth area obtaining unit 400 for generating a first depth map representing the depth of the image for each of the objects included in the stereo image using the first depth map and the generated first depth map. And a segmentation unit 450 for generating a second depth map for dividing each of the objects by distance using the histogram, and the generated second depth map. From the VFL acquisition unit 500 for calculating the VFL (Visual Fatigue Level) value for each of the objects and the object having the highest value among the calculated VFL value automatically controls the viewing angle, the viewing angle is automatically And an automatic viewing angle control unit 600 for outputting the controlled stereoscopic image to the display.

That is, the vergence control device in the stereoscopic image photographing apparatus according to the present invention includes a photographing unit 100, an image processing unit 300, a depth area obtaining unit 400, a segmentation unit 450, and a VFL obtaining unit 500. And the automatic visual controller 600.

The photographing unit 100 photographs a subject by changing a distance between the stereo camera, that is, the left camera 101, the right camera 102, and the subject a predetermined number of times. In this case, two images having parallax with respect to a stereo image are obtained through the left camera 101 and the right camera 102.

The image processor 300 generates and extracts disparity map information, which is variation information of a stereo image, through parallax of the two images.

In this case, the disparity information or the disparity map information may be disparity information calculated by using a stereo algorithm for finding a matching point between two images having different disparities acquired by the photographing unit 100.

That is, the image processor 300 is a means for generating the disparity (map) information by receiving two images having different parallax obtained by the photographing unit 100 and processing two images having different parallax.

The disparity map generation may be performed through a stereo matching algorithm known in the art.

The semi-automatic tilt control device in the 3D photographing apparatus according to the present invention may further include an image acquisition unit 200 for storing two images having different parallax obtained by the photographing unit 100.

The image acquisition unit 200 is connected between the photographing unit 100 and the image processing unit 300 to store the image acquired by the photographing unit 100 and transmit the image to the image processing unit 300.

The depth area obtaining unit 400 generates a disparity map using the extracted disparity information, and uses the disparity map to indicate a depth of an image for each object included in the stereo image. It is a means for generating a depth-map.

The segmentation unit 450 histograms the first depth map and generates a second depth map for segmenting, or segmenting, objects by distance using the histogram. Means.

That is, the segmentation unit 450 histograms the first depth map, and uses the histogram to divide second depth map information for each of the objects included in the stereo image by distance. Perform the function to create.

The histogram represents a distribution of contrast values for pixels in an image and is represented by a bar graph. At this time, the frequency of each intensity value is investigated and represented by the height of a graph.

The VFL acquisition unit 500 is a means for calculating a VFL (Visual Fatigue Level) value for each of the objects from the generated second depth map (specific description will be described later).

The automatic viewing angle control unit 600 is a means for automatically controlling the viewing angle as a focus on the object having the highest value among the calculated VFL values, and outputting a stereo image automatically controlled to the viewing angle on the display.

In the stereoscopic image capturing apparatus according to the present invention, the apparatus for controlling a viewing angle is a semi-automatic viewing control unit for controlling the viewing angle of the stereo image by focusing on one selected object among the objects included in the stereo image output on the display. 700 may further include.

At this time, the VFL obtaining unit 500 performs a CZF (Comfort Zone Function) function calculation step, a weight calculation step and a VFL value calculation step.

In the calculating of the CZF function, CZF, which is a comfortable interval function, is obtained using Equation 1 below.

[Equation 1]

Figure pat00005

In the calculating of the weight, a weight, W, of each object divided by distance in the segmentation unit is calculated using the CZF and the following Equation 2.

&Quot; (2) "

Figure pat00006

In the calculating of the VFL value, a Visual Ftigue Level (VFL) value is calculated by using Equation 3 below and the weight W of each object divided by distance in the segmentation unit.

&Quot; (3) "

Figure pat00007

In this case, the VFL obtaining unit 500 calculates a D value, which is a shift value for focusing on each object separated by distance in the segmentation unit, using the VFL value and Equation 4 below after the VFL value calculating step. It is preferable to perform the step of calculating the shift value (D).

&Quot; (4) "

Figure pat00008

In the automatic viewing control of the automatic viewing control unit 600, the focus is controlled by moving the object having the highest value among the calculated VFL values according to the calculated shift value D. FIG.

In the semi-automatic viewing angle control in the semi-automatic viewing angle controller, the focus is controlled by moving the selected one object according to the calculated shift value D. FIG.

In this case, the display is configured as a touch screen, and the semi-automatic viewing angle control unit controls the viewing angle of the stereoscopic image based on the selected object when any one of the objects output on the display is touched and selected. You can also

Referring to FIG. 3, a viewing angle control apparatus in a stereoscopic image photographing apparatus according to the present invention is as follows.

Two images having different parallax are acquired through the stereo camera, that is, the left camera 101 and the right camera 102, and are stored for processing of the image acquired by the image acquisition unit 200.

The image processor 300 obtains disparity information through the two image processing, and the depth region obtainer 400 obtains a disparity map using the disparity information.

Subsequently, the segmentation unit 450 segments the objects, that is, the objects included in the image, using the histogram of the disparity map.

Thereafter, a Visual Fatigue Level is obtained using an algorithm (the Equations 1 to 4) devised by the VFL obtaining unit 500. At this time, the higher the VFL, the less eye fatigue.

The automatic visual control unit 600 automatically focuses on an object having the least fatigue in the eye by using the VFL value obtained by the VFL obtaining unit 500.

In this case, the semi-automatic viewing controller 700 may control the semi-automatic viewing angle by using a touch button or a jog dial when outputting an incorrect stereoscopic image.

At this time, the VFL value obtained by the VFL acquisition unit 500 focuses first on the object having the least fatigue during semi-automatic viewing control.

Semi-automatic visual control is performed through the user input unit 800 that is composed of a touch button or a jog dial.

By the above configuration, the 3D image having the focus setting in consideration of eye fatigue may be output through the display 850.

4 is a diagram illustrating two images having different parallax obtained through a left camera and a right camera of the photographing unit 100, and FIG. 5 is a first depth map of FIG. 4 generated by the image processor 30. 6 is an exemplary diagram of generating the histogram of the depth map of FIG. 5 by the depth region obtaining unit 60, and FIG. 7 illustrates the histogram of FIG. 6 by the depth region obtaining unit 60. It is an example figure which produced the 2nd depth map using.

Using the following Equation 5, low-pass filtering the histogram through the first obtained depth map.

[Equation 5]

Figure pat00009

The following equations (6) and (7) calculate the peak pixel number Mp [k] of the object and the center depth Ml [k] of the object.

&Quot; (6) "

Figure pat00010

[Equation 7]

Figure pat00011

Equation 8 below is used to unify many small objects and select valid points through thresholds.

[Equation 8]

Figure pat00012

Using Equation 9 below, the smallest value among the objects is set as a boundary value.

&Quot; (9) "

Figure pat00013

The following equation (10) is used to obtain the area (area) of each object by adding the histogram values of the objects defined by the boundary values.

&Quot; (10) "

Figure pat00014

8 is a description through an arbitrary histogram graph using Equation 8.

9 and 10 are exemplary diagrams using an image having more depth steps than in FIG. 4, and FIG. 9 is a histogram obtained through a depth map, and FIG. 10 is an exemplary diagram of an object segmented depth map by distance using a histogram. to be.

Referring to FIG. 11, when the display is viewed with both eyes, the eye fatigue is sharply increased and the fatigue gradually increases as the distance is closer to the focused portion (comfort zone having a disparity value of 0). Left drawing of 11). Through this, a comfort zone function (see Equation 1 above) is obtained. That is, the right diagram of FIG. 11 is a graph showing weights through parallax of two images.

At this time, based on the position where the parallax is 0, the left side shows an object close to the camera and the right side shows an object far from the camera, and a (eye) Comfort Zone Function (CZF) is obtained.

Thereafter, the weighted value W for each segmented object is obtained by using Equation 2 and the CZF value obtained in the previous step.

Next, the Visual Fatigue Level (VFL) is obtained by the sum of the product of the area S and the weight W for each segmented object, using Equation 3 above.

In this case, the higher the VFL value, the less eye fatigue.

12 is a graph illustrating VFL values for each segmented object.

Here, the x axis represents the number of objects divided by the segmentation, and the y axis represents the Visual Fatigue Level. At this time. Focus on object # 1 with the highest VFL value and display it on the screen.

In addition, using Equation 4 to obtain the size to move the image left and right to focus on the object of each step.

At this time, one of the two images is shifted by the D value to focus.

An algorithm according to the present invention will be described with reference to FIG. 13 as follows.

First, left and right images are acquired through the binocular cameras that are the left camera 101 and the right camera 102 (S10).

Next, parallax information is obtained using an algorithm for finding a matching point between two images, and a depth map is generated by obtaining a depth of the image using the parallax information (S20).

Next, histogram information of the depth map is generated to generate a new depth map, that is, a second depth map, which reliably divides the object by distance (S30).

At this time, the VFL value is obtained through an algorithm. The more you focus on an object with a higher VFL, the less eye fatigue.

Then, through the VFL value, the automatic visual control is performed on the object with the least eye fatigue and the 3D image is displayed on the screen. That is, the stereoscopic image is displayed through the automatic visual control (S40).

Then, if the desired stereoscopic image appears on the display is terminated (S50), and when the wrong image is acquired by correcting the focus of the stereoscopic image through the semi-automatic angle of view control (S60).

The eye fatigue is calculated when focusing on each object by using the VFL value obtained above, and it is displayed on the screen when buttons are pressed in order of the objects with less fatigue.

Referring to the perspective control device in the three-dimensional image recording apparatus according to the present invention as follows.

As 3D video becomes an issue recently, 3D technology is also rapidly developing. The application of technology to mobile as well as 3D digital cameras is increasing. However, there are many difficulties in displaying 3D images such as eye strain and motion sickness. In order to solve this problem, a lot of research has been made in H / W and S / W.

The existing 3D camera provides manual visual control after automatic central focusing control. You can set the desired focus by shifting the images sequentially through the H / W red buttons. This can cause eye fatigue because the focus is only on the object located in the center and the focus is set on the unwanted object. In addition, it takes a lot of time because the button has to shift the image to focus on the desired object. Even if focusing on the desired object, refocusing is inevitable if visual fatigue occurs.

The present invention histograms the obtained uncertainty depth information, and then segments the objects by distance, and obtains an eye fatigue level (Visual Fatigue Level) that will occur when setting focus on each object. The VFL value is used to focus on the object with the least eye strain and output it to the screen. After that, if the user focuses on an unwanted or wrong object, the focus can be reset through the touch button or the jog dial. When resetting, the user is quick and easy to operate by changing the order of the objects with less eye fatigue by referring to the VFL value obtained above (see FIGS. 3 and 13).

The core of the present invention is that the focus method in consideration of eye fatigue, which is the most problematic in 3D technology.

Two images are acquired through the binocular camera which is the left camera 101 and the right camera 102.

At this time, a disparity map is obtained through parallax of the acquired two images, and then histogram is generated to generate a new depth map having a clear boundary by distance.

Obtain a Visual Fatigue Level (VFL) value that indicates the degree of eye fatigue for each object. The D (shift value) value for each depth object is obtained using the VFL value obtained in the previous step.

The D value obtained here has a value of how much the image should be shifted when focusing on each object.

5, the VFL value of the first object is the largest. This indicates that eye fatigue is minimal when focusing on object # 1. The automatic visual controller 600 automatically focuses on the object having the largest VFL value (the least eye fatigue) and outputs a 3D image on the screen.

In this case, when the wrong or unwanted object is focused, the user may refocus through a touch button or a jog dial. When the button is clicked for refocusing, the VFL value is applied instead of the depth order to prioritize the objects that are less fatigued. According to the priority of each object to which the VFL value, which is one of the core of the present invention, is reset in the order of objects with less fatigue in the eyes, time waste can be reduced and the user can use it easily and conveniently.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, I will understand. Therefore, the scope of the present invention should not be limited to the described embodiments, but should be defined by the claims below and equivalents thereof.

100: recording unit 101: left camera
102: right camera 200: image acquisition unit
300: image processor 400: depth region acquisition unit
450: segmentation unit 500: VFL acquisition unit
600: automatic viewing control unit 700: semi-automatic viewing control unit
800: user input unit 850: display unit

Claims (7)

A photographing unit 100 for acquiring two images having different parallaxes constituting a stereo image by a stereo camera;
An image processor (300) for extracting disparity information, which is disparity information of the stereo image, from the obtained parallaxes of the two images;
A disparity map is generated using the extracted disparity information, and a first depth map indicating a depth of an image for each of the objects included in the stereo image using the disparity map. A depth region obtaining unit 400 generating a map;
A segmentation unit 450 generating a histogram of the generated first depth map and generating a second depth map for dividing each of the objects by distance using the histogram;
A VFL obtainer 500 for calculating a VFL (Visual Fatigue Level) value for each of the objects from the generated second depth map; And
In the stereoscopic image photographing apparatus including an automatic viewing angle control unit 600 for automatically controlling the viewing angle to focus on the object having the highest value of the calculated VFL value, and outputs a stereo image automatically controlled to the viewing angle. Angle of control.
The method of claim 1,
And a semi-automatic viewing angle control unit for controlling the viewing angle of the stereo image by focusing on one selected object among objects included in the stereo image output to the display.
The VFL obtaining unit 500 of claim 2,
A CZF function calculating step of obtaining a CZF (Comfort Zone Function) using Equation 1 below;
A weight calculation step of obtaining W, which is a weight for each object divided by distance in the segmentation unit, using the CZF function and Equation 2 below;
The segmentation unit performs a VFL value calculating step of calculating a VFL (Visual Ftigue Level) value by using Equation 3 below and the weights W for each of the objects divided by distances. Penetration Control Device in Imaging Device.
[Equation 1]
Figure pat00015

&Quot; (2) "
Figure pat00016

&Quot; (3) "
Figure pat00017

The method of claim 3, wherein the VFL acquisition unit 500,
After calculating the VFL value, calculating a shift value D for calculating a D value, which is a shift value for focusing on each object divided by distance in the segmentation unit, using the VFL value and Equation 4 below. Perpendicular control device in the three-dimensional image recording apparatus, characterized in that further performing.
&Quot; (4) "
Figure pat00018

5. The method of claim 4,
In the automatic viewing control of the automatic viewing control unit 600, the focus is controlled by moving the object having the highest value among the calculated VFL values according to the calculated shift value D. FIG. Viewing angle control device in the imaging device.
5. The method of claim 4,
The semi-automatic viewing control in the semi-automatic viewing control unit is a focus control device in the stereoscopic image shooting apparatus, characterized in that the focus is controlled by moving the selected one of the objects according to the calculated shift value (D).
The method of claim 2,
The display is composed of a touch screen,
The semi-automatic viewing angle controller is configured to control the viewing angle of the stereoscopic image by focusing on the selected object when any one of the objects output on the display is touched and selected. Vertex Control.
KR20120000193A 2012-01-02 2012-01-02 Apparatus for convergence in 3d photographing apparatus KR101345971B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20120000193A KR101345971B1 (en) 2012-01-02 2012-01-02 Apparatus for convergence in 3d photographing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20120000193A KR101345971B1 (en) 2012-01-02 2012-01-02 Apparatus for convergence in 3d photographing apparatus

Publications (2)

Publication Number Publication Date
KR20130078990A true KR20130078990A (en) 2013-07-10
KR101345971B1 KR101345971B1 (en) 2014-01-06

Family

ID=48991830

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20120000193A KR101345971B1 (en) 2012-01-02 2012-01-02 Apparatus for convergence in 3d photographing apparatus

Country Status (1)

Country Link
KR (1) KR101345971B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190046387A (en) 2017-10-26 2019-05-07 주식회사 그루크리에이티브랩 Method for correcting images of head mounted display and method for displaying corrected images on head mounted display
KR101947372B1 (en) 2017-09-04 2019-05-08 주식회사 그루크리에이티브랩 Method of providing position corrected images to a head mount display and method of displaying position corrected images to a head mount display, and a head mount display for displaying the position corrected images

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101042171B1 (en) * 2009-06-08 2011-06-16 충북대학교 산학협력단 Method and apparatus for controlling vergence of intersting objects in the steroscopic camera
KR101668117B1 (en) * 2010-06-16 2016-10-20 엘지이노텍 주식회사 Apparatus for controlling convergence angle of stereo camera and 3-dimension image processing system with the same

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101947372B1 (en) 2017-09-04 2019-05-08 주식회사 그루크리에이티브랩 Method of providing position corrected images to a head mount display and method of displaying position corrected images to a head mount display, and a head mount display for displaying the position corrected images
US10445888B2 (en) 2017-09-04 2019-10-15 Grew Creative Lab Inc. Method of providing position-corrected image to head-mounted display and method of displaying position-corrected image to head-mounted display, and head-mounted display for displaying the position-corrected image
KR20190046387A (en) 2017-10-26 2019-05-07 주식회사 그루크리에이티브랩 Method for correcting images of head mounted display and method for displaying corrected images on head mounted display

Also Published As

Publication number Publication date
KR101345971B1 (en) 2014-01-06

Similar Documents

Publication Publication Date Title
JP5963422B2 (en) Imaging apparatus, display apparatus, computer program, and stereoscopic image display system
US9948918B2 (en) Method and apparatus for stereoscopic focus control of stereo camera
JP6011862B2 (en) 3D image capturing apparatus and 3D image capturing method
JP2013005259A (en) Image processing apparatus, image processing method, and program
US20110228051A1 (en) Stereoscopic Viewing Comfort Through Gaze Estimation
JP5814692B2 (en) Imaging apparatus, control method therefor, and program
CN103181173B (en) 3-dimensional image processing apparatus, three-dimensional image pickup device and three dimensional image processing method
JP5464279B2 (en) Image processing apparatus, program thereof, and image processing method
CN107209949B (en) Method and system for generating magnified 3D images
US9693036B2 (en) Imaging apparatus, image processing device, computer-readable medium having stored thereon an imaging apparatus controlling program, and computer-readable medium having stored thereon an image processing program
JP6585938B2 (en) Stereoscopic image depth conversion apparatus and program thereof
KR20120028121A (en) Method and apparatus for diciding of convergence angle in stereo camera
JP5840022B2 (en) Stereo image processing device, stereo image imaging device, stereo image display device
US9082210B2 (en) Method and apparatus for adjusting image depth
KR20090037247A (en) Method and device for transformation from multi focused 2d image to 3d image, and recording media
JP4988971B2 (en) Image processing apparatus, imaging apparatus, and image processing method
KR101219859B1 (en) Apparatus and method for semi-auto convergence in 3d photographing apparatus
WO2012002347A1 (en) Image processing device, imaging device and image processing method
CN107155102A (en) 3D automatic focusing display method and system thereof
JP5546690B2 (en) Image processing apparatus, image processing method, program, recording medium, and imaging apparatus
KR101345971B1 (en) Apparatus for convergence in 3d photographing apparatus
JP2013535120A (en) Method and apparatus for auto-convergence based on auto-focus points for stereoscopic frames
US20160065941A1 (en) Three-dimensional image capturing apparatus and storage medium storing three-dimensional image capturing program
JP6045280B2 (en) Imaging device
KR20160041403A (en) Method for gernerating 3d image content using information on depth by pixels, and apparatus and computer-readable recording medium using the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20160928

Year of fee payment: 4

LAPS Lapse due to unpaid annual fee