WO2018076529A1 - Scene depth calculation method, device and terminal - Google Patents
Scene depth calculation method, device and terminal Download PDFInfo
- Publication number
- WO2018076529A1 WO2018076529A1 PCT/CN2016/112696 CN2016112696W WO2018076529A1 WO 2018076529 A1 WO2018076529 A1 WO 2018076529A1 CN 2016112696 W CN2016112696 W CN 2016112696W WO 2018076529 A1 WO2018076529 A1 WO 2018076529A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- image
- offset
- lens
- ois
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the embodiments of the present invention relate to the field of communications, and in particular, to a method, a device, and a terminal for calculating a scene depth of a target scene having a dual camera terminal device.
- Optical Image Stabilization (OIS, also commonly referred to as optical image stabilization) is an important means of improving the quality of photographs in low light, and is also used on more and more mobile phones. OIS works by compensating for hand shake by moving the lens to achieve image stabilization.
- the image is generally corrected by the calibration parameters of the dual camera, so that the left and right images provided by the dual camera are aligned in one direction, then the parallax is calculated, and the parallax is converted into the depth of the scene.
- the OIS causes a lens shift, which causes the dual camera calibration parameters to change, resulting in parallax problems (positive parallax and negative parallax exist simultaneously, or the image cannot be aligned in one direction).
- the calculated scene depth value is not accurate.
- the embodiment of the invention provides a scene depth calculation method.
- the scene of the target scene caused by the parallax problem (the positive or negative parallax exists simultaneously or the image cannot be aligned in one direction) is solved.
- the problem of inaccurate depth values is solved.
- a scene depth calculation method comprising: acquiring a lens offset of a camera with an OIS system; wherein the first camera and/or the second camera has an OIS system, the first camera And the second camera is arranged side by side on the body of the same terminal device; according to the preset OIS motor sensitivity calibration parameter, the lens offset is converted into an image offset; according to the compensated first camera calibration parameter and / Or the compensated second camera calibration parameter, and the first image and the second image obtained by the first camera and the second camera respectively acquiring the target scene at the same time, and calculating the scene depth of the target scene;
- the calibration parameter of the first camera is compensated according to the lens offset of the first camera
- the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
- the method may further include: acquiring angular velocity information of the terminal device jitter detected by the gyro sensor; converting the angular velocity information into jitter of the terminal device
- the amplitude drives the OIS motor to push the lens of the first camera and/or the lens of the second camera according to the amplitude of the shake, and acquires a lens offset of the first camera and/or the second camera.
- the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
- the OIS motor sensitivity calibration parameter is determined according to the steps of: pushing the OIS motor through the OIS controller, moving the lens to a designated position; waiting for the OIS motor Photographing after stabilization; when the captured image reaches a preset number of sheets, detecting feature point coordinates of each image, and according to the specified position of the lens and the respective images
- the feature point coordinates of the image determine the OIS motor sensitivity calibration parameters.
- the OIS motor sensitivity calibration parameter is stored therein before the terminal device leaves the factory, so that when the terminal device is shipped, the scene depth of the target scene is calculated, and the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image. Offset.
- the lens offset is converted into an image offset according to the following formula
- ⁇ x is the image offset
- ⁇ is the OIS motor sensitivity calibration parameter
- ⁇ C is the lens offset
- the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
- the first camera calibration parameter after compensation and/or the second camera calibration after compensation And the first image and the second image obtained by the first camera and the second camera at the same time, and the scene depth of the target scene is specifically calculated by:
- Z is the scene depth of the target scene
- f is the focal length
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset
- pixel pitch is the size of a pixel
- a 1 ⁇ 1 ⁇ x 1 is the first image offset
- a ⁇ ⁇ 1 ⁇ pixel pitch is the unit of the first image offset is converted from pixel to mm
- the main point of the first camera is changed from u 1 ' to u 1 after compensation.
- B 'becomes B 1 the principal point of the second camera is u 2
- x 1 is the first image forming point
- x 2 is an imaging point of the second image.
- the first camera and the second camera both have an OIS system
- the first camera is calibrated according to the compensated parameter and/or compensated Second camera calibration parameter
- the first camera and the second camera are The first image and the second image obtained by the target scene are collected at the same time, and the calculation of the scene depth of the target scene includes:
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the mirror of the first camera
- the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
- the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
- the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
- the image is imaged and x 2 is the imaged point of the second image.
- an embodiment of the present invention provides a scenario depth calculation device, where the device includes: a first acquisition unit, a processing unit, and a calculation unit; and the first acquisition unit is configured to acquire a camera with an OIS system. a lens shift amount; wherein the first camera and/or the second camera are provided with an OIS system, the first camera and the second camera are juxtaposed on the body of the same terminal device; and the processing unit is configured to be preset according to The OIS motor sensitivity calibration parameter converts the lens offset into an image offset; the calculating unit is configured to perform calibration parameters according to the compensated first camera and/or compensated second camera calibration parameters, and The first camera and the second camera collect the first image and the second image respectively obtained by the target scene at the same time, and calculate the scene depth of the target scene; wherein, the first offset is compensated according to the lens offset of the first camera The calibration parameter of the camera compensates the calibration parameter of the second camera according to the lens offset of the second camera.
- the device further includes: a second acquiring unit, where the second acquiring unit is configured to: acquire the jitter of the terminal device detected by the gyro sensor Angular velocity information; converting the angular velocity information into a jitter amplitude of the terminal device, Driving the OIS motor to drive lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring a lens shift amount of the first camera and/or the second camera.
- the second acquiring unit is configured to: acquire the jitter of the terminal device detected by the gyro sensor Angular velocity information; converting the angular velocity information into a jitter amplitude of the terminal device, Driving the OIS motor to drive lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring a lens shift amount of the first camera and/or the second camera.
- the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
- the device further includes: a determining unit, where the determining unit is specifically configured to: push the OIS motor through the OIS controller, and move the lens to the designated Positioning; waiting for the OIS motor to stabilize after taking a picture; when the captured image reaches a preset number of sheets, detecting feature point coordinates of each image, and determining an OIS motor sensitivity calibration parameter according to the specified position of the lens.
- the OIS motor sensitivity calibration parameter is stored therein before the terminal device leaves the factory, so that when the terminal device is shipped, the scene depth of the target scene is calculated, and the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image. Offset.
- the processing unit is specifically configured to:
- ⁇ x is the image offset
- ⁇ is the OIS motor sensitivity calibration parameter
- ⁇ C is the lens offset
- the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
- the calculating unit is specifically configured to:
- Z is the scene depth of the target scene
- f is the focal length
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset
- pixel pitch is the size of a pixel
- a 1 ⁇ 1 ⁇ x 1 is the first image offset
- a ⁇ ⁇ 1 ⁇ pixel pitch is the unit of the first image offset is converted from pixel to mm
- the main point of the first camera is changed from u 1 ' to u 1 after compensation.
- B 'becomes B 1 the principal point of the second camera is u 2
- x 1 is the first image forming point
- x 2 is an imaging point of the second image.
- the calculating unit is specifically configured to:
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the mirror of the first camera
- the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
- the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
- the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
- the image is imaged and x 2 is the imaged point of the second image.
- an embodiment of the present invention provides a terminal, where the terminal includes a first camera and a second camera, where the first camera and the second camera are used to collect at least one target scene at a same time, respectively obtaining a first image. And a second image; wherein the first camera and/or the second camera are provided with an OIS system, the first camera and the second camera are juxtaposed on the body of the same terminal device; and the memory is configured to store the first An image and a second image; the processor is configured to acquire a lens offset of the camera with the OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter Calculating a scene depth of the target scene according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the first image and the second image acquired from the memory; wherein, according to First camera The lens offset compensates the calibration parameter of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
- the lens offset is used to compensate for the change of the camera calibration parameters caused by the jitter of the terminal device, and the parallax problem is solved.
- the compensated camera calibration parameters are used to calculate the scene depth of the target scene, and the calculated scene depth value is more accurate.
- the OIS system is specifically configured to: acquire angular velocity information of terminal device jitter detected by a gyro sensor; and convert the angular velocity information into a terminal device The amplitude of the jitter, driving the OIS motor to drive the lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring the lens offset of the first camera and/or the second camera.
- the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
- the processor is further configured to: by the OIS controller, push the OIS motor to move the lens to the designated position; wait for the OIS motor to stabilize Taking a picture; detecting a feature point coordinate of each image when the captured image reaches a preset number of sheets, and determining an OIS motor sensitivity calibration parameter according to the specified position of the lens and the feature point coordinates of each image;
- the memory is further configured to store the OIS motor sensitivity calibration parameters.
- the OIS motor sensitivity calibration parameter is stored therein before the terminal leaves the factory, so that when the terminal is calculated and the scene depth of the target scene is calculated, the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image offset. the amount.
- the processor is specifically configured to convert the lens offset into an image offset according to the following formula,
- ⁇ x is the image offset
- ⁇ is the OIS motor sensitivity calibration parameter
- ⁇ C is the lens offset
- the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
- the processor is specifically configured to determine a scene depth of the target scene by using:
- Z is the scene depth of the target scene
- f is the focal length
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset
- pixel pitch is the size of a pixel
- a 1 ⁇ 1 ⁇ x 1 is the first image shift amount
- a ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm
- the main point of the first camera is changed from u 1 ' to u 1 after compensation.
- the baseline B 'becomes B 1 the principal point of the second camera is u 2
- x 1 is the first image forming point
- the processor is specifically configured to determine the target scenario by using the following formula: Scene depth,
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the mirror of the first camera
- the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
- the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
- the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
- the image is imaged and x 2 is the imaged point of the second image.
- FIG. 1 is a block diagram showing the working principle of the OIS system
- Figure 2 is a block diagram of a depth calculation system
- FIG. 3 is a flowchart of a method for calculating a depth of a scene according to Embodiment 1 of the present invention
- Figure 4a is a schematic diagram of a lens offset scene
- Figure 4b is a schematic diagram of imaging changes before and after lens shift
- Figure 4c is a flow chart for determining the OIS motor sensitivity calibration parameters
- FIG. 5a is a schematic diagram of scene depth calculation according to an embodiment of the present invention.
- FIG. 5b is still another schematic diagram of scene depth calculation according to an embodiment of the present invention.
- Figure 6a is a schematic diagram of an image taken when compensating the calibration parameters of the dual camera
- Figure 6b is a schematic diagram of an image taken after the calibration parameters of the dual camera are not compensated
- Figure 6c is a partial enlarged view of Figure 6a
- Figure 6d is a partial enlarged view of Figure 6b;
- Figure 7a is a schematic diagram of the depth of the scene when the calibration parameters of the dual camera are not compensated
- Figure 7b is a schematic diagram of the depth of the scene after compensating the calibration parameters of the dual camera
- FIG. 8 is a schematic structural diagram of a scene depth calculation apparatus according to Embodiment 2 of the present invention.
- FIG. 9 is a schematic structural diagram of still another scene depth computing apparatus according to Embodiment 2 of the present invention.
- FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 3 of the present invention.
- the terminal device may be a device having a dual camera, including but not limited to a camera (such as a digital camera), a video camera, a mobile phone (such as a smart phone), a tablet (Pad), and a personal digital assistant (Personal).
- a camera such as a digital camera
- a video camera such as a digital camera
- a mobile phone such as a smart phone
- a tablet such as a tablet
- a personal digital assistant Personal digital assistant
- the digital assistant (PDA), the portable device for example, a portable computer
- the wearable device and the like are not specifically limited in the embodiment of the present invention.
- the terminal device may be a mobile phone, and the following uses a mobile phone as an example to perform an embodiment of the present invention. set forth.
- the dual camera simulates the human binocular vision principle to perceive the distance, that is, observing an object from two points, and acquiring images at different viewing angles, according to the pixels between the images.
- the matching relationship is obtained by calculating the offset between pixels by the principle of triangulation to obtain the depth of the scene of the object.
- the OIS causes the lens to shift, which causes the dual camera calibration parameters to change, resulting in parallax problems, which in turn results in inaccurate scene depth calculation. Therefore, it is necessary to compensate the dual camera calibration parameters. , so that the scene depth of the target scene is calculated accurately.
- FIG. 1 is a block diagram of the working principle of the OIS system.
- the terminal device includes an OIS system 100 and an Image Signal Processor (ISP) 110.
- the OIS system 100 includes an OIS controller 120, a gyro sensor 130, a Hall sensor 140, a motor 150, and a camera 160.
- ISP Image Signal Processor
- the camera 160 includes a first camera and a second camera.
- the first camera and the second camera may be juxtaposed in front of the terminal device, or may be juxtaposed on the back of the terminal device, and may be arranged in a horizontal arrangement or a vertical arrangement.
- the first camera and/or the second camera are provided with an OIS system, and the first camera and the second camera respectively have lenses (not shown in FIG. 1).
- the Hall sensor 140 is a magnetic field sensor that performs displacement measurement based on the Hall effect for acquiring the lens shift amount of the camera with the OIS system, that is, the lens shift amount of the first camera and/or the second camera.
- the gyro sensor 130 is a positioning system based on the movement of the terminal device in a free space orientation for acquiring angular velocity information when the terminal device is shaken.
- the OIS controller 120 acquires angular velocity information from the gyro sensor 130, converts the angular velocity information into a jitter amplitude of the terminal device, and transmits the jitter amplitude as a reference signal to the motor 150.
- the motor 150 may be an OIS motor for driving the lens movement of the camera with the OIS system according to the amplitude of the shake to ensure the sharpness of the image; wherein the movement refers to moving in the X and/or Y direction, and the Y direction refers to the lens.
- the X direction refers to the direction of the light passing through the lens and perpendicular to the Y direction.
- the OIS controller 120 also acquires the first image and the second image obtained by acquiring the target scene at the same time from the first camera and the second camera.
- the ISP 110 stores the lens shift amount, the first image, and the second image acquired from the OIS controller 120.
- the terminal device performs initialization, and usually, when ready, the OIS controller 120 controls the shutter to acquire an image.
- the terminal device may shake, and the OIS controller 120 reads the angular velocity information detected by the gyro sensor 130, converts the angular velocity information into the jitter amplitude of the terminal device, and transmits it as a reference signal to the OIS motor, and the OIS motor according to the jitter amplitude Move the lens of the camera with OIS system to avoid blurring of the captured image caused by the jitter of the terminal device and ensure the sharpness of the image.
- the movement may be that the lens of the first camera moves in the X and/or Y direction and/or the lens of the second camera moves in the X and/or Y direction.
- the OIS controller 120 reads the lens offset of the camera with the OIS system detected by the Hall sensor 140, that is, the lens offset of the first camera and/or the second camera, and acquires the captured image from the camera. That is, the first image and the second image obtained by the target scene are acquired at the same time corresponding to the first camera and the second camera, respectively, and the lens offset and the captured image are sent to the ISP 110.
- the ISP 110 stores the lens shift amount and the first image and the second image captured by the camera.
- the terminal device jitter time is generally greater than its exposure time, for example, the terminal device jitter duration is 30ms and the exposure time is 2ms.
- the Hall sensor 140 acquires 15 lens offsets, OIS.
- the controller 120 reads the 15 lens offsets from the Hall sensor 140 according to a preset rule, and determines a lens offset from the 15 lens offsets, and uses in the subsequent process.
- the determined lens offset is used as the lens offset described in the context to perform scene depth calculation of the target scene.
- FIG. 2 is a block diagram of the depth calculation system.
- the depth calculation system includes an ISP 110 and a depth calculation module 210.
- the depth calculation module 210 acquires preset calibration information from the ISP 110 and
- the ISP 110 reads the stored OIS information, and the first camera and the second camera acquire the first image and the second image respectively obtained by the target scene at the same time, calculate the scene depth of the target scene, and output the disparity map/depth map.
- the calibration information is the camera calibration parameters such as focal length, baseline, optical center and principal point when initializing;
- the OIS information is the lens offset.
- the depth calculation module acquires the lens offset. Since the unit of the lens offset is code and the unit of the scene depth is millimeter (mm), the two are inconsistent. Therefore, the lens needs to be offset according to the OIS motor sensitivity. The quantity is converted into image offset, the unit is pixel (pixel), and then the camera offset parameter is compensated by the lens offset, and the scene depth value of the target scene is calculated according to the compensated camera calibration parameter, thereby calculating the scene depth value. More precise.
- FIG. 3 is a flowchart of a method for calculating a scene depth according to Embodiment 1 of the present invention. As shown in FIG. 3, the method includes:
- the lens shift amount can be obtained by Hall sensor detection.
- the lens offset is abnormal; if the lens offset is not greater than the preset threshold, the lens offset is not abnormal.
- S340 converting the lens offset into an image offset (see ⁇ x of FIG. 4b).
- S330 needs to be executed in advance, that is, the OIS motor sensitivity calibration parameter, that is, the image offset caused by the unit lens offset, is input, wherein each camera with the OIS system has its corresponding OIS motor sensitivity calibration. Parameters, the OIS motor sensitivity calibration parameters have been pre-stored before the terminal device leaves the factory. Using the OIS motor sensitivity calibration parameters, the lens offset can be converted to an image offset.
- some calibration parameters such as the optical center, the main point, the baseline, etc.
- the image offset is calculated according to the lens offset
- the lens offset is used to compensate for the change.
- the camera calibration parameters determine the value of the camera calibration parameters after the change. Referring to Fig. 4, the optical center changes from C' to C, and the principal point changes from u' to u. Referring to Fig. 5a, the optical center of the first camera lens changes from C 1 ' to C 1 , the main point changes from u 1 ' to u 1 , and the baseline changes from B' to B 1 . Referring to Fig.
- the optical center of the first camera lens changes from C 1 ' to C 1
- the main point changes from u 1 ' to u 1
- the optical center of the second camera lens changes from C 2 ' to C 2
- the main point From u 2 ' to u 2
- the baseline changes from B' to B 2 .
- S380 Calculate a scene depth of the target scene. See Equation 2 and Equation 4 for the calculation formula. It is necessary to perform S360 in advance before executing S380, that is, input the first image, and S370, that is, input the second image.
- the scene depth of the target scene is determined according to the compensated camera calibration parameters, the first image, and the second image.
- the first camera and the second camera acquire the first image and the second image respectively obtained by the target scene at the same time.
- S390 Output a scene depth of the target scene.
- Figure 4a is a schematic diagram of a lens offset scene.
- the OIS motor pushes the lens from the position of the elliptical dotted line to the specified position (x i , y i ), respectively, to take an image of the fixed chart, and the image sensor will
- the optical image acquired by the camera is converted into an electronic signal, and the lens offset can be determined according to the position before and after the lens is moved, and the image offset is determined according to the images of the two fixed charts.
- Figure 4b is a schematic diagram of imaging changes before and after lens shift.
- the OIS motor pushes the lens of a camera in the X direction as an example.
- the camera calibration parameters the focal length is f
- the optical center is C'
- the main point is u'.
- part of the calibration parameters of the camera changes the optical center changes from C' to C
- the main point changes from u' to u.
- the imaging points before and after the lens movement are x' and x, respectively, ⁇ C is the distance between the optical center C' and the optical center C, that is, the lens shift amount, the unit is code, ⁇ x is the distance between the imaging point x' and the imaging point x, that is, the image Offset in pixels.
- the image offset caused by the unit lens offset can be measured, that is, the OIS motor sensitivity calibration parameter.
- the actual image offset can be calculated based on the lens offset during subsequent shooting to compensate for the camera calibration parameters when the terminal is shaken.
- the OIS motor sensitivity calibration parameters When determining the OIS motor sensitivity calibration parameters, it is assumed that between the lens shift amount ⁇ C and the image shift amount ⁇ x The relationship is linear, and the OIS motor sensitivity calibration parameters can be obtained as: ⁇ ⁇ ⁇ x / ⁇ C, and the ⁇ unit is pixels/code.
- ⁇ C and ⁇ x are not strictly linear.
- Higher-order models such as second-order, third-order or higher, can be used to capture more images for more accurate OIS motor sensitivity calibration parameters.
- Figure 4c is a flow chart for determining OIS motor sensitivity calibration parameters. As shown in Figure 4c, it includes:
- the determined OIS motor sensitivity calibration parameter may have a relatively large error, and multiple images may be taken to improve the accuracy of the OIS motor sensitivity calibration parameter.
- the feature point coordinates in the captured image before and after the lens movement are respectively detected, and the image offset amount is acquired.
- the lens shift amount is determined according to the moving distance of the lens, and the OIS motor sensitivity calibration parameter is determined according to the image shift amount and the lens shift amount.
- S450 Store the OIS motor sensitivity calibration parameter into the terminal device.
- the OIS motor sensitivity calibration parameter is stored therein, so that after the terminal device is shipped from the factory, when the target device is used to collect the target scene, the lens is adjusted according to the OIS motor sensitivity calibration parameter stored in advance.
- the offset is converted to the image offset, and the camera offset parameter is compensated by the lens offset to make the scene depth of the target scene more accurate.
- FIG. 5a is a schematic diagram of scene depth calculation according to an embodiment of the present invention.
- the first camera has an OIS system
- the second camera does not have an OIS system
- the lens of the first camera moves in the X direction.
- the lens of the camera is a convex lens, and the incoming rays and the corresponding and parallel rays are formed.
- the conjugate ray, the intersection of the line connecting the incident point and the exit point with the main optical axis, is called the focal point of the convex lens.
- the distance from the focus plane or the imaging plane such as CCD is called the focal length.
- the point at the center of the lens is called the optical center.
- the intersection of the line of sight with the imaging plane such as the film or CCD is called the main point, and the distance between the first camera lens and the second camera lens is called the baseline.
- the focal length is f
- the optical center of the lens of the first camera is C 1 '
- the main point is u 1 '
- the optical center of the lens of the second camera is C 2
- the main The point is u 2
- the baseline is B'.
- the scene depth calculation is based on the uncompensated camera calibration parameters, the main point u 1 ' and the baseline B', To complete.
- the calculated scene depth Z' error is large.
- the terminal apparatus When pressed the shutter, the terminal apparatus can cause jitter calibration parameters of the camera portion is changed, pushing the first camera lens according to the jitter amplitude shift occurs, the OIS motor, a first optical center of the camera lens from C 1 'becomes C 1 ( That is, the lens offset of the first camera is in code), the main point changes from u 1 ' to u 1 , and the baseline changes from B' to B 1 , and u 1 and B 1 need to be calculated.
- the camera calibration parameters of the lens offset compensation are used to determine the value of the compensated camera calibration parameters.
- the calculated scene depth Z is:
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset of the first camera
- pixel pitch is the size of one pixel
- a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image offset
- a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm.
- FIG. 5b is still another schematic diagram of scene depth calculation according to an embodiment of the present invention.
- the first camera and the second camera are simultaneously provided with an OIS system, and the first camera and the second camera are simultaneously moved in the X direction.
- the focal length is f
- the first camera The optical center of the lens is C 1 '
- the main point is u 1 '
- the optical center of the lens of the second camera is C 2 '
- the main point is u 2 '
- the baseline is B'.
- the imaging point of the first image acquired by the first camera is x 1
- the imaging point of the second image acquired by the second camera is x 2 .
- the depth calculation is based on the uncompensated camera calibration parameters, the main points u 1 ', u 2 ' and the baseline. B' to complete.
- the calculated scene depth Z' error is large.
- the terminal device shake will cause some calibration parameters of the camera to change, that is, the optical center of the first camera lens changes from C 1 ' to C 1 , and the main point changes from u 1 ' to u 1 , the second camera
- the optical center of the lens changes from C 2 ' to C 2 (corresponding to the lens offset of the second camera, the unit is code), the main point changes from u 2 ' to u 2 , and the baseline changes from B' to B 2 , which needs to be calculated.
- u 1 , u 2 and B 2 are produced.
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset of the first camera
- a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image offset
- the pitch is to convert the unit of the first image offset from pixel to mm
- a 2 is the OIS motor sensitivity calibration parameter of the second camera
- ⁇ 2 is the lens offset of the second camera
- a 2 ⁇ ⁇ 2 ⁇ pixel pitch is a unit pixel of the second image shift amount converted into mm.
- FIG. 5a and FIG. 5b illustrate how the camera calibration parameter compensation is performed by taking the movement of the camera in one direction as an example, it should be realized that the same can realize the compensation of the camera calibration parameters of the two cameras moving in two directions. I won't go into details here.
- Figure 6a is a schematic diagram of an image taken when compensating the calibration parameters of the dual camera
- Figure 6b is a schematic diagram of the image taken after the calibration parameters of the dual camera are not compensated
- Figure 6c is a partial enlarged view of Figure 6a
- Figure 6d is a partial enlarged view of Figure 6b It can be seen from Fig. 6a-6d that the image alignment effect is poor when the dual camera calibration parameters are not compensated, and the alignment effect of the image is good after compensating the dual camera calibration parameters.
- Figure 7a is a schematic diagram of the depth of the scene when the calibration parameters of the dual camera are not compensated
- Figure 7b is a schematic diagram of the depth of the scene after the calibration parameters of the dual camera are compensated.
- different depth values are represented by different colors, and black indicates that the scene depth cannot be calculated.
- the depth of the scene measured at 1000 mm is 1915.8 mm and the depth of the scene measured at 300 mm is 344.6 mm.
- Figure 7b the depth of the scene measured at 1000 mm is 909.6 mm and the depth of the scene measured at 300 mm is 287.4 mm. It follows that the calculated scene depth value is more accurate after compensating the dual camera calibration parameters.
- the scene depth calculation method with the dual camera terminal provided by the embodiment of the invention solves the problem that the scene depth value is inaccurate due to the parallax problem.
- FIG. 8 is a schematic structural diagram of a scene depth calculation apparatus according to Embodiment 2 of the present invention.
- the scene depth calculation apparatus 800 includes a first acquisition unit 810, a processing unit 820, and a calculation unit 830.
- the first acquiring unit 810 is configured to acquire a lens offset of the camera with the OIS system; wherein the first camera and/or the second camera have an OIS system, and the first camera and the second camera are arranged side by side in the same On the body of the terminal device.
- the processing unit 820 is configured to convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter.
- the calculating unit 830 is configured to obtain, according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the acquired target scenes obtained by the first camera and the second camera at the same time
- the first image and the second image are used to calculate a scene depth; wherein the calibration parameter of the first camera is compensated according to the lens offset of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
- the processing unit 820 is specifically configured to: convert the lens offset into an image offset according to the following formula,
- ⁇ x is the image shift amount
- ⁇ is the OIS motor sensitivity calibration parameter
- ⁇ C is the lens shift amount
- the calculating unit 830 is specifically configured to:
- Z is the scene depth of the target scene
- f is the focal length
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset
- pixel pitch is the size of a pixel
- a 1 ⁇ 1 ⁇ x 1 is the first image offset
- a ⁇ ⁇ 1 ⁇ pixel pitch is the unit of the first image offset is converted from pixel to mm
- the main point of the first camera is changed from u 1 ' to u 1 after compensation.
- B 'becomes B 1 the principal point of the second camera is u 2
- x 1 is the first image forming point
- x 2 is an imaging point of the second image.
- the calculating unit 830 is specifically configured to:
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the mirror of the first camera
- the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
- the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
- the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
- the image is imaged and x 2 is the imaged point of the second image.
- FIG. 9 is a schematic structural diagram of still another scene depth computing apparatus according to Embodiment 2 of the present invention.
- the device may also be a scene depth computing device 900.
- the device 900 may further include: a second acquiring unit 910 and a determining unit 920.
- a second obtaining unit 910 configured to acquire angular velocity information of the terminal device jitter detected by the gyro sensor; convert the angular velocity information into a jitter amplitude of the terminal device, and drive the OIS motor to push the lens of the first camera according to the jitter amplitude And/or the lens of the second camera moves and acquires the lens offset of the first camera and/or the second camera.
- the determining unit 920 is configured to: push the OIS motor through the OIS controller, move the lens to the designated position; wait for the OIS motor to stabilize and take a picture; and when the captured image reaches a preset number of sheets, detect the feature point coordinates of each image And determining the OIS motor sensitivity calibration parameter according to the specified position of the lens.
- FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 3 of the present invention.
- the terminal 1000 includes a camera 1010 (the camera 1010 includes a first camera and a second camera) processor 1020, a memory 1030, and a system bus; the camera 1010, the processor 1020, and the memory 1030 establish a connection through a system bus.
- the first camera and the second camera are configured to acquire at least the target scene at the same time to obtain the first image and the second image respectively; wherein the first camera and/or the second camera have an OIS system, the first camera and the second camera Parallel to the fuselage of the same terminal device.
- the memory 1020 is configured to store the first image and the second image.
- the processor 1030 is configured to acquire a lens offset of the camera with the OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter; a first camera calibration parameter and/or a compensated second camera calibration parameter, and a first image and a second image acquired from the memory, calculating a scene depth of the target scene; wherein, according to the lens offset of the first camera The calibration parameter of the first camera is compensated, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
- the OIS system is specifically configured to acquire angular velocity information of the terminal device jitter detected by the gyro sensor, convert the angular velocity information into a jitter amplitude of the terminal device, and drive the OIS motor to push the first camera according to the jitter amplitude.
- the lens of the lens and/or the second camera moves and acquires the lens offset of the first camera and/or the second camera.
- the processor 1030 is further configured to: push the OIS motor through the OIS controller, move the lens to a designated position; wait for the OIS motor to stabilize and take a picture; and when the captured image reaches a preset number of sheets, detect feature points of each image Coordinates, and determining OIS motor sensitivity calibration parameters according to the specified position of the lens and the feature point coordinates of the respective images.
- the memory 1020 is further configured to store the OIS motor sensitivity calibration parameter.
- processor 1030 is specifically configured to convert the lens offset into an image offset according to the following formula.
- ⁇ x is the image shift amount
- ⁇ is the OIS motor sensitivity calibration parameter
- ⁇ C is the lens shift amount
- the processor is specifically configured to determine a scene depth of the target scene by using the following formula:
- Z is the scene depth of the target scene
- f is the focal length
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the lens offset
- pixel pitch is the size of a pixel
- a 1 ⁇ 1 ⁇ x 1 is the first image shift amount
- a ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm
- the main point of the first camera is changed from u 1 ' to u 1 after compensation.
- the baseline B 'becomes B 1 the principal point of the second camera is u 2
- x 1 is the first image forming point
- the processor is specifically configured to determine a scene depth of the target scene by using the following formula:
- a 1 is the OIS motor sensitivity calibration parameter of the first camera
- ⁇ 1 is the mirror of the first camera
- the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
- the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
- the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
- the image is imaged and x 2 is the imaged point of the second image.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Embodiments of the present invention relate to a scene depth calculation method, a device and a terminal. The method comprises: obtaining a camera offset of a camera having an OIS system, wherein a first camera and/or a second camera have the OIS system; converting the camera offset into an image offset according to a preset OIS motor sensitivity calibration parameter; calculating the scene depth of a target scene according to a compensated calibration parameter of the first camera and/or a compensated calibration parameter of the second camera, as well as a first image and a second image obtained by the first camera and the second camera photographing the target scene at the same time; wherein the calibration parameter of the first camera is compensated according to the camera offset of the first camera, and the calibration parameter of the second camera is compensated according to the camera offset of the second camera. Therefore, the parallax problem is solved, and the scene depth of a target scene is more accurate as being calculated using the compensated calibration parameters of cameras.
Description
本申请要求于2016年10月25日提交中国国家知识产权局专利局、申请号为201610941102.2、发明名称为“一种具有双摄像头终端的图像深度计算的方法和终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims priority on the Chinese Patent Application filed on October 25, 2016, the Patent Office of the State Intellectual Property Office of China, Application No. 201610941102.2, and the invention titled "Method and Terminal for Image Depth Calculation with Dual Camera Terminals" The entire content of which is incorporated herein by reference.
本发明实施例涉及通信领域,尤其涉及一种具有双摄像头终端设备的目标场景的场景深度计算方法、装置及终端。The embodiments of the present invention relate to the field of communications, and in particular, to a method, a device, and a terminal for calculating a scene depth of a target scene having a dual camera terminal device.
当前,通过双摄像头实现深度计算的方式,正应用在越来越多的手机上,比如型号为华为P9的手机。光学图像稳定(Optical Image Stabilization,OIS,通常也可以称为光学防抖)作为提升在低光照下拍照质量的重要手段,也在越来越多的手机上使用。OIS的工作原理是通过镜头的移动来补偿手抖以达到图像的稳定。Currently, the method of deep calculation through dual cameras is being applied to more and more mobile phones, such as the Huawei P9 mobile phone. Optical Image Stabilization (OIS, also commonly referred to as optical image stabilization) is an important means of improving the quality of photographs in low light, and is also used on more and more mobile phones. OIS works by compensating for hand shake by moving the lens to achieve image stabilization.
现有技术一般通过双摄像头的标定参数来矫正图像,使得双摄像头提供的左右图像在一个方向上对齐,然后计算出视差,再将视差转换为场景深度。但只要其中一个摄像头带有OIS系统后,由于OIS引起镜头偏移,导致双摄像头标定参数发生变化,导致视差问题(正值视差和负值视差同时存在,或者图像无法在一个方向对齐),进而造成计算的场景深度值不准确。In the prior art, the image is generally corrected by the calibration parameters of the dual camera, so that the left and right images provided by the dual camera are aligned in one direction, then the parallax is calculated, and the parallax is converted into the depth of the scene. However, as long as one of the cameras has an OIS system, the OIS causes a lens shift, which causes the dual camera calibration parameters to change, resulting in parallax problems (positive parallax and negative parallax exist simultaneously, or the image cannot be aligned in one direction). The calculated scene depth value is not accurate.
发明内容
Summary of the invention
本发明实施例提供了一种场景深度计算方法,当OIS系统存在于一个或两个摄像头时,解决了视差问题(正负视差同时存在,或者图像无法在一个方向对齐)导致的目标场景的场景深度值不准确的问题。The embodiment of the invention provides a scene depth calculation method. When the OIS system exists in one or two cameras, the scene of the target scene caused by the parallax problem (the positive or negative parallax exists simultaneously or the image cannot be aligned in one direction) is solved. The problem of inaccurate depth values.
第一方面,提供了一种场景深度计算方法,所述方法包括:获取带有OIS系统的摄像头的镜头偏移量;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上;根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。通过补偿终端设备抖动引起的摄像头标定参数的变化,解决了视差问题,用补偿后的摄像头标定参数计算目标场景的场景深度,计算出的场景深度值更准确。In a first aspect, a scene depth calculation method is provided, the method comprising: acquiring a lens offset of a camera with an OIS system; wherein the first camera and/or the second camera has an OIS system, the first camera And the second camera is arranged side by side on the body of the same terminal device; according to the preset OIS motor sensitivity calibration parameter, the lens offset is converted into an image offset; according to the compensated first camera calibration parameter and / Or the compensated second camera calibration parameter, and the first image and the second image obtained by the first camera and the second camera respectively acquiring the target scene at the same time, and calculating the scene depth of the target scene; The calibration parameter of the first camera is compensated according to the lens offset of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera. By compensating for the change of the camera calibration parameters caused by the jitter of the terminal device, the parallax problem is solved, and the scene depth of the target scene is calculated by using the compensated camera calibration parameters, and the calculated scene depth value is more accurate.
结合第一方面,在第一方面的第一种可能的实现方式中,所述方法之前还包括:获取陀螺仪传感器检测的终端设备抖动的角速度信息;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。其中,由于终端设备抖动时间和其曝光时间不一致,且抖动时间大于曝光时间,会获取到多个镜头偏移量,根据预设规则,从多个镜头偏移量中确定一个镜头偏移量,并在后续计算目标场景的场景深度时,利用该确定的镜头偏移量进行计算。With reference to the first aspect, in a first possible implementation manner of the first aspect, the method may further include: acquiring angular velocity information of the terminal device jitter detected by the gyro sensor; converting the angular velocity information into jitter of the terminal device The amplitude drives the OIS motor to push the lens of the first camera and/or the lens of the second camera according to the amplitude of the shake, and acquires a lens offset of the first camera and/or the second camera. Wherein, because the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
结合第一方面,在第一方面的第二种可能的实现方式中,根据以下步骤确定OIS马达感度标定参数:通过OIS控制器,推动OIS马达,将镜头移动至指定位置;等待所述OIS马达稳定后拍照;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置和所述各张图
像的特征点坐标,确定OIS马达感度标定参数。通过在终端设备出厂之前,将该OIS马达感度标定参数存储在其中,以便在终端设备出厂后,计算目标场景的场景深度时,利用存储的OIS马达感度标定参数,将镜头偏移量转换为图像偏移量。In conjunction with the first aspect, in a second possible implementation of the first aspect, the OIS motor sensitivity calibration parameter is determined according to the steps of: pushing the OIS motor through the OIS controller, moving the lens to a designated position; waiting for the OIS motor Photographing after stabilization; when the captured image reaches a preset number of sheets, detecting feature point coordinates of each image, and according to the specified position of the lens and the respective images
The feature point coordinates of the image determine the OIS motor sensitivity calibration parameters. The OIS motor sensitivity calibration parameter is stored therein before the terminal device leaves the factory, so that when the terminal device is shipped, the scene depth of the target scene is calculated, and the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image. Offset.
结合第一方面,在第一方面的第三种可能的实现方式中,根据下式,将镜头偏移量转换为图像偏移量,With reference to the first aspect, in a third possible implementation manner of the first aspect, the lens offset is converted into an image offset according to the following formula,
Δx≈α×ΔCΔx≈α×ΔC
其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量,通过OIS马达感度标定参数,将镜头偏移量转换为图像偏移量,与摄像头标定参数的单位保持一致。Where Δx is the image offset, α is the OIS motor sensitivity calibration parameter, ΔC is the lens offset, and the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
结合第一方面,在第一方面的第四种可能的实现方式中,当第一摄像头带有OIS系统时,所述根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度具体包括:In conjunction with the first aspect, in a fourth possible implementation manner of the first aspect, when the first camera is provided with an OIS system, the first camera calibration parameter after compensation and/or the second camera calibration after compensation And the first image and the second image obtained by the first camera and the second camera at the same time, and the scene depth of the target scene is specifically calculated by:
利用下式确定目标场景的场景深度,Use the following formula to determine the depth of the scene of the target scene.
其中,Z为目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, and pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image offset, a × Δ 1 × pixel pitch is the unit of the first image offset is converted from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. from B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 is an imaging point of the second image.
结合第一方面,在第一方面的第五种可能的实现方式中,当第一摄像头和第二摄像头都带有OIS系统时,所述根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在
同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度具体包括:In conjunction with the first aspect, in a fifth possible implementation manner of the first aspect, when the first camera and the second camera both have an OIS system, the first camera is calibrated according to the compensated parameter and/or compensated Second camera calibration parameter, and the first camera and the second camera are
The first image and the second image obtained by the target scene are collected at the same time, and the calculation of the scene depth of the target scene includes:
a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜a 1 is the OIS motor sensitivity calibration parameter of the first camera, and Δ 1 is the mirror of the first camera
其中,among them,
头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。The head offset, a 1 × Δ 1 ≈ Δx 1 is the first image shift amount, and a 1 × Δ 1 × pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second. The OIS motor sensitivity calibration parameter of the camera, Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 is the second image offset, and a 2 × Δ 2 × pixel pitch is the second image bias The unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first The image is imaged and x 2 is the imaged point of the second image.
第二方面,本发明实施例提供了一种场景深度计算装置,所述装置包括:第一获取单元,处理单元,计算单元;所述第一获取单元,用于获取带有OIS系统的摄像头的镜头偏移量;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上;所述处理单元,用于根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;所述计算单元,用于根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。通过补偿终端设备抖动引起的摄像头标定参数的变化,解决了视差问题,用补偿后的摄像头标定参数计算场景深度,计算出的场景深度值更准确。In a second aspect, an embodiment of the present invention provides a scenario depth calculation device, where the device includes: a first acquisition unit, a processing unit, and a calculation unit; and the first acquisition unit is configured to acquire a camera with an OIS system. a lens shift amount; wherein the first camera and/or the second camera are provided with an OIS system, the first camera and the second camera are juxtaposed on the body of the same terminal device; and the processing unit is configured to be preset according to The OIS motor sensitivity calibration parameter converts the lens offset into an image offset; the calculating unit is configured to perform calibration parameters according to the compensated first camera and/or compensated second camera calibration parameters, and The first camera and the second camera collect the first image and the second image respectively obtained by the target scene at the same time, and calculate the scene depth of the target scene; wherein, the first offset is compensated according to the lens offset of the first camera The calibration parameter of the camera compensates the calibration parameter of the second camera according to the lens offset of the second camera. By compensating for the change of the camera calibration parameters caused by the jitter of the terminal device, the parallax problem is solved, and the scene depth is calculated by using the compensated camera calibration parameters, and the calculated scene depth value is more accurate.
结合第二方面,在第二方面的第一种可能的实现方式中,所述装置还包括:第二获取单元;所述第二获取单元具体用于:获取陀螺仪传感器检测的终端设备抖动的角速度信息;将所述角速度信息转化为终端设备的抖动幅度,
根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。其中,由于终端设备抖动时间和其曝光时间不一致,且抖动时间大于曝光时间,会获取到多个镜头偏移量,根据预设规则,从多个镜头偏移量中确定一个镜头偏移量,并在后续计算目标场景的场景深度时,利用该确定的镜头偏移量进行计算。With reference to the second aspect, in a first possible implementation manner of the second aspect, the device further includes: a second acquiring unit, where the second acquiring unit is configured to: acquire the jitter of the terminal device detected by the gyro sensor Angular velocity information; converting the angular velocity information into a jitter amplitude of the terminal device,
Driving the OIS motor to drive lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring a lens shift amount of the first camera and/or the second camera. Wherein, because the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
结合第二方面,在第二方面的第二种可能的实现方式中,所述装置还包括:确定单元;所述确定单元具体用于,通过OIS控制器,推动OIS马达,将镜头移动至指定位置;等待所述OIS马达稳定后拍照;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置,确定OIS马达感度标定参数。通过在终端设备出厂之前,将该OIS马达感度标定参数存储在其中,以便在终端设备出厂后,计算目标场景的场景深度时,利用存储的OIS马达感度标定参数,将镜头偏移量转换为图像偏移量。With reference to the second aspect, in a second possible implementation manner of the second aspect, the device further includes: a determining unit, where the determining unit is specifically configured to: push the OIS motor through the OIS controller, and move the lens to the designated Positioning; waiting for the OIS motor to stabilize after taking a picture; when the captured image reaches a preset number of sheets, detecting feature point coordinates of each image, and determining an OIS motor sensitivity calibration parameter according to the specified position of the lens. The OIS motor sensitivity calibration parameter is stored therein before the terminal device leaves the factory, so that when the terminal device is shipped, the scene depth of the target scene is calculated, and the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image. Offset.
结合第二方面,在第二方面的第三种可能的实现方式中,所述处理单元具体用于:With reference to the second aspect, in a third possible implementation manner of the second aspect, the processing unit is specifically configured to:
根据下式,将镜头偏移量转换为图像偏移量,Convert the lens offset to the image offset according to the following formula,
Δx≈α×ΔCΔx≈α×ΔC
其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量,通过OIS马达感度标定参数,将镜头偏移量转换为图像偏移量,与摄像头标定参数的单位保持一致。Where Δx is the image offset, α is the OIS motor sensitivity calibration parameter, ΔC is the lens offset, and the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
结合第二方面,在第二方面的第四种可能的实现方式中,当第一摄像头带有OIS时,所述计算单元,具体用于:With reference to the second aspect, in a fourth possible implementation manner of the second aspect, when the first camera is provided with an OIS, the calculating unit is specifically configured to:
利用下式确定目标场景的场景深度,Use the following formula to determine the depth of the scene of the target scene.
其中,Z为目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达
感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, and pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image offset, a × Δ 1 × pixel pitch is the unit of the first image offset is converted from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. from B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 is an imaging point of the second image.
结合第二方面,在第二方面的第五种可能的实现方式中,当第一摄像头和第二摄像头都带有OIS时,所述计算单元,具体用于:With reference to the second aspect, in a fifth possible implementation manner of the second aspect, when the first camera and the second camera both carry the OIS, the calculating unit is specifically configured to:
利用下式确定目标场景的场景深度,Use the following formula to determine the depth of the scene of the target scene.
a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜a 1 is the OIS motor sensitivity calibration parameter of the first camera, and Δ 1 is the mirror of the first camera
其中,among them,
头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。The head offset, a 1 × Δ 1 ≈ Δx 1 is the first image shift amount, and a 1 × Δ 1 × pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second. The OIS motor sensitivity calibration parameter of the camera, Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 is the second image offset, and a 2 × Δ 2 × pixel pitch is the second image bias The unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first The image is imaged and x 2 is the imaged point of the second image.
第三方面,本发明实施例提供了一种终端,所述终端包括第一摄像头和第二摄像头,所述第一摄像头和第二摄像头至少用于在同一时刻采集目标场景,分别得到第一图像和第二图像;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上;存储器,所述存储器用于存储所述第一图像和第二图像;处理器,所述处理器用于获取带有OIS马达的摄像头的镜头偏移量,根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及从所述存储器获取的第一图像和第二图像,计算所述目标场景的场景深度;其中,根据第一摄像头的
镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。利用镜头偏移量,补偿终端设备抖动引起的摄像头标定参数的变化,解决了视差问题,用补偿后的摄像头标定参数计算目标场景的场景深度,计算出的场景深度值更准确。In a third aspect, an embodiment of the present invention provides a terminal, where the terminal includes a first camera and a second camera, where the first camera and the second camera are used to collect at least one target scene at a same time, respectively obtaining a first image. And a second image; wherein the first camera and/or the second camera are provided with an OIS system, the first camera and the second camera are juxtaposed on the body of the same terminal device; and the memory is configured to store the first An image and a second image; the processor is configured to acquire a lens offset of the camera with the OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter Calculating a scene depth of the target scene according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the first image and the second image acquired from the memory; wherein, according to First camera
The lens offset compensates the calibration parameter of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera. The lens offset is used to compensate for the change of the camera calibration parameters caused by the jitter of the terminal device, and the parallax problem is solved. The compensated camera calibration parameters are used to calculate the scene depth of the target scene, and the calculated scene depth value is more accurate.
结合第三方面,在第三方面的第一种可能的实现方式中,所述OIS系统具体用于,获取陀螺仪传感器检测的终端设备抖动的角速度信息;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。其中,由于终端设备抖动时间和其曝光时间不一致,且抖动时间大于曝光时间,会获取到多个镜头偏移量,根据预设规则,从多个镜头偏移量中确定一个镜头偏移量,并在后续计算目标场景的场景深度时,利用该确定的镜头偏移量进行计算。With reference to the third aspect, in a first possible implementation manner of the third aspect, the OIS system is specifically configured to: acquire angular velocity information of terminal device jitter detected by a gyro sensor; and convert the angular velocity information into a terminal device The amplitude of the jitter, driving the OIS motor to drive the lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring the lens offset of the first camera and/or the second camera. Wherein, because the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
结合第三方面,在第三方面的第二种可能的实现方式中,所述处理器还用于,通过OIS控制器,推动OIS马达,将镜头移动至指定位置;等待所述OIS马达稳定后拍照;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置和所述各张图像的特征点坐标,确定OIS马达感度标定参数;所述存储器还用于,存储所述OIS马达感度标定参数。通过在终端出厂之前,将该OIS马达感度标定参数存储在其中,以便在终端出厂后,计算目标场景的场景深度时,利用存储的OIS马达感度标定参数,将镜头偏移量转换为图像偏移量。In conjunction with the third aspect, in a second possible implementation manner of the third aspect, the processor is further configured to: by the OIS controller, push the OIS motor to move the lens to the designated position; wait for the OIS motor to stabilize Taking a picture; detecting a feature point coordinate of each image when the captured image reaches a preset number of sheets, and determining an OIS motor sensitivity calibration parameter according to the specified position of the lens and the feature point coordinates of each image; The memory is further configured to store the OIS motor sensitivity calibration parameters. The OIS motor sensitivity calibration parameter is stored therein before the terminal leaves the factory, so that when the terminal is calculated and the scene depth of the target scene is calculated, the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image offset. the amount.
结合第三方面,在第三方面的第三种可能的实现方式中,所述处理器具体用于,根据下式,将镜头偏移量转换为图像偏移量,With reference to the third aspect, in a third possible implementation manner of the third aspect, the processor is specifically configured to convert the lens offset into an image offset according to the following formula,
Δx≈α×ΔCΔx≈α×ΔC
其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量,通过OIS马达感度标定参数,将镜头偏移量转换为图像偏移量,与摄像头标定参数的单位保持一致。
Where Δx is the image offset, α is the OIS motor sensitivity calibration parameter, ΔC is the lens offset, and the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
结合第三方面,在第三方面的第四种可能的实现方式中,当第一摄像头带有OIS系统时,所述处理器具体用于,利用下式确定所述目标场景的场景深度,With reference to the third aspect, in a fourth possible implementation manner of the third aspect, when the first camera is provided with an OIS system, the processor is specifically configured to determine a scene depth of the target scene by using:
其中,Z为所述目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image shift amount, and a×Δ 1 ×pixel pitch is a unit for converting the first image shift amount from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. , the baseline B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 of the second imaging point image.
结合第三方面,在第三方面的第五种可能的实现方式中,当第一摄像头和第二摄像头都带有OIS系统时,所述处理器具体用于,利用下式确定所述目标场景的场景深度,With reference to the third aspect, in a fifth possible implementation manner of the third aspect, when the first camera and the second camera both have an OIS system, the processor is specifically configured to determine the target scenario by using the following formula: Scene depth,
a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜a 1 is the OIS motor sensitivity calibration parameter of the first camera, and Δ 1 is the mirror of the first camera
其中,among them,
头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。The head offset, a 1 × Δ 1 ≈ Δx 1 is the first image shift amount, and a 1 × Δ 1 × pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second. The OIS motor sensitivity calibration parameter of the camera, Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 is the second image offset, and a 2 × Δ 2 × pixel pitch is the second image bias The unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first The image is imaged and x 2 is the imaged point of the second image.
下面通过附图和实施例,对本发明实施例的技术方案做进一步的详细描
述。The technical solutions of the embodiments of the present invention are further described in detail below with reference to the accompanying drawings and embodiments.
Said.
图1为OIS系统工作原理框图;Figure 1 is a block diagram showing the working principle of the OIS system;
图2为深度计算系统框图;Figure 2 is a block diagram of a depth calculation system;
图3为本发明实施例一提供的场景深度计算方法流程图;FIG. 3 is a flowchart of a method for calculating a depth of a scene according to Embodiment 1 of the present invention;
图4a为镜头偏移场景示意图;Figure 4a is a schematic diagram of a lens offset scene;
图4b为镜头偏移前后成像变化的示意图;Figure 4b is a schematic diagram of imaging changes before and after lens shift;
图4c为确定OIS马达感度标定参数流程图;Figure 4c is a flow chart for determining the OIS motor sensitivity calibration parameters;
图5a为本发明实施例提供的场景深度计算的一个示意图;FIG. 5a is a schematic diagram of scene depth calculation according to an embodiment of the present invention; FIG.
图5b为本发明实施例提供的场景深度计算的又一个示意图;FIG. 5b is still another schematic diagram of scene depth calculation according to an embodiment of the present invention;
图6a为补偿双摄像头标定参数时,拍摄的图像示意图;Figure 6a is a schematic diagram of an image taken when compensating the calibration parameters of the dual camera;
图6b为没有补偿双摄像头标定参数后,拍摄的图像示意图;Figure 6b is a schematic diagram of an image taken after the calibration parameters of the dual camera are not compensated;
图6c为图6a的局部放大图;Figure 6c is a partial enlarged view of Figure 6a;
图6d为图6b的局部放大图;Figure 6d is a partial enlarged view of Figure 6b;
图7a为没有补偿双摄像头标定参数时,场景深度示意图;Figure 7a is a schematic diagram of the depth of the scene when the calibration parameters of the dual camera are not compensated;
图7b为补偿双摄像头标定参数后,场景深度示意图;Figure 7b is a schematic diagram of the depth of the scene after compensating the calibration parameters of the dual camera;
图8为本发明实施例二提供的场景深度计算装置结构示意图;FIG. 8 is a schematic structural diagram of a scene depth calculation apparatus according to Embodiment 2 of the present invention;
图9为本发明实施例二提供的又一场景深度计算装置结构示意图;FIG. 9 is a schematic structural diagram of still another scene depth computing apparatus according to Embodiment 2 of the present invention;
图10为本发明实施例三提供的终端结构示意图。FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 3 of the present invention.
在本发明中,涉及终端设备,终端设备可以是具有双摄像头的设备,包括但不限于照相机(例如数码相机)、摄像机、手机(例如智能手机)、平板电脑(Pad)、个人数字助理(Personal Digital Assistant,PDA)、便携设备(例如,便携式计算机)、可穿戴设备等,本发明实施例对此不做具体限定。In the present invention, the terminal device may be a device having a dual camera, including but not limited to a camera (such as a digital camera), a video camera, a mobile phone (such as a smart phone), a tablet (Pad), and a personal digital assistant (Personal). The digital assistant (PDA), the portable device (for example, a portable computer), the wearable device, and the like are not specifically limited in the embodiment of the present invention.
请参考图1,终端设备可以为手机,下面以手机为例对本发明实施例进行
阐述。Referring to FIG. 1 , the terminal device may be a mobile phone, and the following uses a mobile phone as an example to perform an embodiment of the present invention.
set forth.
目前,越来越多的手机上应用双摄像头计算场景深度,双摄像头模拟人类双目视觉原理感知距离,即从两个点观察一个物体,获取在不同视角下的图像,根据图像之间像素的匹配关系,通过三角测量原理计算出像素之间的偏移来获取物体的场景深度。当双摄像头中的一个或者两个带有OIS系统后,由于OIS引起镜头偏移,导致双摄像头标定参数发生变化,导致视差问题,进而造成场景深度计算不准确,因此,需要补偿双摄像头标定参数,使得目标场景的场景深度计算准确。At present, more and more mobile phones use a dual camera to calculate the depth of the scene. The dual camera simulates the human binocular vision principle to perceive the distance, that is, observing an object from two points, and acquiring images at different viewing angles, according to the pixels between the images. The matching relationship is obtained by calculating the offset between pixels by the principle of triangulation to obtain the depth of the scene of the object. When one or two of the dual cameras are equipped with the OIS system, the OIS causes the lens to shift, which causes the dual camera calibration parameters to change, resulting in parallax problems, which in turn results in inaccurate scene depth calculation. Therefore, it is necessary to compensate the dual camera calibration parameters. , so that the scene depth of the target scene is calculated accurately.
图1为OIS系统工作原理框图。如图1所示,终端设备包括:OIS系统100和图像处理器(Image Signal Processor,ISP)110。OIS系统100包括:OIS控制器120、陀螺仪传感器130、霍尔传感器140、马达150、摄像头160。Figure 1 is a block diagram of the working principle of the OIS system. As shown in FIG. 1, the terminal device includes an OIS system 100 and an Image Signal Processor (ISP) 110. The OIS system 100 includes an OIS controller 120, a gyro sensor 130, a Hall sensor 140, a motor 150, and a camera 160.
摄像头160包括第一摄像头和第二摄像头,第一摄像头和第二摄像头可以并列位于终端设备的前面,也可以并列位于终端设备的背面,排布方式可以为水平排布也可以为竖直排布,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头分别具有镜头(图1中未标示)。The camera 160 includes a first camera and a second camera. The first camera and the second camera may be juxtaposed in front of the terminal device, or may be juxtaposed on the back of the terminal device, and may be arranged in a horizontal arrangement or a vertical arrangement. The first camera and/or the second camera are provided with an OIS system, and the first camera and the second camera respectively have lenses (not shown in FIG. 1).
霍尔传感器140是基于霍尔效应进行位移测量的磁场传感器,用于获取带有OIS系统的摄像头的镜头偏移量,即第一摄像头和/或第二摄像头的镜头偏移量。The Hall sensor 140 is a magnetic field sensor that performs displacement measurement based on the Hall effect for acquiring the lens shift amount of the camera with the OIS system, that is, the lens shift amount of the first camera and/or the second camera.
陀螺仪传感器130是基于终端设备在自由空间方位移动的定位系统,用于获取终端设备抖动时的角速度信息。The gyro sensor 130 is a positioning system based on the movement of the terminal device in a free space orientation for acquiring angular velocity information when the terminal device is shaken.
OIS控制器120从陀螺仪传感器130获取角速度信息,并将角速度信息转换为终端设备的抖动幅度,并将所述抖动幅度作为参考信号发送给马达150。The OIS controller 120 acquires angular velocity information from the gyro sensor 130, converts the angular velocity information into a jitter amplitude of the terminal device, and transmits the jitter amplitude as a reference signal to the motor 150.
马达150可以为OIS马达,用于根据抖动幅度,推动带有OIS系统的摄像头的镜头移动,以保证图像的清晰度;其中,该移动指在X和/或Y方向移动,Y方向指镜头的光心和焦点的连线的平面内的一个方向,X方向指通过镜头的光心,和Y方向垂直的方向。
The motor 150 may be an OIS motor for driving the lens movement of the camera with the OIS system according to the amplitude of the shake to ensure the sharpness of the image; wherein the movement refers to moving in the X and/or Y direction, and the Y direction refers to the lens. One direction in the plane of the line connecting the optical center and the focus, the X direction refers to the direction of the light passing through the lens and perpendicular to the Y direction.
OIS控制器120还从第一摄像头和第二摄像头获取同一时刻采集目标场景分别得到的第一图像和第二图像。The OIS controller 120 also acquires the first image and the second image obtained by acquiring the target scene at the same time from the first camera and the second camera.
ISP 110存储从OIS控制器120获取的镜头偏移量、第一图像以及第二图像。The ISP 110 stores the lens shift amount, the first image, and the second image acquired from the OIS controller 120.
下面结合图1中各个构成部件,对OIS系统的工作原理做具体的介绍。The working principle of the OIS system will be specifically described below in conjunction with the various components in FIG.
在准备拍摄时,终端设备进行初始化,通常,在准备就绪后,OIS控制器120控制快门获取图像。At the time of preparation for shooting, the terminal device performs initialization, and usually, when ready, the OIS controller 120 controls the shutter to acquire an image.
在拍摄时,终端设备会发生抖动,OIS控制器120读取陀螺仪传感器130检测的角速度信息,并将角速度信息转换为终端设备的抖动幅度并作为参考信号发送给OIS马达,OIS马达根据抖动幅度移动带有OIS系统的摄像头的镜头,避免因为终端设备抖动造成的拍摄图像模糊,保证图像的清晰度。其中,该移动可以是第一摄像头的镜头在X和/或Y方向移动和/或第二摄像头的镜头在X和/或Y方向移动。OIS控制器120读取霍尔传感器140检测的带有OIS系统的摄像头的镜头偏移量,亦即第一摄像头和/或第二摄像头的镜头偏移量,并从摄像头获取拍摄的图像,亦即分别对应于第一摄像头和第二摄像头在同一时刻采集目标场景得到的第一图像和第二图像,并将镜头偏移量以及拍摄的图像发送给ISP 110。ISP 110存储镜头偏移量以及摄像头拍摄的第一图像和第二图像。At the time of shooting, the terminal device may shake, and the OIS controller 120 reads the angular velocity information detected by the gyro sensor 130, converts the angular velocity information into the jitter amplitude of the terminal device, and transmits it as a reference signal to the OIS motor, and the OIS motor according to the jitter amplitude Move the lens of the camera with OIS system to avoid blurring of the captured image caused by the jitter of the terminal device and ensure the sharpness of the image. Wherein, the movement may be that the lens of the first camera moves in the X and/or Y direction and/or the lens of the second camera moves in the X and/or Y direction. The OIS controller 120 reads the lens offset of the camera with the OIS system detected by the Hall sensor 140, that is, the lens offset of the first camera and/or the second camera, and acquires the captured image from the camera. That is, the first image and the second image obtained by the target scene are acquired at the same time corresponding to the first camera and the second camera, respectively, and the lens offset and the captured image are sent to the ISP 110. The ISP 110 stores the lens shift amount and the first image and the second image captured by the camera.
由于终端设备抖动的时间一般大于其曝光时间,比如,终端设备抖动持续时间为30ms,曝光时间为2ms,在此次终端设备抖动时,霍尔传感器140会获取到15个镜头偏移量,OIS控制器120根据预设的规则,从霍尔传感器140读取到这15个镜头偏移量中,再从15个镜头偏移量中,确定一个镜头偏移量,并在后续过程中,利用该确定的镜头偏移量作为上下文中所述的镜头偏移量,进行目标场景的场景深度计算。Since the terminal device jitter time is generally greater than its exposure time, for example, the terminal device jitter duration is 30ms and the exposure time is 2ms. When the terminal device shakes, the Hall sensor 140 acquires 15 lens offsets, OIS. The controller 120 reads the 15 lens offsets from the Hall sensor 140 according to a preset rule, and determines a lens offset from the 15 lens offsets, and uses in the subsequent process. The determined lens offset is used as the lens offset described in the context to perform scene depth calculation of the target scene.
图2为深度计算系统框图。如图2所示,深度计算系统包括ISP 110和深度计算模块210。深度计算模块210从ISP 110获取预设的标定信息以及从
ISP 110中读取存储的OIS信息、第一摄像头和第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算目标场景的场景深度,输出视差图/深度图。其中,标定信息为初始化时,摄像头标定参数,比如焦距、基线、光心和主点等;OIS信息为镜头偏移量。Figure 2 is a block diagram of the depth calculation system. As shown in FIG. 2, the depth calculation system includes an ISP 110 and a depth calculation module 210. The depth calculation module 210 acquires preset calibration information from the ISP 110 and
The ISP 110 reads the stored OIS information, and the first camera and the second camera acquire the first image and the second image respectively obtained by the target scene at the same time, calculate the scene depth of the target scene, and output the disparity map/depth map. Wherein, the calibration information is the camera calibration parameters such as focal length, baseline, optical center and principal point when initializing; the OIS information is the lens offset.
具体地,深度计算模块获取镜头偏移量,由于镜头偏移量的单位为code,场景深度的单位为毫米(mm),二者不一致,因此,需要根据OIS马达感度标定参数,将镜头偏移量转换为图像偏移量,单位为像素(pixel),再用镜头偏移量补偿摄像头标定参数,根据补偿后的摄像头标定参数,计算目标场景的场景深度值,由此计算出的场景深度值更加精确。Specifically, the depth calculation module acquires the lens offset. Since the unit of the lens offset is code and the unit of the scene depth is millimeter (mm), the two are inconsistent. Therefore, the lens needs to be offset according to the OIS motor sensitivity. The quantity is converted into image offset, the unit is pixel (pixel), and then the camera offset parameter is compensated by the lens offset, and the scene depth value of the target scene is calculated according to the compensated camera calibration parameter, thereby calculating the scene depth value. More precise.
图3为本发明实施例一提供的场景深度计算方法流程图。如图3所示,该方法包括:FIG. 3 is a flowchart of a method for calculating a scene depth according to Embodiment 1 of the present invention. As shown in FIG. 3, the method includes:
S310,获取镜头偏移量(参见图4b的ΔC,图5a的Δ1,图5b的Δ1和Δ2)。S310, obtaining the lens shift amount (see FIG. 4b [Delta] C, Δ Figure 5a 1, Δ 1, and FIG. 5b Δ 2).
具体地,镜头偏移量可以通过霍尔传感器检测获得。Specifically, the lens shift amount can be obtained by Hall sensor detection.
S320,镜头偏移量是否异常;不异常,跳转至S340;异常,跳转至S380。S320, whether the lens offset is abnormal; if it is not abnormal, jump to S340; if abnormal, jump to S380.
具体地,如果镜头偏移量大于预设阈值,则镜头偏移量异常;如果镜头偏移量不大于预设阈值,则镜头偏移量不异常。Specifically, if the lens offset is greater than the preset threshold, the lens offset is abnormal; if the lens offset is not greater than the preset threshold, the lens offset is not abnormal.
S340,将镜头偏移量转换为图像偏移量(参见图4b的Δx)。在执行S340之前需要预先执行S330,即输入OIS马达感度标定参数,即单位镜头偏移量所导致的图像偏移量,其中,每一个带有OIS系统的摄像头都有其相应的OIS马达感度标定参数,在终端设备出厂之前,OIS马达感度标定参数已经预先存储在其中。利用OIS马达感度标定参数,可以由镜头偏移量转换为图像偏移量。S340, converting the lens offset into an image offset (see Δx of FIG. 4b). Before executing S340, S330 needs to be executed in advance, that is, the OIS motor sensitivity calibration parameter, that is, the image offset caused by the unit lens offset, is input, wherein each camera with the OIS system has its corresponding OIS motor sensitivity calibration. Parameters, the OIS motor sensitivity calibration parameters have been pre-stored before the terminal device leaves the factory. Using the OIS motor sensitivity calibration parameters, the lens offset can be converted to an image offset.
S350,补偿双摄像头标定参数。S350, compensates for dual camera calibration parameters.
具体地,由于拍照时终端设备的抖动,导致摄像头的部分标定参数(比如光心、主点、基线等)发生变化,根据镜头偏移量计算出图像偏移量,用镜头偏移量补偿变化的摄像头标定参数,确定变化后的摄像头标定参数的数
值。参见图4的光心从C'变为C,主点从u'变为u。参见图5a的第一摄像头镜头的光心从C1'变为C1,主点从u1'变为u1,基线从B'变为B1。参见图5b中的第一摄像头镜头的光心从C1'变为C1,主点从u1'变为u1,第二摄像头镜头的光心从C2'变为C2,主点从u2'变为u2,基线从B'变为B2。Specifically, due to the jitter of the terminal device during photographing, some calibration parameters (such as the optical center, the main point, the baseline, etc.) of the camera are changed, the image offset is calculated according to the lens offset, and the lens offset is used to compensate for the change. The camera calibration parameters determine the value of the camera calibration parameters after the change. Referring to Fig. 4, the optical center changes from C' to C, and the principal point changes from u' to u. Referring to Fig. 5a, the optical center of the first camera lens changes from C 1 ' to C 1 , the main point changes from u 1 ' to u 1 , and the baseline changes from B' to B 1 . Referring to Fig. 5b, the optical center of the first camera lens changes from C 1 ' to C 1 , the main point changes from u 1 ' to u 1 , and the optical center of the second camera lens changes from C 2 ' to C 2 , the main point From u 2 ' to u 2 , the baseline changes from B' to B 2 .
S380,计算目标场景的场景深度。计算公式参见公式2、公式4。在执行S380之前需要预先执行S360,即输入第一图像,以及S370,即输入第二图像。S380: Calculate a scene depth of the target scene. See Equation 2 and Equation 4 for the calculation formula. It is necessary to perform S360 in advance before executing S380, that is, input the first image, and S370, that is, input the second image.
具体地,根据补偿后的摄像头标定参数、第一图像和第二图像,确定目标场景的场景深度。其中,第一摄像头和第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像。Specifically, the scene depth of the target scene is determined according to the compensated camera calibration parameters, the first image, and the second image. The first camera and the second camera acquire the first image and the second image respectively obtained by the target scene at the same time.
S390,输出目标场景的场景深度。S390: Output a scene depth of the target scene.
下面结合图4a-4c,说明如何确定OIS马达感度标定参数。How to determine the OIS motor sensitivity calibration parameters is described below in conjunction with Figures 4a-4c.
图4a为镜头偏移场景示意图。如图4a所示,以OIS马达推动一个摄像头的镜头移动为例,OIS马达推动镜头从椭圆形虚线的位置移动至指定位置(xi,yi),分别拍摄固定chart的图像,图像传感器将摄像头获取的光学图像转换为电子信号,可以根据镜头移动前后的位置,确定镜头偏移量,根据两个固定chart的图像,确定图像偏移量。Figure 4a is a schematic diagram of a lens offset scene. As shown in Fig. 4a, taking the OIS motor to push the lens movement of a camera as an example, the OIS motor pushes the lens from the position of the elliptical dotted line to the specified position (x i , y i ), respectively, to take an image of the fixed chart, and the image sensor will The optical image acquired by the camera is converted into an electronic signal, and the lens offset can be determined according to the position before and after the lens is moved, and the image offset is determined according to the images of the two fixed charts.
图4b为镜头偏移前后成像变化的示意图。如图4b所示,以OIS马达推动一个摄像头的镜头在X方向上移动为例,镜头移动前,摄像头标定参数:焦距为f,光心为C',主点为u'。镜头移动后,摄像头的部分标定参数发生变化:光心从C'变为C,主点从u'变为u。镜头移动前后的成像点分别为x'和x,ΔC为光心C'和光心C的距离,即镜头偏移量,单位为code,Δx为成像点x'和成像点x的距离,即图像偏移量,单位为pixel。根据镜头偏移量和图像偏移量,可以测算出单位镜头偏移量所导致的图像偏移量,即OIS马达感度标定参数。根据该OIS马达感度标定参数,可以在后续的拍摄过程中,基于镜头偏移量计算出实际的图像偏移量,以便补偿终端抖动时摄像头标定参数。Figure 4b is a schematic diagram of imaging changes before and after lens shift. As shown in Fig. 4b, the OIS motor pushes the lens of a camera in the X direction as an example. Before the lens moves, the camera calibration parameters: the focal length is f, the optical center is C', and the main point is u'. After the lens is moved, part of the calibration parameters of the camera changes: the optical center changes from C' to C, and the main point changes from u' to u. The imaging points before and after the lens movement are x' and x, respectively, ΔC is the distance between the optical center C' and the optical center C, that is, the lens shift amount, the unit is code, Δx is the distance between the imaging point x' and the imaging point x, that is, the image Offset in pixels. According to the lens offset and the image offset, the image offset caused by the unit lens offset can be measured, that is, the OIS motor sensitivity calibration parameter. According to the OIS motor sensitivity calibration parameter, the actual image offset can be calculated based on the lens offset during subsequent shooting to compensate for the camera calibration parameters when the terminal is shaken.
在确定OIS马达感度标定参数时,假设镜头偏移量ΔC和图像偏移量Δx间
的关系为线性,可以得到OIS马达感度标定参数为:α≈Δx/ΔC,α单位为pixels/code。When determining the OIS motor sensitivity calibration parameters, it is assumed that between the lens shift amount ΔC and the image shift amount Δx
The relationship is linear, and the OIS motor sensitivity calibration parameters can be obtained as: α ≈ Δx / ΔC, and the α unit is pixels/code.
但是ΔC和Δx并不是严格线性的,可以使用更高阶的模型,比如二阶、三阶或者更高,拍摄更多的图像来获得更精准的OIS马达感度标定参数。However, ΔC and Δx are not strictly linear. Higher-order models, such as second-order, third-order or higher, can be used to capture more images for more accurate OIS motor sensitivity calibration parameters.
图4c为确定OIS马达感度标定参数流程图。如图4c所示,包括:Figure 4c is a flow chart for determining OIS motor sensitivity calibration parameters. As shown in Figure 4c, it includes:
S410,通过OIS控制器,推动OIS马达,将镜头移动至指定位置(xi,yi)。S410, pushing the OIS motor through the OIS controller to move the lens to the designated position (x i , y i ).
具体地,可以是将镜头的坐标从(x1,y1)位置移动至(xi,yi)位置。Specifically, it is possible to move the coordinates of the lens from the (x 1 , y 1 ) position to the (x i , y i ) position.
S420,等待OIS马达稳定后拍照。S420, waiting for the OIS motor to stabilize and take a picture.
S430,拍摄图像数目是否达到预设数量张;不够,跳转至S410;足够,跳转至S440。S430, whether the number of captured images reaches a preset number of sheets; if not, jump to S410; enough, jump to S440.
具体地,拍摄图像过少,确定的OIS马达感度标定参数可能误差比较大,可以拍摄多张图像,以提高OIS马达感度标定参数精确度。Specifically, if the captured image is too small, the determined OIS motor sensitivity calibration parameter may have a relatively large error, and multiple images may be taken to improve the accuracy of the OIS motor sensitivity calibration parameter.
另外,镜头偏移量和图像偏移量之间未必存在严格的线性关系,为了更准确地确定在某个镜头偏移量下的图像偏移量,有必要对不同镜头偏移量下的图像偏移情况进行测算。In addition, there is not necessarily a strict linear relationship between the lens shift amount and the image shift amount. In order to more accurately determine the image shift amount under a certain lens shift amount, it is necessary to image the image at different lens shift amounts. The offset is measured.
S440,检测各张图像中的特征点坐标,结合镜头位置(xi,yi),确定OIS马达感度标定参数。S440, detecting feature point coordinates in each image, and determining the OIS motor sensitivity calibration parameter in combination with the lens position (x i , y i ).
具体地,分别检测镜头移动前后,拍摄的图像中的特征点坐标,获取图像偏移量。根据镜头的移动距离,确定镜头偏移量,根据图像偏移量和镜头偏移量,确定OIS马达感度标定参数。Specifically, the feature point coordinates in the captured image before and after the lens movement are respectively detected, and the image offset amount is acquired. The lens shift amount is determined according to the moving distance of the lens, and the OIS motor sensitivity calibration parameter is determined according to the image shift amount and the lens shift amount.
S450,将OIS马达感度标定参数存储至终端设备中。S450: Store the OIS motor sensitivity calibration parameter into the terminal device.
具体地,在终端设备出厂之前,将OIS马达感度标定参数存储至其中,以便该终端设备出厂后,在利用该终端设备采集目标场景时,根据预先存储在其中的OIS马达感度标定参数,将镜头偏移量转换为图像偏移量,并用镜头偏移量补偿摄像头标定参数,使目标场景的场景深度更精确。Specifically, before the terminal device leaves the factory, the OIS motor sensitivity calibration parameter is stored therein, so that after the terminal device is shipped from the factory, when the target device is used to collect the target scene, the lens is adjusted according to the OIS motor sensitivity calibration parameter stored in advance. The offset is converted to the image offset, and the camera offset parameter is compensated by the lens offset to make the scene depth of the target scene more accurate.
下面结合图5a、5b、6a-6c,对如何进行深度计算做进一步的说明。
The depth calculation will be further described below with reference to Figs. 5a, 5b, 6a-6c.
在一个例子中,如图5a所示,图5a为本发明实施例提供的场景深度计算的一个示意图。以第一摄像头带有OIS系统,第二摄像头不带OIS系统,第一摄像头的镜头在X方向移动为例,其中,摄像头的镜头为一凸透镜,入射线和与其对应且相平行的出射线构成共轭光线,其入射点跟出射点的连线与主光轴的交点,称为凸透镜的焦点,从焦点到底片或CCD等成像平面的距离叫焦距,位于透镜中央的点叫光心,主视线与底片或CCD等成像平面的交点叫主点,第一摄像头镜头和第二摄像头镜头的间距叫基线。初始化时,摄像头的以下标定参数是已知的:焦距为f,第一摄像头的镜头的光心为C1',主点为u1',第二摄像头的镜头的光心为C2,主点为u2,基线为B'。其中,摁下快门时,第一摄像头获取的第一图像的成像点为x1,第二摄像头获取的第二图像的成像点为x2。In an example, as shown in FIG. 5a, FIG. 5a is a schematic diagram of scene depth calculation according to an embodiment of the present invention. The first camera has an OIS system, the second camera does not have an OIS system, and the lens of the first camera moves in the X direction. The lens of the camera is a convex lens, and the incoming rays and the corresponding and parallel rays are formed. The conjugate ray, the intersection of the line connecting the incident point and the exit point with the main optical axis, is called the focal point of the convex lens. The distance from the focus plane or the imaging plane such as CCD is called the focal length. The point at the center of the lens is called the optical center. The intersection of the line of sight with the imaging plane such as the film or CCD is called the main point, and the distance between the first camera lens and the second camera lens is called the baseline. At initialization, the following calibration parameters of the camera are known: the focal length is f, the optical center of the lens of the first camera is C 1 ', the main point is u 1 ', and the optical center of the lens of the second camera is C 2 , the main The point is u 2 and the baseline is B'. Wherein, when the shutter is closed, the imaging point of the first image acquired by the first camera is x 1 , and the imaging point of the second image acquired by the second camera is x 2 .
此时,利用相似三角形原理,计算出的场景深度Z'为:At this point, using the similar triangle principle, the calculated scene depth Z' is:
在现有技术中,由于没有考虑摁下快门时,终端设备抖动造成的摄像头的部分标定参数发生变化,因此场景深度计算是按照未补偿的摄像头标定参数,主点u1'和基线B',来完成的。计算出的场景深度Z'误差较大。In the prior art, since the partial calibration parameters of the camera caused by the jitter of the terminal device are changed when the underarm shutter is not considered, the scene depth calculation is based on the uncompensated camera calibration parameters, the main point u 1 ' and the baseline B', To complete. The calculated scene depth Z' error is large.
当摁下快门时,终端设备抖动会引起摄像头的部分标定参数发生变化,根据抖动幅度,OIS马达推动第一摄像头镜头发生偏移,第一摄像头镜头的光心从C1'变为C1(即第一摄像头的镜头偏移量,单位为code),主点从u1'变为u1,基线从B'变为B1,需要计算出u1和B1。利用镜头偏移量补偿变化的摄像头标定参数,确定出补偿后的摄像头标定参数的数值,利用相似三角形原理,计算出的场景深度Z为:When pressed the shutter, the terminal apparatus can cause jitter calibration parameters of the camera portion is changed, pushing the first camera lens according to the jitter amplitude shift occurs, the OIS motor, a first optical center of the camera lens from C 1 'becomes C 1 ( That is, the lens offset of the first camera is in code), the main point changes from u 1 ' to u 1 , and the baseline changes from B' to B 1 , and u 1 and B 1 need to be calculated. The camera calibration parameters of the lens offset compensation are used to determine the value of the compensated camera calibration parameters. Using the similar triangle principle, the calculated scene depth Z is:
其中,a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜
头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,以pixel为单位,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm。Where a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset of the first camera, pixel pitch is the size of one pixel, and a 1 × Δ 1 ≈ Δx 1 is the first image offset In units of pixels, a 1 × Δ 1 × pixel pitch is a unit for converting the first image shift amount from pixel to mm.
在另一个例子中,如图5b所示,图5b为本发明实施例提供的场景深度计算的又一个示意图。以第一摄像头和第二摄像头同时带有OIS系统,第一摄像头和第二摄像头同时在X方向移动为例,初始化时,摄像头的以下标定参数是已知的:焦距为f,第一摄像头的镜头的光心为C1',主点为u1',第二摄像头的镜头的光心为C2',主点为u2',基线为B'。摁下快门时,第一摄像头获取的第一图像的成像点为x1,第二摄像头获取的第二图像的成像点为x2。In another example, as shown in FIG. 5b, FIG. 5b is still another schematic diagram of scene depth calculation according to an embodiment of the present invention. The first camera and the second camera are simultaneously provided with an OIS system, and the first camera and the second camera are simultaneously moved in the X direction. When initializing, the following calibration parameters of the camera are known: the focal length is f, the first camera The optical center of the lens is C 1 ', the main point is u 1 ', the optical center of the lens of the second camera is C 2 ', the main point is u 2 ', and the baseline is B'. When the shutter is depressed, the imaging point of the first image acquired by the first camera is x 1 , and the imaging point of the second image acquired by the second camera is x 2 .
此时,利用相似三角形原理,计算出的场景深度Z'为:At this point, using the similar triangle principle, the calculated scene depth Z' is:
在现有技术中,由于没有考虑摁下快门时,终端设备抖动造成的摄像头的部分标定参数发生变化,因此深度计算是按照未补偿的摄像头标定参数,主点u1'、u2'和基线B'来完成的。计算出的场景深度Z'误差较大。In the prior art, since the partial calibration parameters of the camera caused by the jitter of the terminal device are changed when the underarm shutter is not considered, the depth calculation is based on the uncompensated camera calibration parameters, the main points u 1 ', u 2 ' and the baseline. B' to complete. The calculated scene depth Z' error is large.
当摁下快门时,终端设备抖动会引起摄像头的部分标定参数发生变化,即第一摄像头镜头的光心从C1'变为C1,主点从u1'变为u1,第二摄像头镜头的光心从C2'变为C2(对应第二摄像头的镜头偏移量,单位为code),主点从u2'变为u2,基线从B'变为B2,需要计算出u1、u2和B2。利用镜头偏移量(包括第一摄像头的镜头偏移量和第二摄像头的镜头偏移量),补偿变化的摄像头标定参数,确定出变化后的摄像头标定参数的数值,利用相似三角形原理,计算出的场景深度Z为:When the shutter is lowered, the terminal device shake will cause some calibration parameters of the camera to change, that is, the optical center of the first camera lens changes from C 1 ' to C 1 , and the main point changes from u 1 ' to u 1 , the second camera The optical center of the lens changes from C 2 ' to C 2 (corresponding to the lens offset of the second camera, the unit is code), the main point changes from u 2 ' to u 2 , and the baseline changes from B' to B 2 , which needs to be calculated. u 1 , u 2 and B 2 are produced. Using the lens offset (including the lens offset of the first camera and the lens offset of the second camera), compensating for the changed camera calibration parameters, determining the value of the camera calibration parameter after the change, using the similar triangle principle, calculating The depth Z of the scene is:
其中,a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2
为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm。Where a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset of the first camera, and a 1 × Δ 1 ≈ Δx 1 is the first image offset, a 1 × Δ 1 × pixel The pitch is to convert the unit of the first image offset from pixel to mm, a 2 is the OIS motor sensitivity calibration parameter of the second camera, and Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 For the second image shift amount, a 2 × Δ 2 × pixel pitch is a unit pixel of the second image shift amount converted into mm.
虽然图5a和图5b是以摄像头在一个方向的移动为例说明如何进行摄像头标定参数补偿的,但是应当意识到,同理可以实现两个摄像头在两个方向移动的摄像头标定参数的补偿,此处不再赘述。Although FIG. 5a and FIG. 5b illustrate how the camera calibration parameter compensation is performed by taking the movement of the camera in one direction as an example, it should be realized that the same can realize the compensation of the camera calibration parameters of the two cameras moving in two directions. I won't go into details here.
图6a为补偿双摄像头标定参数时,拍摄的图像示意图;图6b为没有补偿双摄像头标定参数后,拍摄的图像示意图;图6c为图6a的局部放大图;图6d为图6b的局部放大图;从图6a-6d可以看出,没有补偿双摄像头标定参数时,图像的对齐效果很差,补偿双摄像头标定参数后,图像的对齐效果很好。Figure 6a is a schematic diagram of an image taken when compensating the calibration parameters of the dual camera; Figure 6b is a schematic diagram of the image taken after the calibration parameters of the dual camera are not compensated; Figure 6c is a partial enlarged view of Figure 6a; Figure 6d is a partial enlarged view of Figure 6b It can be seen from Fig. 6a-6d that the image alignment effect is poor when the dual camera calibration parameters are not compensated, and the alignment effect of the image is good after compensating the dual camera calibration parameters.
图7a为没有补偿双摄像头标定参数时,场景深度示意图;图7b为补偿双摄像头标定参数后,场景深度示意图。在图7a和7b中,不同深度值用不同的颜色表示,黑色表示无法计算场景深度。在图7a中,在1000mm处测得的场景深度为1915.8mm,在300mm处测得的场景深度为344.6mm。在图7b中,在1000mm处测得的场景深度为909.6mm,在300mm处测得的场景深度为287.4mm。由此得出,补偿双摄像头标定参数后,计算的场景深度值更准确。Figure 7a is a schematic diagram of the depth of the scene when the calibration parameters of the dual camera are not compensated; Figure 7b is a schematic diagram of the depth of the scene after the calibration parameters of the dual camera are compensated. In Figures 7a and 7b, different depth values are represented by different colors, and black indicates that the scene depth cannot be calculated. In Figure 7a, the depth of the scene measured at 1000 mm is 1915.8 mm and the depth of the scene measured at 300 mm is 344.6 mm. In Figure 7b, the depth of the scene measured at 1000 mm is 909.6 mm and the depth of the scene measured at 300 mm is 287.4 mm. It follows that the calculated scene depth value is more accurate after compensating the dual camera calibration parameters.
应用本发明实施例提供的具有双摄像头终端的场景深度计算方法,解决了视差问题导致的场景深度值不准确的问题。The scene depth calculation method with the dual camera terminal provided by the embodiment of the invention solves the problem that the scene depth value is inaccurate due to the parallax problem.
图8为本发明实施例二提供的场景深度计算装置结构示意图,如图8所示,该场景深度计算装置800包括:第一获取单元810,处理单元820,计算单元830。FIG. 8 is a schematic structural diagram of a scene depth calculation apparatus according to Embodiment 2 of the present invention. As shown in FIG. 8, the scene depth calculation apparatus 800 includes a first acquisition unit 810, a processing unit 820, and a calculation unit 830.
其中,第一获取单元810,用于获取带有OIS系统的摄像头的镜头偏移量;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上。The first acquiring unit 810 is configured to acquire a lens offset of the camera with the OIS system; wherein the first camera and/or the second camera have an OIS system, and the first camera and the second camera are arranged side by side in the same On the body of the terminal device.
处理单元820,用于根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量。
The processing unit 820 is configured to convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter.
计算单元830,用于根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及获取的所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。The calculating unit 830 is configured to obtain, according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the acquired target scenes obtained by the first camera and the second camera at the same time The first image and the second image are used to calculate a scene depth; wherein the calibration parameter of the first camera is compensated according to the lens offset of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
具体地,处理单元820具体用于:根据下式,将镜头偏移量转换为图像偏移量,Specifically, the processing unit 820 is specifically configured to: convert the lens offset into an image offset according to the following formula,
Δx≈α×ΔCΔx≈α×ΔC
其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量。Where Δx is the image shift amount, α is the OIS motor sensitivity calibration parameter, and ΔC is the lens shift amount.
当第一摄像头带有OIS系统,第二摄像头不带OIS系统时,所述计算单元830,具体用于:When the first camera has an OIS system and the second camera does not have an OIS system, the calculating unit 830 is specifically configured to:
利用下式确定目标场景的场景深度,Use the following formula to determine the depth of the scene of the target scene.
其中,Z为目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, and pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image offset, a × Δ 1 × pixel pitch is the unit of the first image offset is converted from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. from B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 is an imaging point of the second image.
当第一摄像头和第二摄像头都带有OIS系统时,计算单元830,具体用于:When both the first camera and the second camera have an OIS system, the calculating unit 830 is specifically configured to:
利用下式确定目标场景的场景深度,Use the following formula to determine the depth of the scene of the target scene.
a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜a 1 is the OIS motor sensitivity calibration parameter of the first camera, and Δ 1 is the mirror of the first camera
其中,among them,
头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移
量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。The head offset, a 1 × Δ 1 ≈ Δx 1 is the first image shift amount, and a 1 × Δ 1 × pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second. The OIS motor sensitivity calibration parameter of the camera, Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 is the second image offset, and a 2 × Δ 2 × pixel pitch is the second image bias The unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first The image is imaged and x 2 is the imaged point of the second image.
图9为本发明实施例二提供的又一场景深度计算装置结构示意图。如图9所示,该装置还可以为场景深度计算装置900,该装置900还可以包括:第二获取单元910和确定单元920。FIG. 9 is a schematic structural diagram of still another scene depth computing apparatus according to Embodiment 2 of the present invention. As shown in FIG. 9 , the device may also be a scene depth computing device 900. The device 900 may further include: a second acquiring unit 910 and a determining unit 920.
第二获取单元910,用于获取陀螺仪传感器检测的终端设备抖动的角速度信息;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。a second obtaining unit 910, configured to acquire angular velocity information of the terminal device jitter detected by the gyro sensor; convert the angular velocity information into a jitter amplitude of the terminal device, and drive the OIS motor to push the lens of the first camera according to the jitter amplitude And/or the lens of the second camera moves and acquires the lens offset of the first camera and/or the second camera.
确定单元920,用于通过OIS控制器,推动OIS马达,将镜头移动至指定位置;等待所述OIS马达稳定后拍照;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置,确定OIS马达感度标定参数。The determining unit 920 is configured to: push the OIS motor through the OIS controller, move the lens to the designated position; wait for the OIS motor to stabilize and take a picture; and when the captured image reaches a preset number of sheets, detect the feature point coordinates of each image And determining the OIS motor sensitivity calibration parameter according to the specified position of the lens.
图10为本发明实施例三提供的终端结构示意图。如图10所示,该终端1000包括摄像头1010(摄像头1010包括第一摄像头和第二摄像头)处理器1020、存储器1030、系统总线;摄像头1010、处理器1020和存储器1030通过系统总线建立连接。FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 3 of the present invention. As shown in FIG. 10, the terminal 1000 includes a camera 1010 (the camera 1010 includes a first camera and a second camera) processor 1020, a memory 1030, and a system bus; the camera 1010, the processor 1020, and the memory 1030 establish a connection through a system bus.
第一摄像头和第二摄像头至少用于在同一时刻采集目标场景,分别得到第一图像和第二图像;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上。The first camera and the second camera are configured to acquire at least the target scene at the same time to obtain the first image and the second image respectively; wherein the first camera and/or the second camera have an OIS system, the first camera and the second camera Parallel to the fuselage of the same terminal device.
存储器1020用于存储第一图像和第二图像。The memory 1020 is configured to store the first image and the second image.
处理器1030用于获取带有OIS马达的摄像头的镜头偏移量,根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;根据补偿后的
第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及从所述存储器获取的第一图像和第二图像,计算目标场景的场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。The processor 1030 is configured to acquire a lens offset of the camera with the OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter;
a first camera calibration parameter and/or a compensated second camera calibration parameter, and a first image and a second image acquired from the memory, calculating a scene depth of the target scene; wherein, according to the lens offset of the first camera The calibration parameter of the first camera is compensated, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
具体的,OIS系统具体用于,获取陀螺仪传感器检测的终端设备抖动的角速度信息;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。Specifically, the OIS system is specifically configured to acquire angular velocity information of the terminal device jitter detected by the gyro sensor, convert the angular velocity information into a jitter amplitude of the terminal device, and drive the OIS motor to push the first camera according to the jitter amplitude. The lens of the lens and/or the second camera moves and acquires the lens offset of the first camera and/or the second camera.
处理器1030还用于,通过OIS控制器,推动OIS马达,将镜头移动至指定位置;等待所述OIS马达稳定后拍照;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置和所述各张图像的特征点坐标,确定OIS马达感度标定参数。The processor 1030 is further configured to: push the OIS motor through the OIS controller, move the lens to a designated position; wait for the OIS motor to stabilize and take a picture; and when the captured image reaches a preset number of sheets, detect feature points of each image Coordinates, and determining OIS motor sensitivity calibration parameters according to the specified position of the lens and the feature point coordinates of the respective images.
存储器1020还用于,存储所述OIS马达感度标定参数。The memory 1020 is further configured to store the OIS motor sensitivity calibration parameter.
进一步的,处理器1030具体用于,根据下式,将镜头偏移量转换为图像偏移量,Further, the processor 1030 is specifically configured to convert the lens offset into an image offset according to the following formula.
Δx≈α×ΔCΔx≈α×ΔC
其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量。Where Δx is the image shift amount, α is the OIS motor sensitivity calibration parameter, and ΔC is the lens shift amount.
进一步的,当第一摄像头带有OIS系统时,所述处理器具体用于,利用下式确定所述目标场景的场景深度,Further, when the first camera is provided with an OIS system, the processor is specifically configured to determine a scene depth of the target scene by using the following formula:
其中,Z为所述目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像
点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image shift amount, and a×Δ 1 ×pixel pitch is a unit for converting the first image shift amount from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. , the baseline B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 of the second imaging point image.
进一步的,当第一摄像头和第二摄像头都带有OIS系统时,所述处理器具体用于,利用下式确定所述目标场景的场景深度,Further, when both the first camera and the second camera are provided with an OIS system, the processor is specifically configured to determine a scene depth of the target scene by using the following formula:
a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜a 1 is the OIS motor sensitivity calibration parameter of the first camera, and Δ 1 is the mirror of the first camera
其中,among them,
头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。The head offset, a 1 × Δ 1 ≈ Δx 1 is the first image shift amount, and a 1 × Δ 1 × pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second. The OIS motor sensitivity calibration parameter of the camera, Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 is the second image offset, and a 2 × Δ 2 × pixel pitch is the second image bias The unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first The image is imaged and x 2 is the imaged point of the second image.
专业人员应该还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。A person skilled in the art should further appreciate that the elements and algorithm steps of the various examples described in connection with the embodiments disclosed herein can be implemented in electronic hardware, computer software, or a combination of both, in order to clearly illustrate hardware and software. Interchangeability, the composition and steps of the various examples have been generally described in terms of function in the above description. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods for implementing the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present invention.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分步骤是可以通过程序来指令处理器完成,所述的程序可以存储于计算机可读存储介质中,所述存储介质是非短暂性(英文:non-transitory)介质,例如随机存取存储器,只读存储器,快闪存储器,硬盘,固态硬盘,磁带(英文:magnetic tape),软盘(英文:floppy disk),光盘(英文:optical disc)及其任意组合。
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be performed by a program, and the program may be stored in a computer readable storage medium, which is non-transitory ( English: non-transitory) media, such as random access memory, read-only memory, flash memory, hard disk, solid state disk, magnetic tape (English: magnetic tape), floppy disk (English: floppy disk), CD (English: optical disc) And any combination thereof.
Claims (18)
- 一种场景深度计算方法,其特征在于,所述方法包括:A method for calculating a depth of a scene, the method comprising:获取带有OIS系统的摄像头的镜头偏移量;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上;Obtaining a lens offset of the camera with the OIS system; wherein the first camera and/or the second camera are provided with an OIS system, and the first camera and the second camera are juxtaposed on the body of the same terminal device;根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;Converting the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter;根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。And the first camera and the second image obtained by the first camera and the second camera respectively acquiring the target scene at the same time according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, Calculating a scene depth of the target scene; wherein, the calibration parameter of the first camera is compensated according to the lens offset of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
- 根据权利要求1所述的方法,其特征在于,所述方法之前还包括:The method of claim 1 further comprising:获取陀螺仪传感器检测的终端设备抖动的角速度信息;Obtaining angular velocity information of terminal device jitter detected by the gyro sensor;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。Converting the angular velocity information into a jitter amplitude of the terminal device, driving the OIS motor to drive the lens movement of the lens of the first camera and/or the second camera according to the jitter amplitude, and acquiring the first camera and/or the second camera The lens offset.
- 根据权利要求1所述的方法,其特征在于,根据以下步骤确定OIS马达感度标定参数:The method of claim 1 wherein the OIS motor sensitivity calibration parameters are determined according to the following steps:通过OIS控制器,推动OIS马达,将镜头移动至指定位置;Push the OIS motor through the OIS controller to move the lens to the specified position;等待所述OIS马达稳定后拍照;Waiting for the OIS motor to stabilize after taking a picture;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置和所述各张图像的特征点坐标,确定OIS马达感度标定参数。When the captured image reaches a preset number of sheets, the feature point coordinates of each image are detected, and the OIS motor sensitivity calibration parameter is determined according to the specified position of the lens and the feature point coordinates of the respective images.
- 根据权利要求1所述的方法,其特征在于,所述根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量,具体包括: The method according to claim 1, wherein the converting the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter comprises:根据下式,将镜头偏移量转换为图像偏移量,Convert the lens offset to the image offset according to the following formula,Δx≈α×ΔCΔx≈α×ΔC其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量。Where Δx is the image shift amount, α is the OIS motor sensitivity calibration parameter, and ΔC is the lens shift amount.
- 根据权利要求1所述的方法,其特征在于,当第一摄像头带有OIS系统时,所述根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度具体包括:The method according to claim 1, wherein when the first camera is provided with an OIS system, the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the A camera and the second camera collect the first image and the second image respectively obtained by the target scene at the same time, and calculating the scene depth of the target scene specifically includes:利用下式确定所述目标场景的场景深度,Determining the depth of the scene of the target scene by using the following formula,其中,Z为所述目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image shift amount, and a×Δ 1 ×pixel pitch is a unit for converting the first image shift amount from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. , the baseline B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 of the second imaging point image.
- 根据权利要求1所述的方法,其特征在于,当第一摄像头和第二摄像头都带有OIS系统时,所述根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度具体包括:The method according to claim 1, wherein when the first camera and the second camera both have an OIS system, the compensated first camera calibration parameter and/or the compensated second camera calibration parameter And the first image and the second image obtained by the first camera and the second camera at the same time, and the scene depth of the target scene is specifically calculated by:利用下式确定所述目标场景的场景深度,Determining the depth of the scene of the target scene by using the following formula,其中,a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移 量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset of the first camera, and a 1 × Δ 1 ≈ Δx 1 is the first image offset, a 1 × Δ 1 × pixel The pitch is to convert the unit of the first image offset from pixel to mm, a 2 is the OIS motor sensitivity calibration parameter of the second camera, and Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 For the second image offset, a 2 × Δ 2 × pixel pitch is to convert the unit pixel of the second image offset into mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, after compensation The principal point of the second camera changes from u 2 ' to u 2 , x 1 is the first image imaging point, and x 2 is the imaging point of the second image.
- 一种场景深度计算装置,其特征在于,所述装置包括:第一获取单元,处理单元,计算单元;A scene depth computing device, comprising: a first acquiring unit, a processing unit, and a calculating unit;所述第一获取单元,用于获取带有OIS系统的摄像头的镜头偏移量;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上;The first acquiring unit is configured to acquire a lens offset of the camera with the OIS system; wherein the first camera and/or the second camera have an OIS system, and the first camera and the second camera are juxtaposed in the same terminal On the fuselage of the device;所述处理单元,用于根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;The processing unit is configured to convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter;所述计算单元,用于根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及所述第一摄像头和所述第二摄像头在同一时刻采集目标场景分别得到的第一图像和第二图像,计算所述目标场景的场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。The calculating unit is configured to: according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the first camera and the second camera respectively acquire the target scene at the same time An image and a second image are used to calculate a scene depth of the target scene; wherein, the calibration parameter of the first camera is compensated according to the lens offset of the first camera, and the calibration of the second camera is compensated according to the lens offset of the second camera parameter.
- 根据权利要求7所述的方法,其特征在于,所述装置还包括:第二获取单元;The method according to claim 7, wherein the device further comprises: a second acquiring unit;所述第二获取单元,具体用于,The second obtaining unit is specifically configured to:获取陀螺仪传感器检测的终端设备抖动的角速度信息;Obtaining angular velocity information of terminal device jitter detected by the gyro sensor;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。Converting the angular velocity information into a jitter amplitude of the terminal device, driving the OIS motor to drive the lens movement of the lens of the first camera and/or the second camera according to the jitter amplitude, and acquiring the first camera and/or the second camera The lens offset.
- 根据权利要求7所述的装置,其特征在于,所述装置还包括:确定单元; The device according to claim 7, wherein the device further comprises: a determining unit;所述确定单元具体用于,通过OIS控制器,推动OIS马达,将镜头移动至指定位置;The determining unit is specifically configured to: push the OIS motor through the OIS controller to move the lens to a designated position;等待所述OIS马达稳定后拍照;Waiting for the OIS motor to stabilize after taking a picture;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据镜头的所述指定位置,确定OIS马达感度标定参数。When the captured image reaches a preset number of sheets, the feature point coordinates of each image are detected, and the OIS motor sensitivity calibration parameter is determined according to the specified position of the lens.
- 根据权利要求7所述的方法,其特征在于,所述处理单元具体用于:The method according to claim 7, wherein the processing unit is specifically configured to:根据下式,将镜头偏移量转换为图像偏移量,Convert the lens offset to the image offset according to the following formula,Δx≈α×ΔCΔx≈α×ΔC其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量。Where Δx is the image shift amount, α is the OIS motor sensitivity calibration parameter, and ΔC is the lens shift amount.
- 根据权利要求5所述的方法,其特征在于,当第一摄像头带有OIS系统时,所述计算单元,具体用于:The method according to claim 5, wherein when the first camera is provided with an OIS system, the calculating unit is specifically configured to:利用下式确定所述目标场景的场景深度,Determining the depth of the scene of the target scene by using the following formula,其中,Z为所述目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image shift amount, and a×Δ 1 ×pixel pitch is a unit for converting the first image shift amount from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. , the baseline B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 of the second imaging point image.
- 根据权利要求7所述的装置,其特征在于,当第一摄像头和第二摄像头都带有OIS系统时,所述计算单元,具体用于:The device according to claim 7, wherein when both the first camera and the second camera are provided with an OIS system, the calculating unit is specifically configured to:利用下式确定所述目标场景的场景深度,Determining the depth of the scene of the target scene by using the following formula,其中,a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜 头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset of the first camera, and a 1 × Δ 1 ≈ Δx 1 is the first image offset, a 1 × Δ 1 × pixel The pitch is to convert the unit of the first image offset from pixel to mm, a 2 is the OIS motor sensitivity calibration parameter of the second camera, and Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 For the second image offset, a 2 × Δ 2 × pixel pitch is to convert the unit pixel of the second image offset into mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, after compensation The principal point of the second camera changes from u 2 ' to u 2 , x 1 is the first image imaging point, and x 2 is the imaging point of the second image.
- 一种终端,其特征在于,所述终端包括第一摄像头和第二摄像头,所述第一摄像头和第二摄像头至少用于在同一时刻采集目标场景,分别得到第一图像和第二图像;其中,第一摄像头和/或第二摄像头带有OIS系统,第一摄像头和第二摄像头并列设置在同一终端设备的机身上;A terminal, comprising: a first camera and a second camera, wherein the first camera and the second camera are configured to at least acquire a target scene at the same time, respectively obtaining a first image and a second image; wherein The first camera and/or the second camera are provided with an OIS system, and the first camera and the second camera are arranged side by side on the body of the same terminal device;存储器,所述存储器用于存储所述第一图像和第二图像;a memory for storing the first image and the second image;处理器,所述处理器用于获取带有OIS马达的摄像头的镜头偏移量,根据预设的OIS马达感度标定参数,将所述镜头偏移量转换为图像偏移量;根据补偿后的第一摄像头标定参数和/或补偿后的第二摄像头标定参数,以及从所述存储器获取的第一图像和第二图像,计算所述目标场景的场景深度;其中,根据第一摄像头的镜头偏移量补偿第一摄像头的标定参数,根据第二摄像头的镜头偏移量补偿第二摄像头的标定参数。a processor, configured to acquire a lens offset of a camera with an OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter; a camera calibration parameter and/or a compensated second camera calibration parameter, and a first image and a second image acquired from the memory, calculating a scene depth of the target scene; wherein, according to a lens offset of the first camera The quantity compensates the calibration parameter of the first camera, and compensates the calibration parameter of the second camera according to the lens offset of the second camera.
- 根据权利要求13所述的终端,其特征在于,所述OIS系统具体用于,获取陀螺仪传感器检测的终端设备抖动的角速度信息;The terminal according to claim 13, wherein the OIS system is specifically configured to acquire angular velocity information of terminal device jitter detected by the gyro sensor;将所述角速度信息转化为终端设备的抖动幅度,根据所述抖动幅度驱动OIS马达推动所述第一摄像头的镜头和/或第二摄像头的镜头移动,并获取第一摄像头和/或第二摄像头的镜头偏移量。Converting the angular velocity information into a jitter amplitude of the terminal device, driving the OIS motor to drive the lens movement of the lens of the first camera and/or the second camera according to the jitter amplitude, and acquiring the first camera and/or the second camera The lens offset.
- 根据权利要求13所述的终端,其特征在于,所述处理器还用于,通过OIS控制器,推动OIS马达,将镜头移动至指定位置;The terminal according to claim 13, wherein the processor is further configured to: push the OIS motor through the OIS controller to move the lens to a designated position;等待所述OIS马达稳定后拍照;Waiting for the OIS motor to stabilize after taking a picture;当拍摄的图像达到预设数量张时,检测各张图像的特征点坐标,并根据 镜头的所述指定位置和所述各张图像的特征点坐标,确定OIS马达感度标定参数;When the captured image reaches a preset number of sheets, the feature point coordinates of each image are detected, and according to Determining an OIS motor sensitivity calibration parameter by the specified position of the lens and the feature point coordinates of the respective images;所述存储器还用于,存储所述OIS马达感度标定参数。The memory is further configured to store the OIS motor sensitivity calibration parameter.
- 根据权利要求13所述的终端,其特征在于,所述处理器具体用于,The terminal according to claim 13, wherein the processor is specifically configured to:根据下式,将镜头偏移量转换为图像偏移量,Convert the lens offset to the image offset according to the following formula,Δx≈α×ΔCΔx≈α×ΔC其中,Δx为图像偏移量,α为OIS马达感度标定参数,ΔC为镜头偏移量。Where Δx is the image shift amount, α is the OIS motor sensitivity calibration parameter, and ΔC is the lens shift amount.
- 根据权利要求13所述的终端其特征在于,当第一摄像头带有OIS系统时,所述处理器具体用于,利用下式确定所述目标场景的场景深度,The terminal according to claim 13, wherein when the first camera is provided with an OIS system, the processor is specifically configured to determine a scene depth of the target scene by using the following formula:其中,Z为所述目标场景的场景深度,f为焦距,a1为第一摄像头的OIS马达感度标定参数,Δ1为镜头偏移量,pixel pitch为一个pixel的大小,a1×Δ1≈Δx1为第一图像偏移量,a×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,基线从B'变为B1,第二摄像头的主点为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。Where Z is the scene depth of the target scene, f is the focal length, a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset, pixel pitch is the size of a pixel, a 1 ×Δ 1 ≈Δx 1 is the first image shift amount, and a×Δ 1 ×pixel pitch is a unit for converting the first image shift amount from pixel to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation. , the baseline B 'becomes B 1, the principal point of the second camera is u 2, x 1 is the first image forming point, x 2 of the second imaging point image.
- 根据权利要求13所述的终端,其特征在于,当第一摄像头和第二摄像头都带有OIS系统时,所述处理器具体用于,利用下式确定所述目标场景的场景深度,The terminal according to claim 13, wherein when both the first camera and the second camera are provided with an OIS system, the processor is specifically configured to determine a scene depth of the target scene by using the following formula:其中,a1为第一摄像头的OIS马达感度标定参数,Δ1为第一摄像头的镜头偏移量,a1×Δ1≈Δx1为第一图像偏移量,a1×Δ1×pixel pitch是将第一图像偏移量的单位从pixel转换为mm,a2为第二摄像头的OIS马达感度标定参数,Δ2为第二摄像头的镜头偏移量,a2×Δ2≈Δx2为第二图像偏移量,a2×Δ2×pixel pitch 是将第二图像偏移量的单位pixel转换为mm,补偿后第一摄像头的主点从u1'变为u1,补偿后第二摄像头的主点从u2'变为u2,x1为所述第一图像成像点,x2为所述第二图像的成像点。 Where a 1 is the OIS motor sensitivity calibration parameter of the first camera, Δ 1 is the lens offset of the first camera, and a 1 × Δ 1 ≈ Δx 1 is the first image offset, a 1 × Δ 1 × pixel The pitch is to convert the unit of the first image offset from pixel to mm, a 2 is the OIS motor sensitivity calibration parameter of the second camera, and Δ 2 is the lens offset of the second camera, a 2 × Δ 2 ≈ Δx 2 For the second image offset, a 2 × Δ 2 × pixel pitch is to convert the unit pixel of the second image offset into mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, after compensation. The principal point of the second camera changes from u 2 ' to u 2 , x 1 is the first image imaging point, and x 2 is the imaging point of the second image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680054264.2A CN108260360B (en) | 2016-10-25 | 2016-12-28 | Scene depth calculation method and device and terminal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610941102 | 2016-10-25 | ||
CN201610941102.2 | 2016-10-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018076529A1 true WO2018076529A1 (en) | 2018-05-03 |
Family
ID=62024299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/112696 WO2018076529A1 (en) | 2016-10-25 | 2016-12-28 | Scene depth calculation method, device and terminal |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108260360B (en) |
WO (1) | WO2018076529A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3582487A1 (en) * | 2018-06-15 | 2019-12-18 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image stabilisation |
CN112581538A (en) * | 2020-12-11 | 2021-03-30 | 昆山丘钛光电科技有限公司 | Method and device for obtaining motor sensitivity |
CN113873157A (en) * | 2021-09-28 | 2021-12-31 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111833394A (en) * | 2020-07-27 | 2020-10-27 | 深圳惠牛科技有限公司 | Camera calibration method and measuring method based on binocular measuring device |
KR20230039351A (en) | 2021-09-14 | 2023-03-21 | 삼성전자주식회사 | Electronic device applying bokeh effect to image and operating method thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102867304A (en) * | 2012-09-04 | 2013-01-09 | 南京航空航天大学 | Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system |
CN104954689A (en) * | 2015-06-30 | 2015-09-30 | 努比亚技术有限公司 | Method and shooting device for acquiring photo through double cameras |
CN105629427A (en) * | 2016-04-08 | 2016-06-01 | 东莞佩斯讯光电技术有限公司 | Stereoscopic digital photographing device based on double-controllable-lens inclined type voice coil motor |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5493942B2 (en) * | 2009-12-15 | 2014-05-14 | ソニー株式会社 | Imaging apparatus and imaging method |
JP6435265B2 (en) * | 2013-08-21 | 2018-12-05 | オリンパス株式会社 | Imaging apparatus, imaging method, and program |
CN103685950A (en) * | 2013-12-06 | 2014-03-26 | 华为技术有限公司 | Method and device for preventing shaking of video image |
-
2016
- 2016-12-28 WO PCT/CN2016/112696 patent/WO2018076529A1/en active Application Filing
- 2016-12-28 CN CN201680054264.2A patent/CN108260360B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102867304A (en) * | 2012-09-04 | 2013-01-09 | 南京航空航天大学 | Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system |
CN104954689A (en) * | 2015-06-30 | 2015-09-30 | 努比亚技术有限公司 | Method and shooting device for acquiring photo through double cameras |
CN105629427A (en) * | 2016-04-08 | 2016-06-01 | 东莞佩斯讯光电技术有限公司 | Stereoscopic digital photographing device based on double-controllable-lens inclined type voice coil motor |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3582487A1 (en) * | 2018-06-15 | 2019-12-18 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image stabilisation |
US10567659B2 (en) | 2018-06-15 | 2020-02-18 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image compensation method, electronic device and computer-readable storage medium |
CN112581538A (en) * | 2020-12-11 | 2021-03-30 | 昆山丘钛光电科技有限公司 | Method and device for obtaining motor sensitivity |
CN113873157A (en) * | 2021-09-28 | 2021-12-31 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
CN113873157B (en) * | 2021-09-28 | 2024-04-16 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108260360B (en) | 2021-01-05 |
CN108260360A (en) | 2018-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111147741B (en) | Focusing processing-based anti-shake method and device, electronic equipment and storage medium | |
JP6663040B2 (en) | Depth information acquisition method and apparatus, and image acquisition device | |
US8264553B2 (en) | Hardware assisted image deblurring | |
WO2020088133A1 (en) | Image processing method and apparatus, electronic device and computer-readable storage medium | |
WO2018228467A1 (en) | Image exposure method and device, photographing device, and storage medium | |
US9092875B2 (en) | Motion estimation apparatus, depth estimation apparatus, and motion estimation method | |
JP4852591B2 (en) | Stereoscopic image processing apparatus, method, recording medium, and stereoscopic imaging apparatus | |
JP6585006B2 (en) | Imaging device and vehicle | |
WO2019105214A1 (en) | Image blurring method and apparatus, mobile terminal and storage medium | |
WO2018076529A1 (en) | Scene depth calculation method, device and terminal | |
WO2014156731A1 (en) | Image-capturing device, solid-state image-capturing element, camera module, electronic devi ce, and image-capturing method | |
WO2020259474A1 (en) | Focus tracking method and apparatus, terminal device, and computer-readable storage medium | |
US20090214107A1 (en) | Image processing apparatus, method, and program | |
CN109963080B (en) | Image acquisition method and device, electronic equipment and computer storage medium | |
WO2018228466A1 (en) | Focus region display method and apparatus, and terminal device | |
CN109660718B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
JP2015019119A (en) | Image shake correction device | |
JP5857712B2 (en) | Stereo image generation apparatus, stereo image generation method, and computer program for stereo image generation | |
US8179431B2 (en) | Compound eye photographing apparatus, control method therefor, and program | |
JP5023750B2 (en) | Ranging device and imaging device | |
JP2015017999A (en) | Imaging device | |
US20220286611A1 (en) | Electrical image stabilization (eis)-assisted digital image stabilization (dis) | |
WO2018161322A1 (en) | Depth-based image processing method, processing device and electronic device | |
WO2022147703A1 (en) | Focus following method and apparatus, and photographic device and computer-readable storage medium | |
JP2014096761A (en) | Image processing apparatus, and control method and control program of the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16920386 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16920386 Country of ref document: EP Kind code of ref document: A1 |