[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118567098A - Self-adaptive sub-pixel precision light field data calibration method - Google Patents

Self-adaptive sub-pixel precision light field data calibration method Download PDF

Info

Publication number
CN118567098A
CN118567098A CN202411059811.9A CN202411059811A CN118567098A CN 118567098 A CN118567098 A CN 118567098A CN 202411059811 A CN202411059811 A CN 202411059811A CN 118567098 A CN118567098 A CN 118567098A
Authority
CN
China
Prior art keywords
light field
field data
dimensional
original
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411059811.9A
Other languages
Chinese (zh)
Inventor
卢志
吕文晋
王越笛
杨懿
邬京耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Hehu Technology Co ltd
Original Assignee
Zhejiang Hehu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Hehu Technology Co ltd filed Critical Zhejiang Hehu Technology Co ltd
Priority to CN202411059811.9A priority Critical patent/CN118567098A/en
Publication of CN118567098A publication Critical patent/CN118567098A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4023Scaling of whole images or parts thereof, e.g. expanding or contracting based on decimating pixels or lines of pixels; based on inserting pixels or lines of pixels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0025Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a self-adaptive sub-pixel precision light field data calibration method, which relates to the technical field of sensor equipment and comprises the following steps: acquiring original two-dimensional light field data, and acquiring the transverse and longitudinal deviation distances of the central point of the original two-dimensional light field data through row projection, column projection and energy distribution rearrangement; determining a pixel center point of the original two-dimensional light field data according to the transverse and longitudinal deviation distances of the center point, and cutting and rearranging the original two-dimensional light field data according to the pixel center point to obtain four-dimensional phase space light field data; calculating the sub-pixel center of the original two-dimensional light field data according to the pixel center and the four-dimensional phase space light field data; and scaling the original two-dimensional light field data by adopting different scaling sizes according to the sub-pixel center, and screening out calibrated sub-pixel precision light field data according to the corresponding energy value after four-dimensional phase space rearrangement. The self-adaptive sub-pixel precision light field data calibration method improves the resolution and quality of light field reconstruction.

Description

Self-adaptive sub-pixel precision light field data calibration method
Technical Field
The invention relates to the technical field of sensor equipment, in particular to a self-adaptive sub-pixel precision light field data calibration method.
Background
The conventional sensor can only record the two-dimensional intensity information of light rays, and the visual angle information is lost, so that the two-dimensional imaging result based on the conventional lens and the sensor system is the projection of a three-dimensional object to a two-dimensional plane, and the depth information of a sample is lost. The four-dimensional phase space light field imaging technology based on the micro lens can record the space information and the angle information of the light rays at the same time, and can describe the light rays in the three-dimensional space more completely. The three-dimensional chromatography result of the sample can be obtained by modeling the point spread function of the system and deconvolving the four-dimensional phase space data by using the point spread function.
The four-dimensional phase space data is generated by preprocessing two-dimensional original light field data through pixel calibration, clipping, rearrangement and the like, wherein the pixel calibration mainly comprises alignment of light field visual angles and alignment of micro lens magnification. Misalignment of the light field view can cause mismatching of light field data and point spread functions used for reconstruction, and further cause reduction of transverse and axial resolutions of a reconstructed chromatography result; misalignment of the magnification of the microlenses can cause aliasing and vignetting of the light field data at different perspectives, which in turn can lead to degradation of imaging resolution and quality. Therefore, the accurate four-dimensional phase space pixel calibration has important significance for high-quality reconstruction, in the prior art, the original two-dimensional light field data is subjected to regular grid rearrangement by manually calibrating the central view angle of the four-dimensional phase space, so that the pixel calibration is realized, but the adaptability of the pixel calibration to different systems and different data is poor by manually calibrating, and only the data calibration of pixel precision can be realized by regular sampling reconstruction.
Therefore, how to realize the light field data calibration with sub-pixel precision and improve the adaptability of the pixel calibration are the problems to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the invention provides a self-adaptive sub-pixel precision light field data calibration method, which realizes self-adaptive sub-pixel precision light field data calibration on different systems and different sample data and improves the resolution and quality of light field reconstruction.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
an adaptive sub-pixel precision light field data calibration method comprises the following steps:
Step 1: acquiring original two-dimensional light field data, and acquiring a center point transverse deviation distance and a center point longitudinal deviation distance of the original two-dimensional light field data through row projection, column projection and energy distribution rearrangement;
step 2: determining a pixel center point of the original two-dimensional light field data according to the transverse deviation distance of the center point and the longitudinal deviation distance of the center point, and cutting and rearranging the original two-dimensional light field data according to the pixel center point to obtain four-dimensional phase space light field data;
Step 3: calculating the sub-pixel center of the original two-dimensional light field data according to the pixel center and the four-dimensional phase space light field data;
Step 4: and scaling the original two-dimensional light field data by adopting different scaling sizes according to the sub-pixel center, constructing a plurality of groups of scaled four-dimensional phase space light field data through four-dimensional phase space rearrangement, and screening out calibrated sub-pixel precision light field data according to the energy value corresponding to the scaled four-dimensional phase space light field data.
Preferably, the specific steps of the step1 are as follows:
Step 11: collecting original two-dimensional light field data; the original two-dimensional light field data is a two-dimensional light field diagram, the transverse size and the longitudinal size of the original two-dimensional light field data are u x h and v x w respectively, and h and w respectively represent the number of transverse microlenses and longitudinal microlenses in a microlens array light field imaging system, namely space coordinates; u and v represent the horizontal and vertical field pixels, i.e., the angular coordinates, of a single microlens, respectively;
Performing row projection on the original two-dimensional light field data to obtain the transverse energy distribution of the original light field data;
Rearranging the transverse energy distribution of the original light field data into the transverse energy distribution of the micro lens, and carrying out row projection on the transverse energy distribution of the micro lens to obtain a plurality of (v) transverse visual angle energy distributions; the size of the microlens lateral energy distribution is v×w;
taking the distance of the maximum transverse view angle energy distribution in the transverse view angle energy distribution from the center as the transverse deviation distance of the center point of the original two-dimensional light field data
Step 12: performing column projection on the original two-dimensional light field data to obtain longitudinal energy distribution of the original light field data; rearranging the longitudinal energy distribution of the original light field data into the longitudinal energy distribution of the micro lens, and performing column projection on the longitudinal energy distribution of the micro lens to obtain a plurality of (u) longitudinal view angle energy distributions; taking the distance of the maximum longitudinal visual angle energy distribution in the longitudinal visual angle energy distribution from the center as the longitudinal offset distance of the center point of the original two-dimensional light field data; The size of the longitudinal energy distribution of the microlens is u×h.
Preferably, the specific steps of step 2 include:
step 21: according to the lateral deviation distance of the central point Distance from the central point longitudinallyDetermining pixel center point of original two-dimensional light field data again) Expressed as:
Wherein, AndRespectively representing the abscissa of the pixel center point and the ordinate of the pixel center point; h and w respectively represent the number of transverse microlenses and the number of longitudinal microlenses in the microlens array light field imaging system, and serve as space coordinates; u and v respectively represent a single microlens transverse field pixel and a single microlens longitudinal field pixel as angular coordinates;
step 22: according to the pixel center point ) Clipping the original two-dimensional light field data to obtain a two-dimensional light field diagram with aligned centers; rearranging the two-dimensional light field graph aligned with the center to obtain four-dimensional phase space light field data.
Preferably, the specific step of rearranging the two-dimensional light field pattern with the center aligned comprises:
step 221: fixing the angle coordinates, and obtaining pixel values of each group of angle coordinates in the two-dimensional light field diagram with the center aligned under different space coordinates to form a space coordinate sub-image of the angle coordinates, wherein the size is h x w;
Step 222: and (3) configuring the space coordinate sub-images corresponding to all the angle coordinates in sequence to construct four-dimensional phase space light field data, wherein the size is u, v, h and w.
Preferably, the specific steps of the step 3 are as follows:
Step 31: summing pixel values of the space coordinate sub-images under each group of angular coordinates in the four-dimensional phase space light field data to obtain energy distribution of all view angles, wherein the size of the energy distribution is u x v;
step 32: fitting two-dimensional Gaussian functions to the energy distribution of all view angles to obtain the maximum two-dimensional coordinates of the energy distribution );
Step 33: according to the pixel center point) And the two-dimensional coordinates of the maximum value of the energy distribution) Determining the sub-pixel center of original two-dimensional light field data) Expressed as:
preferably, the specific steps of step 4 include:
Step 41: using sub-pixel center ) Taking the original point as an origin, traversing different scaling sizes at preset intervals in a preset size range, and interpolating the original two-dimensional light field data to obtain scaled two-dimensional light field data corresponding to the different scaling sizes;
Step 42: carrying out four-dimensional phase space rearrangement on the scaled two-dimensional light field data corresponding to different scaling sizes to obtain scaled four-dimensional phase space light field data; the rearrangement step is the same as steps 221-222 described above;
Step 43: and summing all pixel value intensities of the space coordinate sub-images corresponding to the preset angle coordinates of each group of scaled four-dimensional phase space light field data to obtain total energy of the preset angle coordinates, and selecting the scaled four-dimensional phase space light field data corresponding to the maximum value of the total energy as calibrated sub-pixel precision light field data. The scale size corresponding to the maximum value of the total energy is the optimal scale size.
Preferably, the preset size range is [ u×h-10, u×h+10]; the preset interval is 1; the preset angle coordinate is [2:u-1,2:v-1].
Compared with the prior art, the invention discloses a self-adaptive sub-pixel precision light field data calibration method, which is used for calibrating light field data of original light field data of different systems and different samples by an automatic calculation method. Different from the traditional scheme, the method and the device perform coarse calibration through calculation and adjustment of the offset distance, preliminarily determine the light field data pixel center, perform fine calibration according to the adjusted square data pixel center, determine the light field data sub-pixel center and correct sampling interval, and finally obtain calibrated sub-pixel precision light field data. According to the invention, through a sub-pixel precision pixel calibration scheme, the resolution and quality of light field reconstruction are improved finally, and compared with manual calibration, the self-adaptive pixel calibration of light field data of different systems and samples is realized; compared with regular sampling, the method and the device realize the calibration of the sub-pixel precision of the light field data.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for calibrating adaptive sub-pixel precision light field data according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention discloses a self-adaptive sub-pixel precision light field data calibration method, which is shown in fig. 1 and comprises the following steps:
s1: acquiring original two-dimensional light field data, and acquiring a center point transverse deviation distance and a center point longitudinal deviation distance of the original two-dimensional light field data through row projection, column projection and energy distribution rearrangement;
S2: determining a pixel center point of the original two-dimensional light field data according to the transverse deviation distance of the center point and the longitudinal deviation distance of the center point, and cutting and rearranging the original two-dimensional light field data according to the pixel center point to obtain four-dimensional phase space light field data;
s3: calculating the sub-pixel center of the original two-dimensional light field data according to the pixel center and the four-dimensional phase space light field data;
S4: and scaling the original two-dimensional light field data by adopting different scaling sizes according to the sub-pixel center, constructing scaled four-dimensional phase space light field data through four-dimensional phase space rearrangement, and screening calibrated sub-pixel precision light field data according to the energy value corresponding to the scaled four-dimensional phase space light field data.
In another aspect, in a specific embodiment, an adaptive sub-pixel precision light field data calibration method specifically includes:
s1, rough calibration;
S11, acquiring original two-dimensional light field data, wherein the transverse size and the longitudinal size of an original two-dimensional light field image are u x h and v x w respectively, and h and w respectively represent the number of transverse microlenses and longitudinal microlenses in a microlens array light field imaging system, namely space coordinates; u and v represent the horizontal and vertical field pixels, i.e., the angular coordinates, of a single microlens, respectively;
s12, carrying out row projection on original two-dimensional light field data to obtain the transverse energy distribution of the original light field data; rearranging the transverse energy distribution of the original two-dimensional light field data into the transverse energy distribution of the micro lens, wherein the size of the transverse energy distribution is v; continuing the horizontal energy distribution of the micro lens to perform row projection so as to eliminate the influence of the shot sample on the energy distribution of the visual angles and obtain the energy distribution condition of the horizontal v visual angles; finding the distance of maximum value from the center in the energy distribution of the transverse v visual angles as the distance of transverse deviation of the center point of the original light field data ; Performing column projection on the original two-dimensional light field data to obtain longitudinal energy distribution of the original light field data; rearranging the longitudinal energy distribution of the original two-dimensional light field data into the longitudinal energy distribution of the micro lens, wherein the size is u; the longitudinal energy distribution of the micro lens is continuously projected in a row mode, so that the influence of a shooting sample on the energy distribution of the visual angles is eliminated, and the energy distribution condition of the longitudinal u visual angles is obtained; searching the distance of the maximum value from the center in the energy distribution of the longitudinal u visual angles as the distance of the longitudinal deviation of the center point of the original light field data
S13, according toAnd (3) withRedefining original two-dimensional light field data pixel central pointAndExpressed as:
Cutting and rearranging to obtain four-dimensional phase space light field data of the light field: according to the pixel center point ) Clipping the original two-dimensional light field data to obtain a two-dimensional light field diagram with aligned centers; rearranging the two-dimensional light field graph aligned with the center to obtain four-dimensional phase space light field data; the rearrangement process is as follows: fixing an angle coordinate, and obtaining pixel values of the angle coordinate under different space coordinates in a two-dimensional light field diagram with aligned centers to form a space coordinate sub-image of the angle coordinate, wherein the size is h x w; the space coordinate sub-images of all the angle coordinates are arranged in sequence to form four-dimensional phase space light field data together, and the size is u, v, h and w; the pixels of the original white image two-dimensional light field map are rearranged into four-dimensional phase space light field data by the operation;
s2, accurate calibration;
S21, summing pixel values of the space coordinate sub-images under each angle coordinate of the four-dimensional phase space data, namely summing the last two dimensions of the four-dimensional phase space data to obtain energy distribution of all view angles, wherein the size of the energy distribution is u;
S22, the visual angle energy distribution is similar to Gaussian distribution, and the form is as follows:
Wherein the method comprises the steps of AndRepresenting the center point of the gaussian function; performing two-dimensional Gaussian function fitting on the energy distribution of all the visual angles to obtain a two-dimensional coordinate of the maximum value of the energy distributionAnd
S23, obtaining a pixel precision center according to the row and column projection in S13) And two-dimensional coordinates of the maximum of the gaussian fit energy distribution) (I.e., sub-pixel accuracy offset), determining the sub-pixel center of the light field data [ ]):
S24, using the sub-pixel center of the original two-dimensional light field data @ to perform @ operation) Taking u.h-10 to u.h+10 as an origin, traversing different scaling sizes to interpolate the two-dimensional light field data with 1 as an interval, and obtaining a scaled image; carrying out four-dimensional phase space rearrangement on the scaled two-dimensional light field data; calculating the intensity summation of all pixel values of the space coordinate sub-images corresponding to the four-dimensional phase space data angle coordinates [2:u-1,2:v-1] to obtain the total energy of the angle coordinates, and when the energy is maximum, obtaining the optimal scaling size; the four-dimensional phase space rearrangement process is the same as the above-described S13 process;
S25, four-dimensional phase space data obtained through optimal scaling size rearrangement are selected to be used as corrected sub-pixel precision light field data.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. The self-adaptive sub-pixel precision light field data calibration method is characterized by comprising the following steps of:
step 1: acquiring original two-dimensional light field data, and acquiring a center point transverse offset distance and a center point longitudinal offset distance of the original two-dimensional light field data through row projection, column projection and energy distribution rearrangement;
Step 2: determining a pixel center point of the original two-dimensional light field data according to the transverse deviation distance of the center point and the longitudinal deviation distance of the center point, and cutting and rearranging the original two-dimensional light field data according to the pixel center point to obtain four-dimensional phase space light field data;
step 3: calculating a sub-pixel center of the original two-dimensional light field data according to the pixel center and the four-dimensional phase space light field data;
Step 4: and scaling the original two-dimensional light field data by adopting different scaling sizes according to the sub-pixel center, constructing a plurality of groups of scaled four-dimensional phase space light field data through four-dimensional phase space rearrangement, and screening out calibrated sub-pixel precision light field data according to the energy value corresponding to the scaled four-dimensional phase space light field data.
2. The adaptive sub-pixel accurate light field data calibration method according to claim 1, wherein the specific steps of step 1 are as follows:
Step 11: acquiring the original two-dimensional light field data, and performing row projection on the original two-dimensional light field data to obtain the transverse energy distribution of the original light field data; rearranging the transverse energy distribution of the original light field data into the transverse energy distribution of the micro lens, and carrying out row projection on the transverse energy distribution of the micro lens to obtain a plurality of transverse visual angle energy distributions; taking the distance of the maximum transverse view angle energy distribution in the transverse view angle energy distribution from the center as the transverse deviation distance of the center point of the original two-dimensional light field data
Step 12: performing column projection on the original two-dimensional light field data to obtain longitudinal energy distribution of the original light field data; rearranging the longitudinal energy distribution of the original light field data into longitudinal energy distribution of a micro lens, and performing column projection on the longitudinal energy distribution of the micro lens to obtain a plurality of longitudinal visual angle energy distribution; taking the distance of the maximum longitudinal view angle energy distribution in the longitudinal view angle energy distribution from the center as the longitudinal offset distance of the center point of the original two-dimensional light field data
3. The adaptive sub-pixel accurate light field data calibration method according to claim 1, wherein the specific steps of step 2 include:
step 21: according to the lateral deviation distance of the central point Longitudinally offset from the center pointRe-determining the pixel center point of the original two-dimensional light field data) Expressed as:
Wherein, AndRespectively representing the abscissa of the pixel center point and the ordinate of the pixel center point; h and w respectively represent the number of transverse microlenses and the number of longitudinal microlenses in the microlens array light field imaging system, and serve as space coordinates; u and v respectively represent a single microlens transverse field pixel and a single microlens longitudinal field pixel as angular coordinates;
Step 22: according to the pixel center point [ ] ) Cutting the original two-dimensional light field data to obtain a two-dimensional light field diagram with aligned centers; rearranging the two-dimensional light field graph with the aligned centers to obtain the four-dimensional phase space light field data.
4. A method of calibrating sub-pixel accurate light field data according to claim 3, wherein the specific step of rearranging the two-dimensional light field map of center alignment comprises:
Step 221: fixing the angle coordinates, and obtaining pixel values of each group of the angle coordinates in different space coordinates in the two-dimensional light field diagram with aligned centers to form a space coordinate sub-image of the angle coordinates;
step 222: and configuring the space coordinate sub-images corresponding to all the angle coordinates in sequence, and constructing the four-dimensional phase space light field data.
5. The adaptive sub-pixel accurate light field data calibration method according to claim 4, wherein the specific steps of step 3 are as follows:
step 31: summing pixel values of the space coordinate sub-images under each group of the angle coordinates in the four-dimensional phase space light field data to obtain all view angle energy distribution;
step 32: performing two-dimensional Gaussian function fitting on the energy distribution of all the visual angles to obtain the maximum two-dimensional coordinates of the energy distribution );
Step 33: according to the pixel center point [ ]) And the two-dimensional coordinates of the maximum value of the energy distribution) Determining the sub-pixel center of the original two-dimensional light field data) Expressed as:
6. An adaptive sub-pixel precision light field data calibration method as defined in claim 4, the method is characterized by comprising the following specific steps of:
step 41: using the center of the sub-pixel to achieve the following effect ) Taking the original point as an origin, traversing different scaling sizes at preset intervals in a preset size range, and interpolating the original two-dimensional light field data to obtain scaled two-dimensional light field data corresponding to the different scaling sizes;
Step 42: performing four-dimensional phase space rearrangement on the scaled two-dimensional light field data corresponding to different scaling sizes to obtain scaled four-dimensional phase space light field data;
Step 43: and summing all pixel value intensities of the space coordinate sub-images corresponding to the preset angular coordinates of each group of scaled four-dimensional phase space light field data to obtain total energy of the preset angular coordinates, and selecting the scaled four-dimensional phase space light field data corresponding to the maximum value of the total energy as calibrated sub-pixel precision light field data.
7. The method of claim 6, wherein the predetermined size range is [ u x h-10, u x h+10]; the preset interval is 1; the preset angle coordinates are [2:u-1,2:v-1]; h represents the number of transverse microlenses in the microlens array light field imaging system, and u and v represent the transverse field pixels and the longitudinal field pixels of a single microlens, respectively.
CN202411059811.9A 2024-08-05 2024-08-05 Self-adaptive sub-pixel precision light field data calibration method Pending CN118567098A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411059811.9A CN118567098A (en) 2024-08-05 2024-08-05 Self-adaptive sub-pixel precision light field data calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411059811.9A CN118567098A (en) 2024-08-05 2024-08-05 Self-adaptive sub-pixel precision light field data calibration method

Publications (1)

Publication Number Publication Date
CN118567098A true CN118567098A (en) 2024-08-30

Family

ID=92475052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411059811.9A Pending CN118567098A (en) 2024-08-05 2024-08-05 Self-adaptive sub-pixel precision light field data calibration method

Country Status (1)

Country Link
CN (1) CN118567098A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080056609A1 (en) * 2004-03-26 2008-03-06 Centre National D'etudes Spatiales Fine Stereoscopic Image Matching And Dedicated Instrument Having A Low Stereoscopic Coefficient
CN107610182A (en) * 2017-09-22 2018-01-19 哈尔滨工业大学 A kind of scaling method at light-field camera microlens array center
US20200250856A1 (en) * 2019-02-01 2020-08-06 Molecular Devices (Austria) GmbH Calibration of a light-field imaging system
WO2021121037A1 (en) * 2019-12-16 2021-06-24 首都师范大学 Method and system for reconstructing light field by applying depth sampling
CN118333914A (en) * 2024-06-05 2024-07-12 浙江荷湖科技有限公司 High dynamic range light field data white image correction method, system and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080056609A1 (en) * 2004-03-26 2008-03-06 Centre National D'etudes Spatiales Fine Stereoscopic Image Matching And Dedicated Instrument Having A Low Stereoscopic Coefficient
CN107610182A (en) * 2017-09-22 2018-01-19 哈尔滨工业大学 A kind of scaling method at light-field camera microlens array center
US20200250856A1 (en) * 2019-02-01 2020-08-06 Molecular Devices (Austria) GmbH Calibration of a light-field imaging system
WO2021121037A1 (en) * 2019-12-16 2021-06-24 首都师范大学 Method and system for reconstructing light field by applying depth sampling
CN118333914A (en) * 2024-06-05 2024-07-12 浙江荷湖科技有限公司 High dynamic range light field data white image correction method, system and medium

Similar Documents

Publication Publication Date Title
CN109767476B (en) Automatic focusing binocular camera calibration and depth calculation method
CN111351446B (en) Light field camera calibration method for three-dimensional topography measurement
CN109413407B (en) High spatial resolution light field acquisition device and image generation method
CN109859272B (en) Automatic focusing binocular camera calibration method and device
CN108426585B (en) A kind of geometric calibration method of light-field camera
CN108305233B (en) A kind of light field image bearing calibration for microlens array error
EP1589482A2 (en) Three-dimensional image measuring apparatus and method
CN114636385B (en) Three-dimensional imaging method and system based on light field camera and three-dimensional imaging measurement production line
CN112381847B (en) Pipeline end space pose measurement method and system
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN107392849B (en) Target identification and positioning method based on image subdivision
CN111080705B (en) Calibration method and device for automatic focusing binocular camera
CN109712232B (en) Object surface contour three-dimensional imaging method based on light field
CN107358628B (en) Linear array image processing method based on target
CN111707187B (en) Measuring method and system for large part
CN115086550B (en) Meta imaging system
CN112489137A (en) RGBD camera calibration method and system
CN114283203A (en) Calibration method and system of multi-camera system
CN110033491B (en) Camera calibration method
CN118333914B (en) High dynamic range light field data white image correction method, system and medium
US11640680B2 (en) Imaging system and a method of calibrating an image system
CN112489141B (en) Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN118567098A (en) Self-adaptive sub-pixel precision light field data calibration method
CN113447241A (en) Rapid calibration method and device for segmentation projection imaging system
CN113160393A (en) High-precision three-dimensional reconstruction method and device based on large field depth and related components thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination