WO2011121837A1 - 立体撮像装置、画像再生装置及び編集ソフトウエア - Google Patents
立体撮像装置、画像再生装置及び編集ソフトウエア Download PDFInfo
- Publication number
- WO2011121837A1 WO2011121837A1 PCT/JP2010/069913 JP2010069913W WO2011121837A1 WO 2011121837 A1 WO2011121837 A1 WO 2011121837A1 JP 2010069913 W JP2010069913 W JP 2010069913W WO 2011121837 A1 WO2011121837 A1 WO 2011121837A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- time
- parallax images
- read
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
- G03B35/10—Stereoscopic photography by simultaneous recording having single camera with stereoscopic-base-defining system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
Definitions
- the present disclosure relates to a stereoscopic imaging apparatus, and more particularly, to a stereoscopic imaging apparatus that captures a plurality of parallax images having parallax with each other.
- Patent Document 1 discloses a first lens barrel having a CCD for obtaining photographing information for the right eye, a second lens barrel having a CCD for obtaining photographing information for the left eye, and these first lenses.
- a camera detection circuit for detecting the focal length of the lens barrel and the second lens barrel, an EEPROM that stores in advance the shift amounts of the optical axis centers of the first lens barrel and the second lens barrel at each focal length, and the like
- a CPU for controlling an image cut-out area in at least one of the pair of left and right CCDs at each focal length based on an output from the ROM.
- Patent Document 2 an approximate expression for coordinate correction based on the lens center is set for the lens characteristics of the stereo camera, and the projection coordinates of the target image captured by the camera are corrected based on the approximate expression.
- An image processing apparatus is disclosed.
- Patent Document 3 high-definition imaging is performed using a plurality of imaging elements, and each of these imaging outputs is converted into a digital image, and then divided into a plurality of image areas by an image dividing circuit.
- An image processing apparatus is disclosed that can perform high-speed image processing at a moving image rate even in the case of a high-definition image by performing image processing on regions in parallel.
- Patent Document 1 discloses a technique for generating a right-and-left viewpoint image suitable for stereoscopic viewing by changing an image cut-out area based on an optical axis shift due to individual differences between two photographing lenses. Discloses a technique for correcting image distortion due to lens distortion.
- image clipping processing for correcting optical axis misalignment due to individual differences between shooting lenses and image distortion correction processing due to lens distortion may be performed after shooting until writing to the memory card.
- image clipping processing for correcting optical axis misalignment due to individual differences between shooting lenses and image distortion correction processing due to lens distortion may be performed after shooting until writing to the memory card.
- it is necessary to write it in a memory card in real time.
- it is difficult to perform heavy lens distortion correction processing on moving images when shooting moving images where imaging processing, image processing, and compression processing are concentrated, especially when shooting and recording high definition (HD) moving images at a high frame rate. It is difficult.
- HD high definition
- Patent Document 3 discloses an image processing apparatus that can perform high-speed image processing at a moving image rate even in the case of a high-definition image.
- the image processing apparatus described in Cited Document 3 divides a processing region. Since parallel processing is performed, the circuit configuration becomes complicated and expensive. Further, the image processing described in Patent Document 3 is shading correction, is not distortion correction processing of an image with complicated calculation, and is not processing for a three-dimensional (3D) moving image.
- the present disclosure has been made in view of such circumstances, and is capable of capturing and recording a high-definition 3D moving image at a high frame rate and capable of correcting aberrations such as heavy lens distortion. It is an object of the present invention to provide an apparatus, an image reproducing apparatus and editing software for a 3D moving image photographed by the stereoscopic imaging apparatus.
- a stereoscopic imaging apparatus includes a plurality of imaging units each including an imaging optical system and imaging elements that photoelectrically convert subject images formed through the imaging optical system.
- a plurality of imaging units that capture parallax images having parallax with each other, and an optical axis center position and clipping size of each imaging optical system, or a clipping area centered on the optical axis center position of each imaging optical system
- the first information is information for correcting a deviation amount of the optical axis center of each imaging optical system, which is acquired in advance by inspection of each imaging optical system of a plurality of imaging units before product shipment.
- the second information is information acquired in advance by inspection of each imaging optical system of a plurality of imaging units before product shipment, and is information indicating aberrations such as distortion and chromatic aberration of each imaging optical system. .
- the first and second information are stored in advance in a nonvolatile storage unit such as an EEPROM.
- the stereoscopic imaging apparatus records the first information and the second information in association with a plurality of time-series parallax images (3D moving images) obtained by moving image shooting, and lens distortion during moving image shooting.
- a high-definition 3D moving image can be captured and recorded at a high frame rate.
- the first information is for a parallax image before correcting the aberration of the photographing optical system or a parallax image after correcting the aberration.
- Information may be information based on a parallax image (a parallax image including aberration) before correction processing corresponding to the aberration of the photographing optical system, or corresponds to the aberration of the photographing optical system.
- Information based on a corrected parallax image may be used.
- the stereoscopic imaging apparatus according to the first aspect or the second aspect further includes a parallax amount adjusting unit that adjusts a parallax amount between the plurality of parallax images output from the plurality of imaging units.
- the recording unit records third information indicating the parallax amount adjusted by the parallax amount adjusting unit in association with the plurality of parallax images on the recording medium.
- the third information is information indicating a shift amount in the left-right direction between a plurality of parallax images, and is information for adjusting the intensity of parallax.
- the photographing optical system is a zoom lens
- the storage unit is provided for each zoom position of the zoom lens.
- the first information and second information are stored, and the recording unit stores all the first information and second information for each zoom position stored in the storage unit and the zoom at the time of moving image shooting.
- Information indicating the time-series zoom position of the lens or the time-series first information and second information read from the storage unit based on the zoom position of the zoom lens at the time of moving image shooting is the plurality of parallaxes.
- the image is recorded on the recording medium in association with the image.
- the storage unit stores the first information and the second information for each zoom position of the zoom lens, and the first information and the second information for each zoom position at the time of moving image shooting. And information indicating the zoom position of the time-series zoom lens in association with the 3D moving image, or first time-series information read from the storage unit based on the zoom position of the zoom lens at the time of moving image shooting, and The second information is recorded in association with the 3D moving image.
- the stereoscopic imaging device includes a plurality of time-series parallax images, the first information, and the second information from the recording medium.
- a readout unit that reads out information; and the readout time-series plural parallax images are cut out for stereoscopic display based on the read out first information, and the aberration is corrected based on the read out second information.
- An image processing unit that generates a plurality of time-series parallax images, and an output unit that outputs the plurality of time-series parallax images generated by the image processing unit to a stereoscopic display unit in the apparatus or an external stereoscopic display unit And further comprising.
- the stereoscopic imaging device reads a plurality of time-series parallax images, the first information, and the second information from the recording medium during reproduction, and reads the read time-series parallax images.
- An image is generated, and a plurality of time-series parallax images (3D moving images) in which the optical axis deviation is corrected and the aberration is corrected are output to a stereoscopic display unit in the apparatus or an external stereoscopic display unit.
- the stereoscopic imaging apparatus reads out a plurality of time-series parallax images and the first information, the second information, and the third information from the recording medium. And a plurality of the read time-series parallax images are cut out for stereoscopic display based on the read first information and third information, and the aberration is corrected based on the read second information
- An image processing unit that generates a plurality of time-series parallax images, and an output unit that outputs the plurality of time-series parallax images generated by the image processing unit to a stereoscopic display unit in the apparatus or an external stereoscopic display unit And further comprising.
- the stereoscopic imaging device is further different from the stereoscopic imaging device according to the fifth aspect in that the parallax amount of the 3D moving image is adjusted based on the third information and output.
- the stereoscopic imaging device reads out a plurality of time-series parallax images and the first information, the second information, and the third information from the recording medium.
- An image processing unit that generates a plurality of parallax images, and a plurality of time-series parallax images generated by the image processing unit by shifting images based on the third information, or a stereoscopic display unit in the apparatus or an external stereoscopic display And an output unit for outputting to the unit.
- the stereoscopic imaging device is a stereoscopic imaging according to the sixth aspect in that the third information is used for shifting images of a plurality of parallax images (a point used as an offset amount during stereoscopic display). Different from the device.
- the stereoscopic imaging device includes a plurality of time-series generated by the image processing unit instead of the output unit or together with the output unit.
- the apparatus further includes a recording unit that records the parallax image on the recording medium.
- the stereoscopic imaging apparatus records a 3D moving image processed by the image processing unit such as aberration on the recording medium.
- a processed 3D video may be recorded instead of an unprocessed 3D video, or a processed 3D video may be recorded separately from an unprocessed 3D image.
- An image reproduction device includes a plurality of time-series parallax images, the first information, and the second information from the recording medium recorded by the stereoscopic imaging device according to the first, second, or fourth aspect. And reading out the plurality of time-series parallax images for stereoscopic display based on the read first information, and aberrations based on the read second information.
- An image processing unit that generates a plurality of corrected time-series parallax images, and a stereoscopic display unit that displays a stereoscopic video based on the plurality of time-series parallax images generated by the image processing unit.
- the image reproducing device includes a plurality of time-series parallax images, the first information, the second information, and the third information from the recording medium recorded by the stereoscopic imaging device according to the third aspect.
- a readout unit that reads out information, and the read out time-series plural parallax images are cut out for stereoscopic display based on the read out first information and third information, and the read out second information
- An image processing unit that generates a plurality of time-series parallax images in which aberrations are corrected based on the three-dimensional display unit that displays a stereoscopic video based on the plurality of time-series parallax images generated by the image processing unit; Prepare.
- An image reproduction device includes a plurality of time-series parallax images, the first information, the second information, and the third information from the recording medium recorded by the stereoscopic imaging device according to the third aspect.
- a readout unit that reads out information; and the readout time-series plural parallax images are cut out for stereoscopic display based on the read out first information, and the aberration is corrected based on the read out second information.
- An image processing unit that generates a plurality of time-series parallax images, and a three-dimensional video that displays a stereoscopic video by shifting the plurality of time-series parallax images generated by the image processing unit based on the third information A display unit.
- the image reproducing device includes an unprocessed 3D moving image that has not been subjected to correction processing such as aberration from the recording medium, and the first, 2 information, or the first, second, and third information is read out, and the unprocessed 3D moving image is corrected and output to the stereoscopic display unit, so that the optical axis shift, lens distortion, and the like are corrected. 3D moving images can be reproduced.
- the editing software includes a plurality of time-series parallax images, the first information, and the second information from the recording medium recorded by the stereoscopic imaging device according to the first, second, or fourth aspect. And a reading function for reading out the information, and cutting out the read time-series parallax images for stereoscopic display based on the read first information, and aberrations based on the read second information.
- An image processing function for generating a plurality of corrected time-series parallax images and a recording function for recording the generated time-series parallax images on a recording medium are realized by a computer.
- the editing software includes a plurality of time-series parallax images, the first information, the second information, and the third information from the recording medium recorded by the stereoscopic imaging device according to the third aspect.
- a read function for reading information, and the read time-series parallax images are cut out for stereoscopic display based on the read first information and third information, and the read second information is included in the read second information.
- An image processing function for generating a plurality of time-series parallax images in which aberrations are corrected based on the above and a recording function for recording the generated plurality of time-series parallax images on a recording medium are realized by a computer.
- the editing software includes a plurality of time-series parallax images, the first information, the second information, and the third information from the recording medium recorded by the stereoscopic imaging device according to the third aspect.
- a readout function for reading out information, and the readout time-series plural parallax images are cut out for stereoscopic display based on the readout first information, and aberrations are corrected based on the readout second information
- an output function for outputting to an external stereoscopic display unit is realized by a computer.
- a 3D moving image that has not been subjected to correction processing for aberration such as heavy lens distortion and information necessary for the correction processing in association with the 3D moving image are recorded on a recording medium. Yes. Therefore, it is not necessary to perform heavy load correction processing at the time of moving image shooting, so that a high-definition 3D moving image can be shot and recorded at a high frame rate.
- correction processing such as optical axis deviation and lens distortion is performed based on the unprocessed 3D moving image read from the recording medium and information for correction processing recorded in association therewith. I am doing so. Thereby, the reproduction
- FIG. 1A is a diagram (part 1) illustrating an appearance of a stereoscopic imaging device according to the present disclosure.
- FIG. 1B is a diagram (part 2) illustrating an appearance of a stereoscopic imaging device according to the present disclosure.
- FIG. 2 is a block diagram illustrating an embodiment of a stereoscopic imaging device according to the present disclosure.
- FIG. 3 is a flowchart illustrating an operation at the time of 3D moving image shooting of the stereoscopic imaging device according to the present disclosure.
- FIG. 4 is a table showing an example of correction parameters unique to the camera acquired in the inspection before shipment.
- FIG. 5 is a diagram showing the relationship between the left and right full-angle images and the image cut-out area when the zoom position is at the wide end.
- FIG. 1A is a diagram (part 1) illustrating an appearance of a stereoscopic imaging device according to the present disclosure.
- FIG. 1B is a diagram (part 2) illustrating an appearance of a stereoscopic imaging device according to the
- FIG. 6 is a diagram illustrating the relationship between the left and right angle-of-view images and the image cut-out area when the zoom position is moved to the tele side.
- FIG. 7 is a diagram illustrating an example of a file structure of a recording file for recording a 3D moving image.
- FIG. 8 is a flowchart illustrating an operation during playback of a 3D moving image of the stereoscopic imaging apparatus according to the present disclosure.
- FIG. 9A is a diagram for explaining image clipping for correcting optical axis misalignment or the like of the left and right photographing optical systems (part 1).
- FIG. 9B is a diagram for explaining image clipping for correcting optical axis misalignment and the like of the left and right photographing optical systems (part 2).
- FIG. 9A is a diagram for explaining image clipping for correcting optical axis misalignment or the like of the left and right photographing optical systems (part 1).
- FIG. 9B is a diagram for explaining image clipping for
- FIG. 10 is a flowchart illustrating 3D moving image conversion processing by the stereoscopic imaging device according to the present disclosure.
- FIG. 1A and 1B are diagrams illustrating an external appearance of a stereoscopic imaging apparatus according to the present disclosure
- FIG. 1A is a perspective view of the stereoscopic imaging apparatus viewed from the front side
- FIG. 1B is a rear view.
- This stereoscopic imaging device (compound-eye camera) 10 is a digital camera capable of recording and reproducing 2D / 3D still images and 2D / 3D moving images, and is a thin rectangular parallelepiped camera body as shown in FIGS. 1A and 1B.
- a shutter button 11 and a zoom button 12 are disposed on the upper surface of the camera.
- a lens barrier 13 having a width substantially equal to the width in the left-right direction of the camera body is disposed so as to be movable in the up-down direction of the camera body.
- the front surfaces of the pair of left and right photographic optical systems 14-1 and 14-2 can be opened and closed simultaneously by moving in the vertical direction between the position indicated by and the position indicated by the solid line. Note that as the photographing optical systems 14-1 and 14-2, a zoom lens of a bending optical system is used.
- the camera power supply can be turned on / off in conjunction with the opening / closing operation of the lens front surface by the lens barrier 13.
- a 3D liquid crystal monitor 16 is disposed at the center of the back of the camera body.
- the liquid crystal monitor 16 can display a plurality of parallax images (right-eye image and left-eye image) as directional images each having a predetermined directivity by a parallax barrier.
- the 3D liquid crystal monitor 16 uses a lenticular lens, or can display a right eye image and a left eye image individually by wearing dedicated glasses such as polarized glasses or liquid crystal shutter glasses. Is applicable.
- the operation switch 18A is a changeover switch for switching between still image shooting and moving image shooting
- the operation switch 18B is a parallax adjustment switch for adjusting the amount of parallax between the right-eye image and the left-eye image
- the operation switch 18C is 2D shooting. This is a changeover switch for switching between 3D shooting.
- the operation switch 18D is a seesaw key that functions as both a MENU / OK button and a playback button
- the operation switch 18E is a multifunction cross key
- the operation switch 18F is a DISP / BACK key.
- the MENU / OK button is an operation switch having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 16 and a function as an OK button for instructing confirmation and execution of selection contents. It is.
- the playback button is a button for switching from the shooting mode to the playback mode.
- the cross key is an operation switch for inputting instructions in four directions, up, down, left, and right.
- a macro button, a flash button, a self-timer button, or the like is assigned to the menu key. When a menu is selected, the menu screen is displayed. Function as a switch (cursor moving operation unit) for selecting an item from the menu or instructing selection of various setting items from each menu.
- the left / right key of the cross key functions as a frame advance (forward / reverse feed) button in the playback mode.
- the DISP / BACK key is used for switching the display form of the liquid crystal monitor 16, canceling the instruction content on the menu screen, or returning to the previous operation state.
- reference numeral 15 indicates a stereo microphone.
- FIG. 2 is a block diagram showing an embodiment of the stereoscopic imaging apparatus 10.
- the stereoscopic imaging apparatus 10 mainly includes a plurality of imaging units 20-1 and 20-2, a central processing unit (CPU) 32, the shutter button 11, the zoom button 12, and various operation switches described above. Including an operation unit 34, a display control unit 36, a liquid crystal monitor 16, a recording control unit 38, a compression / expansion processing unit 42, a digital signal processing unit 44, an AE (Automatic Exposure) detection unit 46, and an AF (Auto Focus): An automatic focus detection unit 48, an AWB (Automatic White Balance) detection unit 50, a VRAM (Video RAM) 52, a RAM 54, a ROM 56, an EEPROM (Electrically Erasable and Programmable ROM) 58, and the like. Note that the imaging units 20-1 and 20-2 capture two parallax images, a left-eye image and a right-eye image, that have parallax with each other, but there may be three or more imaging units 20.
- the imaging units 20-1 and 20-2 capture two parallax images
- the imaging unit 20-1 that captures an image for the left eye includes a prism (not shown), a photographing optical system including a focus lens and a zoom lens 21, an optical unit including a diaphragm 22 and a mechanical shutter 23, and a solid-state image sensor (CCD). 24, an analog signal processing unit 25, an A / D converter 26, an image input controller 27, a lens driving unit 28 for driving the optical unit, an aperture driving unit 29, a shutter control unit 30, and a CCD 24.
- CCD control unit 31 is provided.
- the imaging unit 20-2 that captures the image for the right eye has the same configuration as the imaging unit 20-1 that captures the image for the left eye, and thus the description of the specific configuration is omitted.
- the CPU 32 controls the overall operation of the camera according to a predetermined control program based on the input from the operation unit 34.
- the ROM 56 stores a control program executed by the CPU 32 and various data necessary for the control.
- the EEPROM 58 stores various information indicating inspection results at the time of inspection before product shipment, for example, pixel defect information of the CCD 24, Correction parameters, tables, and the like used for image processing are stored. The details of various information stored here will be described later.
- the VRAM 52 is a memory for temporarily storing image data for display displayed on the liquid crystal monitor 16, and the RAM 54 includes a calculation work area for the CPU 32 and a temporary storage area for image data.
- the focus lens and zoom lens 21 included in the photographing optical system are driven by the lens driving unit 28 and moved back and forth along the optical axis.
- the CPU 32 controls the driving of the lens driving unit 28 to control the position of the focus lens so as to adjust the focus so that the subject is in focus, and in response to a zoom command from the zoom button 12 in the operation unit 34. Control the zoom position of the zoom lens to change the zoom magnification.
- the diaphragm 22 is configured by an iris diaphragm, for example, and is driven by the diaphragm driving unit 29 to operate.
- the CPU 32 controls the aperture amount (aperture value) of the aperture 22 via the aperture drive unit 29 and controls the amount of light incident on the CCD 24.
- the mechanical shutter 23 determines the exposure time in the CCD 24 by opening and closing the optical path, and prevents unwanted light from entering the CCD 24 when the image signal is read from the CCD 24, thereby preventing smear.
- the CPU 32 outputs a shutter close signal synchronized with the exposure end time corresponding to the shutter speed to the shutter control unit 30 to control the mechanical shutter 23.
- the CCD 24 is composed of a two-dimensional color CCD solid-state imaging device. A large number of photodiodes are two-dimensionally arranged on the light receiving surface of the CCD 24, and color filters are arranged in a predetermined arrangement on each photodiode.
- the optical image of the subject imaged on the CCD light receiving surface via the optical unit having the above configuration is converted into a signal charge corresponding to the amount of incident light by the photodiode.
- the signal charge accumulated in each photodiode is sequentially read out from the CCD 24 as a voltage signal (image signal) corresponding to the signal charge based on a drive pulse given from the CCD control unit 31 according to a command from the CPU 32.
- the CCD 24 has an electronic shutter function, and the exposure time (shutter speed) is controlled by controlling the charge accumulation time in the photodiode.
- the electronic shutter controls the charge accumulation start time corresponding to the shutter speed, and the exposure end time (charge accumulation end time) is controlled by closing the mechanical shutter 23.
- the CCD 24 is used as the image pickup device, but an image pickup device having another configuration such as a CMOS sensor may be used.
- the analog signals R, G, and B read from the CCD 24 are subjected to correlated double sampling (CDS) and amplification by the analog signal processing unit 25, and then the R, G, and B analog signals are output by the A / D converter 26. Converted to a digital signal.
- CDS correlated double sampling
- the image input controller 27 has a built-in line buffer having a predetermined capacity, and temporarily stores R, G, B image signals (CCDRAW data) A / D converted by the A / D converter 26 and then a bus 60. And stored in the RAM 54.
- the CPU 32 controls the imaging unit 20-2 that captures the image for the right eye in the same manner as the imaging unit 20-1 that captures the image for the left eye in the 3D shooting mode.
- the AE detection unit 46 calculates subject brightness necessary for AE control based on an image signal captured when the shutter button 11 is half-pressed, and outputs a signal indicating the subject brightness (shooting EV value) to the CPU 32.
- the CPU 32 sets the shutter speed (exposure time), aperture value, and imaging sensitivity in the plurality of imaging units 20-1 and 20-2 according to a predetermined program diagram based on the input imaging EV value.
- the AF detection unit 48 integrates the absolute value of the high frequency component of the image signal in the AF area captured when the shutter button 11 is half-pressed, and outputs this integrated value (AF evaluation value) to the CPU 32.
- the CPU 32 moves the focus lens from the closest position to the infinity side, searches for a focus position where the AF evaluation value detected by the AF detection unit 48 is maximum, and moves the focus lens to the focus position. Adjust the focus on the subject (main subject).
- the AWB detection unit 50 automatically obtains the light source type (the color temperature of the object scene) based on the R, G, and B image signals acquired at the time of the main imaging, and R, G, The corresponding white balance gain is read out from the table storing the B white balance gain (white balance correction value).
- the digital signal processing unit 44 interpolates a spatial shift of color signals such as R, G, and B accompanying a white balance correction circuit, a gradation conversion processing circuit (for example, a gamma correction circuit), and a color filter array of a single-plate CCD.
- R, G, B image signals (CCDRAW data) stored in the RAM 54 including a first image processing unit such as a synchronization circuit for matching the position of each color signal, a contour correction circuit, and a luminance / color difference signal generation circuit.
- First image processing is performed by the first image processing unit.
- the R, G, and B CCDRAW data are multiplied by the white balance gain detected by the AWB detection unit 50 in the digital signal processing unit 44 and subjected to white balance correction, and thereafter, a gradation conversion process (for example, After predetermined processing such as gamma correction is performed, the signal is converted into a YC signal including a luminance signal (Y signal) and a color difference signal (Cr, Cb signal).
- Y signal a luminance signal
- Cr, Cb signal color difference signal
- the YC signal processed by the digital signal processing unit 44 is stored in the RAM 54.
- the digital signal processing unit 44 corrects the optical axis shift of the imaging optical systems of the plurality of imaging units 20-1 and 20-2 by cutting out images of predetermined clipping areas from the left and right viewpoint images during 3D video playback.
- a second image processing unit such as a lens distortion correction processing circuit for correcting lens distortion correction of the imaging optical system of the plurality of imaging units 20-1 and 20-2.
- the unit performs second image processing on the YC signal processed by the first image processing unit. Details of processing contents of the second image processing unit will be described later.
- the compression / decompression processing unit 42 compresses the YC signal stored in the RAM 54 in accordance with a command from the CPU 32 during recording on the memory card 40, and decompresses the compressed compressed data recorded on the memory card 40. To YC signal.
- the recording control unit 38 converts the compressed data compressed by the compression / decompression processing unit 42 into an image file of a predetermined format (for example, a 3D still image is an MP (multi-picture) format image file) and records it on the memory card 40. Alternatively, the image file is read from the memory card 40.
- the liquid crystal monitor 16 is used as an image display unit for displaying captured images, and is used as a GUI (graphical user interface) at various settings.
- the liquid crystal monitor 16 is used as an electronic viewfinder for confirming the angle of view in the shooting mode.
- the display control unit 36 alternately displays the left-eye image and the right-eye image held in the VRAM 52 pixel by pixel.
- the left and right images alternately arranged pixel by pixel are visually recognized separately by the left and right eyes of the user observing from a predetermined distance. This enables stereoscopic viewing.
- the stereoscopic imaging apparatus 10 also has a function of recording and reproducing audio information (audio data) acquired by the stereo microphone 15 shown in FIGS. 1A and 1B.
- 3D moving image shooting mode Switching to the moving image shooting mode by the operation switch 18A shown in FIG. 1B and switching to the 3D shooting mode by the operation switch 18C sets the shooting mode for shooting a 3D moving image (hereinafter referred to as “3D moving image shooting mode”). Can do.
- the stereoscopic imaging apparatus 10 set to the 3D moving image shooting mode can display a 3D through image on the liquid crystal monitor 16 before or during moving image recording, and the user displays the 3D through image.
- the parallax amount adjustment switch 18B see FIG. 1B
- the parallax amount of the 3D moving image left-eye image and right-eye image
- the parallax amount of the 3D moving image can be increased or decreased by operating the parallax amount adjustment switch 18B in the + direction or the ⁇ direction.
- the CPU 32 monitors the operation of the parallax amount adjustment switch 18B, and when there is an operation of the parallax amount adjustment switch 18B (in the case of “No”), sets the parallax adjustment value H_SHIFT in the internal memory (RAM 54) ( Steps S10 and S12). When the parallax amount adjustment switch 18B is not operated, the parallax adjustment value is set to zero.
- the CPU 32 determines whether or not there is a moving image shooting instruction (full depression of the shutter button 11) (step S14).
- the CPU 32 first creates a recording file for 3D moving image recording in the memory card 40, and records the data of the table shown in FIG. 4 in its header (step S16).
- the EEPROM 58 shown in FIG. 2 stores data unique to the camera acquired in the inspection before shipment as shown in the table of FIG.
- a cutting center optical axis center
- all images there is a cutting center (optical axis center) for cutting out an image from an image having a full shooting angle of view of the CCD (hereinafter referred to as “all images”).
- Information (first information) indicating coordinates and cut-out sizes (vertical and horizontal pixel numbers), and information (second information) indicating lens distortion correction parameters and calculation formulas are stored separately for the left and right photographing optical systems. Yes.
- FIG. 5 and FIG. 6 show the cutting center, cutting size, and the like for all the left and right images at the zoom position ZP1 (wide end; 35 mm angle of view) and the zoom position ZP3 (Tele side; 45 mm angle of view), respectively.
- the clipping center (Lx1, Ly1) for all images for the left eye and the clipping center (Rx1, Ry1) for all images for the right eye are the optical axis centers of the left and right imaging optical systems, respectively. It deviates from the center (x mark) of the CCD imaging surface.
- the lens distortion shown in FIG. 5 is an example of barrel distortion.
- An image having this type of lens distortion is information (second information) for inspecting lens distortion in advance and correcting it. )
- the cut-out centers (Lx3, Ly3) for all images for the left eye shown in FIG. 6 and the cut-out centers (Rx3, Ry3) for all images for the right eye shown in FIG. 6 are the cut-out centers (Lx1, Ly1) shown in FIG. It is different from the cutting center (Rx1, Ry1). This is due to the difference between the optical axis centers of the left and right photographing optical systems for each zoom position of the zoom lens.
- the lens distortion correction parameter is held for each zoom lens zoom position of each of the left and right photographing optical systems as shown in the table of FIG. is doing.
- the cutout center and cutout size in this embodiment are values on the image on which lens distortion correction has been performed. Further, when the lens characteristics of the left and right photographing optical systems are the same, the same lens distortion correction parameter may be used for the left and right.
- the CPU 32 determines whether or not the zoom operation using the zoom button 12 has been performed (step S18). When the zoom operation is performed (in the case of “Yes”), the left and right zoom lenses are moved to the zoom position ZPx instructed by the zoom operation (step S20). When the zoom operation is not performed (in the case of “No”), the process proceeds to step S22. The CPU 32 always knows the current zoom position ZPx regardless of whether or not the zoom operation is performed.
- step S22 the CPU 32 operates the AE detection unit 46 to acquire a photographing EV value indicating the brightness of the subject, and operates the AF detection unit 48 to obtain an AF evaluation value indicating the contrast of the subject image in the AF area. Then, the left and right focus lenses are driven so that the AF evaluation value is maintained at the maximum value.
- the CPU 32 determines shooting conditions (aperture value, shutter speed, shooting sensitivity) in the left and right imaging units 20-1 and 20-2 based on the shooting EV value acquired in step S32 (step S24-1). , S24-2). Note that if the performance of the left and right imaging units 20-1, 20-2 varies, or if different lenses are used, the shooting conditions must be determined individually as described above. When the imaging units 20-1 and 20-2 have the same performance, common imaging conditions may be used.
- the CPU 32 performs exposure control in the left and right imaging units 20-1 and 20-2 according to the imaging conditions determined as described above, and acquires left and right viewpoint images (CCDRAW data) for one frame (step S26-). 1, S26-2).
- the obtained CCDRAW data is subjected to image processing such as white balance correction, gamma correction, and YC conversion by the first image processing unit of the digital signal processing unit 44 (steps S28-1, S28-2). Then, heavy processing by the second image processing unit (image cutout processing circuit, lens distortion correction processing circuit) of the digital signal processing unit 44 is not performed.
- image processing such as white balance correction, gamma correction, and YC conversion by the first image processing unit of the digital signal processing unit 44 (steps S28-1, S28-2).
- the YC signal before lens distortion correction which has been YC converted from the CCDRAW data in steps S28-1 and S28-2, is temporarily stored in the RAM 54 and is stored in a predetermined number of frames (for example, 60 when the frame rate is 60 / second).
- Image of the full angle of view before correction in which the YC signal is compressed for each frame, the zoom position ZPx of the current zoom lens in association with the image, and the parallax adjustment value H_SHIFT set in step S12
- the file is stored in the recording file created in step S16 (step S30).
- Motion JPEG, H.264, H.264, MPEG4, MPEG4-MVC, etc. can be applied as a compression method of a moving image.
- FIG. 7 shows an example of the file structure of the recording file.
- the cutting center optical axis center
- a table of cutout sizes, lens distortion correction parameters (including calculation formulas), 3D moving image thumbnails (reduced right and left reduced images of 3D still images representing 3D moving images), and parallax adjustment values of the respective thumbnails are recorded.
- the main body of the recording file includes a parallax adjustment value H_SHIFT of 3D moving image every second, a packet of information of the zoom position ZPx (packet for each frame for one second), and a compressed 3D moving image (for example, 60 frames). Left and right parallax images in time series) and audio data for one second are recorded.
- information such as the 3D moving image per second is continuously recorded for an arbitrary moving image shooting time.
- step S ⁇ b> 32 the CPU 32 determines whether or not there is an instruction to end shooting of the 3D moving image. For example, if the shutter button 11 is fully pressed in step S14 and then the shutter button 11 is fully pressed again, the CPU 32 determines that “shooting end instruction is present”.
- step S32 a 3D moving image or the like of less than 1 second is recorded in a recording file, and then the 3D moving image is shot End recording.
- the stereoscopic imaging apparatus 10 can play back the image recorded on the memory card 40.
- step S50 when a recording file in which a 3D moving image is recorded is selected and its reproduction is instructed (step S50), the CPU 32 performs data in the table shown in FIG. 4 based on the tag of the recording file (see FIG. 7). Is stored in the internal memory (RAM 54) (step S52).
- the CPU 32 reads the 3D moving image and the zoom position ZPx and the parallax adjustment value H_SHIFT recorded in association with the 3D moving image from the main body of the recording file (step S54).
- the read 3D moving image (compressed YC signal) is decompressed by the compression / decompression processing unit 42 and restored to an uncompressed YC signal (step S56), and then the second signal of the digital signal processing unit 44 is restored.
- Image processing is performed by the image processing unit (steps S58 to S64).
- the restored YC signal is first subjected to lens distortion correction on the left and right viewpoint images by the lens distortion correction processing circuit.
- This lens distortion correction processing is performed based on the zoom position ZPx recorded in association with each frame, and corresponding lens distortion correction parameters and cut-out center (optical axis center) coordinates from the table shown in FIG.
- a new coordinate value is obtained by substituting the coordinate value based on the optical axis center of each pixel of the viewpoint image into the calculation formula using the lens distortion correction parameter as a coefficient, and the pixel is moved to the coordinate value.
- a parallax image with corrected lens distortion is obtained (see step S58, FIG. 5 and FIG. 6).
- the parallax images of all the left and right angles of view that have been subjected to lens distortion correction are cut out by adjusting the cut-out areas respectively by the image cut-out processing circuit, and the optical axis deviations of the left and right photographing optical systems are corrected. That is, the image cutout processing circuit reads out the coordinates of the cutout center (optical axis center) and the cutout size information from the RAM 54 in correspondence with the zoom positions ZPx of the parallax images of all angles of view, and based on these information, The left and right parallax images in which the optical axis shift for 3D display is corrected are cut out from the parallax images at the angle of view (step S60).
- the left and right parallax images with the corrected optical axis deviation are cut out from the left and right parallax images with the corrected lens distortion shown in FIG. 9A.
- the image cutout processing circuit offsets the parallax adjustment value H_SHIFT by a half in the left-right direction as shown in FIG. 9C. Cut out in the cut out area (steps S62 and S64). In the example illustrated in FIG. 9C, the cut-out is performed in the direction in which the amount of parallax increases. However, when the sign of the parallax adjustment value H_SHIFT is reversed, the cut-out is performed in the direction in which the amount of parallax is reduced.
- the left and right parallax images cut out as described above are output to the liquid crystal monitor 16 (stereoscopic display unit) via the display control unit 36, and are displayed as a 3D image (step S66).
- step S68 it is determined whether the video frame has ended or the end of moving image reproduction has been instructed. If “No”, the process returns to step S54.
- the reproduction process is repeated for each frame, and a 3D moving image is displayed on the liquid crystal monitor 16.
- the output destination of the 3D moving image is not limited to the liquid crystal monitor 16 and may be an external 3D display device.
- step S70 When the video frame ends or the end of moving image reproduction is instructed (in the case of “Yes”), the 3D moving image reproduction process is terminated (step S70).
- an image that has been subjected to parallax adjustment based on the parallax adjustment value H_SHIFT is further cut out from the cut-out image.
- the coordinates of the cut-out center of the left and right parallax images are offset by a value that is half the parallax adjustment value H_SHIFT, and the offset cut-out center and cut-out size are used to cut out from the image of the full angle of view at once. May be.
- the cutout image may be subjected to parallax adjustment based on the parallax adjustment value H_SHIFT (shifting the image).
- H_SHIFT / 2 on the left and right edges of the display image may be filled with black or the like. Of course, it may be displayed as it is without being painted.
- the operation switch 18D (MENU / OK button) of the stereoscopic imaging apparatus 10 is operated to display the moving image conversion menu on the liquid crystal monitor 16, and when the selection execution of the moving image conversion menu is instructed, the 3D moving image is displayed.
- the conversion process is started (step S80).
- a 3D moving image to be converted is selected.
- the selection of the 3D moving image can be performed by displaying a thumbnail of the 3D moving image and selecting a desired thumbnail.
- by displaying the thumbnail before conversion processing and the thumbnail after conversion processing so as to be identifiable by a marker, characters, or the like, it is possible to prevent the conversion-processed 3D moving image from being selected.
- lens distortion correction processing When there is an instruction to convert the selected 3D moving image, lens distortion correction processing, image cut-out processing for correcting optical axis deviation, and the like are performed frame by frame in steps S52 to S64 described above.
- step S86 When the image processing for the prescribed number of frames for one second is completed, the left and right parallax images are respectively compressed (step S86), overwritten on the recording file, or created and stored (step S88). .
- step S54 to step S88 is repeated until the video frame is completed. Thereby, it is possible to generate a converted 3D moving image recording file in which lens distortion correction, optical axis center deviation, and the like are corrected.
- the stereoscopic imaging apparatus 10 performs the 3D moving image reproduction process illustrated in FIG. 8.
- the present invention is not limited to this, and the 3D reproduction apparatus having the 3D display may A reproduction process may be performed.
- the 3D playback device acquires the recording file of the 3D moving image recorded by the stereoscopic imaging device 10 from the recording medium directly or via communication means.
- the stereoscopic imaging apparatus 10 performs the 3D moving image conversion process illustrated in FIG. 10.
- the present invention is not limited to this, and editing software that performs the 3D moving image conversion process illustrated in FIG. 10.
- a personal computer in which is installed, or another external device may input an unprocessed 3D video and perform a 3D video conversion process.
- the coordinates of the cutout center and the cutout size information in the image after the lens distortion correction are stored and held.
- the coordinates of the cutout center and the cutout size information in the image before distortion correction may be stored and held.
- an image for correcting the deviation of the optical axis center is first cut out from the image before the lens distortion correction, and then the lens distortion is corrected for the cut out image. .
- the coordinate value of the cutout center (optical axis center) and cutout size information are provided according to the zoom position ZP.
- the present invention is not limited to this, and a diagonal coordinate value of the cut-out area may be provided.
- the present disclosure is not limited to correction of lens distortion of the photographing optical system, but can be applied to correction of other aberrations such as chromatic aberration.
- the table shown in FIG. 4 is stored in the tag of the recording file, and information on the zoom position ZPx is recorded in association with the frame of each moving image.
- the data selected from the table shown in FIG. 4 based on ZPx may be recorded in association with the frames of the moving image directly. In this case, it is not necessary to record information on the zoom position ZPx in the table shown in FIG.
- SYMBOLS 10 Stereoscopic imaging device, 11 ... Shutter button, 12 ... Zoom button, 16 ... Liquid crystal monitor, 20-1, 20-2 ... Imaging part, 21 ... Focus lens and zoom lens, 24 ... CCD, 25 ... Analog signal processing part 32 ... Central processing unit (CPU), 34 ... Operation unit, 44 ... Digital signal processing unit, 54 ... RAM, 56 ... ROM, 58 ... EEPROM
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Camera Data Copying Or Recording (AREA)
- Studio Devices (AREA)
Abstract
Description
図1A及び1Bは本開示内容に係る立体撮像装置の外観を示す図であり、図1Aは立体撮像装置を前面側から見た斜視図であり、図1Bは背面図である。
図2は上記立体撮像装置10の実施形態を示すブロック図である。
次に、本開示内容に係る立体撮像装置10の3D動画撮影時の動作について、図3に示すフローチャートを参照しながら説明する。
図3に戻って、CPU32は、ズームボタン12によるズーム操作が行われたかを判別する(ステップS18)。ズーム操作が行われた場合(「Yes」の場合)には、左右のズームレンズをズーム操作により指示されたズームポジションZPxに移動させる(ステップS20)。ズーム操作が行われない場合(「No」の場合)には、ステップS22に遷移させる。尚、CPU32は、ズーム操作の有無にかかわらず、常に現在のズームポジションZPxを把握している。
次に、本開示内容に係る立体撮像装置10の3D動画再生時の動作について、図8に示すフローチャートを参照しながら説明する。
次に、本開示内容に係る立体撮像装置10の3D動画の変換処理について、図10に示すフローチャートを参照しながら説明する。尚、図8に示した動画再生時にフローチャートと共通する部分については、共通のステップ番号を付し、その詳細な説明は省略する。
この実施形態では、立体撮像装置10が、図8に示した3D動画の再生処理を行うようにしたが、これに限らず、3Dディスプレイを有する3D再生装置が、図8に示した3D動画の再生処理を行うようにしてもよい。この場合、3D再生装置は、立体撮像装置10により記録された3D動画の記録ファイルを記録メディアから直接又は通信手段を介して取得するようにする。
Claims (14)
- 撮影光学系と該撮影光学系を介して結像される被写体像をそれぞれ光電変換する撮像素子とを有する複数の撮像部であって、互いに視差を有する視差画像を撮像する複数の撮像部と、
各撮影光学系の光軸中心位置及び切り出しサイズ、又は各撮影光学系の光軸中心位置を中心にした切り出しエリアを示す第1の情報と、各撮影光学系の収差を示す第2の情報とが記憶された記憶部と、
前記複数の撮像部から立体動画となる時系列の複数の視差画像を出力させる撮像制御部と、
動画撮影時に前記複数の撮像部から出力される時系列の複数の視差画像を記録媒体に記録するとともに、前記記憶部から読み出した前記第1の情報及び第2の情報を前記複数の視差画像に関連付けて前記記録媒体に記録する記録部と、
を備える立体撮像装置。 - 前記第1の情報は、前記撮影光学系の収差を補正する前の視差画像、又は収差を補正した後の視差画像に対する情報である、請求項1に記載の立体撮像装置。
- 前記複数の撮像部から出力される複数の視差画像間の視差量を調整する視差量調整部をさらに備え、
前記記録部は、前記視差量調整部により調整された視差量を示す第3の情報を前記複数の視差画像に関連付けて前記記録媒体に記録する、請求項1又は2に記載の立体撮像装置。 - 前記撮影光学系はズームレンズであり、
前記記憶部は、前記ズームレンズのズーム位置毎に前記第1の情報及び第2の情報を記憶し、
前記記録部は、前記記憶部に記憶された前記ズーム位置毎の第1の情報及び第2の情報の全情報と動画撮影時の前記ズームレンズの時系列のズーム位置を示す情報、又は動画撮影時の前記ズームレンズのズーム位置に基づいて前記記憶部から読み出した時系列の前記第1の情報及び第2の情報を、前記複数の視差画像に関連付けて前記記録媒体に記録する請求項1から3のいずれかに記載の立体撮像装置。 - 前記記録媒体から時系列の複数の視差画像と前記第1の情報及び第2の情報とを読み出す読出部と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理部と、
前記画像処理部により生成された時系列の複数の視差画像を、装置内の立体表示部又は外部の立体表示部に出力する出力部と、
をさらに備える請求項1、2又は4に記載の立体撮像装置。 - 前記記録媒体から時系列の複数の視差画像と前記第1の情報、第2の情報及び第3の情報とを読み出す読出部と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報及び第3の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理部と、
前記画像処理部により生成された時系列の複数の視差画像を、装置内の立体表示部又は外部の立体表示部に出力する出力部と、
をさらに備える請求項3に記載の立体撮像装置。 - 前記記録媒体から時系列の複数の視差画像と前記第1の情報、第2の情報及び第3の情報とを読み出す読出部と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理部と、
前記画像処理部により生成された時系列の複数の視差画像を、前記第3の情報に基づいて画像ずらして装置内の立体表示部又は外部の立体表示部に出力する出力部と、
をさらに備える請求項3に記載の立体撮像装置。 - 前記出力部の代わりに、又は前記出力部とともに前記画像処理部により生成された時系列の複数の視差画像を前記記録媒体に記録する記録部をさらに備える請求項5から7のいずれかに記載の立体撮像装置。
- 請求項1、2又は4の立体撮像装置により記録された前記記録媒体から時系列の複数の視差画像と前記第1の情報及び第2の情報とを読み出す読出部と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理部と、
前記画像処理部により生成された時系列の複数の視差画像に基づいて立体動画を表示する立体表示部と、
を備える画像再生装置。 - 請求項3に記載の立体撮像装置により記録された前記記録媒体から時系列の複数の視差画像と前記第1の情報、第2の情報及び第3の情報とを読み出す読出部と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報及び第3の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理部と、
前記画像処理部により生成された時系列の複数の視差画像に基づいて立体動画を表示する立体表示部と、
を備える画像再生装置。 - 請求項3の立体撮像装置により記録された前記記録媒体から時系列の複数の視差画像と前記第1の情報、第2の情報及び第3の情報とを読み出す読出部と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理部と、
前記画像処理部により生成された時系列の複数の視差画像を、前記第3の情報に基づいて画像ずらして立体動画を表示する立体表示部と、
を備える画像再生装置。 - 請求項1、2又は4の立体撮像装置により記録された前記記録媒体から時系列の複数の視差画像と前記第1の情報及び第2の情報とを読み出す読出機能と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理機能と、
前記生成された時系列の複数の視差画像を記録媒体に記録する記録機能と、
をコンピュータにより実現させる編集ソフトウエア。 - 請求項3の立体撮像装置により記録された前記記録媒体から時系列の複数の視差画像と前記第1の情報、第2の情報及び第3の情報とを読み出す読出機能と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報及び第3の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理機能と、
前記生成された時系列の複数の視差画像を記録媒体に記録する記録機能と、
をコンピュータにより実現させる編集ソフトウエア。 - 請求項3の立体撮像装置により記録された前記記録媒体から時系列の複数の視差画像と前記第1の情報、第2の情報及び第3の情報とを読み出す読出機能と、
前記読み出した時系列の複数の視差画像を、前記読み出した第1の情報に基づいて立体表示用に切り出すとともに、前記読み出した第2の情報に基づいて収差を補正した時系列の複数の視差画像を生成する画像処理機能と、
前記画像処理部により生成された時系列の複数の視差画像を、前記第3の情報に基づいて画像ずらして装置内の立体表示部又は外部の立体表示部に出力する出力機能と、
をコンピュータにより実現させる編集ソフトウエア。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012508018A JP5166650B2 (ja) | 2010-03-29 | 2010-11-09 | 立体撮像装置、画像再生装置及び編集ソフトウエア |
US13/638,555 US9256116B2 (en) | 2010-03-29 | 2010-11-09 | Stereoscopic imaging device, image reproducing device, and editing software |
CN201080065964.4A CN102860015B (zh) | 2010-03-29 | 2010-11-09 | 立体成像装置、图像再现装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010075750 | 2010-03-29 | ||
JP2010-075750 | 2010-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011121837A1 true WO2011121837A1 (ja) | 2011-10-06 |
Family
ID=44711611
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/069913 WO2011121837A1 (ja) | 2010-03-29 | 2010-11-09 | 立体撮像装置、画像再生装置及び編集ソフトウエア |
Country Status (4)
Country | Link |
---|---|
US (1) | US9256116B2 (ja) |
JP (1) | JP5166650B2 (ja) |
CN (1) | CN102860015B (ja) |
WO (1) | WO2011121837A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103220460A (zh) * | 2012-01-20 | 2013-07-24 | 华晶科技股份有限公司 | 影像处理方法及其装置 |
JP2013247518A (ja) * | 2012-05-25 | 2013-12-09 | Sharp Corp | 空間情報算出システム |
TWI625051B (zh) * | 2017-03-21 | 2018-05-21 | 奇景光電股份有限公司 | 深度感測裝置 |
WO2022219985A1 (ja) * | 2021-04-12 | 2022-10-20 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理方法、情報処理装置、プログラム |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4787369B1 (ja) * | 2010-03-30 | 2011-10-05 | 富士フイルム株式会社 | 画像処理装置および方法並びにプログラム |
US20140225991A1 (en) * | 2011-09-02 | 2014-08-14 | Htc Corporation | Image capturing apparatus and method for obatining depth information of field thereof |
WO2013042311A1 (ja) * | 2011-09-20 | 2013-03-28 | パナソニック株式会社 | 立体映像処理装置及び立体映像処理方法 |
TWI524735B (zh) * | 2012-03-30 | 2016-03-01 | 華晶科技股份有限公司 | 三維影像產生方法及裝置 |
DE102013012988A1 (de) * | 2013-08-03 | 2015-02-05 | Carl Zeiss Microscopy Gmbh | Verfahren zur Kalibrierung eines digitalen optischen Abbildungssystems, Verfahren zur Korrektur von Abbildungsfehlern in einem digitalen optischen Abbildungssystem, sowie digitales optisches Abblildungssystem |
CN105023275B (zh) * | 2015-07-14 | 2018-08-28 | 清华大学 | 超分辨率光场采集装置及其的三维重建方法 |
US11147647B2 (en) * | 2016-03-30 | 2021-10-19 | Sony Olympus Mrdical Solutions Inc. | Medical stereoscopic observation device, medical stereoscopic observation method, and program |
CN107801012B (zh) * | 2017-10-30 | 2019-05-17 | Oppo广东移动通信有限公司 | 白平衡处理方法及装置、电子装置和计算机可读存储介质 |
US11765336B2 (en) * | 2019-03-18 | 2023-09-19 | Sony Group Corporation | Image-capturing apparatus, information processing method, and program |
JP2023079119A (ja) | 2021-11-26 | 2023-06-07 | キヤノン株式会社 | 画像処理装置、撮像装置および画像処理方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004007304A (ja) * | 2002-06-03 | 2004-01-08 | Fuji Photo Film Co Ltd | デジタル撮影装置 |
JP2004128565A (ja) * | 2002-09-30 | 2004-04-22 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2008131551A (ja) * | 2006-11-24 | 2008-06-05 | Konica Minolta Opto Inc | 撮像装置 |
JP2008141518A (ja) * | 2006-12-01 | 2008-06-19 | Fujifilm Corp | 撮影装置 |
JP2009302657A (ja) * | 2008-06-10 | 2009-12-24 | Canon Inc | 画像処理装置、制御方法、及びプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08317424A (ja) | 1995-05-19 | 1996-11-29 | Olympus Optical Co Ltd | 立体撮影装置 |
JP2002247593A (ja) | 2001-02-16 | 2002-08-30 | Olympus Optical Co Ltd | 画像処理装置 |
JP4397573B2 (ja) | 2002-10-02 | 2010-01-13 | 本田技研工業株式会社 | 画像処理装置 |
WO2006062325A1 (en) * | 2004-12-06 | 2006-06-15 | Electronics And Telecommunications Research Institute | Apparatus for correcting image distortion of stereo-camera and method thereof |
CN101001309A (zh) * | 2006-01-13 | 2007-07-18 | 鸿富锦精密工业(深圳)有限公司 | 一种成像系统及方法 |
-
2010
- 2010-11-09 WO PCT/JP2010/069913 patent/WO2011121837A1/ja active Application Filing
- 2010-11-09 JP JP2012508018A patent/JP5166650B2/ja not_active Expired - Fee Related
- 2010-11-09 US US13/638,555 patent/US9256116B2/en not_active Expired - Fee Related
- 2010-11-09 CN CN201080065964.4A patent/CN102860015B/zh not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004007304A (ja) * | 2002-06-03 | 2004-01-08 | Fuji Photo Film Co Ltd | デジタル撮影装置 |
JP2004128565A (ja) * | 2002-09-30 | 2004-04-22 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2008131551A (ja) * | 2006-11-24 | 2008-06-05 | Konica Minolta Opto Inc | 撮像装置 |
JP2008141518A (ja) * | 2006-12-01 | 2008-06-19 | Fujifilm Corp | 撮影装置 |
JP2009302657A (ja) * | 2008-06-10 | 2009-12-24 | Canon Inc | 画像処理装置、制御方法、及びプログラム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103220460A (zh) * | 2012-01-20 | 2013-07-24 | 华晶科技股份有限公司 | 影像处理方法及其装置 |
JP2013247518A (ja) * | 2012-05-25 | 2013-12-09 | Sharp Corp | 空間情報算出システム |
TWI625051B (zh) * | 2017-03-21 | 2018-05-21 | 奇景光電股份有限公司 | 深度感測裝置 |
WO2022219985A1 (ja) * | 2021-04-12 | 2022-10-20 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理方法、情報処理装置、プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5166650B2 (ja) | 2013-03-21 |
CN102860015B (zh) | 2015-08-26 |
CN102860015A (zh) | 2013-01-02 |
US9256116B2 (en) | 2016-02-09 |
JPWO2011121837A1 (ja) | 2013-07-04 |
US20130038699A1 (en) | 2013-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5166650B2 (ja) | 立体撮像装置、画像再生装置及び編集ソフトウエア | |
JP4875225B2 (ja) | 立体撮像装置 | |
JP4897940B2 (ja) | 立体撮像装置 | |
JP4692770B2 (ja) | 複眼デジタルカメラ | |
JP5722975B2 (ja) | 撮像装置、撮像装置用シェーディング補正方法及び撮像装置用プログラム | |
JP5640143B2 (ja) | 撮像装置及び撮像方法 | |
JP4686795B2 (ja) | 画像生成装置及び画像再生装置 | |
WO2011136190A1 (ja) | 立体画像再生装置及び方法、立体撮像装置、立体ディスプレイ装置 | |
WO2011136191A1 (ja) | 立体画像再生装置及び方法、立体撮像装置、立体ディスプレイ装置 | |
JP5433109B2 (ja) | 立体撮像装置および立体撮像方法 | |
US9310672B2 (en) | Stereoscopic image capturing device and method of controlling thereof | |
WO2012101916A1 (ja) | 立体動画処理装置、立体動画処理プログラム及びその記録媒体、立体撮像装置並びに立体動画処理方法 | |
JP2010237582A (ja) | 立体撮像装置および立体撮像方法 | |
WO2012105120A1 (ja) | 立体動画再生装置、立体動画再生プログラムならびにその記録媒体、立体ディスプレイ装置、立体撮像装置及び立体動画再生方法 | |
JP2012124650A (ja) | 撮像装置および撮像方法 | |
JP2010200024A (ja) | 立体画像表示装置および立体画像表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080065964.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10849010 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012508018 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13638555 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10849010 Country of ref document: EP Kind code of ref document: A1 |