WO2014157270A1 - 画像処理装置、撮像装置、プログラム及び画像処理方法 - Google Patents
画像処理装置、撮像装置、プログラム及び画像処理方法 Download PDFInfo
- Publication number
- WO2014157270A1 WO2014157270A1 PCT/JP2014/058407 JP2014058407W WO2014157270A1 WO 2014157270 A1 WO2014157270 A1 WO 2014157270A1 JP 2014058407 W JP2014058407 W JP 2014058407W WO 2014157270 A1 WO2014157270 A1 WO 2014157270A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- display
- determination
- focus
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 262
- 238000012545 processing Methods 0.000 title claims abstract description 221
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 230000033001 locomotion Effects 0.000 claims abstract description 324
- 238000001514 detection method Methods 0.000 claims abstract description 222
- 230000003287 optical effect Effects 0.000 claims abstract description 70
- 210000001747 pupil Anatomy 0.000 claims abstract description 14
- 238000012790 confirmation Methods 0.000 claims abstract description 11
- 238000013459 approach Methods 0.000 claims description 17
- 230000004044 response Effects 0.000 abstract description 2
- 230000002194 synthesizing effect Effects 0.000 abstract 1
- 238000000034 method Methods 0.000 description 417
- 230000008569 process Effects 0.000 description 374
- 230000009467 reduction Effects 0.000 description 58
- 238000010586 diagram Methods 0.000 description 29
- 230000006870 function Effects 0.000 description 29
- 238000004891 communication Methods 0.000 description 26
- 230000007246 mechanism Effects 0.000 description 6
- 239000003086 colorant Substances 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000008602 contraction Effects 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000010363 phase shift Effects 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 206010044565 Tremor Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000005674 electromagnetic induction Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 230000007257 malfunction Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000010897 surface acoustic wave method Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
Definitions
- the present invention relates to an image processing apparatus, an imaging apparatus, a program, and an image processing method, and in particular, an image processing apparatus, an imaging apparatus, a program, and an image that generate and display an image for confirming a focused state of a subject image. It relates to the processing method.
- a split image is displayed in a live view image (so-called through image) in order to facilitate manual focus adjustment (so-called manual focus) in an imaging apparatus such as a digital camera or a camera-equipped mobile phone.
- the split image is an image obtained by combining a part of the right-eye image and a part of the left-eye image corresponding to a predetermined area in the subject image.
- a shift with respect to the parallax generation direction occurs between the right-eye image and the left-eye image according to the in-focus state.
- the user confirms the in-focus state by visually recognizing the deviation between the right-eye image and the left-eye image in the split image.
- Patent Document 1 discloses an imaging apparatus capable of switching and displaying a split image and a part of a live view image when performing manual focus.
- This imaging device generates image data of a live view image based on a signal obtained by photoelectrically converting an optical image formed by a light beam incident on an imaging optical system from a subject.
- the imaging apparatus generates a split image corresponding to the phase difference between the two optical images based on a signal obtained by photoelectrically converting two optical images formed by the two divided light beams.
- the imaging apparatus generates a partial image corresponding to a part of the image data, and switches and displays the split image and the partial image on the display means at the time of manual focus.
- an imaging apparatus that includes a touch panel and can perform part of focus adjustment by a simple operation via the touch panel when performing manual focus.
- Patent Document 2 discloses an imaging apparatus capable of performing focus control by a simple operation when focus control is performed in a focus detection region designated by a touch panel.
- the imaging apparatus includes a focus adjustment unit that extracts a focus signal indicating a focused state of a subject image included in a focus detection region within an imaging range and adjusts the position of the focus lens based on the focus signal.
- the imaging apparatus includes a touch panel that is provided on the surface of a display unit that displays an image and receives designation of the position of the focus detection region within the imaging range.
- the focus adjusting unit detects the focus designated by the touch panel while moving the focus lens within a predetermined range based on the position of the focus lens moved according to the user operation when performing manual focus. A focus signal in the region is extracted. The focus adjusting unit adjusts the position of the focus lens based on the focus signal.
- Patent Document 3 discloses a technique for performing a scroll operation corresponding to an operation of drawing a circular drag touch circle as a touch screen scroll method.
- Patent Document 3 is a technique that has a problem of scrolling the entire screen with only one touch operation, and is simply referred to as the technique described in Patent Document 1 and Patent Document 2 described above. Even if it is applied to the above, adjusting the position of the focus lens may be an unnatural operation for the user.
- the present invention has been made in view of the above problems, and provides an image processing apparatus, an imaging apparatus, a program, and an image processing method capable of performing focusing control using a split image by an intuitive operation. Objective.
- an image processing apparatus includes first and second images in which subject images that have passed through first and second regions in an imaging lens including a focus lens are divided into pupils. Based on the first image based on the image signal output from the first pixel group and the second image based on the image signal output from the second pixel group in the imaging device having two pixel groups, the first A first divided image selected from a plurality of divided images obtained by dividing the image in a predetermined dividing direction, and a plurality of divided images obtained by dividing the second image in the dividing direction.
- a generation unit configured to generate a display image used for focusing confirmation, in which a second divided image selected from the divided images excluding the divided image representing the divided region corresponding to the first divided image is arranged; Have a display area and display area In a state in which a display unit provided with a touch panel on the surface thereof, a display control unit that performs control to display a display image generated by the generation unit on the display unit, and a display image displayed on the display unit, A first detection unit that detects that the selection operation of the first divided image or the second divided image on the display image is performed via the touch panel; and an intersection that intersects the division direction with respect to the display image A second detection unit that detects that the movement operation in the direction is performed via the touch panel; and the detection operation is detected by the first detection unit, and then the movement operation is detected by the second detection unit.
- a focusing control unit that performs control to move the focus lens according to the moving operation is provided for the moving unit that moves the focus lens in the optical axis direction.
- the imaging unit includes the first and second pixel groups in which the subject image that has passed through the first and second regions in the imaging lens including the focus lens is divided into pupils and formed respectively by the generation unit.
- An image (corresponding to a split image) is generated.
- the display control unit performs control to display the display image generated by the generation unit on the display unit having the display area and the touch panel provided on the surface of the display area. Is called.
- the selection operation of the first divided image or the second divided image on the display image is performed via the touch panel. It is detected that it has been done.
- the second detection unit detects that a moving operation in the intersecting direction intersecting the dividing direction with respect to the display image is performed via the touch panel.
- the focus lens is moved in the optical axis direction by the moving unit.
- the focusing operation unit detects the selection operation by the first detection unit and then the movement operation is detected by the second detection unit, the focus lens is moved according to the movement operation. Control to be performed is performed on the moving unit.
- the image processing apparatus moves when the first divided image or the second divided image in the display image (split image) is selected via the touch panel and a moving operation is further performed.
- the focus lens is moved according to the operation.
- the generation unit further generates a second display image used for confirming the imaging range based on the image signal output from the imaging element
- the display control unit includes: Control for further displaying the second display image generated by the generation unit may be performed on the display unit.
- the imaging device further includes a third pixel group that outputs a third image by forming a subject image that has passed through the imaging lens without being pupil-divided, and The unit may generate the second display image based on the third image output from the third pixel group. Thereby, focusing control using the split image can be performed while checking the imaging range.
- the focus control unit may determine whether the selection operation detected by the first detection unit is a selection operation for the first divided image or a selection operation for the second divided image.
- Control of moving the focus lens may be performed by determining the movement direction of the focus lens based on the result of the first determination and the result of the second determination. Thereby, focusing control can be performed by a simple operation on the split image.
- the first image is a right-eye image
- the second image is a left-eye image
- the focusing control unit determines that the result of the first determination is
- the operation is a right-eye image selection operation and the second determination result is a rightward movement operation as viewed from the operator observing the display unit
- the in-focus position is greater than the current in-focus position.
- the focus lens is moved in a direction approaching the image sensor
- the first determination result is a right-eye image selection operation
- the second determination result is leftward as viewed from the operator observing the display unit.
- the moving unit may be controlled to move the focus lens in a direction in which the in-focus position is farther from the image sensor than the current in-focus position.
- the first image is a right-eye image
- the second image is a left-eye image
- the focusing control unit determines that the result of the first determination is When it is a left-eye image selection operation and the second determination result is a rightward movement operation as viewed from the operator observing the display unit, the in-focus position is greater than the current in-focus position.
- the imaging focus lens is moved in a direction away from the imaging device
- the first determination result is a left-eye image selection operation
- the second determination result is the left direction as viewed from the operator observing the display unit
- control may be performed to move the focus lens in a direction in which the in-focus position is closer to the image sensor than the current in-focus position.
- the first divided image and the second divided image are arranged adjacent to each other in the division direction in the display image, and the first divided image and the second divided image
- a third detection unit that detects that the movement operation that passes through the boundary line with the divided image is performed via the touch panel, and the focusing control unit detects that the selection operation is detected by the first detection unit.
- the movement unit may be controlled to move the focus lens according to the movement operation. Thereby, focusing control can be performed by a simple operation on the split image.
- the focusing control unit includes a third determination for determining a position of at least one of the first divided image and the second divided image with respect to the boundary line, and a second detection unit.
- the moving direction of the focus lens may be determined based on the above result and the result of the fifth determination, and the moving unit may be controlled to move the focus lens. Thereby, focusing control can be performed by a simple operation on the split image.
- the first image is a right-eye image
- the second image is a left-eye image
- the third determination the second divided image
- the focusing control unit performs the moving operation by the third detecting unit following the detection of the moving operation by the second detecting unit.
- the moving operation is detected by the third detecting unit following the detection of the moving operation by the second detecting unit, and the result of the fourth determination is left as viewed from the operator observing the display unit.
- the movement operation in the direction and the result of the fifth determination are the upward movement as viewed from the operator observing the display unit.
- the second detection unit detects the movement operation following the detection of the movement operation by the third detection unit, and the result of the fourth determination is the leftward movement operation and the fifth determination
- the result is a downward movement operation
- the movement operation is detected by the second detection unit, and the result of the fourth determination is the right direction
- the focus lens is moved in a direction in which the in-focus position is farther from the image sensor than the current in-focus position. Control may be performed on the moving unit. Thereby, focusing control can be performed by a simple operation with respect to the split image by moving in the clockwise direction.
- the first image is a right-eye image
- the second image is a left-eye image
- the focus control unit performs the moving operation by the third detecting unit following the detection of the moving operation by the second detecting unit.
- the result of the fourth determination is a leftward movement operation as viewed from the operator observing the display unit and the result of the fifth determination is the downward direction as viewed from the operator observing the display unit
- the moving operation is detected by the third detecting unit following the detection of the moving operation by the second detecting unit, and the result of the fourth determination is viewed from the operator observing the display unit.
- the result of the rightward moving operation and the fifth determination is upward when viewed from the operator observing the display unit. If the movement operation is a moving operation, the second detection unit detects the movement operation following the detection of the movement operation by the third detection unit, and the result of the fourth determination is the leftward movement operation and the fifth determination.
- the focus lens is moved in a direction in which the in-focus position is closer to the image sensor than the current in-focus position. Control to be performed may be performed on the moving unit. Thereby, the focusing control can be performed by a simple operation on the split image by moving the counterclockwise.
- the first image is a right-eye image
- the second image is a left-eye image
- the focus control unit performs the moving operation by the third detecting unit following the detection of the moving operation by the second detecting unit.
- the result of the fourth determination is a rightward moving operation as viewed from the operator observing the display unit and the result of the fifth determination is the downward direction as viewed from the operator observing the display unit
- the moving operation is detected by the third detecting unit following the detection of the moving operation by the second detecting unit, and the result of the fourth determination is viewed from the operator observing the display unit.
- the result of the moving operation in the left direction and the fifth determination is the upward direction as viewed from the operator observing the display unit. If the movement operation is a moving operation, the second detection unit detects the movement operation following the detection of the movement operation by the third detection unit, and the result of the fourth determination is the leftward movement operation and the fifth determination.
- the focus lens is moved in a direction in which the focus position is closer to the image sensor than the current focus position. Control to be performed may be performed on the moving unit. Thereby, it is possible to perform focusing control on the split image by a simple operation of moving clockwise.
- the first image is a right-eye image
- the second image is a left-eye image
- the first divided image When it is determined that the position is above the boundary line as viewed from the operator observing the display unit, the focusing control unit performs the moving operation by the third detecting unit following the detection of the moving operation by the second detecting unit.
- the result of the fourth determination is detected, and the moving operation in the left direction as viewed from the operator observing the display unit and the result of the fifth determination in the downward direction as viewed from the operator observing the display unit
- the movement operation is detected by the third detection unit following the detection of the movement operation by the second detection unit, and the result of the fourth determination is the right as viewed from the operator observing the display unit.
- the movement operation in the direction and the result of the fifth determination are the upward movement as viewed from the operator observing the display unit.
- the second detection unit detects the movement operation following the detection of the movement operation by the third detection unit, and the result of the fourth determination is the leftward movement operation and the fifth determination
- the result is an upward movement operation
- the movement operation is detected by the second detection unit, and the result of the fourth determination is the right direction
- the focus lens is moved in a direction in which the in-focus position is farther from the image sensor than the current in-focus position. You may make it perform control with respect to a moving part. As a result, it is possible to perform focus control on the split image by a simple operation of moving counterclockwise.
- the focusing control unit may be configured such that the selection operation detected by the first detection unit is a selection operation for both the first divided image and the second divided image, and the second operation is performed.
- the movement operation detected by the detection unit is different from each other along the crossing direction in each of the first divided image and the second divided image, the moving operation and the second division of the first divided image are performed.
- the moving direction of the focus lens may be determined based on the moving direction of each image moving operation, and the moving unit may be controlled to move the focus lens.
- the first image is an image for the right eye
- the second image is an image for the left eye
- the focusing control unit is detected by the first detection unit.
- the selection operation is a selection operation of both the first divided image and the second divided image
- the moving operation of the right eye image detected by the second detection unit is displayed in the direction along the intersecting direction and displayed.
- An operation of observing the display unit in the direction along the crossing direction, and the movement operation of the left-eye image detected by the second detection unit When the movement operation is to the left as viewed from the user, the focus lens is moved in a direction in which the focus position is closer to the image sensor than the current focus position, and the selection operation detected by the first detection unit is A selection operation of both the first divided image and the second divided image, The movement operation of the right-eye image detected by the second detection unit is a leftward movement operation, and the movement operation of the left-eye image detected by the second detection unit is a rightward movement operation. If there is, the moving unit may be controlled to move the focus lens in a direction in which the in-focus position is farther from the image sensor than the current in-focus position. Thereby, it is possible to prevent an erroneous operation by designating both the right-eye image and the left-eye image, and it is possible to perform focusing control using the split image by an intuitive operation.
- the focus control unit moves the focus lens to the contact position in the contact operation while the contact operation on the touch panel in the movement operation detected by the second detection unit is continued. Accordingly, the moving unit may be controlled to move in the optical axis direction. Thereby, the focal position of the imaging lens can be finely adjusted by an intuitive operation.
- the focusing control unit may perform control on the moving unit to continuously move the focus lens in the moving direction according to the moving operation in the optical axis direction. .
- focusing control using a split image can be easily performed by an intuitive operation.
- the focusing control unit may control the moving unit to move the focus lens at a moving speed corresponding to the operation speed in the moving operation.
- the moving speed of the focus lens can be adjusted by an intuitive and simple operation.
- the focusing control unit may perform control for moving the focus lens on the moving unit according to the movement amount corresponding to the operation movement amount in the moving operation.
- the moving speed of the focus lens can be adjusted by an intuitive and simple operation.
- the focus control unit does not perform control to move the focus lens with respect to the moving unit when the operation speed in the moving operation is less than a predetermined first threshold. You may do it. Thereby, an erroneous operation can be prevented.
- the image processing apparatus in a state where the focus lens is moved by the moving unit, after the selection operation on the touch panel is once released, the contact operation at any position in the display area is performed via the touch panel.
- a fourth detection unit for detecting that the movement has been performed, and the focusing control unit controls the movement unit to stop the movement of the focus lens when a contact operation is detected by the fourth detection unit;
- the movement of the imaging focus lens can be stopped by a simple operation.
- the movement unit controls the movement unit to decelerate the moving speed of the focus lens and stop the focus lens. You may make it perform with respect to it. Thereby, the focus position of the focus lens can be finely adjusted by a simple operation.
- the moving unit when the focusing control unit detects that the moving time in the moving operation detected by the second detecting unit is equal to or greater than a predetermined second threshold value, the second detecting unit detects While the contact operation on the touch panel in the moved operation is continued, the moving unit is controlled to move the focus lens in the optical axis direction according to the contact position in the contact operation, and the movement time is less than the second threshold value.
- the moving unit may be controlled to continuously move the focus lens in the moving direction according to the contact position of the moving operation in the optical axis direction.
- the image processing apparatus further includes a fifth detection unit that detects a focus state of the display image in a state where the focus lens is moved by the focus control unit, and the focus control unit includes: When it is detected by the fifth detection unit that focus is achieved, control for stopping the movement of the imaging focus lens may be performed. Thereby, the imaging focus lens can be moved to the in-focus position by a simple operation.
- the fifth detection unit that detects the focus state of the display image and the fifth detection unit are in focus. And a notification unit that notifies the user that the subject is in focus. Further, in the image processing apparatus according to the present invention, the notification unit may notify the in-focus state by vibrating a part where a touch operation on the touch panel is performed. As a result, it is possible to notify that the in-focus state has been achieved reliably and promptly.
- the fifth detection unit may detect the in-focus state of the display image based on the contrast of the display image. Thereby, it can be determined at high speed whether the image picked up by the focus lens is in focus.
- the fifth detection unit detects the in-focus state of the display image based on the phase difference between the first divided image and the second divided image in the display image. Therefore, it can be determined with high accuracy whether or not the image captured by the focus lens is in focus.
- the first divided image and the second divided image are arranged adjacent to each other in the division direction in the display image, and the first divided image and the second divided image
- a third detection unit that detects that the movement operation that passes through the boundary line with the divided image is performed via the touch panel, and the display control unit is discontinuous with the detection of the movement operation by the second detection unit.
- control for enlarging or reducing the display image may be performed according to the operation direction of the movement operation detected by the third detection unit. As a result, the subject image is easy to see, and the in-focus state can be easily confirmed.
- the first divided image and the second divided image are arranged adjacent to each other in the division direction in the display image, and the first divided image and the second divided image
- a third detection unit that detects that the movement operation that passes through the boundary line with the divided image is performed via the touch panel, and the display control unit is discontinuous with the detection of the movement operation by the second detection unit.
- the third detection unit detects a movement operation
- the third detection unit detects two contact positions of the touch operation on the touch panel and detects a movement operation in a direction in which the two contact positions are separated from each other.
- the display control unit Show Image may be controlled to reduce the. Thereby, malfunction can be suppressed compared with the case where a contact position is one point.
- the display control unit displays the display image when the size of the display image becomes equal to the size of the entire display area in accordance with the control for enlarging the display image. You may make it stop the control to expand. Further, in the image processing apparatus according to the present invention, the display control unit, when the size of the display image becomes larger than the entire display area due to the control for enlarging the display image, a part of the display image. May be controlled to be displayed in the display area. Thereby, it is possible to easily confirm the in-focus state of the split image.
- the display control unit reduces the display image after the display image is enlarged, and the size of the display image becomes the size of the display image before the enlargement.
- the control for reducing the display image may be stopped. Thereby, it can suppress that it becomes difficult to see a split image.
- an imaging apparatus includes an image processing apparatus according to the present invention, an imaging lens, an imaging element having first and second pixel groups, and an imaging element. And a memory for storing an image generated based on the output image signal.
- the imaging apparatus according to the present invention operates in the same manner as the image processing apparatus according to the present invention. Therefore, as with the image processing apparatus according to the present invention, the focus control using the split image is intuitively operated. Can be done by.
- a program causes a computer to divide a subject image that has passed through the first and second regions in an imaging lens including a focus lens into a pupil-divided image. Based on the first image based on the image signal output from the first pixel group and the second image based on the image signal output from the second pixel group in the imaging device having the first and second pixel groups, A first divided image selected from a plurality of divided images obtained by dividing the first image in a predetermined dividing direction, and a plurality of divisions obtained by dividing the second image in the dividing direction.
- a display control unit that performs control to display a display image generated by the generation unit on a display unit that has a touch panel on the surface of the display area, and the display image is displayed on the display unit ,
- the first detection unit for detecting that the selection operation of the first divided image or the second divided image on the display image is performed via the touch panel, and the display image intersect with the dividing direction.
- a second detection unit that detects that the movement operation in the intersecting direction is performed via the touch panel, and the second detection unit detects the movement operation following the selection operation detected by the first detection unit.
- the moving unit that moves the focus lens in the optical axis direction is caused to function as a focus control unit that performs control to move the focus lens in accordance with the moving operation.
- the focus control using the split image is intuitive as in the image processing apparatus according to the present invention. It can be done by operation.
- the subject image that has passed through the first and second regions in the imaging lens including the focus lens is divided into pupils and formed first.
- a first divided image selected from a plurality of divided images obtained by dividing the first image in a predetermined dividing direction, and a plurality of divisions obtained by dividing the second image in the dividing direction.
- Generation for generating a display image used for focusing confirmation in which a second divided image selected from divided images excluding divided images representing divided areas corresponding to the first divided image is arranged.
- Step and display area a display control step for performing control to display a display image generated by the generation step on a display unit provided with a touch panel on the surface of the display area, and display in a state where the display image is displayed on the display unit
- a focusing control step for performing control to move the imaging focus lens according to the moving operation with respect to the moving unit that moves the focus lens in the optical axis direction.
- the image processing method according to the present invention operates in the same manner as the image processing apparatus according to the present invention. Therefore, as with the image processing apparatus according to the present invention, the focus control using the split image is intuitive. It can be done by operation.
- FIG. 3 is a schematic arrangement diagram illustrating an example of arrangement of color filters and light shielding members provided in an image sensor of the imaging apparatus according to the first embodiment. It is a schematic block diagram which shows an example of a structure of the phase difference pixel (1st pixel and 2nd pixel) of the image pick-up element of the imaging device which concerns on 1st Embodiment.
- the imaging apparatus according to the first embodiment when focusing control is performed based on the phase difference between the right-eye image and the left-eye image, the time from when the focus lens starts to moving until it stops It is a graph which shows the relationship between the converted elapsed time and the position of a focus lens.
- the process normalized by the time from the start of the movement of the focus lens to the stop when the focus control is performed based on the contrast of the image for the right eye and the image for the left eye in the imaging apparatus according to the first embodiment It is a graph which shows the relationship between time and the position of a focus lens. It is a flowchart which shows the flow of a process of the reciprocal movement process routine program which concerns on 1st Embodiment. It is a front view which shows an example of the display state of a split image when reciprocal movement operation is performed in the imaging device which concerns on 1st Embodiment. It is a front view which shows another example of the display state of a split image when reciprocal movement operation is performed in the imaging device which concerns on 1st Embodiment.
- FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of color filters and an arrangement of light shielding members provided in an imaging element of the imaging apparatus according to the first embodiment.
- FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of color filters and an arrangement of light shielding members provided in an imaging element of the imaging apparatus according to the first embodiment.
- FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of color filters and an arrangement of light shielding members provided in an imaging element of the imaging apparatus according to the first embodiment.
- FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of color filters and an arrangement of light shielding members provided in an imaging element of the imaging apparatus according to the first embodiment.
- FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of color filters and an arrangement of light shielding members provided in an imaging element of the imaging apparatus according to the first embodiment.
- FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of color filters and an arrangement of light shielding members provided in an imaging element of the imaging apparatus according to the first embodiment.
- FIG. 14 is a front view for explaining an example of a split image display state when a split image enlargement operation is performed in the imaging apparatus according to the fourth embodiment.
- FIG. 15 is a front view for explaining an example of a split image display state when a split image reduction operation is performed in the imaging apparatus according to the fourth embodiment.
- It is a flowchart which shows the flow of a process of the other example 2 of the expansion / contraction control processing routine program which concerns on 4th Embodiment.
- the imaging apparatus 100 is a lens interchangeable camera. Further, as shown in FIG. 1, the imaging apparatus 100 includes a camera body 200 and an interchangeable lens 258 that is a zoom lens that is replaceably attached to the camera body 200, and a digital camera in which the reflex mirror is omitted. It is.
- the interchangeable lens 258 includes an imaging lens 16 having a focus lens 302 movable in the optical axis direction, a focus ring 260, a slide mechanism 303, and a motor 304 (see FIG. 3, details will be described later).
- the camera body 200 is provided with a hybrid finder (registered trademark) 220.
- the hybrid viewfinder 220 here refers to a viewfinder in which, for example, an optical viewfinder (hereinafter referred to as “OVF”) and an electronic viewfinder (hereinafter referred to as “EVF”) are selectively used.
- OPF optical viewfinder
- EMF electronic viewfinder
- the camera body 200 and the interchangeable lens 258 are interchangeably mounted by combining a mount 256 provided in the camera body 200 and a mount 346 (see FIG. 3) on the interchangeable lens 258 side corresponding to the mount 256.
- the lens barrel of the interchangeable lens 258 is provided with a focus ring 260 that is used in the manual focus mode.
- the imaging apparatus 100 moves the focus lens 302 in the optical axis direction in accordance with the rotation operation of the focus ring 260, and forms subject light on the imaging element 20 (see FIG. 3) described later at a focus position corresponding to the subject distance. Can be made.
- a front view of the camera body 200 is provided with an OVF viewfinder window 241 included in the hybrid viewfinder 220.
- a finder switching lever 214 is provided on the front surface of the camera body 200. When the viewfinder switching lever 214 is rotated in the direction of the arrow SW, the image displayed on the viewfinder is switched between an optical image visible by the OVF and an electronic image (live view image) visible by the EVF. (Described later).
- the OVF optical axis L2 is an optical axis different from the optical axis L1 of the interchangeable lens 258.
- a release switch 211 and a dial 212 for setting a shooting mode, a playback mode, and the like are mainly provided on the upper surface of the camera body 200.
- the release switch 211 serving as a shooting preparation instruction unit and a shooting instruction unit is configured to detect a two-stage pressing operation between a shooting preparation instruction state and a shooting instruction state.
- the shooting preparation instruction state refers to a state where the release switch 211 is pressed from the standby position to the intermediate position (half-pressed position), and the shooting instruction state refers to the final pressed position (when the release switch 211 exceeds the intermediate position ( This refers to the state where it is pressed down to the fully pressed position.
- half-pressed state a state where the button is pressed from the standby position to the half-pressed position
- full-pressed state a state where the button is pressed from the standby position to the fully-pressed position
- the shooting mode and the playback mode are selectively set as the operation mode in accordance with a user instruction.
- a manual focus mode and an autofocus mode are selectively set according to a user instruction.
- an autofocus mode a shooting control process described later is executed by pressing the release switch 211 halfway, and then exposure (shooting) is performed when the release switch 211 is fully pressed.
- an OVF viewfinder eyepiece 242 a display input unit 216, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225 are provided on the back of the camera body 200.
- the display input unit 216 displays an image (still image and moving image), character information, and the like, visually transmits information to the user, and detects a user operation on the displayed information. ⁇ It is a display. That is, the display input unit 216 includes a display unit 213 and a touch panel 215.
- the display unit 213 is realized by an LCD (Liquid Crystal Display), an organic EL (Organic Electroluminescence) display, or the like.
- the display unit 213 is used to display a continuous frame image (such as a live view image) obtained by capturing continuous frames in the shooting mode.
- the display unit 213 is also used to display a single frame image (such as a still image) obtained by capturing a single frame when a still image shooting instruction is given. Furthermore, the display unit 213 is also used for displaying a reproduced image and a menu screen in the reproduction mode.
- the touch panel 215 is a device that is stacked on the display unit 213 so that an image displayed on the display surface of the display unit 213 is visible, and detects coordinates indicating a position touched by a user's finger, a stylus, or the like. .
- the touch panel 215 When the touch panel 215 is operated with a user's finger or a stylus, the touch panel 215 outputs a detection signal indicating the touched position generated due to the operation to the CPU 12.
- the size of the display area of the display unit 213 and the size of the touch panel 215 may be completely matched, but it is not always necessary to match the two.
- the position detection method employed in the touch panel 215 include a matrix switch method, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, a capacitance method, and the like. Also good.
- the cross key 222 functions as a multi-function key that outputs various command signals such as menu selection, zoom and frame advance.
- the MENU / OK key 224 has both a function as a menu button for instructing to display a menu on the screen of the display unit 213 and an function as an OK button for instructing confirmation and execution of selection contents. Key.
- the BACK / DISP button 225 is used to delete a desired object such as a selection item, cancel a designated content, or return to the previous operation state.
- FIG. 3 is a block diagram illustrating an example of a hardware configuration of the imaging apparatus 100 according to the present embodiment.
- the interchangeable lens 258 includes a slide mechanism 303 and a motor 304 which are an example of a moving unit according to the present invention.
- the slide mechanism 303 moves the focus lens 302 in the direction of the optical axis L1 by operating the focus ring 260.
- a focus lens 302 is attached to the slide mechanism 303 so as to be slidable in the direction of the optical axis L1.
- a motor 304 is connected to the slide mechanism 303, and the slide mechanism 303 receives the power of the motor 304 and slides the focus lens 302 along the direction of the optical axis L1.
- the motor 304 is connected to the camera body 200 via mounts 256 and 346, and driving is controlled according to a command from the camera body 200.
- a stepping motor is applied as an example of the motor 304. Therefore, the motor 304 operates in synchronization with the pulse power according to a command from the camera body 200.
- the imaging apparatus 100 is a digital camera that records captured still images and moving images, and the overall operation of the camera is controlled by a CPU (Central Processing Unit) 12 shown in FIG. As shown in the figure, the imaging apparatus 100 includes an operation unit 14, an interface unit 24, a memory 26, and an encoder 34 in addition to the CPU 12.
- the imaging apparatus 100 includes a display control unit 36, an eyepiece detection unit 37, and an external interface (I / F) 39, which are examples of the display control unit according to the present invention.
- the imaging apparatus 100 includes an image processing unit 28.
- the CPU 12, the operation unit 14, the interface unit 24, the memory 26, the image processing unit 28, the encoder 34, the display control unit 36, the eyepiece detection unit 37, and the external interface (I / F) 39 are connected to each other via the bus 40.
- the memory 26 includes a non-volatile storage area (such as an EEPROM) that stores parameters, programs, and the like, and a volatile storage area (such as an SDRAM) that temporarily stores various information such as images.
- the CPU 12 performs focusing control by driving and controlling the focus adjustment motor so that the contrast value of the image obtained by imaging is maximized. Further, the CPU 12 calculates AE information that is a physical quantity indicating the brightness of an image obtained by imaging. When the release switch 211 is half-pressed, the CPU 12 derives the shutter speed and F value corresponding to the brightness of the image indicated by the AE information. Then, the exposure state is set by controlling each related part so that the derived shutter speed and F value are obtained.
- the operation unit 14 is a user interface operated by the operator when giving various instructions to the imaging apparatus 100. Various instructions received by the operation unit 14 are output as operation signals to the CPU 12, and the CPU 12 executes processing according to the operation signals input from the operation unit 14.
- the operation unit 14 includes a release switch 211, a dial 212, a display unit 213, a viewfinder switching lever 214, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225.
- the camera body 200 includes a position detection unit 23.
- the position detection unit 23 is connected to the CPU 12.
- the position detection unit 23 is connected to the focus ring 260 via mounts 256 and 346, detects the rotation angle of the focus ring 260, and outputs rotation angle information indicating the rotation angle as a detection result to the CPU 12.
- the CPU 12 executes processing according to the rotation angle information input from the position detection unit 23.
- the image light indicating the subject is emitted from the color image sensor (for example, a CMOS sensor) 20 via the imaging lens 16 including the focus lens 302 that can be moved manually and the shutter 18.
- An image is formed on the light receiving surface.
- the signal charge accumulated in the image sensor 20 is sequentially read out as a digital signal corresponding to the signal charge (voltage) by a read signal applied from the device control unit 22.
- the image sensor 20 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing of the read signal by using the electronic shutter function.
- the image sensor 20 according to the first embodiment is a CMOS image sensor, but is not limited thereto, and may be a CCD image sensor.
- the image sensor 20 is provided with a color filter 21 as shown in FIG.
- the color filter 21 includes a first filter G corresponding to G (green) that contributes most to obtain a luminance signal, and second filters R and B corresponding to R (red).
- a third filter B corresponding to (blue) is included.
- a G filter, an R filter, and a B filter are arranged with a predetermined periodicity in each of the row direction (horizontal direction) and the column direction (vertical direction) for each pixel of the image sensor 20. . Therefore, the imaging apparatus 100 can perform processing according to a repetitive pattern when performing synchronization (interpolation) processing of R, G, and B signals.
- the synchronization process is a process for calculating all color information for each pixel from a mosaic image corresponding to a color filter array of a single-plate color image sensor.
- the synchronization process means a process for calculating color information of all RGB for each pixel from a mosaic image made of RGB.
- the imaging apparatus 100 has a phase difference AF function.
- the image sensor 20 according to the present embodiment includes a plurality of phase difference detection pixels that are used when the phase difference AF function is activated.
- the plurality of phase difference detection pixels are arranged according to a predetermined pattern.
- the pixel for phase difference detection is either the first pixel L in which the left half in the horizontal direction is shielded or the second pixel R in which the right half in the horizontal direction is shielded.
- phase difference pixels when it is not necessary to distinguish between the first pixel L and the second pixel R, they are referred to as “phase difference pixels”.
- the first pixel L has a light shielding member 20A
- the second pixel R has a light shielding member 20B.
- the light shielding member 20A is provided on the front side (microlens M side) of the photodiode PD, and shields the left half of the light receiving surface of the photodiode PD.
- the light shielding member 20B is provided on the front side of the photodiode PD, and shields the right half of the light receiving surface of the photodiode PD.
- the microlens M and the light shielding members 20A and 20B function as a pupil division unit, the first pixel L receives only the left side of the optical axis of the light beam passing through the exit pupil of the imaging lens 16, and the second pixel R is Only the right side of the optical axis of the light beam passing through the exit pupil of the imaging lens 16 is received. In this way, the light beam passing through the exit pupil is divided into left and right by the microlens M and the light shielding members 20A and 20B, which are pupil dividing portions, and enter the first pixel L and the second pixel R, respectively.
- a subject image (left eye image) corresponding to the left half of the light flux passing through the exit pupil of the imaging lens 16 and a subject image (right) corresponding to the right half of the light flux.
- a region in focus (in an in-focus state) in the image for eye) forms an image at the same position (in-focus position) on the image sensor 20.
- a so-called front pin region that is in focus at a position closer to the imaging device 100 than the subject, or a so-called rear pin region that is in focus at a position farther from the subject than the imaging device 100 is, Each is incident on a different position on the image sensor 20 (phase is shifted).
- the split image refers to an image in which a right eye divided image and a left eye divided image are arranged adjacent to each other in a predetermined direction (for example, a direction orthogonal to the parallax generation direction).
- the right-eye divided image refers to a partial divided image selected from a plurality of divided images obtained by dividing the right-eye image in a predetermined division direction (for example, a direction orthogonal to the parallax generation direction).
- the left eye divided image refers to a divided image selected by removing an image representing a divided region corresponding to the right eye divided image from a plurality of divided images obtained by dividing the left eye image in the dividing direction. 7, 8, 13, 14, 23, 24, 25 A, 25 B, 36, 37, 38, 46, and 47, for convenience of explanation.
- a column image indicating a column included in the right eye divided image is referred to as a right eye image
- a column image indicating a column included in the left eye divided image is referred to as a left eye image.
- the right-eye image is used in the rear pin region (when the subject having the longer subject distance (the one far from the imaging lens 16) is focused on the two columns).
- An image for the right eye which is a column image indicating a column having a shorter subject distance among the two columns
- the left-eye image is viewed from the operator who observes the display unit 213.
- the right eye in the front pin region (when the subject having the shorter subject distance (the one closer to the imaging lens 16) is focused on the two columns)
- the image for the right eye (the image for the right eye, which is the column image showing the column with the longer subject distance among the two columns) is shifted in the right direction
- the image for the left eye (the one with the longer subject distance among the two columns)
- the left-eye image that is a column image indicating the column of the image is shifted in the left direction.
- the imaging device 100 according to the present embodiment can acquire parallax images having different parallaxes between the right-eye image and the left-eye image.
- the imaging apparatus 100 detects a phase shift amount based on the pixel value of the first pixel L and the pixel value of the second pixel R. Then, the adjustment of the focal position of the focus lens 302 by the user operation is assisted by presenting the detected phase shift amount.
- the light shielding members 20A and 20B are referred to as “light shielding members” without reference numerals.
- the image sensor 20 is classified into a first pixel group, a second pixel group, and a third pixel group.
- the first pixel group refers to a plurality of first pixels L, for example.
- the second pixel group refers to a plurality of second pixels R, for example.
- the third pixel group refers to a plurality of normal pixels, for example.
- the “normal pixel” here refers to, for example, a pixel other than the phase difference pixel (for example, a pixel having no light shielding members 20A and 20B).
- the RAW image indicated by the image signal output from the first pixel group is referred to as a “first image”.
- the RAW image indicated by the image signal output from the second pixel group is referred to as a “second image”.
- the RAW image indicated by the image signal output from the third pixel group is referred to as a “third image”.
- each pixel included in the first pixel group and the second pixel group is disposed at a position where the positions in the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group.
- each pixel included in the first pixel group and the second pixel group is disposed at a position where the positions in the vertical direction are aligned within the one pixel between the first pixel group and the second pixel group.
- the first pixel L and the second pixel R are alternately arranged at intervals of a plurality of pixels (in this embodiment, two pixels) linearly in each of the horizontal direction and the vertical direction. Is arranged. That is, the L pixel and the R pixel are included in the same column and the same row, respectively.
- the positions of the pixels included in the first and second pixel groups are positions that are aligned within one pixel in each of the horizontal direction and the vertical direction, but at least in the horizontal direction and the vertical direction.
- One of the positions may be within a predetermined number of pixels (for example, within 2 pixels).
- the position of each pixel included in the first and second pixel groups is set horizontally. It is preferable that the positions are aligned within one pixel in each of the direction and the vertical direction.
- the phase difference pixels are 2 ⁇ 2 pixels (for example, 3rd row, 3rd column, 3rd row, 4th column, 4th row, 3rd column, and 4th row, 4th from the upper left in front view of FIG. 4). It is provided for the pixels of the G filter in a square array corresponding to the pixels in the column). In the present embodiment, the pixel in the lower right corner of the front view in FIG. 4 among the 2 ⁇ 2 pixel G filter is assigned to the phase difference pixel.
- the light shielding member is provided for the pixel in the lower right corner of the 2 ⁇ 2 pixel G filter, and the phase difference pixel is spaced by a plurality of pixels in both the vertical direction and the horizontal direction.
- the phase difference pixel is spaced by a plurality of pixels in both the vertical direction and the horizontal direction.
- the interpolation accuracy in the case of interpolating the pixel values of the phase difference pixels from the pixel values of the normal pixels can be improved.
- the pixels included in the first to third pixel groups are arranged so that the normal pixels used for interpolation do not overlap between the phase difference pixels, further improvement in interpolation accuracy can be expected.
- the image sensor 20 outputs an image signal (digital signal indicating the pixel value of each first pixel) indicating the first image from the first pixel group, and outputs the image signal from the second pixel group.
- An image signal indicating a second image (a digital signal indicating a pixel value of each second pixel) is output.
- the imaging element 20 outputs an image signal indicating a third image (a digital signal indicating a pixel value of each normal pixel) from the third pixel group.
- the third image is a chromatic image, for example, a color image having the same color array as the normal pixel array.
- the image data indicating the first image, the second image, and the third image are temporarily stored in a volatile storage area in the memory 26 via the interface unit 24.
- the image processing unit 28 includes a normal processing unit 30.
- the normal processing unit 30 processes the R, G, and B signals corresponding to the third pixel group to generate a chromatic color normal image that is an example of the second display image.
- the image processing unit 28 includes a split image processing unit 32.
- the split image processing unit 32 processes the G signal corresponding to the first pixel group and the second pixel group to generate an achromatic split image which is an example of a second display image.
- the image processing unit 28 according to the present embodiment is realized by an ASIC (Application Specific Integrated Circuit) that is an integrated circuit in which circuits for realizing a plurality of functions related to image processing are combined into one.
- ASIC Application Specific Integrated Circuit
- the CPU 12 executes a shooting control processing program, which will be described later, and causes the split image processing unit 32 to generate a split image and controls the display unit 213 to display the generated split image.
- the hardware configuration of the image processing unit 28 is not limited to the ASIC, and may be another hardware configuration such as a computer including a programmable logic device, CPU, ROM, and RAM.
- the encoder 34 converts the input signal into a signal of another format and outputs it.
- the hybrid viewfinder 220 has an LCD 247 that displays an electronic image.
- the number of pixels in a predetermined direction on the LCD 247 (for example, the number of pixels in the horizontal direction, which is a parallax generation direction) is smaller than the number of pixels in the same direction on the display unit 213.
- the display control unit 36 is connected to the display unit 213 and the LCD 247, respectively, and an image is displayed on the LCD 247 or the display unit 213 by selectively controlling the LCD 247 and the display unit 213.
- the display unit 213 and the LCD 247 are referred to as “display devices” when it is not necessary to distinguish between them.
- the imaging apparatus 100 is configured to be able to switch between the manual focus mode and the autofocus mode described above using the dial 212.
- the display control unit 36 causes the display device to display a live view image obtained by combining the split images.
- the CPU 12 operates as a phase difference detection unit and an automatic focus adjustment unit.
- the phase difference detection unit detects a phase difference between the first image output from the first pixel group and the second image output from the second pixel group.
- the automatic focus adjustment unit controls the motor 304 from the device control unit 22 via the mounts 256 and 346 so that the defocus amount of the focus lens 302 is zero based on the detected phase difference, and controls the focus lens 302. Move to the in-focus position.
- the above “defocus amount” refers to, for example, the amount of phase shift between the first image and the second image.
- the eyepiece detection unit 37 detects that the user has looked into the viewfinder eyepiece unit 242, and outputs the detection result to the CPU 12. Therefore, the CPU 12 can grasp whether or not the finder eyepiece unit 242 is used based on the detection result of the eyepiece detection unit 37.
- the external I / F 39 is connected to a communication network such as a LAN (Local Area Network) or the Internet, and controls transmission / reception of various information between the external device (for example, a printer) and the CPU 12 via the communication network. Therefore, when a printer is connected as an external device, the imaging apparatus 100 can output a captured still image to the printer for printing. Further, when a display is connected as an external device, the imaging apparatus 100 can output and display a captured still image or live view image on the display.
- a communication network such as a LAN (Local Area Network) or the Internet
- the normal processing unit 30 and the split image processing unit 32 each have a WB gain unit, a gamma correction unit, and a synchronization processing unit (not shown), and the original digital data temporarily stored in the memory 26.
- Signal processing is sequentially performed on each signal (RAW image) in each processing unit. That is, the WB gain unit executes white balance (WB) by adjusting the gains of the R, G, and B signals.
- the gamma correction unit performs gamma correction on each of the R, G, and B signals that have been subjected to WB in the WB gain unit.
- the synchronization processing unit performs color interpolation processing corresponding to the arrangement of the color filters 21 of the image sensor 20, and generates synchronized R, G, B signals.
- the normal processing unit 30 and the split image processing unit 32 perform image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 20.
- the normal processing unit 30 receives R, G, B RAW images from the interface unit 24, and applies the first to the light-shielded pixels in the phase difference image in the R, G, B pixels of the third pixel group. Interpolation is performed using peripheral pixels (for example, adjacent G pixels) of the same color in the first pixel group and the second pixel group.
- the normal processing unit 30 may use the image obtained by the interpolation as a normal image for display and a normal image for recording.
- the normal processing unit 30 outputs the generated image data of the normal image for recording to the encoder 34.
- the R, G, and B signals processed by the normal processing unit 30 are converted (encoded) into recording signals by the encoder 34 and recorded in the recording unit 42 (see FIG. 9).
- the normal processing unit 30 outputs the generated image data of the normal image for display to the display control unit 36.
- the term “for recording” and “for display” are used. Is referred to as a “normal image”.
- the image sensor 20 can change the exposure conditions (shutter speed by an electronic shutter as an example) of each of the first pixel group and the second pixel group, and thereby can simultaneously acquire images having different exposure conditions. . Therefore, the image processing unit 28 can generate an image with a wide dynamic range based on images with different exposure conditions. Further, a plurality of images can be simultaneously acquired under the same exposure condition, and by adding these images, a highly sensitive image with less noise can be generated, or a high resolution image can be generated.
- the split image processing unit 32 extracts the G signal of the first pixel group and the second pixel group from the RAW image once stored in the memory 26, and the G of the first pixel group and the second pixel group. An achromatic split image is generated based on the signal.
- Each of the first pixel group and the second pixel group extracted from the RAW image is a pixel group including G filter pixels as described above. Therefore, the split image processing unit 32 is an achromatic left parallax image and an achromatic right parallax image based on the G signal of the first pixel group and the second pixel group. An image for the right eye can be generated.
- the split image processing unit 32 divides the right-eye image in a predetermined division direction (front view vertical direction in FIGS. 10A and 10B) to obtain a plurality (two in this embodiment) of divided images.
- the right-eye divided image is obtained by selecting a part of the images from the plurality of divided images.
- the split image processing unit 32 divides the left-eye image in the division direction to obtain a plurality (two in the present embodiment) of divided images, and corresponds to the right-eye divided image from the plurality of divided images.
- a left-eye divided image is selected except for an image representing a divided region to be performed.
- the split image processing unit 32 generates a split image by arranging the right eye divided image and the left eye divided image in the corresponding regions. Note that the image data of the split image generated in this way is output to the display control unit 36.
- the right-eye image 300A and the left-eye image 300B are alternately arranged in the division direction.
- the split image 300 when the focus of the imaging lens 16 is deviated from the in-focus position, as shown in FIG. 10A, each of the right-eye image 300A and the left-eye image 300B intersects the above-described division direction ( 10A and 10B is a front view left-right direction, which is hereinafter referred to as “crossing direction”), and the image is shifted by an amount corresponding to the shift amount from the in-focus position.
- crossing direction a front view left-right direction
- display area information indicating the display area of the split image 300 in the display unit 213 is stored in the memory 26 in advance.
- This display area information is information indicating the range of the display area (in the present embodiment, the central portion of the normal image 301) indicated by a predetermined coordinate system in the display area of the display unit 213.
- the memory 26 stores division direction information indicating the division direction of the split image 300 and division number information indicating the number of divisions of the split image 300 in advance.
- the split image processing unit 32 reads out the division direction information from the memory 26 to determine the division direction of the right eye image 300A and the left eye image 300B. Further, the split image processing unit 32 reads the division number information from the memory 26 to determine the division number of the split image 300. Then, the split image processing unit 32 generates the split image 300 based on the division direction and the number of divisions obtained by the above processing.
- the display control unit 36 records image data corresponding to the third pixel group input from the normal processing unit 30 and the split corresponding to the first and second pixel groups input from the split image processing unit 32. Display image data is generated based on the image data of the image. For example, the display control unit 36 displays the image input from the split image processing unit 32 in the display area of the normal image indicated by the recording image data corresponding to the third pixel group input from the normal processing unit 30. The split image 300 indicated by the data is synthesized. Then, the image data obtained by the synthesis is output to the display device.
- the split image 300 when the split image 300 is combined with the normal image 301, the split image 300 is combined instead of a part of the normal image 301.
- the combining method is not limited to this.
- a composition method in which the split image 300 is superimposed on the normal image 301 may be used.
- the composition method is a method of superimposing the split image 300 by appropriately adjusting the transmittance of a part of the normal image 301 corresponding to the region on which the split image 300 is superimposed.
- the normal image 301 and the split image 300 may be displayed by different layers. Thereby, the split image 300 can be displayed in the display area of the normal image in the live view image.
- the live view image is an image that shows a subject image when a subject image that is continuously photographed is displayed continuously on the screen of the display device.
- the hybrid finder 220 includes an OVF 240 and an EVF 248.
- the OVF 240 is an inverse Galileo finder having an objective lens 244 and an eyepiece 246, and the EVF 248 has an LCD 247, a prism 245, and an eyepiece 246.
- a liquid crystal shutter 243 is disposed in front of the objective lens 244, and the liquid crystal shutter 243 shields light so that an optical image does not enter the objective lens 244 when the EVF 248 is used.
- the prism 245 reflects an electronic image or various information displayed on the LCD 247 and guides it to the eyepiece 246, and combines the optical image and information (electronic image and various information) displayed on the LCD 247.
- an OVF mode in which an optical image can be visually recognized by the OVF 240 and an electronic image can be visually recognized by the EVF 248 each time it is rotated.
- the EVF mode is switched alternately.
- the display control unit 36 controls the liquid crystal shutter 243 to be in a non-light-shielded state so that the optical image can be viewed from the viewfinder eyepiece 242. Further, only the split image 300 is displayed on the LCD 247. Thereby, the finder image in which the split image 300 is superimposed on a part of the optical image can be displayed.
- the display control unit 36 controls the liquid crystal shutter 243 to be in a light shielding state so that only the electronic image displayed on the LCD 247 from the viewfinder eyepiece unit 242 can be visually recognized.
- image data equivalent to the image data obtained by combining the split image 300 output to the display unit 213 is input to the LCD 247. Accordingly, the display control unit 36 can display an electronic image in which the split image 300 is combined with a part of the normal image 301 in the same manner as the display unit 213.
- an image signal indicating each of the normal image 301 and the split image 300 is input to the display device.
- the display device displays the split image 300 indicated by the input image signal in the display area of the split image 300 having a rectangular shape at the center of the screen. Further, the display device displays the normal image 301 indicated by the input image signal in the outer peripheral area of the split image 300.
- the display device displays the normal image 301 indicated by the input image signal in the display area of the display device. Display in the whole area.
- the display device displays the split image 300 indicated by the input image signal in the display area. Let the area be a blank area.
- the imaging apparatus 100 detects an operation on the split image 300 by the user via the touch panel 215 when the split image 300 is displayed on the display unit 213, focus adjustment is performed according to the detection result. I do.
- a scroll operation, a flick operation, and a reciprocal movement operation are applied as operations for the split image 300.
- the scrolling operation is an operation in which a desired position on the touch panel 215 is designated with a finger and a moving operation for moving the designated position is continuously performed for a first time (100 msec in this embodiment) or more.
- the flick operation here refers to a second time (in this embodiment, 100 msec) after a desired position on the touch panel 215 is designated with a finger, the moving operation is performed, and the moving operation is started. This is an operation of releasing the finger from the touch panel 215 until the time elapses.
- one point in the region of the right eye image 300A or the left eye image 300B is designated with a finger or the like, and either the right eye image 300A or the left eye image 300B is specified.
- the selection operation for selecting one is performed immediately before.
- the reciprocal movement operation referred to here is a selection operation for selecting each of the right-eye image 300A and the left-eye image 300B by designating one point in each of the areas of the right-eye image 300A and the left-eye image 300B. After performing the above operation, each designated position is moved in the opposite direction along the intersecting direction by a scroll operation or a flick operation.
- FIG. 11 is a flowchart showing the flow of processing of the shooting control processing program executed by the CPU 12 when the imaging apparatus 100 is set to the manual focus mode and the dial 212 is set to the shooting mode.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S401 image data indicating the normal image 301 based on the image signal output from the third pixel group is acquired via the interface unit 24, and control is performed to display the normal image 301 on the display unit 213. .
- the imaging apparatus 100 acquires the image data indicating the normal image 301 from the third pixel group, but the method of acquiring the image data is not limited to this.
- the image data indicating the right-eye image 300A based on the image signal output from the first pixel group and the image data indicating the left-eye image 300B based on the image signal output from the second pixel group are usually used.
- Image data indicating the image 301 may be generated.
- a method for generating image data indicating the normal image 301 at this time for example, a method in which the image data indicating the right-eye image 300A or the left-eye image 300B is directly used as the image data indicating the normal image 301 is exemplified. Is done. Further, an interpolation pixel is arranged between adjacent pixels in the image data indicating the right-eye image 300A or the left-eye image 300B, and the average value of the pixel values surrounding the interpolation pixel is determined as the pixel value of the interpolation pixel.
- a method of generating image data indicating the normal image 301 may be used. Further, the method of generating the normal image 301 may be a method of generating image data indicating the normal image 301 by combining image data indicating the right-eye image 300A and the left-eye image 300B.
- the split image processing unit 32 is controlled to generate the split image 300, and the generated split image 300 is displayed on the display unit 213.
- the CPU 12 displays the image superimposed on the central portion of the normal image 301.
- the CPU 12 displays both the normal image 301 and the split image 300 on the display unit 213, but is not limited thereto, and only the split image 300 is displayed on the display unit 213. It may be displayed.
- step S405 it is determined whether or not a touch operation is detected by the touch panel 215. If the determination in step S405 is affirmative, the process proceeds to step S407. If the determination is negative, the process proceeds to step S425 described later.
- step S407 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation on the right-eye image 300A or the left-eye image 300B. If an affirmative determination is made in step S407, the process proceeds to step S411 described later.
- step S407 if a negative determination is made in step S407, the process proceeds to step S409, and after performing processing according to the touch operation, the process proceeds to step S425 described later.
- Examples of the process corresponding to the touch operation include a process of performing automatic focusing control in a peripheral area around the designated position including the designated position by the touch operation, a process of displaying the peripheral area in an enlarged manner, and the like. .
- step S411 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation for designating one point. If the determination in step S411 is affirmative, the process proceeds to step S412. If the determination is negative, the process proceeds to step S421 described later.
- step S412 it is determined whether or not the designated position by the touch operation has moved. In the imaging apparatus 100 according to the present embodiment, when the designated position moves more than a predetermined distance (10 pixels in the present embodiment), it is determined that the designated position has moved. If an affirmative determination is made in step S412, the process proceeds to step S413. If a negative determination is made, the process proceeds to step S409.
- step S413 it is determined whether or not the touch operation detected by the process in step S405 is a scroll operation.
- the CPU 12 continues the movement operation in the touch operation for the first time or more, and the designated position is predetermined in the crossing direction by the movement operation (in this embodiment, 10 pixels). ) If it is moved as described above, it is determined that the operation is a scroll operation.
- whether or not the scroll operation is performed is determined based on the duration of the touch operation, but the determination method of the scroll operation is not limited thereto.
- a predetermined operation applied to the scroll operation is performed via the operation unit 14, the touch operation continues for the first time or more, and the designated position is greater than or equal to the first distance by the moving operation.
- Whether or not the operation is a scroll operation may be determined based on whether or not the user has moved.
- step S413 If the determination in step S413 is affirmative, the process proceeds to step S415, and scroll processing described later is executed. If the determination is negative, the process proceeds to step S417.
- step S417 it is determined whether or not the touch operation detected by the process in step S405 is a flick operation.
- the CPU 12 performs a flick operation when the moving operation in the touch operation does not continue for the first time or more and the designated position moves by the moving operation in the intersecting direction by the first distance or more. Is determined.
- the touch operation is a flick operation based on the duration of the touch operation and the movement distance of the specified position, but the determination method of the flick operation is limited to this.
- a predetermined operation applied to the flick operation is performed via the operation unit 14 and the touch operation does not continue for the first time or more, and the designated position is set to the first distance by the moving operation. It may be determined whether or not the operation is a flick operation based on whether or not the movement has been performed.
- step S417 determines whether the determination in step S417 is affirmative. If the determination in step S417 is affirmative, the process proceeds to step S419, and a flick process described later is executed. If the determination is negative, the process proceeds to step S425 described later.
- step S421 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation designating two points. If the determination in step S421 is affirmative, the process proceeds to step S423, and a reciprocal movement process described later is executed. If the determination is negative, the process proceeds to step S425.
- step S425 it is determined whether or not an instruction to end the shooting control processing program has been input by determining whether or not the dial 212 has been set to a mode other than the shooting mode. If the determination in step S425 is affirmative, the process proceeds to S427. If the determination is negative, the process proceeds to step S431 described later.
- step S427 control is performed to stop the display of the normal image 301 displayed on the display unit 213 by the process in step S401.
- step S429 control for stopping the display of the split image 300 displayed on the display unit 213 by the process of step S403 is performed, and the photographing control processing program is terminated.
- step S431 it is determined whether or not a shooting instruction has been input by determining whether or not a full press operation on the release switch 211 has been detected. If a negative determination is made in step S431, the process returns to step S405 described above. If an affirmative determination is made, the process proceeds to step S433.
- step S433 a photographing process for recording image data indicating the normal image 301 in the memory 26 is performed, and the photographing control processing program is terminated. Note that the photographing process is a commonly performed process, and thus detailed description thereof is omitted here.
- whether the operation is a scroll operation or a flick operation is determined based on the duration of the touch operation, but the determination method is not limited to this.
- whether the operation is a scroll operation or a flick operation may be determined based on the movement distance by the movement operation. In this case, when the moving distance by the moving operation is equal to or greater than a predetermined distance, it is determined as a scroll operation, and when it is less than the predetermined distance, it is determined as a flick operation. And good.
- whether the operation is a scroll operation or a flick operation may be determined by the moving speed in the moving operation. In this case, when the moving speed in the moving operation is equal to or higher than a predetermined speed, it is determined as a flick operation, and when it is lower than the predetermined speed, it is determined as a scroll operation. And good.
- FIG. 12 is a flowchart showing a flow of processing of a scroll processing routine program executed by the CPU 12 during execution of the photographing control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S501 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation on the right-eye image 300A. If the determination in step S501 is affirmative, the process proceeds to step S503. If the determination is negative, the touch operation detected by the process in step S405 is considered to be for the left-eye image 300B, which will be described later. The process proceeds to step S519.
- step S503 it is determined whether or not the movement operation in the touch operation detected by the process in step S405 is the rightward movement operation. If the determination in step S503 is affirmative, the process proceeds to step S505. If the determination is negative, the movement operation is regarded as a leftward movement, and the process proceeds to step S513 described later.
- step S505 the movement distance with respect to the crossing direction of the designated position in the touch operation detected by the process in step S405 is acquired.
- the CPU 12 executes the process of step S505 for the first time after executing the scroll process routine program, the CPU 12 moves the distance in the intersecting direction from the specified position when the touch operation is started to the current specified position. Get as distance.
- the execution of the process in step S505 is performed for the second time or later after the scroll process routine program is executed, the distance from the previous designated position to the current designated position is acquired as the movement distance.
- step S507 the direction in which the focus lens 302 is focused on the subject closer to the current focus position in the optical axis direction of the focus lens 302 (the direction in which the focus position approaches the imaging apparatus 100). Are moved by a distance corresponding to the movement distance acquired in step S505.
- a value obtained by multiplying the moving distance of the designated position by a predetermined first coefficient is set as the moving distance of the focus lens 302.
- examples of the first coefficient include a value obtained by dividing the movable length of the focus lens 302 in the optical axis direction by the length of the detectable region on the touch panel 215 in the intersecting direction.
- the first coefficient is not limited to this. Note that the focus lens 302 can be moved more precisely as the first coefficient is decreased, and the focus lens 302 can be moved faster as the first coefficient is increased.
- information indicating the first coefficient may be input by the user via the operation unit 14.
- the CPU 12 moves the focus lens 302 in the direction in which the focus position approaches the imaging apparatus 100 along the optical axis direction, so that the focus of the focus lens 302 is the in-focus position. Get closer to.
- step S505 the moving distance of the designated position 310 in the intersecting direction is acquired.
- step S507 the focus lens 302 is moved by a distance corresponding to the moving distance, but the present invention is not limited to this.
- the moving speed of the designated position 310 in the intersecting direction may be acquired in step S505, and the focus lens 302 may be moved by a distance corresponding to the moving speed (for example, a distance proportional to the moving speed).
- step S509 it is determined whether or not the touch operation detected by the process in step S405 continues to touch the touch panel 215. If an affirmative determination is made in step S509, the process returns to step S503. On the other hand, if a negative determination is made, the scroll process routine program is terminated, and the process proceeds to step S425 of the photographing control process program (main routine).
- step S513 similarly to the process of step S505, the movement distance with respect to the crossing direction of the designated position 310 in the touch operation detected by the process of step S405 is acquired.
- the focus lens 302 is focused on a subject farther from the current focus position in the optical axis direction of the focus lens 302 (the direction in which the focus position moves away from the imaging apparatus 100). ) Is moved by a distance corresponding to the movement distance acquired in step S513.
- a value obtained by multiplying the moving distance of the designated position 310 by the first coefficient is set as the moving distance of the focus lens 302.
- the CPU 12 moves the focus lens 302 in the direction in which the focus position moves away from the imaging apparatus 100 along the optical axis direction of the focus lens 302. It approaches the focus position.
- step S517 it is determined whether or not the touch operation detected by the process in step S405 is continued. If the determination in step S517 is affirmative, the process returns to step S503. If the determination is negative, the process proceeds to step S425.
- step S519 it is determined whether or not the moving operation in the touch operation detected by the process in step S405 is a leftward moving operation. If the determination in step S519 is affirmative, the process proceeds to step S521. If the determination is negative, the moving operation is regarded as a rightward moving operation, and the process proceeds to step S529 described later.
- step S521 similarly to the process of step S505, the movement distance of the designated position 310 in the touch operation detected by the process of step S405 with respect to the intersecting direction is acquired.
- step S523 similar to the processing in step S507, the focus lens 302 is moved by a distance corresponding to the movement distance acquired in step S521 in the direction in which the focal position in the optical axis direction of the focus lens 302 approaches the imaging apparatus 100. Move.
- a value obtained by multiplying the moving distance of the designated position 310 by the first coefficient is set as the moving distance of the focus lens 302.
- step S525 it is determined whether or not the touch operation detected by the process in step S405 is continued. If the determination in step S525 is affirmative, the process returns to step S519. If the determination is negative, the process proceeds to step S425.
- step S529 similarly to the process of step S505, the movement distance with respect to the crossing direction of the designated position 310 in the touch operation detected by the process of step S405 is acquired.
- step S531 similarly to the processing in step S515, the focus lens 302 is moved by a distance corresponding to the movement distance acquired in step S529 in the direction in which the focal position in the optical axis direction of the focus lens 302 moves away from the imaging device 100. Move.
- step S533 it is determined whether or not the touch operation detected by the process in step S405 is continued. If the determination in step S533 is affirmative, the process returns to step S519. If the determination is negative, the process proceeds to step S425.
- FIG. 15 is a flowchart showing the flow of processing of the flick processing routine program executed by the CPU 12 during the execution of the photographing control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- the memory 26 stores correspondence information indicating the relationship between the moving speed of the designated position 310 and the moving speed of the focus lens 302 in the moving operation at the time of the flick operation as shown in FIG. 16 as an example.
- the horizontal axis indicates the moving speed of the designated position 310 in the moving operation during the flick operation
- the vertical axis indicates the moving speed of the focus lens 302.
- the right direction is indicated as a positive direction
- the left direction is indicated as a negative direction.
- the moving speed of the focus lens 302 increases as the moving speed of the designated position 310 increases.
- the moving speed of the designated position 310 is from a predetermined first threshold value ( ⁇ 25 pixels / second in the present embodiment) to a predetermined second threshold value (25 pixels / second in the present embodiment). If it is between, the moving speed of the focus lens 302 becomes 0 (zero). This is because, when the moving speed of the moving operation is between the first threshold value and the second threshold value, it is possible that the designated position 310 has moved due to the user's intention, such as finger tremor. This is because the possibility that the designated position 310 has moved against the intention is higher.
- the moving speed of the focus lens 302 is determined according to the moving speed of the moving operation in the flick operation.
- step S601 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation on the right-eye image 300A. If the determination in step S601 is affirmative, the process proceeds to step S603. If the determination is negative, the touch operation detected by the process in step S405 is considered to be for the left-eye image 300B, which will be described later. The process proceeds to step S617.
- step S603 it is determined whether or not the moving operation in the touch operation detected by the process in step S405 is the rightward moving operation.
- step S603 the process proceeds to step S605.
- step S611 the process proceeds to step S611 described later.
- step S605 the moving speed of the specified position 310 in the touch operation detected by the process in step S405 with respect to the intersecting direction is acquired.
- the CPU 12 acquires a value obtained by dividing the moving distance of the designated position 310 in the moving operation by the time from the start to the end of the movement as the moving speed.
- the moving speed of the focus lens 302 is derived.
- the correspondence information is read from the memory 26, and the movement speed corresponding to the movement speed acquired in S605 in the correspondence information is derived.
- the moving speed of the focus lens 302 is derived based on the moving speed of the designated position 310 and the correspondence information, but the present invention is not limited to this.
- the moving speed of the focus lens 302 is derived based on the moving speed of the designated position 310 by using a predetermined calculation formula for deriving the moving speed of the focus lens 302 using the moving speed of the designated position 310 as a variable. Also good.
- step S609 movement of the focus lens 302 is started at a speed corresponding to the movement speed derived in step S607 in the direction in which the focus position in the optical axis direction of the focus lens 302 approaches the imaging apparatus 100, which will be described later.
- the process proceeds to step S631.
- step S611 similarly to the process of step S605, the moving speed of the designated position 310 in the touch operation detected by the process of step S405 in the intersecting direction is acquired.
- step S613 the moving speed of the focus lens 302 is derived as in the process of step S607.
- step S615 control for starting the movement of the focus lens 302 at a speed corresponding to the movement speed derived in step S613 in the direction in which the focal position in the optical axis direction of the focus lens 302 moves away from the imaging device 100 is performed. Then, the process proceeds to step S631 described later.
- step S617 it is determined whether or not the movement operation in the touch operation detected by the process in step S405 is the leftward movement operation. If an affirmative determination is made in step S617, the process proceeds to step S619. On the other hand, if a negative determination is made, the moving operation is regarded as a rightward movement, and the process proceeds to step S625 described later.
- step S619 similarly to the process in step S605, the moving speed of the designated position 310 in the touch operation detected by the process in step S405 in the intersecting direction is acquired.
- step S621 the moving speed of the focus lens 302 is derived as in the process of step S607.
- step S623 the focus lens 302 starts to move at a speed corresponding to the moving speed derived in step S621 in the direction in which the focus position in the optical axis direction of the focus lens 302 approaches the imaging apparatus 100, which will be described later.
- the process proceeds to step S631.
- step S625 similarly to the process of step S605, the moving speed at which the designated position 310 in the touch operation detected by the process of step S405 moves in the intersecting direction is acquired.
- step S627 the moving speed of the focus lens 302 is derived as in the process of step S607.
- step S629 control is performed to start the movement of the focus lens 302 at a speed corresponding to the movement speed derived in step S627 in a direction in which the focal position moves away from the imaging device 100 in the optical axis direction of the focus lens 302.
- the process proceeds to step S631.
- step S631 the movement of the focus lens 302 started by any one of steps S609, S615, S623, and S629 is continued until a touch operation is detected again, and then the process proceeds to step S633.
- the imaging device 100 when any position in the entire area of the touch panel 215 is designated again after the flick operation is performed, it is determined that the touch operation is detected again.
- step S633 control to stop the movement of the focus lens 302 is performed, and this flick processing routine program is terminated.
- the imaging apparatus 100 moves the focus lens 302 based on the flick operation being performed in the flick process, and the focus lens is based on the touch operation being performed again. 302 is stopped.
- the timing at which the focus lens 302 is stopped is not limited to the timing at which the touch operation is performed again, and may be stopped by gradually decreasing the moving speed of the focus lens 302.
- another example 1 of the flick process will be described.
- FIG. 17 is a flowchart showing the flow of processing of the flick processing routine program executed by the CPU 12 during the execution of the shooting control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- the memory 26 stores in advance stop time information indicating a stop time that is a time from when the movement of the focus lens 302 is started until the focus lens 302 is stopped.
- step S701 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation on the right-eye image 300A. If the determination in step S701 is affirmative, the process proceeds to step S703. If the determination is negative, the touch operation detected by the process in step S405 is considered to be for the left-eye image 300B, which will be described later. The process proceeds to step S721.
- step S703 it is determined whether or not the moving operation in the touch operation detected by the process in step S405 is the rightward moving operation. If the determination in step S703 is affirmative, the process proceeds to step S705. If the determination is negative, the movement operation is regarded as a leftward movement, and the process proceeds to step S713 described later.
- step S705 similarly to the process of step S605, the moving speed of the designated position 310 in the touch operation detected by the process of step S405 with respect to the intersecting direction is acquired.
- step S707 the moving speed that is the initial speed of the focus lens 302 is derived as in the process of step S607.
- a deceleration speed for every predetermined time (1 second in this embodiment) for decelerating the moving speed of the focus lens 302 is derived.
- the stop time information is read from the memory 26.
- the imaging apparatus 100 according to the present embodiment when the movement of the focus lens 302 is started at the movement speed derived in step S707, the focus lens 302 is moved at the timing when the stop time elapses from the start timing.
- the deceleration speed for stopping the movement is derived.
- a value obtained by dividing the movement speed derived in step S707 by the time from the start of movement of the focus lens 302 to the elapse of the stop time is the deceleration speed.
- step S711 control is performed to start the movement of the focus lens 302 at a speed corresponding to the movement speed derived in step S707 in the direction in which the focus position approaches the imaging device 100 in the optical axis direction of the focus lens 302. Then, the process proceeds to step S739 described later.
- step S713 similarly to the process of step S705, the moving speed of the designated position 310 in the touch operation detected by the process of step S405 in the intersecting direction is acquired.
- step S715 similarly to the process in step S707, the moving speed that is the initial speed of the specified position 310 in the touch operation detected by the process in step S405 with respect to the intersecting direction is derived.
- step S717 similarly to the process in step S709, the deceleration speed of the focus lens 302 every predetermined time is derived.
- step S719 control is performed to start the movement of the focus lens 302 at a speed corresponding to the movement speed derived in step S715 in a direction in which the focal position moves away from the imaging device 100 in the optical axis direction of the focus lens 302. Then, the process proceeds to step S739 described later.
- step S721 it is determined whether or not the moving operation in the touch operation detected by the process in step S405 is the leftward moving operation. If an affirmative determination is made in step S721, the process proceeds to step S723. If a negative determination is made, the movement operation is regarded as a rightward movement, and the process proceeds to step S731 described later.
- step S723 similarly to the process in step S705, the moving speed of the designated position 310 in the touch operation detected by the process in step S405 with respect to the intersecting direction is acquired.
- step S725 the moving speed that is the initial speed of the focus lens 302 is derived in the same manner as in the process of step S707.
- step S727 similarly to the process in step S709, the deceleration speed of the focus lens 302 for each predetermined time is derived.
- step S729 control is performed to start the movement of the focus lens 302 at a speed corresponding to the movement speed derived in step S725 in a direction in which the focal position approaches the imaging device 100 in the optical axis direction of the focus lens 302. Then, the process proceeds to step S739 described later.
- step S731 similarly to the processing in step S705, the moving speed of the designated position 310 in the touch operation detected by the processing in step S405 in the intersecting direction is acquired.
- step S733 similarly to the process in step S707, the moving speed that is the initial speed of the focus lens 302 in the touch operation detected by the process in step S405 is derived.
- step S735 similarly to the process in step S709, the deceleration speed of the focus lens 302 for each predetermined time is acquired.
- step S737 control is performed to start the movement of the focus lens 302 at a speed corresponding to the movement speed derived in step S733 in a direction in which the focal position moves away from the imaging device 100 in the optical axis direction of the focus lens 302.
- the process proceeds to step S739.
- step S739 after waiting for a predetermined time (1 second in the present embodiment) to elapse, the process proceeds to step S741.
- step S741 control is performed to decelerate the speed corresponding to the deceleration speed derived by any one of steps S709, S717, S727, and S735 from the moving speed of the focus lens 302.
- step S743 it is determined whether or not the movement of the focus lens 302 is stopped. If a negative determination is made in step S743, the process returns to step S739. If an affirmative determination is made, the flick processing routine program is terminated.
- FIG. 18 is a graph showing an example of the relationship between the elapsed time from the start of the movement of the focus lens 302 and the moving speed of the focus lens 302 in the imaging apparatus 100 according to the present embodiment.
- the horizontal axis indicates the elapsed time since the movement of the focus lens 302 is started
- the vertical axis indicates the movement speed of the focus lens 302.
- the elapsed time from the start of the movement of the focus lens 302 is standardized so that the time from the start of the movement of the focus lens 302 to the stop is 1.
- the elapsed time is standardized so that the stop time from when the movement of the focus lens 302 is started until it stops is 1.
- the imaging apparatus 100 has a slower moving speed of the focus lens 302 as the elapsed time progresses, and the stop time has elapsed since the movement of the focus lens 302 was started. At this timing, the movement speed becomes 0 (zero).
- the moving speed of the focus lens 302 increases as the elapsed time from the start of the movement of the focus lens 302 progresses. The focus lens 302 will eventually stop as it becomes slower.
- the imaging apparatus 100 moves the focus lens 302 based on the flick operation performed in the flick process, and gradually reduces the moving speed of the focus lens 302.
- the focus lens 302 may be stopped.
- the method of stopping the focus lens 302 is not limited to the method of gradually decelerating the moving speed of the focus lens 302, and is stopped after the focus lens 302 is moved by a moving amount corresponding to the moving distance of the designated position 310. May be.
- the focus lens 302 is driven by a pulse motor, 1000 pulses are assigned to the entire drive range of the focus lens 302.
- the movement amount of the designated position 310 is 10% of the entire screen range of the display unit 312 (for example, 64 pixels in the longitudinal direction in the case of VGA (640 ⁇ 210))
- the focus lens 302 is moved by 100 pulses. Stop after letting go.
- timing at which the focus lens 302 is stopped in the flick process is not limited to these methods, and the movement of the focus lens 302 may be stopped when an image captured by the focus lens 302 is in focus.
- another example 2 of the flick process will be described.
- FIG. 19 is a flowchart showing the flow of processing of the flick processing routine program executed by the CPU 12 during the execution of the shooting control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S801 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation on the right-eye image 300A. If the determination in step S801 is affirmative, the process proceeds to step 803. If the determination is negative, the touch operation detected by the process in step S405 is considered to be for the left-eye image 300B, and will be described later. The process proceeds to step S817.
- step S803 it is determined whether or not the moving operation in the touch operation detected by the process in step S405 is the rightward moving operation. If an affirmative determination is made in step S803, the process proceeds to step S805. If a negative determination is made, the moving operation is regarded as a leftward movement, and the process proceeds to step S811 described later.
- an in-focus determination area is determined as an area for confirming the in-focus state on the right-eye image 300A.
- a predetermined length centered on a designated position 310 (see FIGS. 13 and 14) when a selection operation is performed (in this embodiment, 500 pixels as an example).
- the determination method of the focus determination area is not limited to this, and may be a circular area, a rectangular area, or the like located in the center of the right-eye image 300A.
- the length is not limited to this.
- a predetermined length centered on the intersection of the boundary line 311 and the perpendicular line drawn from the indication position 310 to the boundary line 311 between the right-eye image 300A and the left-eye image 300B.
- the right-eye image 300 ⁇ / b> A (the right-eye image 300 ⁇ / b> A that is a column image showing a column with a shorter subject distance among the two columns that are subjects) and the left It may be a circular region having an eye image 300B (a radius of a circle including a left eye image 300B that is a column image indicating a column having a longer subject distance among two columns that are subjects).
- the focus lens 302 is moved at a predetermined moving speed (in the present embodiment, 50 pixels / second) in the direction in which the focus position approaches the imaging device 100 in the optical axis direction of the focus lens 302. Control to start.
- a predetermined moving speed in the present embodiment, 50 pixels / second
- the imaging apparatus 100 for example, the right-eye image 300A in the focus determination region (in the example illustrated in the upper diagram of FIG. 13, the shorter subject distance of the two pillars that are subjects) A right-eye image 300A that is a column image indicating a column and a left-eye image 300B (in the example illustrated in the upper diagram of FIG. 13, a column that indicates a column having a shorter subject distance among the two columns that are subjects.
- the phase difference for each pixel of the left-eye image 300B which is an image, is derived and the derived phase difference is within a predetermined range including 0 (zero), the in-focus state is obtained. judge.
- the imaging apparatus 100 when a main subject exists in the focus determination area, the main subject in the focus determination area (if a plurality of main subjects are included, any one of them) ) (For example, in the example shown in the upper diagram of FIG. 13, a right-eye image that is a column image indicating a column having a shorter subject distance among two columns (an example of a plurality of main subjects)).
- 300A and the left eye image 300B which is a column image indicating the column with the shorter subject distance of the two columns, are in focus, it is determined that the image is in focus.
- FIG. 20 is a graph showing an example of the relationship between the elapsed time from the start of the movement of the focus lens 302 and the relative position of the focus lens 302 in the imaging apparatus 100 according to the present embodiment.
- the horizontal axis indicates the elapsed time since the movement of the focus lens 302 is started, and the vertical axis indicates the relative position of the focus lens 302.
- the elapsed time from the start of the movement of the focus lens 302 is normalized so that the time from the start of the movement of the focus lens 302 to the stop is 1.
- the relative position of the focus lens 302 is a relative position when the position of the focus lens 302 when the designated position 310 is selected and operated is 0 (zero).
- the phase difference between the right-eye image 300 ⁇ / b> A and the left-eye image 300 ⁇ / b> B is reduced by moving the focus lens 302.
- the movement of the focus lens 302 is stopped at a position where the phase difference is 0 (zero).
- the phase difference between the right-eye image 300A and the left-eye image 300B is 0 (zero)
- the method for determining whether or not the subject is in focus is not limited to this.
- FIG. 21 is a graph showing an example of the relationship between the elapsed time from the start of the movement of the focus lens 302 and the relative position of the focus lens 302 in the imaging apparatus 100 according to the present embodiment, as in FIG.
- the horizontal axis indicates the elapsed time from the start of the movement of the focus lens 302
- the vertical axis indicates the relative position of the focus lens 302.
- the elapsed time from the start of the movement of the focus lens 302 is set to 1 from the start of the movement of the focus lens 302 to the stop.
- the relative position of the focus lens 302 is a relative position where the designated position 310 in the selection operation is zero.
- the contrast value of the split image 300 is derived while moving the focus lens 302, and the movement of the focus lens 302 is stopped at the position where the contrast value is maximized.
- the focus determination is performed based on the contrast value of the split image 300, but the present invention is not limited to this, and the focus determination may be performed based on the contrast value of the normal image 301.
- the in-focus state when confirming the in-focus state of the normal image 301, using the mutual phase difference between the right-eye image 300A and the left-eye image 300B, the in-focus state can be confirmed at a higher speed, while the contrast value If is used, the in-focus state can be confirmed more accurately.
- step S811 as in the process of step S805, an in-focus determination area is determined as an area for confirming the in-focus state on the right-eye image 300A.
- a predetermined moving speed in the present embodiment, 50 pixels / in the optical axis direction of the focus lens 302 in a direction away from the imaging device 100. Second) to start the movement of the focus lens 302.
- step S815 similarly to the process in step S809, it is repeatedly determined whether or not the normal image 301 is in focus. If it is determined that the normal image 301 is in focus, the process proceeds to step S831 described later.
- step S817 it is determined whether or not the moving operation in the touch operation detected by the process in step S405 is the leftward moving operation. If an affirmative determination is made in step S817, the process proceeds to step S819. If a negative determination is made, the movement operation is regarded as a rightward movement, and the process proceeds to step S825 described later.
- step S819 as in the process of step S805, an in-focus determination area is determined as an area for confirming the in-focus state on the left-eye image 300B.
- a predetermined moving speed in the present embodiment, 50 pixels / in the optical axis direction of the focus lens 302 in the direction in which the focal position approaches the imaging device 100. Second) to start the movement of the focus lens 302.
- step S823 similarly to the processing in step S809, it is repeatedly determined whether or not the normal image 301 is in focus. If it is determined that the normal image 301 is in focus, the process proceeds to step S831 described later.
- step S825 as in the process of step S805, an in-focus determination area is determined as an area for confirming the in-focus state on the left-eye image 300B.
- a predetermined moving speed in the present embodiment, 50 pixels / in the optical axis direction of the focus lens 302 in the direction away from the imaging device 100. Second) to start the movement of the focus lens 302.
- step S829 similarly to the process in step S809, it is repeatedly determined whether or not the normal image 301 is in focus. If it is determined that the normal image 301 is in focus, the process proceeds to step S831.
- step S831 the movement of the focus lens 302 started by any one of steps S807, S813, S821 and S827 is stopped, and this flick processing routine program is terminated.
- FIG. 22 is a flowchart showing the flow of processing of the reciprocal movement processing routine program executed by the CPU 12 during the execution of the imaging control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S901 it is determined whether or not the moving operation on the right eye image 300A in the touch operation detected by the processing in step S405 is the moving operation in the right direction. If the determination in step S901 is affirmative, the process proceeds to step S903. If the determination is negative, the moving operation is regarded as a leftward movement, and the process proceeds to step S909 described later.
- step S903 it is determined whether or not the moving operation on the left-eye image 300B in the touch operation detected by the process in step S405 is the leftward moving operation. If the determination in step S903 is affirmative, the process proceeds to step S905. If the determination is negative, the process proceeds to step S907 described later.
- step S905 as in the process of step S507, the focus lens 302 is moved in the direction in which the focus position approaches the imaging device 100 in the optical axis direction of the focus lens 302.
- the designated position 310 on the right eye image 300A is moved in the right direction by the user operation via the touch panel 215, and the left eye image 300B is moved.
- the designated position 310 is moved leftward.
- the CPU 12 moves the focus lens 302 in the direction in which the focus position approaches the imaging apparatus 100 along the optical axis direction, so that the focus of the focus lens 302 is the in-focus position. Get closer to.
- step S907 it is determined whether or not the touch operation detected by the process in step S405 continues to touch the touch panel 215. If an affirmative determination is made in step S907, the process returns to step S901. On the other hand, if a negative determination is made, the reciprocal movement process routine program ends, and the process proceeds to step S425 of the imaging control process program (main routine). .
- step S909 it is determined whether or not the moving operation on the left-eye image 300B in the touch operation detected in the process of step S405 is the moving operation in the right direction. If the determination in step S909 is affirmative, the process proceeds to step S911. If the determination is negative, the process proceeds to step S913 described later.
- step S911 as in the process of step S515, the focus lens 302 is moved in the direction away from the imaging apparatus 100 in the optical axis direction of the focus lens 302.
- the designated position 310 on the right eye image 300A is moved to the left by the user operation via the touch panel 215, and the left eye image 300B is moved.
- the designated position 310 is moved to the right.
- the CPU 12 moves the focus lens 302 in the direction away from the imaging device 100 along the optical axis direction, so that the focus of the focus lens 302 is in the in-focus position. Get closer to.
- step S913 it is determined whether or not the touch operation detected in the process of step S405 continues to touch the touch panel 215. If an affirmative determination is made in step S913, the process returns to step S901. On the other hand, if a negative determination is made, the reciprocal movement process routine program ends, and the process proceeds to step S425 of the imaging control process program (main routine). .
- the user touches each of the right eye image 300A and the left eye image 300B with two fingers, and each of the right eye image 300A and the left eye image 300B is displayed. It moves so that the position with respect to the said crossing direction may mutually correspond.
- the focus lens 302 moves according to the positional relationship of the right-eye image 300A and the left-eye image 300B with respect to each other.
- the image captured by the focus lens 302 is in a focused state. In this way, the user can perform focusing control by performing an intuitive operation using the split image 300.
- a single phase difference pixel is arranged with respect to a 2 ⁇ 2 pixel G filter.
- a 2 ⁇ 2 pixel G filter is used.
- a pair of first pixel L and second pixel R may be arranged.
- a pair of first pixel L and second pixel R adjacent to each other in the row direction with respect to a 2 ⁇ 2 pixel G filter may be arranged.
- a pair of first pixel L and second pixel R adjacent to each other in the column direction with respect to a 2 ⁇ 2 pixel G filter may be arranged.
- the positions of the first pixel L and the second pixel R are set at least in the column direction and the row direction between the first pixel group and the second pixel group. It is preferable to align one side within a predetermined number of pixels.
- 26 and 27 show the first pixel L and the second pixel R, and the positions in the column direction and the row direction between the first pixel group and the second pixel group are one pixel. An example is shown in which they are arranged at the same positions.
- the color filter 21 is exemplified, but the present invention is not limited to this.
- the arrangement of the primary colors (R filter, G filter, B filter) of the color filter may be a Bayer arrangement.
- phase difference pixels are arranged for the G filter.
- a phase difference pixel is arranged at the center of an array pattern G1 in which the four corners and the center of a 3 ⁇ 3 pixel square matrix are G filters. Further, the first pixel L and the second pixel R are alternately arranged with the G filter for one pixel being skipped in each of the row direction and the column direction (with the G filter for one pixel in between). Yes. Further, the first pixel L and the second pixel R are arranged at positions where the positions in the column direction and the row direction are aligned within one pixel between the first pixel group and the second pixel group. ing.
- an image based on the phase difference pixel at the center of the array pattern G1 can be interpolated using an image based on the normal pixels at the four corners of the array pattern G1, so that the interpolation accuracy is improved compared to the case without this configuration. Can be made.
- the positions of the array patterns G1 do not overlap each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups.
- the pixels included in the second image are arranged at positions that do not overlap in pixel units. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
- the color filter 21B shown in FIG. 29 has a phase difference pixel arranged at the center of the array pattern G1 and at the lower right corner in front view in the figure.
- the first pixel L and the second pixel R are alternately arranged by skipping the G filter for two pixels in each of the row direction and the column direction (with the G filter for two pixels in between). Yes.
- the first pixel L and the second pixel R are arranged at positions where the positions in the column direction and the row direction are aligned within one pixel between the first pixel group and the second pixel group.
- the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
- the interpolation accuracy can be improved.
- the positions of the array patterns G1 do not overlap each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups.
- the pixels included in the second image are arranged at positions that do not overlap in a pair of pixel units.
- the “pair of pixels” refers to, for example, a first pixel L and a second pixel R (a pair of phase difference pixels) included in each array pattern G1. Therefore, it is possible to avoid an image based on a pair of phase difference pixels being interpolated by an image based on normal pixels used in the interpolation of an image based on another pair of phase difference pixels. Therefore, further improvement in interpolation accuracy can be expected.
- the first pixel L is arranged at the center of the array pattern G1, and the second pixel R is arranged at the lower right corner when viewed from the front in the figure.
- the first pixel L is disposed by skipping the G filters for two pixels in each of the row direction and the column direction
- the second pixel R is also a G filter for two pixels in each of the row direction and the column direction. It is arranged by skipping.
- the first pixel L and the second pixel R are arranged at positions where the positions in the column direction and the row direction are aligned within the two pixels between the first pixel group and the second pixel group.
- the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
- the positions of the array patterns G1 do not overlap each other. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
- a color filter 21D shown in FIG. 31 schematically shows an example of the arrangement of primary colors (R filter, G filter, B filter) of the color filter 21D provided in the image sensor 20 and the arrangement of the light shielding members.
- the first to fourth row arrangements are repeatedly arranged in the column direction.
- the first row arrangement refers to an arrangement in which B filters and G filters are alternately arranged along the row direction.
- the second row arrangement refers to an arrangement obtained by shifting the first row arrangement by a half pitch (half pixel) in the row direction.
- the third row arrangement refers to an arrangement in which G filters and R filters are alternately arranged along the row direction.
- the fourth row arrangement refers to an arrangement obtained by shifting the third row arrangement by a half pitch in the row direction.
- the first row arrangement and the second row arrangement are adjacent to each other with a half pitch shift in the column direction.
- the second row arrangement and the third row arrangement are also adjacent to each other with a half-pitch shift in the column direction.
- the third row arrangement and the fourth row arrangement are adjacent to each other with a half-pitch shift in the column direction.
- the fourth row arrangement and the first row arrangement are also adjacent to each other with a half-pitch shift in the column direction. Accordingly, each of the first to fourth row arrays repeatedly appears every two pixels in the column direction.
- the first pixel L and the second pixel R are assigned to the third and fourth row arrays as shown in FIG. 31 as an example. That is, the first pixel L is assigned to the third row array, and the second pixel R is assigned to the fourth row array. Further, the first pixel L and the second pixel R are disposed adjacent to each other (by the minimum pitch) in pairs. In the example shown in FIG. 31, the first pixel L is assigned every 6 pixels in the row direction and the column direction, and the second pixel R is also 6 pixels in each of the row direction and the column direction. Assigned to each. Thereby, the phase difference between the first pixel group and the second pixel group is calculated with higher accuracy than in the case where the present configuration is not provided.
- G filters are assigned to the first pixel L and the second pixel R. Since the pixel provided with the G filter is more sensitive than the pixel provided with the filter of other colors, the interpolation accuracy can be increased. In addition, since the G filter has continuity compared to other color filters, pixels to which the G filter is assigned are more easily interpolated than pixels to which the other color filter is assigned.
- the split image divided in the vertical direction is exemplified.
- the present invention is not limited to this, and an image divided into a plurality of parts in the horizontal direction or the diagonal direction may be applied as the split image.
- the split image 66a shown in FIG. 32 is divided into odd lines and even lines by a plurality of boundary lines 63a parallel to the row direction.
- a line-like (eg, strip-like) phase difference image 66La generated based on the output signal outputted from the first pixel group is displayed on an odd line (even an even line is acceptable).
- a line-shaped (eg, strip-shaped) phase difference image 66Ra generated based on the output signal output from the second pixel group is displayed on even lines.
- the split image 66b shown in FIG. 33 is divided into two by a boundary line 63b (for example, a diagonal line of the split image 66b) having an inclination angle in the row direction.
- the phase difference image 66Lb generated based on the output signal output from the first pixel group is displayed in one area.
- the phase difference image 66Rb generated based on the output signal output from the second pixel group is displayed in the other region.
- the split image 66c shown in FIGS. 34A and 34B is divided by a grid-like boundary line 63c parallel to the row direction and the column direction, respectively.
- the phase difference image 66Lc generated based on the output signal output from the first pixel group is displayed in a checkered pattern (checker pattern).
- the phase difference image 66Rc generated based on the output signal output from the second pixel group is displayed in a checkered pattern.
- the image is not limited to the split image, and another image for confirmation of focus may be generated from the two phase difference images, and the image for confirmation of focus may be displayed.
- two phase difference images may be superimposed and displayed as a composite image. If the image is out of focus, the image may be displayed as a double image, and the image may be clearly displayed when the image is in focus.
- the present embodiment includes the same configuration and operation as those of the first embodiment, the same configuration and operation are described as such and detailed description thereof is omitted.
- FIG. 35 is a flowchart illustrating an example of a process flow of a scroll process routine program according to the present embodiment that is executed by the CPU 12 during the shooting control process.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S1001 it is determined whether or not the touch operation detected by the process in step S405 is a touch operation near the boundary line 311 (see FIG. 36) between the right-eye image 300A and the left-eye image 300B.
- a range having a width in the dividing direction centered on the boundary line 311 between the right-eye image 300A and the left-eye image 300B (a predetermined range straddling the boundary line 311) is determined in advance. If the touch-operated position is within the range, it is determined that the touch operation detected by the process of step S405 is a touch operation on the vicinity of the boundary line 311.
- the method for determining whether or not the touch operation is in the vicinity of the boundary line 311 is not limited to this. For example, as described in the first embodiment, an in-focus determination area is determined, and the in-focus determination area is included. The determination may be made based on whether or not the boundary line 311 is included.
- step S1001 If the determination in step S1001 is affirmative, the process proceeds to step S1003. If the determination is negative, the scroll process routine program is terminated, and the process proceeds to step S425 in the photographing control process program (main routine).
- step S1003 it is determined whether or not the contact position (designated position) exceeds the boundary line 311. When it becomes negative determination, it will be in a standby state, but when it becomes affirmation determination, it will transfer to step S1005.
- FIG. 36 is a front view for explaining an example of a split image display state when a scroll operation is performed in the imaging apparatus 100 according to the present embodiment.
- FIG. 36 shows a case where an image divided into a plurality of parts is applied as the split image 300 in the same manner as the split image 66a shown in FIG. 33 of the first embodiment.
- the contact position is the designated position 310B
- the left-eye image 300B is located in an area above the boundary line 311.
- step S1005 If the determination in step S1005 is affirmative, the process proceeds to step S1007. If the determination is negative, the process proceeds to step S1015 described later.
- step S1007 it is determined whether or not the operation is to draw a clockwise circle.
- the locus of the designated position 310 (the contact position of the user) is detected, and it is detected whether the operation is to draw a circle based on the locus and whether it is clockwise or counterclockwise. ing. If the determination is affirmative based on the detection result, the process proceeds to step S1009, whereas if the determination is negative, the process proceeds to step S1011. Note that FIG. 36 shows a trajectory when an operation of drawing a clockwise circle is performed.
- step S ⁇ b> 1009 the focus lens 302 is moved by a predetermined distance in the direction of focusing on the subject on the farther side than the current focus position in the optical axis direction of the focus lens 302 (the direction in which the focus position moves away from the imaging apparatus 100). .
- a distance for moving the focus lens 302 is determined in advance by one rotation operation. As described above, in the imaging device 100 of the present embodiment, when the scroll operation for drawing a clockwise circle is performed in a state where the image for left eye 300B is located in the region above the boundary line 311, in the first embodiment, It is assumed that an operation similar to that performed when the left-eye image 300B is scrolled to the right is performed.
- FIG. 37 shows a front view for explaining an example of a split image display state when a scroll operation (clockwise) for drawing a circle is repeatedly performed. As in the example shown in FIG. 37, when the scroll operation (clockwise) for drawing a circle is repeatedly performed, the determination is affirmative. If the determination in step S1025 is affirmative, the process returns to step S1003. If the determination is negative, the scroll processing routine program is terminated, and the process proceeds to step S425 in the photographing control processing program (main routine).
- step S1011 it is determined whether or not the operation is to draw a counterclockwise circle, contrary to the processing in step S1007.
- the locus of the designated position 310 the contact position of the user
- the operation is to draw a circle based on the locus, whether it is clockwise or counterclockwise. , Has been detected. If an affirmative determination is made based on the detection result, the process proceeds to step S1013. If a negative determination is made, the scroll processing routine program is terminated, and the shooting control processing program (main routine) proceeds to step S425. Transition.
- the focus lens 302 is predetermined in a direction in which the focus lens 302 is focused on a subject closer to the current focus position (a direction in which the focus position approaches the imaging apparatus 100) in the optical axis direction of the focus lens 302.
- the process proceeds to step S1025.
- a distance for moving the focus lens 302 is determined in advance by one rotation operation. As described above, in the imaging apparatus 100 according to the present embodiment, when the left-eye image 300B is on the boundary line 311 and the scroll operation for drawing a left-hand circle is performed, the left-eye image is used in the first embodiment. It is assumed that the same operation as when 300B is scrolled to the left is performed.
- step S1015 it is determined whether the right-eye image 300A is located in an area above the boundary line 311.
- the contact position is the designated position 310A or the designated position 310C
- the right-eye image 300A is located in a region above the boundary line 311.
- step S1015 If the determination in step S1015 is affirmative, the process proceeds to step S1017. If the determination is negative, the scroll process routine program is terminated, and the process proceeds to step S425 in the photographing control process program (main routine).
- step S1017 as in the process of step S1007, it is determined whether or not the operation is to draw a clockwise circle. If the determination in step S1017 is affirmative, the process proceeds to step S1019. If the determination is negative, the process proceeds to step S1021.
- step S1019 as in the process of step S1013, the direction in which the focus lens 302 is focused on the subject closer to the current focus position in the optical axis direction of the focus lens 302 (the focus position is imaged).
- the process proceeds to step S1025.
- the imaging apparatus 100 when the scroll operation for drawing a clockwise circle is performed in a state where the right-eye image 300A is on the boundary line 311, the right-eye image is used in the first embodiment. It is assumed that the same operation as when 300A is scrolled to the right is performed.
- the designated position 310 is rotated clockwise by the user operation via the touch panel 215 when the right eye image 300A is on the boundary line 311.
- the CPU 12 moves the focus lens 302 in the direction in which the focus position approaches the imaging device 100 along the optical axis direction, so that the focus of the focus lens 302 is brought into the focus position. Get closer.
- step S1021 it is determined whether or not the operation is to draw a counterclockwise circle, as in the process of step S1011. If the determination in step S1021 is affirmative, the process proceeds to step S1023. If the determination is negative, the scroll process routine program is terminated and the process proceeds to step S425 in the photographing control process program (main routine).
- step S1023 in the same manner as the processing in step S1009, the direction in which the object on the far side is in focus with respect to the current focus position in the optical axis direction of the focus lens 302 (the focal position moves away from the imaging device 100). ), The process proceeds to step S1025.
- the imaging apparatus 100 when the scroll operation for drawing a left-hand circle is performed in a state where the right-eye image 300A is on the boundary line 311, the right-eye image is used in the first embodiment. It is assumed that the same operation as when 300A is scrolled to the left is performed.
- the imaging apparatus 100 moves the focus lens 302 based on the operation of drawing a circle in the scroll process.
- the moving speed for moving the focus lens 302 may be changed according to the speed of the circle drawing operation.
- another example 1 of the scroll process will be described.
- FIG. 39 is a flowchart showing the flow of processing of a scroll processing routine program of another example executed by the CPU 12 during the execution of the photographing control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- the scroll process of the different example 1 includes the same process as the scroll process of the present embodiment, the same process is described as such and a detailed description thereof is omitted.
- step S1101 to step S1107 corresponds to each process from step S1001 to step S1007 of the scroll process.
- step S1107 When the image for left eye 300B is located in an area above the boundary line 311, if the operation is to draw a clockwise circle in step S1107, the determination is affirmative, and the process proceeds to step S1109. The process proceeds to S1117.
- step S1109 the angular velocity of the circular motion in which the designated position 310 moves is acquired.
- the acquisition method of angular velocity is not specifically limited, For example, what is necessary is just like acquiring the moving speed of 1st Embodiment.
- the moving speed of the focus lens 302 is derived.
- a correspondence relationship between the angular velocity of the circular motion at which the designated position 310 moves and the moving velocity of the focus lens 302 is stored in a predetermined storage area of the memory 26 in advance.
- FIG. 40 is a graph illustrating an example of a correspondence relationship between the angular speed of the circular motion and the moving speed of the focus lens 302 in the imaging apparatus 100 according to the present embodiment.
- the horizontal axis indicates the angular velocity of the circular motion
- the vertical axis indicates the moving speed of the focus lens 302.
- the moving speed of the focus lens 302 increases as the angular speed of the circular motion increases.
- the angular velocity of the circular motion is between a predetermined third threshold ( ⁇ 40 deg / sec in the present embodiment) and a predetermined fourth threshold (40 deg / sec in the present embodiment).
- the moving speed of the focus lens 302 is 0 (zero).
- the memory 26 stores in advance a table corresponding to the graph shown in FIG. 40 (a table showing the correspondence between the angular velocity and the moving velocity).
- the moving speed of the focus lens 302 is gradually increased as the angular speed of the circular motion is increased.
- the moving speed of the focus lens 302 decreases as the angular speed of the circular motion decreases.
- step S1111 a table indicating the correspondence illustrated in FIG. 40 is read from the memory 26, and a movement speed corresponding to the movement speed acquired in S1109 is derived in the correspondence.
- the moving speed of the focus lens 302 is derived based on the table indicating the correspondence relationship between the angular speed and the moving speed of the circular motion at the designated position 310, but the present invention is not limited to this. .
- the moving speed of the focus lens 302 based on the angular speed of the circular motion at the designated position 310 is used. May be derived.
- step S ⁇ b> 1113 the direction of focusing on the object on the farther side with respect to the current in-focus position in the optical axis direction of the focus lens 302 based on the moving speed derived in step S ⁇ b> 1111 (the focus position is the imaging device 100.
- the focus lens 302 is moved by a predetermined distance in a direction away from the focus lens 302.
- step S1115 corresponds to the process of step S1025 of the scroll process, and determines whether or not contact with the touch panel 215 is continued. If the determination is affirmative, the process returns to step S1103. If the determination is negative, the scroll process routine program is terminated, and the process proceeds to step S425 of the photographing control process program (main routine).
- step S1117 corresponds to the process of step S1011 of the scroll process. If the operation is to draw a counterclockwise circle in step S1117, the determination is affirmative and the process proceeds to step S1119. If the determination is negative, the scroll processing routine program is terminated and the shooting control processing program (main routine) is terminated. Control goes to step S425.
- step S1119 similar to the processing in step S1109, the angular velocity of the circular motion in which the designated position 310 moves is acquired.
- step S1121 the moving speed of the focus lens 302 is derived in the same manner as the process in step S1111.
- step S1123 the direction in which the subject closer to the nearer side than the current in-focus position in the optical axis direction of the focus lens 302 is focused (the focal position is in the imaging apparatus 100) by the moving speed derived in step S1121. Then, the process proceeds to step S1115.
- step S1125 and step S1127 that is shifted when the determination is made in step S1105 as to whether or not the left-eye image 300B is located in the region above the boundary line 311 is step S1015 of the scroll processing. And corresponding to each processing in step S1017.
- step S1127 If the right-eye image 300A is an operation of drawing a clockwise circle in step S1127 when the right-eye image 300A is located in the region above the boundary line 311, the determination is affirmative, and the process proceeds to step S1129. The process moves to S1135.
- step S1129 similar to the processing in step S1109, the angular velocity of the circular motion in which the designated position 310 moves is acquired.
- step S1131 the moving speed of the focus lens 302 is derived in the same manner as the processing in step S1111.
- step S ⁇ b> 1133 the direction of focusing on the subject closer to the nearer position than the current in-focus position among the optical axis directions of the focus lens 302 based on the moving speed derived in step S ⁇ b> 1131 (the focus position is in the imaging apparatus 100.
- the process proceeds to step S1115.
- step S1135 corresponds to the process of step S1021 of the scroll process. If the operation is to draw a counterclockwise circle in step S1135, the determination is affirmative and the process proceeds to step S1137. If the determination is negative, the scroll processing routine program is terminated and the shooting control processing program (main routine) is terminated. Control goes to step S425.
- step S1137 similarly to the processing in step S1109, the angular velocity of the circular motion in which the designated position 310 moves is acquired.
- step S1121 the moving speed of the focus lens 302 is derived in the same manner as the process in step S1111.
- step S1141 in the optical axis direction of the focus lens 302 based on the moving speed derived in step S1139, the direction in which the object on the far side from the current focus position is focused (the focus position is the imaging device 100). After moving the focus lens 302 by a predetermined distance in a direction away from the center, the process proceeds to step S1115.
- the imaging apparatus 100 moves the focus lens 302 based on the action of drawing a circle in the scroll process of the different example 1. Further, the moving speed of the focus lens 302 may be set to a moving speed corresponding to the angular speed of the circular motion.
- a circular motion may be performed so as to draw a circle over a plurality of divided images.
- each scroll process may be performed every time the boundary line 311 is crossed.
- the present invention is not limited to this, and the movement locus of the designated position 310 straddles the crossing direction movement operation and the boundary line 311 (division). It is only necessary to be able to divide the movement movement in the direction. Note that, as in the scroll processing of this embodiment, the designated position 310 is moved so as to draw a circle so that the focus ring 260 is manually rotated to adjust the focus (so-called manual focus). Can be given to the user.
- FIG. 41 is a block diagram illustrating an example of the configuration of the electrical system of the imaging apparatus 100 according to the present embodiment. As shown in FIG. 41, in addition to the configuration of the imaging device 100 of each of the embodiments described above, the imaging device 100 of the present embodiment includes a notification control unit 38 and a vibration member for notifying the user that the camera is in focus. 217.
- the notification control unit 38 is connected to the vibration member 217 and controls the vibration member 217 to vibrate when the right-eye image 300A and the left-eye image 300B are in focus.
- the vibration member 217 of the imaging apparatus 100 according to the present embodiment is provided at a site where the user of the touch panel 215 performs a contact operation.
- FIG. 42 is a flowchart showing the flow of processing of the focus notification processing routine program executed by the CPU 12 in combination with the execution of the shooting control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S1201 it is determined whether or not the focus lens 302 is moving. If the determination in step S1201 is affirmative, the process proceeds to step S1203. If the determination is negative, the focus control processing routine program ends.
- step S1203 it is determined whether or not the subject image (the right eye image 300A and the left eye image 300B) near the contact position (designated position 310) is in focus. If the split image 300 includes a plurality of subject images (right-eye image 300A and left-eye image 300B), all subject images may not be in focus. For example, in the imaging apparatus 100 of the present embodiment, one of the subject images of the subject on the far side and the subject image of the subject on the near side is in focus with respect to the imaging device 100. Therefore, in the imaging apparatus 100 of the present embodiment, the subject images (the right eye image 300A and the left eye image 300B) in the vicinity of the designated position 310 (contact position where the user is in contact) in the above embodiment are in focus. It is determined whether or not.
- a predetermined area centering on the intersection with the right eye image 300A is provided, and it is determined whether the right eye image 300A and the left eye image 300B included in the area are in focus. Note that a method for determining whether or not the subject is in focus may be the same as the shooting control process in each of the above embodiments.
- step S1203 If the determination in step S1203 is affirmative, the process proceeds to step S1205. If the determination is negative, the process proceeds to step S1201.
- step S1205 after the vibrating member 217 is vibrated to notify the in-focus state, the in-focus control processing routine program is terminated.
- the vibration member 217 vibrates the part where the user of the touch panel 215 performs the contact operation, specifically, the part corresponding to the designated position 310 as described above.
- the part to be vibrated by the vibration member 217 is not limited to the part, and may be, for example, the entire touch panel 215 or the entire imaging apparatus 100 (camera body 200).
- the imaging apparatus 100 indicates that the subject image (the right-eye image 300A and the left-eye image 300B) is in focus during the movement of the focus lens 302 in the focus notification process. To inform.
- the movement of the focus lens 302 may be stopped.
- another example 1 of the focusing notification process will be described.
- FIG. 43 is a flowchart showing the flow of processing of a focus notification processing routine program of another example executed by the CPU 12 in combination with execution of the shooting control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- the focus notification process of the different example 1 includes the same process as the focus notification process of the present embodiment, the same process is described as such and a detailed description thereof is omitted.
- step S1301 to step S1305 corresponds to each process from step S1201 to step S1205 of the focus notification process.
- step S1305 when the subject image (right-eye image 300A and left-eye image 300B) near the contact position is in focus (positive determination in step S1303) while the focus lens 302 is moving (positive determination in step S1301). After the vibration member 217 is vibrated to notify the in-focus state, the process proceeds to step S1307.
- step S1307 the movement of the focus lens 302 is stopped, and then the present process is terminated.
- the subject images (the right-eye image 300A and the left-eye image 300B) are in focus during the movement of the focus lens 302 in the focus notification process of the another example 1.
- the user is notified of this, and the movement of the focus lens 302 is stopped.
- the imaging apparatus 100 moves the focus lens 302 based on the action of drawing a circle in the scroll process of the different example 1, and the moving speed of the focus lens 302 is increased.
- the moving speed may be set according to the angular speed of the circular motion.
- the notification is performed by vibrating the contact portion of the touch panel 215 by the vibration member 217, but the notification method is not limited to this method.
- an audio output unit such as a speaker may be provided, and notification may be performed by outputting sound.
- the colors of the right-eye image 300A, the left-eye image 300B, the split image 300, and the like may be changed.
- a display notifying that the subject is in focus may be performed on the display unit 213.
- the focus notification process is described as another process that is preferably combined with the shooting control process described in the first and second embodiments.
- a split image is used as another process.
- the enlargement / reduction control process 300 will be described in detail with reference to the drawings.
- FIG. 44 is a flowchart showing the flow of processing of the enlargement / reduction control processing routine program executed by the CPU 12 in combination with the execution of the shooting control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- step S1501 it is determined whether or not a scroll operation is being performed vertically (in the division direction). If the determination in step S1501 is affirmative, the process proceeds to step S1503. If the determination is negative, the enlargement / reduction control processing routine program is terminated.
- step S1503 the split image 300 displayed on the display unit 213 of the display input unit 216 is enlarged or reduced according to the scroll direction (the moving direction of the designated position 310, the upward direction and the downward direction), and then the main image is displayed.
- the enlargement / reduction control processing routine program is terminated.
- the split image 300 is enlarged or reduced according to the scroll direction.
- the split image 300 is enlarged when the scroll direction is upward, and the split image 300 is reduced when the scroll direction is downward. It has been.
- a predetermined enlargement rate and reduction rate corresponding to one scroll operation may be determined in advance.
- the enlargement ratio and the reduction ratio may be determined in advance according to the scroll distance (the movement distance of the designated position 310). In this case, for example, the enlargement ratio and the reduction ratio may be determined in advance as the scroll distance is longer.
- the imaging apparatus 100 enlarges the split image 300 in accordance with the scroll direction based on the scroll operation in the vertical direction (division direction) being performed in the enlargement / reduction control process. Or perform reduction.
- the subject images (the right-eye image 300A and the left-eye image 300B) are easy to see, and the user can easily check the in-focus state.
- the movement operation of the designated position 310 performed for controlling the in-focus state in the shooting control process is a movement operation in the horizontal direction. Therefore, in the imaging apparatus 100 of the present embodiment, since the direction of the movement operation (scroll operation) is different between the control of the in-focus state in the shooting control process and the enlargement / reduction control process, the two can be clearly distinguished from each other. it can.
- the movement operation of the designated position 310 performed for controlling the in-focus state in the shooting control process is a movement operation in the horizontal direction and a movement operation in the vertical direction (division direction). Including.
- the enlargement / reduction control processing is not limited to the processing described in the above embodiments.
- another example of the enlargement / reduction control process will be described.
- FIG. 45 is a flowchart showing the flow of processing of the enlargement / reduction control processing routine program of another example 1 executed by the CPU 12 in combination with the execution of the photographing control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- Example 1 Since the enlargement / reduction control process of Example 1 includes the same process as the enlargement / reduction control process of the above-described embodiment, the same process is described as such and a detailed description thereof is omitted.
- step S1601 it is determined whether two points in the split image 300 have been touched. That is, it is determined whether there are two designated positions 210. If the determination in step S1601 is affirmative, the process proceeds to step S1603. If the determination is negative, the enlargement / reduction control processing routine program is terminated.
- step S1603 it is determined whether or not a scroll operation (a so-called pinch oven operation (pinch out operation)) in which the designated position 310 is moved in the vertical direction is performed. If the determination in step S1603 is affirmative, the process proceeds to step S1605. If the determination is negative, the process proceeds to step S1607.
- FIG. 46 is a front view for explaining an example of a split image display state when a split image enlargement operation is performed in the imaging apparatus 100 according to the present embodiment.
- step S1605 after the split image 300 displayed on the display unit 213 of the display input unit 216 is enlarged, the enlargement / reduction control processing routine program is terminated.
- FIG. 47 described later shows a state in which the split image 300 is even displayed enlarged on the display unit 213.
- the enlargement rate may be the same as that in step S1503 of the enlargement / reduction control process.
- step 1607 it is determined whether or not a scroll operation (a so-called pinch close operation (pinch in operation)) in which the designated position 310 moves in a direction approaching the top and bottom is performed. If the determination in step S1607 is affirmative, the process proceeds to step S1609. If the determination is negative, the enlargement / reduction control processing routine program is terminated.
- FIG. 47 is a front view for explaining an example of a split image display state when a split image reduction operation is performed in the imaging apparatus 100 according to the present embodiment. 47 illustrates a case where the size of the enlarged split image 300 is larger than the size of the entire display area of the display unit 213.
- step 1609 after the split image 300 displayed on the display unit 213 of the display input unit 216 is reduced, the enlargement / reduction control processing routine program is terminated.
- the reduction ratio may be the same as that in step S1503 of the enlargement / reduction control process.
- the two designated positions 310 are designated in the enlargement / reduction control process of the different example 1, the designated positions 310 are in the vertical direction (division direction).
- the split image 300 is enlarged or reduced depending on whether it is away or approached.
- FIG. 48 is a flowchart showing the flow of processing of the enlargement / reduction control processing routine program of another example 2 executed by the CPU 12 in combination with the execution of the photographing control processing program.
- the program is stored in advance in a predetermined storage area of the memory 26.
- the enlargement / reduction control process of the different example 2 includes the same process as the enlargement / reduction control process of the present embodiment, the same process is described as such and a detailed description thereof is omitted.
- step S1701 corresponds to the process of step S1601 of the enlargement / reduction control process of the different example 1.
- step S1701 it is determined whether or not two points in the split image 300 are touch-operated (there are two designated positions 210). If the determination is affirmative, the process proceeds to step S1703 while the determination is negative. In this case, the enlargement / reduction control processing routine program is terminated.
- step S1703 it is determined whether or not the designated position 310 is at a position sandwiching the boundary line 311. For example, the determination is made based on whether or not a straight line connecting two designated positions 310 intersects the boundary line 311. 46 and 47 show a case where the designated position 310 is at a position with the boundary line 311 in between.
- the distance from the boundary line 311 to each designated position 310 is not particularly limited, and the distances from the boundary line 311 at the two designated positions 310 may be different. If the determination is affirmative in step S1703, the process proceeds to step S1705, and if the determination is negative, the process proceeds to step S1713.
- step S1705 corresponds to the process of step S1603 of the enlargement / reduction control process of the different example 1.
- step S1705 it is determined whether or not a scroll operation has been performed in which the designated position 310 moves in the direction of moving up and down. If the determination is affirmative, the process proceeds to step S1707. If the determination is negative, the process proceeds to step S1709. To do.
- the split image 300 is set as a boundary line 311 with the center in the vertical direction (division direction) sandwiched between the designated positions 310, and the center in the left and right direction (horizontal direction) as the designated position 310, and the display input unit 216
- the enlargement / reduction control processing routine program ends.
- the left and right positions of the two designated positions 310 are different, for example, the middle point in the left and right direction of the two designated positions 310 may be the center, or one of the designated positions 310 may be the center. good.
- the boundary line 311 close to the center of the vertical interval between the two designated positions 310 may be used as the center.
- step S1709 corresponds to the process of step S1607 of the enlargement / reduction control process of the first example.
- step S1709 it is determined whether a scroll operation has been performed in which the designated position 310 moves in a direction in which the designated position 310 approaches vertically. If the determination in step S1709 is affirmative, the process proceeds to step S1711. If the determination is negative, the enlargement / reduction control processing routine program is terminated.
- step S1711 the split image 300 is set to a boundary line 311 with the center in the vertical direction (division direction) sandwiched between the specified positions 310, and the center in the left and right direction (horizontal direction) is set to the specified position 310.
- the enlargement / reduction control processing routine program is terminated.
- step S1707 when the left and right positions of the two designated positions 310 are different, for example, the middle point in the left and right direction of the two designated positions 310 may be the center, or one of the designated positions is designated.
- the position 310 may be the center.
- step S1707 when two designated positions 310 sandwich a plurality of boundary lines 311, for example, the boundary line 311 near the center of the vertical interval between the two designated positions 310 is the center. Also good.
- step S1713 as in step S1705, it is determined whether or not a scroll operation has been performed in which the designated position 310 moves up and down. If the determination is affirmative, the process proceeds to step S1715, while a negative determination is made. If YES, the process moves to step S1717.
- step S1715 corresponds to the process of step S1605 of the enlargement / reduction control process of the first example.
- step S1715 after enlarging the entire split image 300 displayed on the display unit 213 of the display input unit 216, the enlargement / reduction control processing routine program ends.
- step S1717 as in step S1709, it is determined whether or not a scroll operation is performed in which the designated position 310 moves in the direction of approaching up and down. If the determination in step S1709 is affirmative, the process proceeds to step S1711. If the determination is negative, the enlargement / reduction control processing routine program is terminated.
- step S1719 corresponds to the process of step S1609 of the enlargement / reduction control process of another example 1.
- step 1719 the entire split image 300 displayed on the display unit 213 of the display input unit 216 is reduced, and then the enlargement / reduction control processing routine program ends.
- the imaging apparatus 100 operates in the vertical direction (division direction) when two designated positions 310 are designated with the boundary line 311 interposed therebetween.
- the split image 300 is enlarged or reduced with the boundary 311 sandwiched between the designated positions 310 and the center in the left-right direction (horizontal direction) designated as the designated position 310.
- the center of the up / down direction (division direction) of the enlargement / reduction is set as the boundary line 311 sandwiched between the specified positions 310, and the center in the left / right direction (horizontal direction) is set as the specified position 310, so that the user can It is possible to enlarge / reduce the split image 300 (subject image) at a portion where the in-focus state is to be confirmed. Thereby, in the imaging device 100 of this embodiment, it becomes possible to realize an operation intended by the user.
- FIG. 49 is a flowchart showing the flow of processing of the enlargement / reduction control processing routine program executed by the CPU 12 in combination with the execution of each of the enlargement / reduction control processing programs.
- the program is stored in advance in a predetermined storage area of the memory 26.
- the enlargement / reduction control control process shown in FIG. 49 is executed while the split image 300 is being enlarged or reduced.
- step S1801 it is determined whether or not the split image 300 is being enlarged. If the determination in step S1801 is affirmative, the process proceeds to step S1803. If the determination is negative, the process proceeds to step S1807.
- step S1803 it is determined whether or not the split image 300 has been enlarged to the size of the entire display area of the display unit 213. If a negative determination is made in step S1803, the process proceeds to step S1801, whereas if a positive determination is made, the process proceeds to step S1805.
- step S1805 after the enlargement of the split image 300 is stopped, the enlargement / reduction control processing routine program is terminated.
- step S1807 it is determined whether or not the split image 300 is being reduced. If the determination in step S1807 is affirmative, the process proceeds to step S1809. If the determination is negative, the enlargement / reduction control processing routine program is terminated.
- step S1809 it is determined whether or not the split image 300 has been reduced to the original size.
- the “original size” is the size of the split image 300 that is predetermined for the imaging apparatus 100 as an initial state. For example, the size of the split image 300 displayed on the display unit 213 when the imaging apparatus 100 is turned on is used. The “original size” is not limited to this, and may be the size before enlargement when the split image 300 is reduced after being enlarged. If a negative determination is made in step S1809, the process proceeds to step S1807. If an affirmative determination is made, the process proceeds to step S1811.
- step S1811 after the reduction of the split image 300 is stopped, the enlargement / reduction control processing routine program is terminated.
- the imaging apparatus 100 uses the maximum enlargement ratio of the split image 300 as the entire display area of the display unit 213 and the maximum reduction ratio in the enlargement / reduction control process.
- 100 is the size of the split image 300 that is predetermined as an initial state.
- the split image 300 By enlarging the split image 300 in this way, when a part of the subject image displayed in the normal image 301 is the split image 300, the split image 300 having the maximum enlargement ratio is displayed. However, by reducing the size slightly, the user can check the subject image shown in the periphery of the angle of view.
- each process included in the imaging control process described in each of the above embodiments may be realized by a software configuration using a computer by executing a program, or may be realized by another hardware configuration. Also good. Further, it may be realized by a combination of a hardware configuration and a software configuration.
- the program may be stored in a predetermined storage area (for example, the memory 26) in advance. It is not always necessary to store in the memory 26 from the beginning.
- a program is first stored in an arbitrary “portable storage medium” such as an SSD (Solid State Drive), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card that is connected to a computer. May be. Then, the computer may acquire the program from these portable storage media and execute it.
- each program may be stored in another computer or server device connected to the computer via the Internet, LAN (Local Area Network), etc., and the computer may acquire and execute the program from these. Good.
- the imaging device 100 is illustrated.
- a mobile terminal device that is a modification of the imaging device 100, for example, a mobile phone or a smartphone having a camera function, a PDA (Personal Digital Assistants), a portable game machine Etc.
- a smartphone will be described as an example, and will be described in detail with reference to the drawings.
- FIG. 50 is a perspective view showing an example of the appearance of the smartphone 500.
- a smartphone 500 shown in FIG. 50 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
- the housing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541.
- the configuration of the housing 502 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent may be employed, or a configuration having a folding structure or a slide structure may be employed.
- FIG. 51 is a block diagram showing an example of the configuration of the smartphone 500 shown in FIG.
- the main components of the smartphone 500 include a wireless communication unit 510, a display input unit 520, a communication unit 530, an operation unit 540, a camera unit 541, a storage unit 550, and an external input / output. Part 560.
- the smartphone 500 includes a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
- GPS Global Positioning System
- a wireless communication function for performing mobile wireless communication via a base station device and a mobile communication network is provided.
- the wireless communication unit 510 performs wireless communication with a base station apparatus accommodated in the mobile communication network in accordance with an instruction from the main control unit 501. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
- the display input unit 520 is a so-called touch panel, and includes a display panel 521 and an operation panel 522. For this reason, the display input unit 520 displays images (still images and moving images), character information, and the like visually by controlling the main control unit 501, and visually transmits information to the user, and performs user operations on the displayed information. To detect. Note that when viewing the generated 3D image, the display panel 521 is preferably a 3D display panel.
- the display panel 521 uses an LCD, OELD (Organic Electro-Luminescence Display), or the like as a display device.
- the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
- the display panel 521 and the operation panel 522 of the smartphone 500 integrally form the display input unit 520, but the operation panel 522 is disposed so as to completely cover the display panel 521. ing.
- the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
- the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
- the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like. Furthermore, examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method, and any method is adopted. You can also
- the communication unit 530 includes a speaker 531 and a microphone 532.
- the communication unit 530 converts the user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501, and outputs the voice data to the main control unit 501. Further, the communication unit 530 decodes the audio data received by the wireless communication unit 510 or the external input / output unit 560 and outputs it from the speaker 531.
- the speaker 531 and the microphone 532 can be mounted on the same surface as the surface on which the display input unit 520 is provided.
- the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
- the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed by a finger or the like, and turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
- the storage unit 550 stores the control program and control data of the main control unit 501, application software, address data that associates the name and telephone number of the communication partner, and transmitted / received e-mail data.
- the storage unit 550 stores Web data downloaded by Web browsing and downloaded content data.
- the storage unit 550 temporarily stores streaming data and the like.
- the storage unit 550 includes an external storage unit 552 having an internal storage unit 551 built in the smartphone and a removable external memory slot.
- Each of the internal storage unit 551 and the external storage unit 552 constituting the storage unit 550 is realized using a storage medium such as a flash memory type or a hard disk type.
- Other storage media include multimedia card micro type, card type memory (for example, MicroSD (registered trademark) memory, etc.), RAM (Random Access Memory), ROM (Read Only Memory). ) Can be exemplified.
- the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and is used to connect directly or indirectly to other external devices through communication or the like or a network. is there. Examples of communication with other external devices include universal serial bus (USB), IEEE 1394, and the like. Examples of the network include the Internet, wireless LAN, Bluetooth (Bluetooth (registered trademark)), RFID (Radio Frequency Identification), and infrared communication (Infrared Data Association: IrDA) (registered trademark). Other examples of the network include UWB (Ultra Wideband) (registered trademark), ZigBee (registered trademark), and the like.
- Examples of the external device connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, and a memory card connected via a card socket.
- Other examples of external devices include SIM (Subscriber Identity Module Card) / UIM (User Identity Module Card) cards, and external audio / video devices connected via audio / video I / O (Input / Output) terminals. It is done.
- an external audio / video device that is wirelessly connected can be used.
- the external input / output unit may transmit data received from such an external device to each component inside the smartphone 500, or may allow data inside the smartphone 500 to be transmitted to the external device. it can.
- the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received GPS signals, and calculates the latitude of the smartphone 500 Detect the position consisting of longitude and altitude.
- the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
- the motion sensor unit 580 includes a triaxial acceleration sensor, for example, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
- the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
- the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the storage unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function for controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 510.
- the application processing function is realized by the main control unit 501 operating according to the application software stored in the storage unit 550.
- Application processing functions include, for example, an infrared communication function that controls the external input / output unit 560 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
- the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image data or moving image data) such as received data or downloaded streaming data.
- the image processing function is a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
- the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
- the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
- a software key such as a scroll bar
- the scroll bar refers to a software key for accepting an instruction to move a display portion of an image, such as a large image that does not fit in the display area of the display panel 521.
- the main control unit 501 detects a user operation through the operation unit 540, or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 522. Or, by executing the operation detection control, the main control unit 501 accepts a display image scroll request through a scroll bar.
- the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 521.
- a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key is provided.
- the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function according to the detected gesture operation.
- Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
- the camera unit 541 is a digital camera that captures an image using an image sensor such as a CMOS or a CCD, and has the same function as the image capturing apparatus 100 shown in FIG.
- the camera unit 541 can switch between a manual focus mode and an autofocus mode.
- the focus lens 302 of the camera unit 541 can be focused by operating a focus icon button or the like displayed on the operation unit 540 or the display input unit 520.
- the manual focus mode the live view image obtained by combining the split images is displayed on the display panel 521 so that the in-focus state during the manual focus can be confirmed.
- the camera unit 541 converts the image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501.
- the converted image data can be recorded in the storage unit 550 or output through the external input / output unit 560 or the wireless communication unit 510.
- the camera unit 541 is mounted on the same surface as the display input unit 520.
- the mounting position of the camera unit 541 is not limited to this, and the camera unit 541 may be mounted on the back surface of the display input unit 520.
- a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for imaging may be switched and imaged alone, or a plurality of camera units 541 may be used simultaneously for imaging. it can.
- the camera unit 541 can be used for various functions of the smartphone 500.
- an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
- the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
- the optical axis direction of the camera unit 541 of the smartphone 500 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
- an image from the camera unit 541 can be used in the application software.
- various kinds of information can be added to still image or moving image data and recorded in the storage unit 550 or output through the external input / output unit 560 or the wireless communication unit 510.
- the “various information” herein include, for example, position information acquired by the GPS receiving unit 570 and image information of the still image or moving image, audio information acquired by the microphone 532 (sound text conversion by the main control unit or the like). May be text information).
- posture information acquired by the motion sensor unit 580 may be used.
- the image pickup device 20 having the first to third pixel groups is illustrated, but the present invention is not limited to this, and only the first pixel group and the second pixel group.
- An image sensor made of A digital camera having this type of image sensor generates a three-dimensional image (3D image) based on the first image output from the first pixel group and the second image output from the second pixel group. 2D images (2D images) can also be generated. In this case, the generation of the two-dimensional image is realized, for example, by performing an interpolation process between pixels of the same color in the first image and the second image. Moreover, you may employ
- the mode in which both the normal image and the split image are simultaneously displayed on the moving image surface of the display device when the first to third images are input to the image processing unit 28 is exemplified.
- the display control unit 36 controls the display device to continuously display the normal image as a moving image and controls the display device to continuously display the split image as a moving image.
- “suppressing the display of a normal image” refers to not displaying a normal image on a display device, for example. Specifically, the normal image is generated, but the normal image is not displayed on the display device by not outputting the normal image to the display device, or the normal image is not displayed on the display device by generating the normal image.
- split image refers to an image output from a phase difference pixel group (for example, a first image output from a first pixel group and a second image when a specific image sensor is used).
- a split image based on the second image output from the pixel group can be exemplified.
- Examples of “when using a specific image sensor” include a case where an image sensor consisting only of a phase difference pixel group (for example, a first pixel group and a second pixel group) is used. In addition to this, a case where an image sensor in which phase difference pixels (for example, a first pixel group and a second pixel group) are arranged at a predetermined ratio with respect to a normal pixel can be exemplified.
- various conditions are conceivable as conditions for suppressing the normal image display and displaying the split image.
- the display control unit 36 performs control to display the split image without displaying the normal image on the display device. You may make it perform.
- the display control unit 36 may perform control to display a split image without displaying a normal image on the display device.
- the display control unit 36 may perform control to display the split image without displaying the normal image on the display device.
- the display control unit 36 may perform control to display the split image without displaying the normal image on the display device. Further, for example, when the face detection function for detecting the face overshoot of the subject is activated, the display control unit 36 may perform control to display the split image without displaying the normal image on the display device. .
- the display control unit 36 suppresses the display of the normal image.
- the present invention is not limited to this.
- the display control unit 36 overwrites the split image of the full screen on the normal image. Control may be performed.
- the CPU 12 or the main control unit 501 included in each device performs the above-described shooting control process in the imaging device 100 or the smartphone 500 including the imaging lens 16 including the focus lens 302.
- the present invention is not limited to this.
- the above-described shooting control process may be performed by remotely operating an imaging apparatus including an imaging lens including a focus lens from an external apparatus such as a personal computer or the smartphone.
- the shooting control process in each of the above embodiments may be applied not only when the subject image is brought into focus but also when the subject image is put into a so-called out-of-focus state where the subject image is not brought into focus.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Lens Barrels (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Focusing (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
本実施形態に係る撮像装置100は、レンズ交換式カメラである。また、撮像装置100は、図1に示すように、カメラ本体200と、カメラ本体200に交換可能に装着されるズームレンズである交換レンズ258と、を含み、レフレックスミラーが省略されたデジタルカメラである。交換レンズ258は、光軸方向に移動可能なフォーカスレンズ302を有する撮像レンズ16、フォーカスリング260、スライド機構303、及びモータ304を備えている(図3参照、詳細後述)。また、カメラ本体200には、ハイブリッドファインダー(登録商標)220が設けられている。ここで言うハイブリッドファインダー220とは、例えば光学ビューファインダー(以下、「OVF」という)及び電子ビューファインダー(以下、「EVF」という)が選択的に使用されるファインダーを指す。
上記第1実施形態では、スクロール処理において、指定位置をスプリットイメージの右眼用画像及び左眼用画像の視差方向(水平方向)に移動させる場合について例示したが、移動方向はこれに限らない。本実施形態では、移動方向の変形例として例えば、円を描くように指定位置を移動させる動作(円動作)を行う場合について、図面を参照しつつ、詳細に説明する。
上記各実施形態において示した撮影制御処理は、スプリットイメージを用いて合焦を行うためのその他の処理を組み合わせても良い。本実施形態では、上記各実施形態において示した撮影制御処理と組み合わせることが好ましいその他の処理について、図面を参照しつつ、詳細に説明する。
上記第3実施形態では、上記第1及び第2実施形態において示した撮影制御処理と組み合わせることが好ましいその他の処理として合焦報知処理について説明したが、本実施形態では、その他の処理としてスプリットイメージ300の拡大・縮小制御処理について、図面を参照しつつ、詳細に説明する。
上記第1実施形態では、撮像装置100を例示したが、撮像装置100の変形例である携帯端末装置としては、例えばカメラ機能を有する携帯電話機やスマートフォン、PDA(Personal Digital Assistants)、携帯型ゲーム機などが挙げられる。以下、スマートフォンを例に挙げ、図面を参照しつつ、詳細に説明する。
(hard disk type)などの格納媒体を用いて実現される。格納媒体としては、この他にも、マルチメディアカードマイクロタイプ(multimedia card micro type)、カードタイプのメモリ(例えば、MicroSD(登録商標)メモリ等)、RAM(Random Access Memory)、ROM(Read Only Memory)が例示できる。
20 撮像素子
26 メモリ
28 画像処理部
30 通常処理部
32 スプリットイメージ処理部
36 表示制御部
38 報知制御部
100 撮像装置
213 表示部
217 振動部材
241 LCD
Claims (35)
- フォーカスレンズを含む撮像レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子における第1の画素群から出力された画像信号に基づく第1の画像及び第2の画素群から出力された画像信号に基づく第2の画像に基づいて、前記第1の画像を予め定められた分割方向に分割して得られた複数の分割画像から選択される第1の分割画像、及び前記第2の画像を前記分割方向に分割して得られた複数の分割画像であって、前記第1の分割画像に対応する分割領域を表す分割画像を除いた分割画像から選択される第2の分割画像を配置した、合焦確認に使用する表示用画像を生成する生成部と、
表示領域を有し、かつ当該表示領域の表面にタッチパネルが設けられた表示部と、
前記表示部に対して前記生成部により生成された前記表示用画像を表示する制御を行う表示制御部と、
前記表示部に前記表示用画像が表示された状態において、当該表示用画像上の前記第1の分割画像または前記第2の分割画像の選択操作が前記タッチパネルを介して行われたことを検出する第1検出部と、
前記表示用画像に対して、前記分割方向と交差する交差方向への移動操作が前記タッチパネルを介して行われたことを検出する第2検出部と、
前記第1検出部により前記選択操作が検出されたことに続いて、前記第2検出部により前記移動操作が検出された場合、前記フォーカスレンズを光軸方向に移動させる移動部に対して、前記フォーカスレンズを前記移動操作に応じて移動させる制御を行う合焦制御部と、
を備えた画像処理装置。 - 前記生成部は、前記撮像素子から出力された画像信号に基づいて、撮像範囲の確認に使用する第2の表示用画像を更に生成し、
前記表示制御部は、前記表示部に対して前記生成部により生成された前記第2の表示用画像を更に表示する制御を行う
請求項1記載の画像処理装置。 - 前記撮像素子は、前記撮像レンズを透過した被写体像が瞳分割されずに結像されて第3の画像を出力する第3の画素群を更に有し、
前記生成部は、前記第3の画素群から出力された前記第3の画像に基づいて前記第2の表示用画像を生成する
請求項2記載の画像処理装置。 - 前記合焦制御部は、前記第1検出部により検出された選択操作が前記第1の分割画像の選択操作であるか前記第2の分割画像の選択操作であるかを判別する第1の判別と、前記第2検出部により検出された移動操作が前記交差方向に沿った第1の方向および第2の方向の何れの方向への移動操作であるかを判別する第2の判別を行い、前記第1の判別の結果と前記第2の判別の結果に基づいて前記フォーカスレンズの移動方向を決定し、前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項1乃至3の何れか1項記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記合焦制御部は、前記第1の判別の結果が前記右眼用画像の選択操作であり、かつ前記第2の判別の結果が前記表示部を観察する操作者から見て右方向への移動操作であった場合、合焦位置が現在の合焦位置よりも前記撮像素子に近付く方向に前記フォーカスレンズを移動させ、前記第1の判別の結果が前記右眼用画像の選択操作であり、かつ前記第2の判別の結果が前記表示部を観察する操作者から見て前記左方向への移動操作であった場合、合焦位置が現在の合焦位置よりも前記撮像素子から遠ざかる方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項4記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記合焦制御部は、前記第1の判別の結果が前記左眼用画像の選択操作であり、かつ前記第2の判別の結果が前記表示部を観察する操作者から見て右方向への移動操作であった場合、合焦位置が現在の合焦位置よりも前記撮像素子から遠ざかる方向に前記フォーカスレンズを移動させ、前記第1の判別の結果が前記左眼用画像の選択操作であり、かつ前記第2の判別の結果が前記表示部を観察する操作者から見て左方向への移動操作であった場合、合焦位置が現在の合焦位置よりも前記撮像素子に近付く方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項4記載の画像処理装置。 - 前記第1の分割画像及び前記第2の分割画像は、前記表示用画像内において前記分割方向に隣接して配置され、前記第1の分割画像と前記第2の分割画像との境界線を通過する移動操作が前記タッチパネルを介して行われたことを検出する第3検出部を更に備え、
前記合焦制御部は、前記第1検出部により前記選択操作が検出されたことに続いて、前記第2検出部及び前記第3検出部により移動操作が検出された場合、前記フォーカスレンズを前記移動操作に応じて移動させる制御を前記移動部に対して行う
請求項1乃至3の何れか1項記載の画像処理装置。 - 前記合焦制御部は、
前記境界線に対する前記第1の分割画像及び前記第2の分割画像の少なくとも一方の位置を判別する第3の判別と、
前記第2検出部により検出された移動操作が前記交差方向に沿った第1の方向および第2の方向の何れの方向への移動操作であるかを判別する第4の判別と、
前記第3検出部により検出された移動操作が前記分割方向に沿った第3の方向および第4の方向の何れの方向への移動操作であるかを判別する第5の判別と、を行い、
前記第3の判別の結果と前記第4の判別の結果と前記第5の判別の結果とに基づいて前記フォーカスレンズの移動方向を決定し、前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項7記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記第3の判別の結果、前記第2の分割画像の位置が、前記表示部を観察する操作者から見て前記境界線の上側にあると判別された場合、
前記合焦制御部は、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て右方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て下方向への移動操作であった場合、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て左方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て上方向への移動操作であった場合、
前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記左方向への移動操作及び前記第5の判別の結果が前記下方向への移動操作であった場合、
及び前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記右方向への移動操作及び前記第5の判別の結果が前記上方向への移動操作であった場合、
の何れかの場合に、合焦位置が現在の合焦位置よりも前記撮像素子から遠ざかる方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項8記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記第3の判別の結果、前記第2の分割画像の位置が、前記表示部を観察する操作者から見て前記境界線の上側にあると判別された場合、
前記合焦制御部は、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て左方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て下方向への移動操作であった場合、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て右方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て上方向への移動操作であった場合、
前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記左方向への移動操作及び前記第5の判別の結果が前記上方向への移動操作であった場合、
及び前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記右方向への移動操作及び前記第5の判別の結果が前記下方向への移動操作であった場合、
の何れかの場合に、合焦位置が現在の合焦位置よりも前記撮像素子に近付く方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項8記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記第3の判別の結果、前記第1の分割画像の位置が、前記表示部を観察する操作者から見て前記境界線の上側にあると判別された場合、
前記合焦制御部は、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て右方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て下方向への移動操作であった場合、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て左方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て上方向への移動操作であった場合、
前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記左方向への移動操作及び前記第5の判別の結果が前記下方向への移動操作であった場合、
及び前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記右方向への移動操作及び前記第5の判別の結果が前記上方向への移動操作であった場合、
の何れかの場合に、合焦位置が現在の合焦位置よりも前記撮像素子に近付く方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う請求項8記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記第3の判別の結果、前記第1の分割画像の位置が、前記表示部を観察する操作者から見て前記境界線の上側にあると判別された場合、
前記合焦制御部は、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て左方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て下方向への移動操作であった場合、
前記第2検出部による移動操作の検出に続いて前記第3検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記表示部を観察する操作者から見て右方向への移動操作及び前記第5の判別の結果が前記表示部を観察する操作者から見て上方向への移動操作であった場合、
前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記左方向への移動操作及び前記第5の判別の結果が前記上方向への移動操作であった場合、
及び前記第3検出部による移動操作の検出に続いて前記第2検出部により移動操作が検出され、かつ、前記第4の判別の結果が前記右方向への移動操作及び前記第5の判別の結果が前記下方向への移動操作であった場合、
の何れかの場合に、合焦位置が現在の合焦位置よりも前記撮像素子から遠ざかる方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項8記載の画像処理装置。 - 前記合焦制御部は、前記第1検出部により検出された選択操作が前記第1の分割画像及び第2の分割画像の双方の選択操作であり、かつ前記第2検出部により検出された移動操作が前記第1の分割画像及び前記第2の分割画像の各々において前記交差方向に沿って互いに異なる方向であった場合、前記第1の分割画像の移動操作及び第2の分割画像の移動操作の各々の移動方向に基づいて前記フォーカスレンズの移動方向を決定し、前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項1乃至3の何れか1項記載の画像処理装置。 - 前記第1の画像は、右眼用画像であり、
前記第2の画像は、左眼用画像であり、
前記合焦制御部は、前記第1検出部により検出された選択操作が前記第1の分割画像及び第2の分割画像の双方の選択操作であり、かつ前記第2検出部により検出された前記右眼用画像の移動操作が、前記交差方向に沿った方向かつ前記表示部を観察する操作者から見て右方向への移動操作であり、かつ前記第2検出部により検出された前記左眼用画像の移動操作が、前記交差方向に沿った方向かつ前記表示部を観察する操作者から見て左方向への移動操作であった場合、合焦位置が現在の合焦位置よりも前記撮像素子に近付く方向に前記フォーカスレンズを移動させ、前記第1検出部により検出された選択操作が前記第1の分割画像及び第2の分割画像の双方の選択操作であり、かつ前記第2検出部により検出された前記右眼用画像の移動操作が前記左方向への移動操作であり、かつ前記第2検出部により検出された前記左眼用画像の移動操作が前記右方向への移動操作であった場合、合焦位置が現在の合焦位置よりも前記撮像素子から遠ざかる方向に前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項13記載の画像処理装置。 - 前記合焦制御部は、前記第2検出部により検出された移動操作における前記タッチパネルに対する接触操作が継続されている間、前記フォーカスレンズを前記接触操作における接触位置の移動に応じて前記光軸方向に移動させる制御を前記移動部に対して行う
請求項1乃至14の何れか1項記載の画像処理装置。 - 前記合焦制御部は、前記フォーカスレンズを前記光軸方向における当該移動操作に応じた移動方向に継続的に移動させる制御を前記移動部に対して行う
請求項1乃至14の何れか1項記載の画像処理装置。 - 前記合焦制御部は、前記移動操作における操作速度に応じた移動速度によって前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項1乃至16の何れか1項記載の画像処理装置。 - 前記合焦制御部は、前記移動操作における操作移動量に応じた移動量によって前記フォーカスレンズを移動させる制御を前記移動部に対して行う
請求項1乃至16の何れか1項記載の画像処理装置。 - 前記合焦制御部は、前記操作速度が予め定められた第1閾値未満であった場合、前記移動部に対して前記フォーカスレンズを移動させる制御を行わない
請求項17記載の画像処理装置。 - 前記移動部により前記フォーカスレンズが移動されている状態において、前記タッチパネルに対する選択操作が一旦解除された後、前記表示領域の何れかの位置における接触操作が前記タッチパネルを介して行われたことを検出する第4検出部を更に備え、
前記合焦制御部は、前記第4検出部により前記接触操作が検出された場合、前記フォーカスレンズの移動を停止させる制御を前記移動部に対して行う
請求項16記載の画像処理装置。 - 前記合焦制御部は、前記フォーカスレンズの移動を開始させた後、当該移動に伴って当該フォーカスレンズの移動速度を減速させて当該フォーカスレンズを停止させる制御を前記移動部に対して行う
請求項16記載の画像処理装置。 - 前記合焦制御部は、前記第2検出部により検出された前記移動操作における移動時間が予め定められた第2閾値以上であった場合、前記第2検出部により検出された移動操作における前記タッチパネルに対する接触操作が継続されている間、前記フォーカスレンズを前記接触操作における接触位置に応じて前記光軸方向に移動させる制御を前記移動部に対して行い、前記移動時間が前記第2閾値未満であった場合、前記フォーカスレンズを前記光軸方向における当該接触位置に応じた移動方向に継続的に移動させる制御を前記移動部に対して行う
請求項1乃至14の何れか1項記載の画像処理装置。 - 前記合焦制御部により前記フォーカスレンズが移動されている状態において、前記表示用画像の合焦状況を検出する第5検出部を更に備え、
前記合焦制御部は、前記第5検出部により合焦していることが検出された場合、前記フォーカスレンズの移動を停止させる制御を前記移動部に対して行う
請求項1乃至22の何れか1項記載の画像処理装置。 - 前記合焦制御部により前記フォーカスレンズが移動されている状態において、前記表示用画像の合焦状況を検出する第5検出部と、
前記第5検出部により合焦していることが検出された場合、合焦していることを報知する報知部と、
を更に備えた
請求項1乃至23の何れか1項記載の画像処理装置。 - 前記報知部は、前記タッチパネルに対する接触操作が行われている部位を振動させることにより、合焦していることを報知する
請求項24記載の画像処理装置。 - 前記第5検出部は、前記表示用画像のコントラストに基づいて前記表示用画像の合焦状況を検出する
請求項23乃至25の何れか1項記載の画像処理装置。 - 前記第5検出部は、前記表示用画像内の前記第1の分割画像と前記第2の分割画像との位相差に基づいて前記表示用画像の合焦状況を検出する
請求項23乃至26の何れか1項記載の画像処理装置。 - 前記第1の分割画像及び前記第2の分割画像は、前記表示用画像内において前記分割方向に隣接して配置され、
前記第1の分割画像と前記第2の分割画像との境界線を通過する移動操作が前記タッチパネルを介して行われたことを検出する第3検出部を備え、
前記表示制御部は、前記第2検出部による移動操作の検出と非連続で前記第3検出部が移動操作を検出した場合は、前記第3検出部によって検出した移動操作の操作方向に応じて前記表示用画像を拡大または縮小する制御を行う
請求項1乃至6又は13の何れか1項記載の画像処理装置。 - 前記第1の分割画像及び前記第2の分割画像は、前記表示用画像内において前記分割方向に隣接して配置され、
前記第1の分割画像と前記第2の分割画像との境界線を通過する移動操作が前記タッチパネルを介して行われたことを検出する第3検出部を備え、
前記表示制御部は、前記第2検出部による移動操作の検出と非連続で前記第3検出部が移動操作を検出した場合は、
前記第3検出部が前記タッチパネルに対する接触操作の接触位置を2点検出すると共に2点の接触位置が離れる方向の移動操作を検出した場合は、前記表示制御部は、前記表示用画像を拡大する制御を行い、
前記第3検出部が前記タッチパネルに対する接触操作の接触位置を2点検出すると共に2点の接触位置が近付く方向の移動操作を検出した場合は、前記表示制御部は、前記表示用画像を縮小する制御を行う
請求項1乃至6又は13の何れか1項記載の画像処理装置。 - 前記表示制御部は、前記表示用画像を拡大する制御に伴い、前記表示用画像の大きさが前記表示領域全体の大きさと等しくなった場合は、前記表示用画像を拡大する制御を停止する
請求項28または29記載の画像処理装置。 - 前記表示制御部は、前記表示用画像を拡大する制御に伴い、前記表示用画像の大きさが前記表示領域全域よりも大きくなった場合は、前記表示用画像の一部分を前記表示領域に表示させる制御を行う
請求項28または29記載の画像処理装置。 - 前記表示制御部は、前記表示用画像の拡大後に前記表示用画像を縮小する場合において、前記表示用画像の大きさが拡大前の前記表示用画像の大きさとなった場合に、前記表示用画像を縮小する制御を停止する
請求項28乃至30の何れか1項記載の画像処理装置。 - 請求項1乃至32の何れか1項記載の画像処理装置と、
前記撮像レンズと、
前記第1及び第2の画素群を有する撮像素子と、
前記撮像素子から出力された画像信号に基づいて生成された画像を記憶する記憶部と、
を含む撮像装置。 - コンピュータを、
フォーカスレンズを含む撮像レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子における第1の画素群から出力された画像信号に基づく第1の画像及び第2の画素群から出力された画像信号に基づく第2の画像に基づいて、前記第1の画像を予め定められた分割方向に分割して得られた複数の分割画像から選択される第1の分割画像、及び前記第2の画像を前記分割方向に分割して得られた複数の分割画像であって、前記第1の分割画像に対応する分割領域を表す分割画像を除いた分割画像から選択される第2の分割画像を配置した、合焦確認に使用する表示用画像を生成する生成部と、
表示領域を有し、かつ当該表示領域の表面にタッチパネルが設けられた表示部に対して前記生成部により生成された前記表示用画像を表示する制御を行う表示制御部と、
前記表示部に前記表示用画像が表示された状態において、当該表示用画像上の前記第1の分割画像または前記第2の分割画像の選択操作が前記タッチパネルを介して行われたことを検出する第1検出部と、
前記表示用画像に対して、前記分割方向と交差する交差方向への移動操作が前記タッチパネルを介して行われたことを検出する第2検出部と、
前記第1検出部により前記選択操作が検出されたことに続いて、前記第2検出部により前記移動操作が検出された場合、前記フォーカスレンズを光軸方向に移動させる移動部に対して、前記フォーカスレンズを前記移動操作に応じて移動させる制御を行う合焦制御部と、
として機能させるためのプログラム。 - フォーカスレンズを含む撮像レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子における第1の画素群から出力された画像信号に基づく第1の画像及び第2の画素群から出力された画像信号に基づく第2の画像に基づいて、前記第1の画像を予め定められた分割方向に分割して得られた複数の分割画像から選択される第1の分割画像、及び前記第2の画像を前記分割方向に分割して得られた複数の分割画像であって、前記第1の分割画像に対応する分割領域を表す分割画像を除いた分割画像から選択される第2の分割画像を配置した、合焦確認に使用する表示用画像を生成する生成ステップと、
表示領域を有し、かつ当該表示領域の表面にタッチパネルが設けられた表示部に対して前記生成ステップにより生成された前記表示用画像を表示する制御を行う表示制御ステップと、
前記表示部に前記表示用画像が表示された状態において、当該表示用画像上の前記第1の分割画像または前記第2の分割画像の選択操作が前記タッチパネルを介して行われたことを検出する第1検出ステップと、
前記表示用画像に対して、前記分割方向と交差する交差方向への移動操作が前記タッチパネルを介して行われたことを検出する第2検出ステップと、
前記第1検出ステップにより前記選択操作が検出されたことに続いて、前記第2検出ステップにより前記移動操作が検出された場合、前記フォーカスレンズを光軸方向に移動させる移動部に対して、前記フォーカスレンズを前記移動操作に応じて移動させる制御を行う合焦制御ステップと、
を備えた画像処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015508570A JP5981026B2 (ja) | 2013-03-29 | 2014-03-26 | 画像処理装置、撮像装置、プログラム及び画像処理方法 |
CN201480010818.XA CN105026976B (zh) | 2013-03-29 | 2014-03-26 | 图像处理装置、摄像装置及图像处理方法 |
US14/844,776 US9456129B2 (en) | 2013-03-29 | 2015-09-03 | Image processing device, imaging device, program, and image processing method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-074278 | 2013-03-29 | ||
JP2013074278 | 2013-03-29 | ||
JP2013174876 | 2013-08-26 | ||
JP2013-174876 | 2013-08-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/844,776 Continuation US9456129B2 (en) | 2013-03-29 | 2015-09-03 | Image processing device, imaging device, program, and image processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014157270A1 true WO2014157270A1 (ja) | 2014-10-02 |
Family
ID=51624236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/058407 WO2014157270A1 (ja) | 2013-03-29 | 2014-03-26 | 画像処理装置、撮像装置、プログラム及び画像処理方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9456129B2 (ja) |
JP (1) | JP5981026B2 (ja) |
CN (1) | CN105026976B (ja) |
WO (1) | WO2014157270A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019188035A1 (ja) * | 2018-03-28 | 2019-10-03 | ソニー株式会社 | 撮像装置及び撮像装置における通知制御方法、並びに情報処理装置 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014069228A1 (ja) * | 2012-11-05 | 2014-05-08 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
JP6231869B2 (ja) * | 2013-12-12 | 2017-11-15 | キヤノン株式会社 | 撮像装置、通信機器およびそれらの制御方法、プログラム |
JPWO2015097892A1 (ja) * | 2013-12-27 | 2017-03-23 | パイオニア株式会社 | 端末装置、キャリブレーション方法及びキャリブレーションプログラム |
JP6587380B2 (ja) | 2014-09-12 | 2019-10-09 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、プログラム、記憶媒体 |
US10983680B2 (en) * | 2016-06-28 | 2021-04-20 | Nikon Corporation | Display device, program, display method and control device |
US10387477B2 (en) * | 2017-05-30 | 2019-08-20 | Qualcomm Incorporated | Calibration for phase detection auto focus (PDAF) camera systems |
CN107390685B (zh) * | 2017-07-14 | 2020-10-16 | 深圳市优必选科技有限公司 | 一种机器人的回充控制方法、机器人及机器人系统 |
JP6971696B2 (ja) * | 2017-08-04 | 2021-11-24 | キヤノン株式会社 | 電子機器およびその制御方法 |
JP7158841B2 (ja) * | 2017-11-08 | 2022-10-24 | キヤノン株式会社 | 撮像装置、撮像方法、プログラム、記録媒体および画像処理装置 |
TWI632527B (zh) * | 2017-11-22 | 2018-08-11 | 東友科技股份有限公司 | 影像擷取與輸出方法 |
US10735640B2 (en) * | 2018-02-08 | 2020-08-04 | Facebook Technologies, Llc | Systems and methods for enhanced optical sensor devices |
JP7049163B2 (ja) * | 2018-04-09 | 2022-04-06 | キヤノン株式会社 | 電子機器およびその制御方法、プログラム並びに記憶媒体 |
JP2020020991A (ja) * | 2018-08-02 | 2020-02-06 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 制御装置、方法およびプログラム |
CN109862243B (zh) * | 2019-01-31 | 2020-10-09 | 维沃移动通信有限公司 | 终端设备及终端设备的控制方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007079929A (ja) * | 2005-09-14 | 2007-03-29 | Make Softwear:Kk | 写真シール作成装置、写真シール作成装置の制御方法及び写真シール作成装置の制御プログラム。 |
JP2009237214A (ja) * | 2008-03-27 | 2009-10-15 | Canon Inc | 撮像装置 |
JP2009276426A (ja) * | 2008-05-13 | 2009-11-26 | Canon Inc | 撮像装置 |
JP2012173531A (ja) * | 2011-02-22 | 2012-09-10 | Sony Corp | 撮像装置、およびフォーカス制御方法、並びにプログラム |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5950671A (ja) * | 1982-09-16 | 1984-03-23 | Olympus Optical Co Ltd | テレビジヨンカメラ等の焦点情報表示方法 |
US7099575B2 (en) * | 2002-07-08 | 2006-08-29 | Fuji Photo Film Co., Ltd. | Manual focus device and autofocus camera |
JP2004191629A (ja) * | 2002-12-11 | 2004-07-08 | Canon Inc | 焦点検出装置 |
JP4182117B2 (ja) * | 2006-05-10 | 2008-11-19 | キヤノン株式会社 | 撮像装置及びその制御方法及びプログラム及び記憶媒体 |
JP5043626B2 (ja) * | 2007-12-13 | 2012-10-10 | キヤノン株式会社 | 撮像装置 |
US8279318B2 (en) * | 2007-12-14 | 2012-10-02 | Canon Kabushiki Kaisha | Image pickup apparatus and display control method for the same |
JP4868075B2 (ja) * | 2009-10-09 | 2012-02-01 | 株式会社ニコン | 撮像装置 |
JP2011119930A (ja) * | 2009-12-02 | 2011-06-16 | Seiko Epson Corp | 撮像装置、撮像方法および撮像プログラム |
JP2011151728A (ja) | 2010-01-25 | 2011-08-04 | Canon Inc | 撮像装置及びその制御方法 |
JP5459031B2 (ja) * | 2010-04-13 | 2014-04-02 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
US8588600B2 (en) * | 2010-07-27 | 2013-11-19 | Texas Instruments Incorporated | Stereoscopic auto-focus based on coordinated lens positions |
KR101662726B1 (ko) | 2010-12-29 | 2016-10-14 | 삼성전자주식회사 | 전자 기기의 스크롤 방법 및 장치 |
JP5848561B2 (ja) * | 2011-09-20 | 2016-01-27 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、並びに記憶媒体 |
CN104396227B (zh) * | 2012-06-01 | 2016-10-12 | 富士胶片株式会社 | 摄像装置 |
JP5681329B2 (ja) * | 2012-06-07 | 2015-03-04 | 富士フイルム株式会社 | 撮像装置及び画像表示方法 |
CN104662886B (zh) * | 2012-09-19 | 2017-11-10 | 富士胶片株式会社 | 摄像装置及对焦确认显示方法 |
WO2014045739A1 (ja) * | 2012-09-19 | 2014-03-27 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム |
JP5937690B2 (ja) * | 2012-09-19 | 2016-06-22 | 富士フイルム株式会社 | 撮像装置及びその制御方法 |
CN104838313B (zh) * | 2012-09-19 | 2018-01-05 | 富士胶片株式会社 | 摄像装置及其控制方法 |
CN104782110B (zh) * | 2012-09-19 | 2018-09-14 | 富士胶片株式会社 | 图像处理装置、摄像装置及图像处理方法 |
CN104641626B (zh) * | 2012-09-19 | 2018-02-27 | 富士胶片株式会社 | 摄像装置及对焦确认显示方法 |
WO2014045740A1 (ja) * | 2012-09-19 | 2014-03-27 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム |
JP5960286B2 (ja) * | 2012-12-19 | 2016-08-02 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム |
CN104885440B (zh) * | 2013-01-04 | 2017-12-08 | 富士胶片株式会社 | 图像处理装置、摄像装置及图像处理方法 |
WO2014106935A1 (ja) * | 2013-01-04 | 2014-07-10 | 富士フイルム株式会社 | 画像処理装置、撮像装置、プログラム及び画像処理方法 |
WO2014155812A1 (ja) * | 2013-03-27 | 2014-10-02 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム |
WO2014155813A1 (ja) * | 2013-03-29 | 2014-10-02 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム |
-
2014
- 2014-03-26 WO PCT/JP2014/058407 patent/WO2014157270A1/ja active Application Filing
- 2014-03-26 CN CN201480010818.XA patent/CN105026976B/zh active Active
- 2014-03-26 JP JP2015508570A patent/JP5981026B2/ja not_active Expired - Fee Related
-
2015
- 2015-09-03 US US14/844,776 patent/US9456129B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007079929A (ja) * | 2005-09-14 | 2007-03-29 | Make Softwear:Kk | 写真シール作成装置、写真シール作成装置の制御方法及び写真シール作成装置の制御プログラム。 |
JP2009237214A (ja) * | 2008-03-27 | 2009-10-15 | Canon Inc | 撮像装置 |
JP2009276426A (ja) * | 2008-05-13 | 2009-11-26 | Canon Inc | 撮像装置 |
JP2012173531A (ja) * | 2011-02-22 | 2012-09-10 | Sony Corp | 撮像装置、およびフォーカス制御方法、並びにプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019188035A1 (ja) * | 2018-03-28 | 2019-10-03 | ソニー株式会社 | 撮像装置及び撮像装置における通知制御方法、並びに情報処理装置 |
Also Published As
Publication number | Publication date |
---|---|
CN105026976A (zh) | 2015-11-04 |
JP5981026B2 (ja) | 2016-08-31 |
US20150381883A1 (en) | 2015-12-31 |
US9456129B2 (en) | 2016-09-27 |
JPWO2014157270A1 (ja) | 2017-02-16 |
CN105026976B (zh) | 2017-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5981026B2 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
KR102145542B1 (ko) | 촬영 장치, 복수의 촬영 장치를 이용하여 촬영하는 촬영 시스템 및 그 촬영 방법 | |
JP5931206B2 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
JP5901801B2 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
JP5960286B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP6302564B2 (ja) | 動画編集装置、動画編集方法及び動画編集プログラム | |
JP5833254B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP5955417B2 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
WO2014155813A1 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP2012015619A (ja) | 立体表示装置及び立体撮影装置 | |
US10015405B2 (en) | Image processing device, imaging device, image processing method, and image processing program | |
JP2022184712A (ja) | 情報処理装置、撮像装置、制御方法、プログラム、および記憶媒体 | |
JP2022183845A (ja) | 情報処理装置、制御方法、プログラム、および記憶媒体 | |
JP6257255B2 (ja) | 表示制御装置及び表示制御装置の制御方法 | |
JP5934844B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
US20240236503A1 (en) | Electronic device, control method thereof and non-transitory computer-readable medium | |
JP7389662B2 (ja) | 撮像装置、制御方法、プログラムならびに記憶媒体 | |
WO2014045741A1 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP2016015602A (ja) | 撮像装置、その制御方法及びプログラム並びに記録媒体 | |
JP2023117615A (ja) | 電子機器、電子機器の制御方法、プログラム、および記憶媒体 | |
JP2022183846A (ja) | 電子機器及びその制御方法及びプログラム及び記録媒体 | |
KR20240143821A (ko) | 촬상장치 | |
JP2022183656A (ja) | 電子機器、制御方法、プログラム、および記憶媒体 | |
US20130162689A1 (en) | Display control apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480010818.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14773408 Country of ref document: EP Kind code of ref document: A1 |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
ENP | Entry into the national phase |
Ref document number: 2015508570 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14773408 Country of ref document: EP Kind code of ref document: A1 |