US20010045989A1 - Image sensing apparatus - Google Patents
Image sensing apparatus Download PDFInfo
- Publication number
- US20010045989A1 US20010045989A1 US09/170,056 US17005698A US2001045989A1 US 20010045989 A1 US20010045989 A1 US 20010045989A1 US 17005698 A US17005698 A US 17005698A US 2001045989 A1 US2001045989 A1 US 2001045989A1
- Authority
- US
- United States
- Prior art keywords
- image
- photoelectric conversion
- rangefinding
- image forming
- optical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
Definitions
- This invention relates to an image sensing apparatus having an automatic focusing function for focusing the image of a subject.
- TTL (Through The Lens) secondary image forming phase-difference detection Optical images that have been formed by passage through different pupil areas of an image sensing optical system are formed again as a pair of secondary images on a focus detection lens via a secondary image forming optical system and the state of focus of the image sensing optical system is detected from the spacing between the two secondary images.
- a rangefinding pattern formed on an object by a light projection system is received by light-receiving systems spaced apart by a predetermined baselength, and absolute distance to the object is sensed based upon outputs from the light-receiving systems.
- finders are used as monitors for verifying the photographic area of the above-mentioned digital still cameras:
- a photoelectric finder whereby an output obtained by photoelectrically converting the image of an object is displayed on a monitor such as an a liquid crystal display.
- the secondary image forming phase-difference detection method and the passive triangulation method require the use of a photoelectric converting sensor for focus detection in addition to the image sensing device for acquisition of the photographic image. This raises the cost of the focus detection mechanism and increases the size of the image sensing device.
- the active triangulation method besides having the drawback set forth above, requires a special-purpose light receiving device for receiving projected light. This raises the cost of the focus detection mechanism.
- the optical TTL finder requires a mechanism such as quick-return mirror or half-mirror for separating and switching between a photographic light flux and a finder light flux. This results in an apparatus of large size.
- the optical rangefinder uses a double-image coincidence mechanism in order to display the state of focusing.
- the result is a complex, costly structure.
- the photoelectric finder displays images at a low resolution and makes it difficult to confirm state of focusing accurately.
- an object of the present invention is to provide an image sensing apparatus that is capable of performing autofocusing highly accurately through a simple structure.
- an image sensing apparatus comprising photoelectric conversion means for photoelectrically converting an image of an object obtained through image forming optics; optical path changeover means provided on an optical path between the image forming optics and the photoelectric conversion means and movable between first and second positions for changing over the optical path in such a manner that a first image of the object is formed on the photoelectric conversion means when the optical path changeover means is at the first position and a second image of the object is formed on the photoelectric conversion means when the optical path changeover means is at the second position; focus detection means for detecting state of focus of the image forming optics using the first image when the optical path changeover means is at the first position; and image sensing means for picking up the second image using the photoelectric conversion means when the optical path changeover means is at the second position.
- focus detection and image pick-up can be performed by a single photoelectric conversion means.
- the apparatus utilizes an image forming optical system for image pick-up and the photoelectric conversion means thereof effectively to make possible rangefinding by TTL secondary image phase-difference detection or passive triangulation.
- the apparatus is unnecessary to separately provide costly photoelectric conversion means for focus detection, thus making it possible to provide a small-size, inexpensive image sensing apparatus capable of highly accurate rangefinding through a simple structure.
- the optical path changeover means of the image sensing apparatus has focus detection optics for forming two secondary images as the second image of the object from a pair of light fluxes obtained by passing the image of the object through different pupil areas; the apparatus further comprising means for detecting a phase difference between the two secondary images, which have been formed on the photoelectric conversion means, when the optical path changeover means is at the first position.
- the state of focus of the image forming optical system is detected by TTL secondary image forming phase-difference detection. This makes it possible to perform accurate detection of focusing in a short period of time even in a case where the object is greatly out of focus.
- the optical path changeover means of the image sensing apparatus includes: a first mirror for deflecting an image forming light flux in a direction different from an image forming optic axis connecting the image forming optics and the photoelectric conversion means; and a second mirror for returning the deflected light flux to the image forming optic axis
- the optical path changeover means of the image sensing apparatus includes a beam splitter for splitting an image forming light flux into light fluxes in first and second directions at a predetermined ratio of light quantities; the apparatus having finder means for viewing the image of the object along the second direction.
- the optical path changeover means of the image sensing apparatus has lens means for making image forming power of the image forming optics different at the first and second positions.
- a focus detection optical system having a prescribed optical path can be accommodated in a small space, thus making it possible to reduce the size of the image sensing apparatus and to obtain a wide range of focus detection.
- the image sensing apparatus further comprises release operating means, wherein the focus detection means is activated in response to a first operation of the release operating means, and the optical path changeover means is switched from the first position to the second position in response to a second operation of the release operating means.
- the image sensing apparatus further comprises display means for displaying the first image of the object when focus detection is performed by the focus detection means and the second image of the object when pick-up is performed by the image sensing means.
- the image sensing apparatus further comprises focusing control means for performing focusing based upon results of focus detection by the focus detection means.
- TTL focus detection is performed with a coarse image projected before image pick-up, and automatic focusing is carried out highly accurately in a short period based upon the result, thereby making it possible to focus the image of the object.
- a high-definition, focused image can subsequently be acquired.
- the image sensing apparatus further comprises display means for selectively displaying the first and second images of the object; wherein the optical path changeover means has image magnification changing means for forming the first image on the photoelectric conversion means at a first magnification at the first position and forming the second image on the photoelectric conversion means at a second magnification at the second position, whereby sizes of the first and second images displayed on the display means are made substantially the same.
- the normal image of an object and the reduced image of the object obtained by projection can be displayed with their sizes equalized in regard to the same subject imaged at different optical characteristics. This improves the ability to visually confirm an image of reduced size.
- the optical path changeover means of the image sensing apparatus forms a plurality of images of the object on the photoelectric conversion means at the first position and forms a single image of the object on the photoelectric conversion means at the second position.
- both an image for focus detection and an image for photography can be obtained through a simple structure by a single image sensing means.
- the optical path changeover means of the image sensing apparatus forms the first image of the object at a location displaced from the center of a light-receiving portion of the photoelectric conversion means at the first position and forms the second image of the object at the center of the light-receiving portion of the photoelectric conversion means at the second position.
- the second optical image forming means of the image sensing apparatus has an image forming power different from that of the first optical image forming means.
- the image sensing apparatus further comprises image signal recording means for recording the output of the first photoelectric conversion means.
- the first optical image forming means of the image sensing apparatus includes a zoom lens and the rangefinding means has image magnification correction means for correcting a fluctuation in image magnification that accompanies a zooming operation of the zoom lens.
- This arrangement is such that when the image forming optical means for image pick-up is used for rangefinding, a parameter correction conforming to power fluctuation is carried out to perform a rangefinding calculation. This makes it possible to perform accurate rangefinding at all times even when power fluctuates.
- the image sensing apparatus further comprises display means for displaying the first image.
- the image of a subject for image pick-up is displayed for monitoring.
- the state of subject focus can be checked and it is possible to prevent the taking of a photograph that is out of focus.
- An image sensing apparatus comprises projection means for projecting rangefinding a light flux toward an object to form a rangefinding pattern on the object; optical image forming means, which is spaced away from the projection means by a predetermined baselength, for selectively forming the image of the rangefinding pattern and the image of the object; photoelectric conversion means for photoelectrically converting the image of the rangefinding pattern and the image of the object; and rangefinding means for sensing distance between the optical image forming means and the object based upon an output from the photoelectric conversion means when the image of the rangefinding pattern has been received by the photoelectric conversion means.
- both a projection pattern for rangefinding in active triangulation and an image of the object can be acquired by a single image sensing system, as a result of which the apparatus can be reduced in size and lowered in cost.
- the optical image forming means of the image sensing apparatus includes a zoom lens and the rangefinding means has image magnification correction means for correcting a fluctuation in image magnification that accompanies a zooming operation of the zoom lens.
- the image sensing apparatus further comprises image signal recording means for recording the output of the photoelectric conversion means when the image of the subject has been received by the photoelectric conversion means.
- the image sensing apparatus further comprises wavelength region selecting means, which is interposed between the optical image forming means and the photoelectric conversion means, for passing a first wavelength region when the image of the rangefinding pattern is photoelectrically converted and passing a second wavelength region when the image of the object is photoelectrically converted.
- a wavelength selection suited to acquisition of a pattern image for rangefinding is performed at the time of rangefinding and a wavelength selection suited to acquisition of the image of the object at the time of image pick-up.
- the image sensing apparatus further comprises focusing means for focusing the first optical image forming means based upon an output of the rangefinding means; focal shift discrimination means for discriminating state of focus of the image of the object based upon outputs from the rangefinding means and the focusing means; image signal combining means for combining outputs from the first and second photoelectric conversion means; display means for displaying an output image from the image signal combining means; and combining control means for changing operation of the image signal combining means based upon an output from the focal shift discrimination means.
- the combining control means of the image sensing apparatus varies relative amount of offset between display positions, on the display means, of outputs from the first and second photoelectric conversion means in dependence upon a focal shift signal output by the focal shift discriminating means.
- the amount of focal shift of the image of an object can be checked visually and clearly from the amount of relative offset between two images displayed in superposition. This makes it possible to prevent the taking of a photograph that is out of focus.
- An image sensing apparatus comprises optical image forming means for capturing a light flux from an object and forming an image of the object; photoelectric conversion means for photoelectrically converting the image of the object; rangefinding means for sensing distance between the optical image forming means and the object; focusing means for focusing the image forming optical means based upon an output from the rangefinding means; display means for displaying the image of the object from the photoelectric conversion means; focal shift discrimination means for discriminating state of focus of the image of the object based upon outputs from the rangefinding means and the focusing means; and display control means for varying form of display of the image of the object on the display means based upon an output from the focal shift discrimination means.
- the amount of focal shift of the image of an object can be checked visually and clearly based upon images combined and displayed. This makes it possible to prevent the taking of a photograph that is out of focus.
- the display means of the image sensing apparatus has first and second display areas, and the display control means varies position of a displayed image in the second display area relative to a displayed image in the first display area in dependence upon a focal shift signal from the focal shift discrimination means.
- the amount of focal shift of the image of an object can be checked visually and clearly from the amount of relative offset between two images displayed in superposition. This makes it possible to prevent the taking of a photograph that is out of focus.
- the focal shift discrimination means of the image sensing apparatus detects amount of focal shift from outputs from the rangefinding means and the focusing means.
- a finder display or the like can be presented using information relating to the amount of focal shift sensed. This makes it possible to prevent the taking of a photograph that is out of focus.
- FIG. 1 is a block diagram illustrating the construction of an image sensing apparatus according to a first embodiment of the present invention, this diagram showing the apparatus at the time of focus detection;
- FIG. 2 is a diagram showing formation of images of an object when the object is in focus at the time of focus detection according to the first embodiment
- FIG. 3 is a diagram showing formation of the images of an object when the object is not in focus at the time of focus detection according to the first embodiment
- FIG. 4 is a diagram showing the image sensing apparatus at the time of image pick-up according to the first embodiment
- FIG. 5 is a diagram showing formation of the image of an object at the time of image pick-up according to the first embodiment
- FIG. 6 is a flowchart showing a procedure for controlling a camera according to the first embodiment
- FIG. 7 is a flowchart showing a procedure for controlling a lens according to the first embodiment
- FIG. 8 is a diagram illustrating part of a focus detection optical system according to a second embodiment of the present invention.
- FIG. 9 is a diagram showing formation of the images of an object when the object is in focus at the time of focus detection according to the second embodiment
- FIG. 10 is a diagram illustrating part of a focus detection optical system according to a third embodiment of the present invention.
- FIG. 11 is a diagram showing formation of the images of an object when the object is in focus at the time of focus detection according to the third embodiment
- FIG. 12 is a block diagram illustrating the construction of an image sensing apparatus according to a fourth embodiment of the present invention, this diagram showing the apparatus at the time of focus detection;
- FIG. 13 is a diagram showing formation of the images of an object when the object is in focus at the time of focus detection according to the fourth embodiment
- FIG. 14 is a diagram useful in describing the state of a display on a display unit at the time of focus detection according to the fourth embodiment
- FIG. 15 is a diagram showing the image sensing apparatus at the time of image pick-up according to the fourth embodiment.
- FIG. 16 is a diagram showing formation of the image of an object at the time of image pick-up according to the fourth embodiment
- FIG. 17 is a flowchart showing a procedure for controlling a camera according to the fourth embodiment.
- FIG. 18 is a block diagram illustrating the construction of an image sensing apparatus according to a fifth embodiment of the present invention, this diagram showing the apparatus at the time of focus detection;
- FIG. 19 is a diagram showing the construction of an image sensing apparatus at the time of image pick-up according to the fifth embodiment
- FIG. 20 is a block diagram illustrating the construction of an image sensing apparatus according to a sixth embodiment of the present invention.
- FIGS. 21A, 21B are diagrams useful in describing the state of image formation when rangefinding is performed according to the sixth embodiment
- FIGS. 22A, 22B are diagrams useful in describing the principle of image magnification correction according to the sixth embodiment
- FIG. 24 is a flowchart showing a procedure for controlling an image sensing apparatus according to the sixth embodiment.
- FIG. 25 is a diagram showing the construction of an image sensing apparatus at the time of rangefinding according to the seventh embodiment
- FIG. 26 is a diagram useful in describing the state of formation of a spot image for rangefinding according to the seventh embodiment
- FIG. 27 is a diagram useful in describing the concept of an image signal when a rangefinding calculation is performed according to the seventh embodiment
- FIG. 28 is a diagram showing the construction of an image sensing apparatus at the time of image pick-up according to the seventh embodiment
- FIGS. 29 is a diagram useful in describing the state of image formation when image pick-up is performed according to the seventh embodiment
- FIG. 30 is a diagram useful in describing the state of an image display after image pick-up according to the seventh embodiment
- FIG. 31 is a flowchart showing a procedure for controlling an image sensing apparatus according to the seventh embodiment
- FIG. 32 is a block diagram illustrating the construction of an image sensing apparatus according to an eighth embodiment of the present invention.
- FIG. 33 is a diagram useful in describing the concept of an image signal when a rangefinding calculation is performed according to the eighth embodiment
- FIG. 34 is a diagram useful in describing the state of an image display when rangefinding is performed according to the eighth embodiment.
- FIG. 35 is a flowchart showing a procedure for controlling an image sensing apparatus according to the eighth embodiment.
- FIG. 36 is a block diagram illustrating the construction of an image sensing apparatus according to a ninth embodiment of the present invention.
- FIG. 37 is a diagram useful in describing the state of image formation when rangefinding is performed according to the ninth embodiment.
- FIG. 38 is a diagram useful in describing the state of an image display when rangefinding is performed according to the ninth embodiment.
- FIG. 39 is a flowchart showing a procedure for controlling an image sensing apparatus according to the ninth embodiment.
- FIGS. 1 through 7 are diagrams relating to a first embodiment of the present invention.
- FIG. 1 is a block diagram showing the construction of an image sensing apparatus according to a first embodiment.
- This embodiment relates to a so-called single-lens reflex camera in which an interchangeable lens having an image sensing optical system is capable of being removably mounted on a camera body having a image sensing device.
- the camera includes a camera body 101 having a mount (not shown) joining various functional portions for image pick-up and an interchangeable lens 151 , described later.
- An object is indicated at OBJ.
- the camera is internally provided with a single-chip microcomputer 102 having a ROM, a RAM and A/D, D/A conversion functions.
- the microcomputer 102 implements a series of camera operations such as automatic exposure control (AE), autofocus (AF) and image sensing.
- the microcomputer 102 controls various circuits and lens operation by communicating with peripheral circuitry within the lens body 101 and with the interchangeable lens 151 .
- the ROM constitutes a storage medium and can be a semiconductor memory, an optical disk, a magneto-optic device or a magnetic medium, etc.
- a power supply 103 supplies the camera circuits and actuators with power-supply voltage and supplies the interchangeable lens 151 with power via a line Vcc.
- a line DCL transmits a signal from the microcomputer 102 to a microcomputer 161 (described later) inside the lens.
- a line DLC transmits a signal from the microcomputer 161 inside the lens to the microcomputer 102 inside the camera body.
- the camera body 101 controls the interchangeable lens 151 via these two lines.
- the camera body 101 and interchangeable lens 151 are connected to ground via a line GND.
- the camera body 101 has a display unit 104 , such as a liquid crystal panel, having a display function for displaying photographic conditions and a monitor function for monitoring a sensed image.
- a display unit 104 such as a liquid crystal panel, having a display function for displaying photographic conditions and a monitor function for monitoring a sensed image.
- a driver 105 drives and controls an image sensing device 111 , described later.
- the driver 105 controls the storage of charge in the image sensing device 111 , charge transfer, CDS (Correlated Double Sampling), AGC (Automatic Gain Control), A/D conversion, gamma correction and AWB (Automatic White Balance), etc.
- a memory 106 records and preserves image signal data representing a sensed image and can be a semiconductor memory, magnetic disk or optical disk, etc.
- a terminal 107 for outputting a recorded image to external equipment is connected to a personal computer or printer.
- the image sensing device 111 such as a CCD, is a two-dimensional photoelectric sensor for photoelectrically converting the image of the object formed by an image sensing optical system 152 - 154 .
- the camera body has a main switch 120 .
- this switch When this switch is turned on (closed), the microcomputer 102 allows the execution of a prescribed program relating to preparations for photography, namely exposure metering and focus detection, etc.
- Switches 121 (SW 1 ) and 122 (SW 2 ) are linked to a camera release button and are turned on (closed) by pressing the release button through first and second stroke lengths, respectively. More specifically, the switch 121 is for preparing for image pick-up. When this switch is turned on, preparatory photographic operations such as exposure metering, focus detection and focusing are executed.
- the switch 122 is a photography switch. When this switch is turned on, a photographic image that has been formed on the image sensing device 111 is acquired and recorded in the image memory 106 .
- An AF mode switch 123 is used to select the autofocus mode.
- a display switch 124 is used to designate a display for monitoring a photographic image.
- the image of the object formed by the image sensing optical system is formed again by a focus detection module 130 using the various optical elements set forth below.
- the focus detection module 130 includes a first mirror 131 for fully reflecting the photographic light flux upward in FIG. 1; a semi-transparent second mirror 132 for passing about 70% of the fully reflected light flux and reflecting the remaining 30% of the light flux rightward in FIG. 1; a third mirror 133 for fully reflecting the light flux downward in FIG. 1; and a fourth mirror 134 for fully reflecting the fully reflected light flux rightward in FIG.
- a field lens 135 placed in a first predetermined focal plane of the image sensing optical system, with a primary image IM 1 of the object OBJ being formed in this predetermined focal plane by the image sensing optical system; a field mask 136 which decides a focus detection area; and a pair of secondary image forming lenses 137 for forming the images of the primary image IM 1 again.
- a movable mirror unit 138 is capable of moving the first mirror 131 , fourth mirror 134 and secondary image forming lenses 137 in unison to advance and retract the same into and out of the photographic light flux.
- a quick-return (QR) actuator 139 drives the movable mirror unit 138 to advance and retract the same.
- a focusing screen 141 is placed in a second predetermined focal plane that is in a conjugate relationship with the first predetermined focal plane mentioned above.
- a secondary primary image IM 2 resulting from light flux reflected by the first mirror 131 and passed by the second mirror 132 is formed on the focusing screen 141 .
- a pentagonal prism 142 and an eyepiece 143 construct an optical finder that makes it possible for the photographer to view the secondary primary image IM 2 .
- the interchangeable lens 151 is capable of being removably mounted on the camera body 101 and includes a focusing lens group 152 for performing focusing by being advanced and retracted along the direction of the optic axis; a zoom lens group 153 for performing zooming by being advanced and retracted along the direction of the optic axis; and a relay lens group 154 for performing a prescribed image forming operation together with the lens groups 152 and 153 .
- the lens groups 152 , 153 and 154 together construct the image sensing optical system.
- the stop 155 decides the entrant light flux of the image sensing optical system, and an actuator 156 drives the stop 155 .
- the microcomputer 161 inside the lens is a single-chip microcomputer having a ROM, a RAM and A/D, D/A conversion functions.
- the microcomputer 161 controls the driving of a focus actuator and zoom actuator, described later, as well as the driving of the actuator mentioned above.
- Various operating states of the lens and parameters specific to the lens are transmitted to the microcomputer 102 by the signal line DLC.
- a focus actuator 162 drives the focusing lens group 152 to advance and retract the same, and a focus encoder 163 senses position information indicative of the position of the focusing lens group 152 , namely object distance information.
- a zoom actuator 164 drives the zoom lens group 153 to advance and retract the same, and a zoom encoder 165 senses position information indicative of the position of the zoom lens group 153 , namely focal length information.
- the interchangeable lens 151 forms the image of the object OBJ on the image sensing device 111 of the camera and performs focusing, zooming and control of entrant light quantity based upon a control instruction from the camera.
- a light flux from the object OBJ passes through the lens groups 152 , 153 , 154 and stop 155 constructing the image sensing optical system and is reflected by the first and second mirrors 131 , 132 , respectively, to form the first primary image IM 1 on a first image forming plane.
- the light flux is then reflected by the third mirror 133 , after which the light flux impinges upon the two secondary image forming lenses 137 .
- Each of the lenses 137 function as pupil.
- the lenses 137 form two images by pupil-splitting or -separating (referred to as simply “splitting”, hereinafter).
- two light fluxes are obtained and these are reflected by the fourth mirror 134 , after which the reflected light fluxes are projected onto the light-receiving surface of the image sensing device 111 as the two secondary images IMA and IMB.
- FIG. 2 is a diagram showing the state of image formation of the secondary images IMA, IMB on the image sensing device 111 .
- a light-receiving portion 112 of the image sensing device 111 has m ⁇ n light-receiving pixels and a charge transfer portion (CCD V for vertical transfer, not shown) for transferring electric charge that has accumulated in these pixels.
- a horizontal transfer portion (CCD H ) 113 stores electric charge transferred in the direction of arrow TRV by the vertical transfer portion CCD V in the light-receiving portion 112 , then transfers the charge in the direction of arrow TRH and outputs an image signal from a signal output portion 114 to the image sensor driver 105 .
- Two areas ARA, ARB on the light-receiving portion 112 are images of the field mask 136 projected by the secondary image forming lenses 137 of FIG. 1.
- the secondary images IMA, IMB of the first primary image IM 1 are formed in the areas ARA, ARB, respectively.
- V 0 represent the spacing between the two images IMA, IMB when the image sensing optical system is in focus with respect to the object OBJ.
- FIG. 3 is a diagram showing the state of image formation of the secondary images on the image sensing device 111 when the image sensing optical system is not in focus with respect to the object OBJ.
- the spacing between the two images IMA and IMB is V 1 ( ⁇ V 0 ).
- FIG. 4 is a diagram showing the camera when the photography switch 122 of the camera body 101 is closed upon the completion of the autofocus operation, thereby establishing the photographic state.
- the photography switch 122 When the photography switch 122 is closed, the movable mirror unit 138 is withdrawn away from the optical path of photography (i.e., upward in FIG. 4) by the quick-return actuator 139 . When this is done, the mirrors are removed from between the image sensing optical system and image sensing device 111 so that a primary image IM 3 produced by the image sensing optical system is formed on the image sensing device 111 .
- FIG. 5 is a diagram showing formation of the image of the object on the image sensing device 111 at the time of photography.
- the primary image IM 3 of the object OBJ is projected onto the light-receiving portion 112 . Accordingly, the image signal prevailing under these conditions is accepted and recorded in the image memory 106 of the camera body 101 , whereby the image is sensed.
- FIGS. 6 and 7 are flowcharts illustrating the control flows of the microcomputers 102 , 161 when focus detection, focusing and photography are performed by the camera body 101 and interchangeable lens 151 according to the first embodiment of the present invention.
- step S 101 When the main switch 120 of the camera body 101 is closed (turned on), the microcomputer 102 is activated from the sleep state and control proceeds from step S 101 to step S 102 , at which the states of the switches 121 - 124 inside the camera body 101 are sensed.
- step S 103 The state of the photography preparation switch 121 (SW 1 ), which is turned on by pressing the release button through its first stroke length, is sensed at step S 103 . Control returns to step S 102 when the switch 121 is off (open) and proceeds to step S 104 when the switch 121 is on.
- step S 104 The fact that preparation for shutter release has been executed is communicated to the microcomputer 161 inside the interchangeable lens 151 at step S 104 .
- step S 105 at which parameters are communicated to the microcomputer 161 .
- the communication of parameters involves transmitting lens-specific information such as the lens f-number, focal length and focusing sensitivity to the camera.
- step S 106 the image sensing device 111 is activated to acquire an image signal.
- Processing of the image signal acquired at step S 106 is executed at step S 107 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 108 At which object brightness information is calculated from the image signal processed at step S 107 . Further, in accordance with a predetermined exposure control program, a control value for narrowing the stop 155 and the exposure time (charge storage time) of the image sensing device 111 are calculated.
- step S 109 The setting of the AF mode switch 123 is discriminated at step S 109 to determine if the autofocus (AF) mode is in effect. Control proceeds to step S 112 if the mode is not the AF mode and to step S 110 if the mode is the AF mode.
- the defocus quantity ⁇ V of the object is calculated from the spacing between the two secondary images IMA, IMB at step S 110 in the manner illustrated in FIG. 3.
- the defocus quantity ⁇ V calculated at step S 110 is transmitted to the microcomputer 161 inside the lens.
- step S 112 The state of the photography switch (SW 2 ) 122 is discriminated at step S 112 . If the switch 122 is off, control returns to step S 102 so that the processing of steps S 102 -S 111 is executed again. If the photography switch 122 is found to be on at step S 112 , it is judged that release has been performed and control shifts to step S 121 .
- the transition to the release operation is communicated to the microcomputer 161 in the lens at step S 121 .
- the stop control value that was calculated at step S 106 is transmitted to the microcomputer 161 at step S 122 .
- step S 123 the movable mirror unit 138 is withdrawn from the photographic optical path of FIG. 1 to the attitude shown in FIG. 4, where the mirror unit 128 is outside the optical path.
- step S 124 Control of charge accumulation and charge transfer in the image sensing device is performed at step S 124 for the purpose of photography. This is followed by step S 125 , at which processing of the image signal acquired at step S 124 is executed in the same manner as performed at step S 107 . More specifically, the image signal is applied to A/D conversion, white balance adjustment, gamma correction and compression processing, etc.
- step S 126 the signal processed at step S 125 is recorded and preserved in the image memory 106 .
- the withdrawn movable mirror unit 138 is driven at step S 127 so as to be returned to the optical path for photography.
- An instruction for restoring the stop 155 is transmitted to the microcomputer 161 at step S 128 .
- the image recorded at step S 126 is displayed on the display unit 104 at step S 129 , thereby allowing the photographer to view the image that is the result of photography.
- Control returns to step S 102 when the above-described photographic operation is completed.
- FIG. 7 is a flowchart illustrating control of the microcomputer 161 inside the lens.
- step S 151 Power is supplied to the interchangeable lens by turning on the main switch 120 on the camera side, whereupon control proceeds from step S 151 to step S 152 .
- the microcomputer 161 is in the sleep state waiting for communication from the camera body. Control stops at step S 152 if there is no communication from the camera body.
- step S 152 If communication relating to release preparation corresponding to step S 104 in FIG. 6 is received from the microcomputer 102 , control proceeds from step S 152 to step S 153 .
- step S 153 at which the microcomputer 161 inside the lens senses the states of the focus encoder 163 and zoom encoder 165 and judges the current status of the lens. Communication of parameters corresponding to step S 105 in FIG. 6 is performed at step S 154 based upon the lens status sensed at step S 153 .
- a signal representing the amount of defocusing corresponding to step S 111 in FIG. 6 is received at step S 155 .
- the amount of focusing lens drive necessary is calculated at step S 156 based upon the amount of defocusing received at step S 155 and the lens status sensed at step S 153 .
- the focusing lens is driven to performing focusing based upon the amount of drive calculated at step S 156 .
- step S 158 It is determined at step S 158 whether a release interrupt corresponding to step S 121 of FIG. 6 has occurred. If the decision is “NO”, control returns to step S 152 . If a release interrupt occurs, however, control proceeds from step S 158 to step S 171 , at which driving of the focusing lens is halted.
- a stop control value is received from the microcomputer 102 at step S 172 .
- the stop actuator 156 is driven in accordance with the stop control value received at step S 172 .
- Photography is performed on the camera side.
- a stop restoration instruction is received at step S 174 .
- the stop is restored to the open state at step S 175 in accordance with this instruction.
- control returns to step S 152 .
- the photographic light flux is split and projected upon the image sensing device 111 via the focus detection optical system in the manner shown in FIG. 1.
- the camera calculates the defocus quantity ⁇ V of the object from the spacing between the two secondary images projected upon the image sensing device 111 , as shown in FIG. 3, and transmits ⁇ V to the interchangeable lens 151 .
- the interchangeable lens 151 drives the focusing lens in accordance with the defocus quantity ⁇ V, thereby performing focusing.
- the photography switch 122 is turned on, the movable mirror unit 138 is withdrawn from the photographic light flux, as shown in FIG. 4, so that the in-focus image of the object is projected upon the image sensing device 111 , as shown in FIG. 5.
- the camera acquires the image of the object and records the image in the image memory 106 .
- the focus detection module 130 composed of the four mirrors ( 131 - 134 ) is placed in the optical path between the optical lens system and the image sensing device, and the pair of optical lenses 137 is provided in the optical path of the focus detection module 130 .
- the optical lens 137 forms two images on the image sensing device 111 when the object is not in focus and forms one image on the image sensing device 111 when the object is in focus.
- the defocus quantity ⁇ V can be obtained based upon the spacing V 1 between the two images. As a result, it is unnecessary to separately provide photoelectric conversion means for focus detection, and both focus detection and focusing can be performed using the light flux that has passed through the photographic lens. This makes it possible to realize an inexpensive, small-size camera exhibiting highly accurate automatic focusing.
- the focus detection module 130 is disposed in back of the final lens group (the lens group 154 in the example of FIG. 1) of the image sensing optical system. As a result, the module is suited to a digital still camera of the single-lens reflex type having a long back focus.
- the optical path of the optical finder system and the optical path of the focus detection module 130 may be separated completely.
- a half-mirror such as the mirror 132 is provided between the lens 154 and the mirror 121 , and the mirror 132 is made a fully reflective mirror.
- the optical finder is disposed above the mirror 132 ′.
- the lenses 137 for splitting the optic axis into two portions should ideally be provided between the mirror 133 and the mirror 134 . Theoretically, however, it is possible to provide the lenses 137 between the lens 135 and mirror 133 or between the mirror 132 and lens 135 .
- the pair of secondary image forming lenses 137 are arranged parallely to the light incident direction onto the photographic screen 111 (that is, in a right-to-left direction with respect to the FIG. 1 orientation) so that the two secondary images (IMA and IMB) are disposed one above the other on the photographic screen 111 .
- a pair of secondary image forming lenses are adapted (in a direction vertical to the surface of FIG. 1 drawing) to be arranged so that two secondary images are juxtaposed side by side on the photographic screen and are vertically offset from the optic axis.
- the construction of the second embodiment is substantially identical to that of the first embodiment except for the arrangement of secondary forming lenses.
- FIG. 8 is a perspective view showing a development of part of the focus detection optical system according to the second embodiment
- FIG. 9 is a plan view showing the disposition of the secondary images on the image sensing device 111 .
- FIG. 8 schematically illustrates an optical path from the field mask 136 to the image sensing device 111 .
- the third mirror 133 and the fourth mirror 134 are omitted from the focus detection module 130 in FIG. 1 for the sake of simplified illustration purpose.
- the pair of two secondary image forming lenses 237 are disposed horizontally, i.e., side by side with respect to the photographic screen, between the field lens 135 (having the field mask 136 ) and the image sensing device 111 .
- the centers connecting the two secondary image forming lenses 237 are offset by a distance OFS to a position below the optic axis C. Accordingly, the projected images of the field mask 136 formed by the secondary image forming lenses 237 become downwardly offset areas ARC, ARD on the light-receiving portion 112 of the image sensing device 111 .
- FIG. 9 illustrates the disposition of images on the image sensing device 111 in the arrangement of FIG. 8.
- the images ARC, ARD of the field mask 136 are projected as areas on the light-receiving portion 112 , and secondary images IMC, IMD of the object OBJ are formed in the areas ARC, ARD, respectively.
- H 0 represent the spacing between the two images when the object is in focus
- H 1 represent the spacing between the two images when the object is not in focus. If H 1 is measured and H 0 and H 1 can be compared, then a defocus quantity ⁇ H (-H 1 -H 0 ) of the object OBJ can be detected.
- the two secondary image projection areas ARC, ARD are disposed at the lower part the light-receiving portion 112 , namely on the side near the exit in the transfer direction of the vertical-transfer CCD in the light-receiving portion.
- the image signal on the lower half of the light-receiving portion 112 need be read for the purpose of detecting focus.
- the image signal on the lower half of the light-receiving portion 112 is used in image processing for focus detection, then the image signal of the upper half of the light-receiving portion 112 maybe discarded without being read. In the first focus detection cycle, therefore, the time needed to read out the image signal is shortened.
- the second embodiment has the following advantage in addition to the advantages (AD 1 )-(AD 3 ) of the first embodiment:
- focus detection is carried out based upon one pair of secondary images.
- the optical system is so adapted that focus detection is carried out by forming two pairs of secondary images.
- FIG. 10 is a perspective view showing a development of part of the focus detection optical system according to the third embodiment
- FIG. 11 is a plan view showing the disposition of the secondary images on the image sensing device 111 .
- two pairs of secondary image forming lenses 337 are disposed between the field lens 135 (having the field mask 136 ) and the image sensing device 111 .
- the projected images of the field mask 136 formed by the secondary image forming lenses 337 become four areas ARA, ARB, ARC and ARD on the light-receiving portion 112 of the image sensing device 111 .
- FIG. 11 illustrates the disposition of images on the image sensing device 111 in the arrangement of FIG. 10.
- the images ARA, ARB, ARC, ARD of the field mask 136 are projected on the light-receiving portion 112 , and secondary images IMA, ImB, IMC, IMD of the object OBJ are formed in the areas ARA, ARB, ARC, ARD, respectively.
- V 0 represent the spacing between the two images IMA, IMB when the object is in focus
- H 0 represent the spacing between the two images IMC, IMD when the object is in focus. If the spacings between the images when the object is not in focus are measured and compared with the spacings V 0 and H 0 , then a defocus quantity of the object OBJ can be detected.
- the third embodiment has the following advantage in addition to the advantages (AD 1 )-(AD 3 ) of the first embodiment:
- focus detection is carried out based upon images of the object OBJ that are offset vertically and horizontally, focus detection can be performed with greater accuracy.
- the secondary image forming optical system for focus detection uses mirrors for deflecting the light flux.
- the fourth embodiment described below use is made of a reducing lens instead of mirrors.
- FIGS. 12 - 17 are diagrams relating to the fourth embodiment.
- FIG. 12 is a diagram showing the construction of an image sensing apparatus according to the present invention. This shows the apparatus when detection of focus is carried out.
- the focus detection module 130 of the first embodiment shown in FIG. 1 is replaced by a focus detection module 430 .
- the optical finder composed of such elements as the pentagonal prism is eliminated, other elements are the same as those shown in FIG. 1. The construction and operation of this embodiment will now be described.
- the focus detection module 430 includes a reducing lens 431 , a field mask 436 , a field lens 435 and two secondary image forming lenses 437 .
- the stop 155 of the interchangeable lens 151 and the entrance pupil of the pair of secondary image forming lenses 137 are in a projection relationship owing to the field lens 135 .
- a quick-return (QR) actuator 439 is provided for advancing and retracting the focus detection module 430 into and out of the projected light flux.
- the image of the object OBJ is formed as a primary image IM 4 on the primary image forming surface in the field lens 435 via the image sensing optical system, which is constructed by the lens groups 152 - 154 and stop 155 , and the above-mentioned reducing lens 431 .
- the primary image IM 4 has a size different from that of the first primary image IM 1 or IM 2 of the first embodiment owing to the intervention of the reducing lens 431 .
- the primary image IM 4 is split by the two secondary image forming lenses 437 disposed one above the other, whereby the image is formed again. These secondary images are projected upon the image sensing device 111 as IMA and IMB.
- FIG. 13 is a diagram showing formation of the secondary images on the image sensing device 111 .
- this embodiment detects the amount of defocusing of the object OBJ based upon a change in the difference between the two images with respect to the reference spacing value V 0 between the images.
- FIG. 14 illustrates the state of the display presented on a display unit 404 when focus detection is performed.
- One of the two secondary images, e.g., IMB, projected upon the image sensing device 111 in FIG. 13 is subjected to enlargement processing and displayed on the display unit 404 as IMBL, thereby making it possible for the photographer to check the composition of the photographic area as well as the state of focusing.
- FIG. 15 is a diagram showing the camera when the photography switch 122 of the camera body 101 is closed upon the completion of the autofocus operation, thereby establishing the photographic state.
- FIG. 16 is a diagram showing formation of the image of the object on the image sensing device 111 at the time of photography.
- the primary image IM 3 of the object OBJ is projected onto the light-receiving portion 112 in a manner similar to that of the first embodiment. Accordingly, the image signal prevailing under these conditions is accepted and recorded in the image memory 106 of the camera body 401 , whereby the image is sensed.
- FIG. 17 is a flowchart illustrating the control flow of a microcomputer 402 inside the camera body. This flowchart differs from that of FIG. 6 only in the addition of an operation for displaying a finder image at the time of focus detection.
- step S 401 When the main switch 120 of the camera body 401 is turned on, the microcomputer 402 is activated from the sleep state and control proceeds from step S 401 to steps S 402 , S 403 and S 404 .
- step S 404 The fact that preparation for shutter release has been executed is communicated to the microcomputer 161 inside the interchangeable lens 151 at step S 404 . This is followed by step S 405 , at which parameters are communicated to the microcomputer 161 .
- step S 406 the image sensing device 111 is activated to acquire an image signal.
- Processing of the image signal acquired at step S 406 is executed at step S 4107 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 408 at which the image to be displayed on the display unit 404 , namely the image in the area ARB of FIG. 13, is enlarged and then rotated by 180° about its center. In comparison with the image that prevails at the time of imaging, therefore, the image is turned upside down when focus detection is performed.
- step S 408 The image for viewing purposed obtained at step S 408 is displayed on the display unit 404 at step S 409 .
- step S 410 At which object brightness information is calculated from the image signal processed at step S 407 . Further, in accordance with a predetermined exposure control program, a control value for narrowing the stop 155 and the exposure time (charge storage time) of the image sensing device 111 are calculated.
- step S 411 The setting of the AF mode switch 123 is discriminated at step S 411 to determine if the autofocus (AF) mode is in effect. Control proceeds to step S 414 if the mode is not the AF mode and to step S 412 if the mode is the AF mode.
- the amount of defocus of the object is calculated from the spacing between the two secondary images at step S 412 in the manner illustrated in FIG. 13. Next, at step S 413 , the amount of defocus calculated at step S 412 is transmitted to the microcomputer 161 inside the lens.
- step S 414 The state of the photography switch 122 is discriminated at step S 414 . If the switch 122 is off, control returns to step S 402 . If the photography switch 122 is found to be ON, it is judged that release has been performed and control shifts to step S 421 .
- steps S 421 -S 429 is the same as the processing of steps S 121 -S 129 in FIG. 6 and need not be described again. Further, the flow for controlling the interchangeable lens 151 is the same as that of the first embodiment shown in FIG. 7 and need not be described again.
- the photographic light flux is split and projected upon the image sensing device 111 via the focus detection module 430 in the manner shown in FIG. 12.
- the camera enlarges one of the two secondary images, which are projected upon the image sensing device 111 in the manner shown in FIG. 13, and displays the enlarged image on the display unit 404 , as illustrated in FIG. 14.
- the camera calculates the amount of object defocus from the spacing between the two secondary images and transmits this to the interchangeable lens 151 .
- the interchangeable lens 151 drives the focusing lens in accordance with the defocus quantity, thereby performing focusing.
- the focus detection module 430 is withdrawn from the photographic light flux, as shown in FIG. 15, so that the in-focus image of the object is projected upon the image sensing device 111 , as shown in FIG. 16.
- the camera acquires the image of the object and records the image in the image memory 106 .
- the fourth embodiment has the following advantages in addition to the advantages (AD 1 ), (AD 2 ) of the first embodiment:
- AD 6 A mirror for deflecting the optical path is not required in the focus detection module 430 , thereby making it possible to reduce the size of the module and simplify the same.
- AD 7 Since the image for purposes of focus detection is displayed on the monitor screen of the display unit, an optical finder is unnecessary. This makes it possible to reduce the size and lower the cost of the apparatus.
- the reducing lens is used in the optical system for focus detection.
- a relay lens is inserted into the optical system for focus detection and the reducing lens is not employed.
- FIGS. 18 and 19 are diagrams relating to the fifth embodiment.
- FIG. 18 is a diagram showing the construction of an image sensing apparatus according to the present invention. This shows the apparatus when detection of focus is carried out.
- the focus detection module 430 of FIG. 12 is replaced by a focus detection module 530 , and a relay lens module 540 is additionally provided.
- a lens 531 is provided at the rearmost portion of the image forming optical system inside an interchangeable lens 551 .
- Other components are the same as shown in FIG. 12.
- the focus detection module 530 includes a field mask 536 , a field lens 535 and a pair of secondary image forming lenses 537 .
- the stop 155 of the interchangeable lens 551 and the entrance pupil of the pair of secondary image forming lenses 537 are in a projection relationship owing to the field lens 535 .
- the relay lens module 540 is provided internally with a concave relay lens 541 .
- a quick-return actuator (QR) 539 is provided for moving the focus detection module 530 and relay lens module 540 into the photographic light flux alternatively.
- the image of the object OBJ is formed as a primary image IM 5 on the primary image forming surface in the field lens 535 via the image sensing optical system, which is constructed by the lens groups 152 - 154 , stop 155 and lens 531 .
- the arrangement is such that the primary image IM 5 has a size substantially the same as that of the image IM 4 of the fourth embodiment.
- the primary image IM 5 is split by the two secondary image forming lenses 437 , whereby the image is formed again. These secondary images are projected upon the image sensing device 111 as IMA and IMB. The projected images are the same as those shown in FIG. 13. In addition, the viewing image displayed on a display unit 504 is similar to that shown in FIG. 14.
- FIG. 19 is a diagram showing the camera when the photography switch 122 of the camera body 501 is closed upon the completion of the autofocus operation, thereby establishing the photographic state.
- the entire focus detection module 530 is withdrawn away from the optical path of photography (i.e., upward in FIG. 19) by the quick-return actuator 539 .
- the relay lens module 540 is inserted into the photographic optical path in place of the focus detection module 530 .
- the primary image IM 3 formed by the image sensing optical system in the interchangeable lens 551 and the relay lens 541 in the camera body 501 is formed on the image sensing device 111 .
- the state of the formed image is the same as that of the primary image shown in FIG. 16. Accordingly, the image signal prevailing under these conditions is accepted and recorded in the image memory 106 of the camera body 501 , whereby the image is sensed.
- the fifth embodiment has the following advantages in addition to the advantages (AD 1 ), (AD 2 ) of the first embodiment and the advantages (AD 6 ), (AD 7 ) of the fourth embodiment:
- the focus detection optical system of the second or third embodiment may be applied to the fourth or fifth embodiment.
- a half-mirror may be placed in front of the focus detection module of the fourth or fifth embodiment to extract part of the photographic light flux and introduce this flux to an optical finder.
- the invention may be applied not only to an image sensing apparatus of interchangeable lens type but also to an image sensing apparatus having a fixed lens.
- FIGS. 20 through 24 are diagrams relating to the sixth embodiment.
- FIG. 20 is a block diagram showing the construction of an image sensing apparatus according to a sixth embodiment.
- Numeral 601 denotes a camera body having various functional components for forming the image of an object OBJ, detecting focus and sensing the image.
- the camera body includes the focusing lens group 152 for performing focusing by being advanced and retracted along the direction of the optic axis; the zoom lens group 153 for performing zooming by being advanced and retracted along the direction of the optic axis; and the relay lens group 154 for performing a prescribed image forming operation together with the lens groups 152 and 153 .
- the stop 155 decides the entrant light flux of the image sensing optical system.
- An infrared blocking filter 606 blocks infrared light from the object OBJ and passes only visible light.
- the lens groups 152 , 153 , 154 , the stop 155 and the infrared blocking filter 606 together construct the image sensing optical system.
- a first IM 1 of the object OBJ is formed on a main image sensing device 111 .
- the main image sensing device 111 is a two-dimensional photoelectric sensor, such as a CCD, for photoelectrically converting the first image IM 1 .
- the camera body further includes a rangefinding module 621 having a light-receiving lens 622 for forming the image of the object OBJ whose range is to be measured, an infrared blocking filter 623 for blocking infrared light and passing only visible light of the light flux that has passed through the light-receiving lens 622 , and a subordinate image sensing device 624 .
- the rangefinding optical system which includes the light-receiving lens 622 and the infrared blocking filter 623 , has an image forming power different from that of the above-mentioned image sensing optical system and forms a second image IM 2 of the object OBJ on the subordinate image sensing device 624 , described later.
- the subordinate image sensing device 624 such as a CCD, is a two-dimensional photoelectric sensor for photoelectrically converting the second image IM 2 .
- the module 621 including these elements is so disposed that its optic axis is spaced away from the optic axis of the image sensing optical system by a distance equivalent to a baselength BL.
- a microcomputer 631 is a single-chip microcomputer having a ROM, a RAM and A/D, D/A conversion functions.
- the microcomputer 631 implements a series of camera operations such as automatic exposure control (AE), autofocus (AF) and image sensing.
- AE automatic exposure control
- AF autofocus
- the microcomputer 631 controls the operation of peripheral circuits and actuators inside the camera body 601 .
- the ROM constitutes a storage medium and can be a semiconductor memory, an optical disk, a magneto-optic disk or a magnetic medium, etc.
- the power supply 103 supplies the camera circuits and actuators with power.
- the driver 105 drives and controls the main image sensing device 111 .
- the driver 105 controls the storage of charge in the image sensing device 111 , charge-transfer, CDS (Correlated Double Sampling), AGC (Automatic Gain Control), A/D conversion, gamma correction and AWB (Automatic White Balance), etc.
- a driver 634 drives and controls the subordinate image sensing device 624 and, like the driver 105 of the main image sensing device, controls the storage of charge in the image sensing device 111 , charge transfer, CDS, AGC, A/D conversion, gamma correction and AWB, etc.
- the memory 106 records and preserves image signal data representing an image sensed by the main image sensing device 111 and can be a semiconductor memory, optical disk, magneto-optical disk or magnetic medium, etc.
- the terminal 107 for outputting a recorded image to external equipment is connected to a personal computer or printer.
- the camera body has a display unit 104 , such as a liquid crystal panel, having a display function for displaying photographic conditions and a monitor function for monitoring a photographic image.
- a display unit 104 such as a liquid crystal panel, having a display function for displaying photographic conditions and a monitor function for monitoring a photographic image.
- the camera body has the main switch 120 .
- the microcomputer 631 allows the execution of a prescribed program relating to preparations for photography, namely exposure metering and focus detection, etc.
- the switches 121 and 122 are linked to the camera release button and are turned on by pressing the release button through first and second stroke lengths, respectively. More specifically, the switch 121 is for preparing for picture taking. When this switch is turned on, preparatory photographic operations such as exposure metering, focus detection and focusing are executed.
- the switch 122 is a photography switch. When this switch is turned on, a photographic image that has been formed on the image sensing device 111 is acquired and recorded in the image memory 106 .
- the AF mode switch 123 is used to select the autofocus mode.
- the display switch 124 is used to designate a display for monitoring a photographic image.
- the focus actuator 162 drives the focusing lens group 152 to advance and retract the same, and the focus encoder 163 senses position information indicative of the position of the focusing lens group 152 , namely object distance information.
- the zoom actuator 164 drives the zoom lens group 153 to advance and retract the same, and the zoom encoder 165 senses position information indicative of the position of the zoom lens group 153 , namely focal length information.
- the stop actuator 156 controls the stopping down of the stop 155 and restores the stop 155 to the open state.
- the camera body 601 acquires the first image IM 1 and second image IM 2 of the object OBJ and performs rangefinding, focusing, and image sensing through methods described later.
- a light flux from the object OBJ passes through the image sensing optical system comprising the lens groups 152 , 153 , 154 and is formed on the main image sensing device 111 as the first image IM 1 . Further, the second image IM 2 is formed on the subordinate image sensing device 624 inside the rangefinding module 621 .
- FIGS. 21A, 21B are diagrams illustrating the two image sensing devices 111 , 624 and the dispositions of two images formed on these image sensing devices.
- the light-receiving portion 112 of the image sensing device 111 comprises m 1 ⁇ n 1 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels.
- the horizontal transfer CCD 113 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receiving portion 112 , then transfers the charge in the direction of arrow TRH and outputs an image signal from the signal output portion 114 to the image sensor driver 105 .
- IM 1 T represents the image of the object OBJ when the image sensing optical system has been set to the maximum telescopic mode
- IM 1 W represents the image of the object OBJ when the image sensing optical system has been set to the maximum wide-angle mode.
- the size of the first image IM 1 of the object varies depending upon the state of the image sensing optical system.
- a light-receiving portion 625 of the subordinate image sensing device 624 comprises m 2 ⁇ n 2 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels.
- a horizontal transfer CCD 626 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receiving portion 625 , then transfers the charge in the direction of arrow TRH and outputs an image signal from the signal output portion 627 to the image sensor driver 634 .
- IM 2 INF represents the image obtained when the object OBJ is at infinity
- IM 2 DEF represents the image obtained when the object OBJ is at a finite distance.
- the position of the second image IM 2 of the object varies depending upon the distance of the object OBJ.
- FIGS. 22A, 22B are diagrams useful in describing the principle of image magnification correction for detecting object distance from the first and second images IM 1 and IM 2 , respectively, of the object.
- a disparity in regard to the object is detected from the relative positions of two images formed by two image forming systems spaced apart by a predetermined baselength, and the object distance is found from this disparity. In this case, it is required that the sizes of the two images be equalized.
- the first image IM 1 varies in size depending upon the zoom setting of the image sensing optical system.
- the number of pixels (or pixel size) of the light-receiving portion 112 of image sensing device 111 and the number of pixels (or pixel size) of the light-receiving portion 625 of image sensing device 624 differ. Consequently, if the image signal is processed digitally, it is necessary to subject the image to a magnitude correction based upon the difference in the numbers of pixels.
- the image forming characteristics of the image sensing optical system are recognized from the results of detection from the focus encoder 163 and zoom encoder 165 inside the camera body 601 , and the size of the first image IM 1 is made equal to the size of the second image IM 2 based upon the results of recognition.
- FIGS. 22A, 22B are diagrams illustrating the respective images after application of the above-described size correction.
- FIGS. 21A, 21B illustrate the sizes of the optical images on the image sensing devices 624 , 111
- FIGS. 22A, 22B conceptually illustrate the image signals in image computation memories (not shown in FIG. 20) within the microcomputer 631 .
- IM 2 DEF represents the image signal read out of the subordinate image sensing device 624 and stored in a second computation memory RAM 2
- IM 1 0 represents the image signal read out of the main image sensing device 111 and stored in a first computation memory RAM 1 .
- the signal IM 1 0 is an image signal that has undergone the size correction described above.
- the image IM 2 DEF regarding the second image IM 2 of the object and the image IM 1 0 regarding the first image IM 1 of the object are the same in size but and differ only in terms of their relative positions, as illustrated in FIG. 23.
- FIG. 23 is a schematic view showing spacing V DEF of image signals stored in the two computation memories RAM 1 , RAM 2 mentioned above.
- the reference spacing V 0 should be zero when the object is at infinity.
- the optical systems and image sensing elements develop positional offsets in the camera manufacturing process. According to this embodiment, therefore, information representing the reference spacing V 0 conforming to the positions of the focusing lens group 152 and zoom lens group 153 is stored in the ROM (not shown) of the microcomputer 631 .
- FIG. 24 is a flowchart showing the control flow of the microcomputer 631 when focus detection, focusing and photography are performed in the camera body 601 according to the sixth embodiment.
- step S 501 the states of the switches 121 - 124 inside the camera body 601 are sensed.
- step S 503 The state of the photography preparation switch 121 (SW 1 ), which is turned on by pressing the release button through its first stroke length, is sensed at step S 503 . Control returns to step S 502 when the switch 121 is off and proceeds to step S 504 when the switch 121 is on.
- step S 504 the main image sensing device 111 is activated to acquire an image signal.
- Processing of the image signal acquired at step S 504 is executed at step S 505 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 506 At which object brightness information is calculated from the image signal processed at step S 505 . Further, in accordance with a predetermined exposure control program, a control value for stopping down the stop 155 and the exposure time (charge storage time) of the image sensing device 111 are calculated.
- the image signal produced by the image sensing device 111 at steps S 504 and S 505 namely the image signal 1 M 1 W or 1 M 1 T in FIG. 21B, is displayed on the display unit 104 at step S 507 .
- step S 508 the setting of the AF mode switch 123 is discriminated at step S 508 to determine if the autofocus (AF) mode is in effect. Control jumps to step S 520 if the mode is not the AF mode and proceeds to step S 511 if the mode is the AF mode.
- the autofocus operation described below, is then executed.
- the microcomputer 631 senses the state of the focus encoder 163 at step S 511 and senses the state of the zoom encoder 165 at step S 512 to judge the current optical status of the lens.
- a coefficient for making the size of the first image IM 1 of the object equal to that the second image of the object, namely an image magnification correction coefficient, is read out of the ROM (not shown) of microcomputer 631 at step S 513 in the manner described above in connection with FIGS. 22A, 22B. Coefficients are stored in the ROM as matrix data corresponding to the states of the focus encoder 163 and zoom encoder 165 .
- a position offset correction quantity V 0 is read out of ROM at step S 514 in the same manner as the image magnification correction coefficient.
- the subordinate image sensing device 624 is activated at step S 515 to obtain an image signal.
- Processing of the image signal acquired at step S 515 is executed at step S 516 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 517 At which image size is corrected by multiplying the image signal of the first image IM 1 acquired at step S 505 by the image magnification correction coefficient read out at step S 513 .
- step S 518 the distance to the object is calculated in accordance with Equations (EQ.1), (EQ.2) using the first image IM 1 whose size has been corrected at step S 517 , the second image IM 2 obtained at step S 516 and the position offset correction quantity V 0 obtained at step S 514 .
- the focusing lens group 152 is driven at step S 519 based upon the result of the above-described calculation to bring the first image IM 1 , which is for image sensing purposes, into focus.
- step S 520 The state of the photography switch 122 (SW 2 ) is discriminated at step S 520 . If the switch 122 is off, control returns to step S 502 so that the processing of steps S 502 -S 519 is executed again. If the photography switch 122 is found to be on at step S 520 , it is judged that release has been performed and control shifts to step S 521 .
- the stop actuator 166 is driven at step S 521 in accordance with the stop control value calculated at step S 506 .
- step S 522 Charge accumulation and charge transfer of the main image sensing device 111 for photography are controlled at step S 522 . This is followed by step S 523 , at which processing of the image signal acquired at step S 522 is executed in the same manner as performed at step S 505 . More specifically, the image signal is applied to A/D conversion, white balance adjustment, gamma correction and compression processing, etc.
- step S 524 the signal processed at step S 523 is recorded and preserved in the image memory 106 .
- the image recorded at step S 524 is displayed on the display unit 104 at step S 525 , thereby allowing the photographer to check the image that is the result of photography.
- the first image IM 1 of the object is formed on the image sensing device 111 via the image sensing optical system and the second image IM 2 of the object is formed on the subordinate image sensing device 624 via the light-receiving lens 622 , as illustrated in FIG. 20 and FIGS. 21A, 21B.
- the camera obtains the two above-mentioned images, performs the image magnification correction, as shown in FIGS. 22A, 22B, and calculates the distance to the object OBJ by calculating the spacing between the two images in the manner shown in FIG. 23.
- the focusing lens group 152 is driven based upon the calculated value, whereby focusing is achieved. Continuous focusing is performed by executing this operation repeatedly.
- the first image IM 1 of the object is displayed on the display unit 104 to inform the photographer of the composition of the picture taken and of the state of focusing.
- the sixth embodiment provides the following advantages:
- the rangefinding module can be simplified to make possible an autofocus camera that is compact and low in price.
- the image sensing optical system serves also as the rangefinding optical system.
- the image of the object for rangefinding purposes is also projected in enlarged size. This makes it possible to achieve a rangefinding accuracy that conforms to the state of the image sensing optical system.
- AD 14 Since the automatically focused image of the object is displayed on the display unit 104 such as a liquid crystal monitor, the state of focus of the image of the object can be verified visually and accurately in real-time.
- the sixth embodiment concerns a passive-triangulation-type rangefinding device composed of a single image sensing system and a single rangefinding module.
- a seventh embodiment described below provides an active-triangulation-type rangefinding device (i.e., a device in which infrared light is projected upon an object and rangefinding is performed based upon the reflected light) comprising a single image sensing system and a single projection system.
- FIGS. 25 through 31 are diagrams for describing the construction and operation of the seventh embodiment.
- FIG. 25 illustrates the disposition of the image sensing apparatus when rangefinding is performed according to the seventh embodiment.
- Components in FIGS. 25 through 31 that perform actions identical with those of the sixth embodiment are designated by like reference characters and need not be described again in detail.
- Numeral 701 denotes a camera body having various functional components for forming the image of an object OBJ, detecting focus and sensing the image.
- the image sensing optical system is composed of the elements 152 - 155 of the sixth embodiment.
- An infrared blocking filter 706 blocks infrared light from the object OBJ and passes only visible light. Since the filter 706 is used for ordinary image sensing, it is withdrawn from the light flux of the sensed image at the time of rangefinding (shown in FIG. 25).
- An infrared passing filter 707 blocks visible light and passes only infrared light from the object OBJ. Since the filter 707 is used for rangefinding, the filter is inserted into the light flux of the sensed image only at the time of rangefinding.
- the lens groups 152 , 153 , 154 , stop 155 and filters 706 , 707 together construct the image sensing optical system.
- Numeral 711 denotes an image sensing device such as a CCD. This is a two-dimensional photoelectric sensor for photoelectrically converting the object image, which is for image sensing purposes, or the image of an infrared spot which is for rangefinding, described later.
- the image sensing device 711 is sensitive to light from the visible to infrared wavelengths.
- a projection module 721 includes a light-emitting element 724 which emits infrared light from a light-emitting portion 723 , and a projecting lens 722 for projecting the emitted infrared light onto the object OBJ.
- the projection module 721 having these elements is spaced away from the optic axis of the image sensing optical system by the baselength BL. As a result, a rangefinding pattern that corresponds to the projected image of the light-emitting portion 723 , namely an infrared spot SPT, is formed on the object OBJ.
- the infrared spot SPT is formed, via the image sensing optical system, as an infrared spot image SPT 1 on the main image sensing device 711 at a position spaced a predetermined distance away from the center thereof. Since the infrared passing filter 707 has been inserted into the light flux of the sensed image, the light flux of the object OBJ per se does not pass through the filter; only the light flux from the infrared spot SPT arrives at the main image sensing device 711 .
- a driver 734 drives the light-emitting element 724 so that the latter emits rangefinding infrared light at the time of a rangefinding operation in accordance with an instruction from a microcomputer 731 .
- the microcomputer 731 is a single-chip microcomputer having a ROM, a RAM and A/D, D/A conversion functions. In accordance with a camera sequence program stored in the ROM serving as a storage medium, the microcomputer 731 implements a series of camera operations such as automatic exposure control (AE), autofocus (AF) and image sensing in a manner similar to that of the sixth embodiment. To this end, the microcomputer 731 controls the operation of peripheral circuits and actuators inside the camera body 701 .
- AE automatic exposure control
- AF autofocus
- image sensing in a manner similar to that of the sixth embodiment.
- the microcomputer 731 controls the operation of peripheral circuits and actuators inside the camera body 701 .
- the power supply 103 , driver 105 , memory 106 , terminal 107 , display unit 104 and switches 120 - 124 , 162 - 166 are similar to those of the sixth embodiment.
- An optical finder 761 is composed of a lens group 762 , a zoom lens 762 , an erecting prism 764 such as a Porro lens, a field mask 765 and an eyepiece 766 .
- a zoom linkage member 767 mechanically connects the zoom lens 153 with the zoom lens 763 .
- the magnification of the optical finder 761 is automatically adjusted by the zoom linkage member 767 in operative association with the zooming operation of the image sensing optical system.
- An exposure metering element 768 is disposed in the vicinity of the optical finder 761 .
- the exposure metering element 768 splits the light flux within the optical finder 761 by a beam splitter (not shown) and measures the brightness of the object before a picture is taken.
- An erect real image IM 2 of the object OBJ is projected into the field mask 765 by the optical finder 761 so that the photographer can verify the zone of photography by viewing the finder image IM 2 through the eyepiece 766 .
- FIG. 26 is a diagram illustrating the disposition of the infrared spot image SPT 1 formed on the main image sensing device at the time of rangefinding.
- a light-receiving portion 212 of the main image sensing device 711 comprises m 1 ⁇ n 1 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels.
- a horizontal transfer CCD 213 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receiving portion 212 , then transfers the charge in the direction of arrow TRH and outputs an image signal from a signal output portion 214 to the image sensor driver 105 .
- SPT 1 T represents the image of the infrared spot SPT when the image sensing optical system has been set to the maximum telescopic mode
- SPT 1 W represents the image of the infrared spot when the image sensing optical system has been set to the maximum wide-angle mode.
- the size and projected position of the image vary depending upon the state of the image sensing optical system.
- FIG. 27 illustrates the result of subjecting the image SPT 1 T or SPT 1 W to processing similar to that of the sixth embodiment and normalizing size and position.
- the normalized image signal is indicated at SPT 1 0 .
- Spacing V DEF between the position of the center of gravity of the signal SPT 1 0 and a predetermined reference position C is obtained.
- Distance DST to the object OBJ can be detected in accordance with Equations (EQ.1) and (EQ.2), in a manner similar to that of the sixth embodiment, using the reference spacing V 0 , normalized focal length f 0 of the image sensing optical system and baselength BL of the optical finder 761 .
- f 0 represents the focal length of the image sensing optical system normalized by normalization of the size and position of the spot image. This is an imaginary focal length for obtaining the normalized image signal SPT 1 0 of FIG. 27 at all times even if there is a changed in the zoom state.
- the object OBJ is brought into focus automatically if the focusing lens group 152 is driven based upon the distance DST to the object OBJ calculated in accordance with Equations (EQ.1), (EQ.2).
- FIG. 28 is a diagram showing the camera when the photography switch 122 of the camera body 701 is closed upon the completion of the autofocus operation, thereby establishing the image sensing state.
- the photography switch 122 When the photography switch 122 is turned on, the light-emitting element 724 stops emitting infrared light.
- the filter actuator 708 is then actuated to withdraw the infrared passing filter 707 from the photographic light flux and insert the infrared blocking filter 706 into the light flux.
- the image IM 1 of the object OBJ is formed on the main image sensing device 711 via the image sensing optical system. If FIGS. 25 and 28 are compared, it will be seen that the positions of the infrared blocking filter 706 and infrared passing filter 707 are reversed.
- FIG. 29 is a diagram showing formation of the image of the object on the main image sensing device 711 at the time of photography.
- the primary image IM 1 of the object OBJ is projected upon the image-receiving portion 212 . Accordingly, the image signal is acquired under these conditions and recorded in the image memory 106 of the camera body 701 , whereby the image is sensed.
- FIG. 30 is a diagram showing the state of the display on the display unit 104 after image sensing.
- the image IM 1 acquired in FIG. 29 is displayed on the display screen of the display unit 104 as an image IM 1 L resulting from photography. This allows the photographer to determine whether photography has been performed correctly.
- FIG. 31 is a flowchart showing the control flow of the microcomputer 731 when focus detection, focusing and photography are performed in the camera body 701 according to the seventh embodiment. The control flowchart of FIG. 31 will be described with reference to FIGS. 25 through 30.
- step S 601 When the main switch 120 of the camera body 701 is turned on, the microcomputer 731 is activated from the sleep state and control proceeds from step S 601 to step S 602 , at which the states of the switches 121 - 124 inside the camera body 701 are sensed.
- step S 603 The state of the photography preparation switch 121 (SW 1 ), which is turned on by pressing the release button through its first stroke length, is sensed at step S 603 . Control returns to step S 602 when the switch 121 is off and proceeds to step S 604 when the switch 121 is on.
- step S 604 at which the output of the exposure metering element 768 is read out, object brightness information is calculated and, in accordance with a predetermined exposure control program, a control value for narrowing the stop 155 and the exposure time (charge storage time) of the image sensing device 711 are calculated.
- step S 605 the setting of the AF mode switch 123 is discriminated at step S 605 to determine if the autofocus (AF) mode is in effect. Control jumps to step S 619 if the mode is not the AF mode and proceeds to step S 611 if the mode is the AF mode.
- the microcomputer 731 senses the state of the zoom encoder 165 at step S 611 to judge the current optical status of the lens. It should be noted that when rangefinding is performed according to this embodiment, the focusing lens 152 is always at an initial position that corresponds to infinity. The state of the focus encoder 163 , therefore, is not sensed.
- the image position offset correction quantity V 0 is read out of ROM at step S 613 in the same manner as the image magnification correction coefficient.
- the light-emitting element 724 is activated at step S 614 to project rangefinding infrared light toward the object OBJ.
- the main image sensing device 711 is activated at step S 615 to obtain the signal representing the infrared spot image SPT 1 .
- Processing of the image signal acquired at step S 615 is executed at step S 616 . More specifically, the image signal is converted from an analog to a digital quantity.
- step S 617 At which image size is corrected by multiplying the image signal of the infrared spot image SPT 1 acquired at step S 616 by the image magnification correction coefficient read out at step S 612 .
- the resulting signal is converted to the signal SPT 10 normalized in the manner shown in FIG. 27.
- step S 618 the distance DEF to the object is calculated upon calculating V DEF in accordance with Equations (EQ.1), (EQ.2) using the normalized signal SPT 10 obtained at step S 617 and the positional offset correction quantity V 0 obtained at step S 613 .
- step S 619 The state of the photography switch 122 (SW 2 ) is discriminated at step S 619 . If the switch 122 is off, control returns to step S 602 so that the processing of steps S 602 -S 618 , namely the rangefinding operation, is executed again. If the photography switch 122 is found to be on at step S 619 , it is judged that release has been performed and control shifts to step S 621 .
- the focusing lens 152 is driven at step S 621 based upon the result of the calculation at step S 618 to bring the image IM 1 into focus.
- the filter actuator 708 is driven at step S 622 to withdraw the infrared passing filter 707 from the photographic light flux and insert the infrared blocking filter 706 into the photographic light flux instead.
- the stop actuator 166 is driven at step S 623 in accordance with the stop control value calculated at step S 604 .
- step S 624 Charge accumulation and charge transfer of the main image sensing device for photography are controlled at step S 624 .
- step S 625 at which processing of the image signal acquired at the above-mentioned steps is executed. More specifically, the image signal is applied to A/D conversion, white balance adjustment, gamma correction and compression processing, etc.
- step S 626 the signal processed at step S 625 is recorded and preserved in the image memory 106 .
- the image recorded at step S 626 is displayed on the display unit 104 at step S 627 , thereby allowing the photographer to check the image that is the result of photography.
- the stop actuator 166 is restored to open the stop 155 at step S 628 .
- the infrared blocking filter 706 and infrared passing filter 707 are interchanged, i.e., restored to the positions that prevail at the time of the rangefinding operation, at step S 629 .
- the focusing lens group 152 is restored to its initial position at step S 630 .
- Control returns to step S 502 when the above-described photographic operation is completed.
- infrared light is projected toward the object OBJ from the projection module 721 to form the infrared spot SPT on the object, as shown in FIG. 25.
- the image sensing optical system forms the image of the infrared spot SPT on the main image sensing device 711 via the infrared passing filter 707 , and the distance to the object OBJ is detected based upon the amount of shift of the spot image from the reference position.
- AD 15 An active-triangulation-type rangefinding device that projects infrared light can be provided. This makes it possible to sense distance accurately even in a dark field.
- AD 16 Since an infrared spot image is obtained by an image sensing optical system having a large aperture, it is possible to measure distance even to a distant object.
- the image from the main image sensing device 111 is displayed on the display unit 104 , such as a liquid crystal monitor, as is when rangefinding is performed.
- a liquid crystal monitor provides a display of low resolution and, though it makes it possible to roughly ascertain the focused state, accurate verification of focusing is difficult.
- a second image of an object is displayed superimposed on a first image of the object after being shifted by an amount proportional to the amount of defocusing. In other words, the eighth embodiment provides a finder of double-image coincidence type.
- FIGS. 32 through 35 are diagrams relating to the eighth embodiment.
- FIG. 32 is a block diagram showing the structure of a camera body 801 used in the eighth embodiment.
- the components are the same as those of the sixth embodiment, the only difference being the manner of control at the time of rangefinding and the manner in which an image is displayed.
- the reference numerals 831 and 804 of the microcomputer and display unit, respectively are different from those of the sixth embodiment. All other components are the same as those of the sixth embodiment and operate in the same fashion and need not be described again.
- FIG. 33 corresponds to FIG. 23 of the sixth embodiment and illustrates a first object image IM 31 0 for imaging formed in the computation memory and a second object image IM 32 DEF within the rangefinding module 621 .
- computation is performed to make the size of the first image IM 1 of the object conform to the size of the second image of the object.
- these images are used in presenting a display. The second image, therefore, is made to conform to the first image, which is for image picture-taking purposes, and it is so arranged that the sizes and limits of the images displayed on the display unit 804 will coincide with the imaging area.
- An image IM 31 L from the main image sensing device 111 is displayed over the entire display area.
- a rectangular area AR centered on the display area is a twin-image display area in which an image IM 32 L, which is obtained by extracting part of the image from the subordinate image sensing device 624 , is displayed in superposition on the image IM 31 L.
- the images IM 31 L and IM 32 L are displayed in a form offset from each other by an amount DELTA calculated in accordance with the following equation:
- V DEF represents a quantity relating to the distance to the object OBJ
- D FOCUS a quantity relating to the amount of feed of the focusing lens 152
- K a coefficient for improving visibility by enlarging the display offset quantity
- FIG. 35 is a flowchart illustrating the flow of control by the microcomputer 831 in a case where focus detection, focusing and photography are carried out in the camera body 801 of the eighth embodiment.
- step S 701 When the main switch 120 of the camera body 801 is turned on, the microcomputer 831 is activated from the sleep state and control proceeds from step S 701 to step S 702 , at which the states of the switches 121 - 124 inside the camera body 801 are sensed.
- step S 703 The state of the photography preparation switch 121 (SW 1 ), which is turned on by pressing the release button through its first stroke length, is sensed at step S 703 . Control returns to step S 702 when the switch 121 is off and proceeds to step S 704 when the switch 121 is on.
- step S 704 the main image sensing device 111 is activated to acquire an image signal.
- Processing of the image signal acquired at step S 704 is executed at step S 705 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 706 At which object brightness information is calculated from the image signal processed at step S 705 . Further, in accordance with a predetermined exposure control program, a control value for stopping down the stop 155 and the exposure time (charge storage time) of the image sensing device 111 are calculated.
- step S 707 the setting of the AF mode switch 123 is discriminated at step S 707 to determine if the autofocus (AF) mode is in effect. Control jumps to step S 722 if the mode is not the AF mode and proceeds to step S 711 if the mode is the AF mode.
- the microcomputer 831 senses the state of the focus encoder 163 at step S 711 and senses the state of the zoom encoder 165 at step S 712 to judge the current optical status of the lens.
- a coefficient for making the size of the first image IM 1 of the object equal to that the second image of the object, namely an image magnification correction coefficient, is read out of the ROM of microcomputer 831 at step S 713 in the manner described above in connection with FIGS. 22A, 22B. Coefficients are stored in the ROM as matrix data corresponding to the states of the focus encoder 163 and zoom encoder 165 .
- a position offset correction quantity V 0 is read out of the ROM at step S 714 in the same manner as the image magnification correction coefficient.
- the subordinate image sensing device 624 is activated at step S 715 to obtain an image signal.
- Processing of the image signal acquired at step S 715 is executed at step S 716 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 717 At which image size is corrected by multiplying the image signal of the second image IM 2 acquired at step S 715 by the reciprocal of the image magnification correction coefficient read out at step S 713 .
- step S 718 the distance to the object is calculated in accordance with Equations (EQ.1), (EQ.2) using the second image IM 2 that was subjected to the image magnification correction at step S 717 , the first image IM 1 obtained at step S 705 and the position offset correction quantity V 0 obtained at step S 714 .
- the focusing lens group 152 is driven at step S 719 based upon the result of the above-described calculation to bring the first image IM 1 into focus.
- step S 720 At which the offset DELTA between the two images for display shown in FIG. 34 is calculated in accordance with Equation (EQ.3) and processing for superposing the two images is executed.
- step S 721 the image signal obtained at step S 720 , namely the split-image coincidence image signal, is displayed on the display unit 804 .
- step S 722 The state of the photography switch 122 (SW 2 ) is discriminated at step S 722 . If the switch 122 is off, control returns to step S 702 so that the processing of steps S 702 -S 721 is executed again. If the photography switch 122 is found to be on at step S 722 , it is judged that release has been performed and control shifts to step S 731 .
- Steps S 731 -S 736 are for an image sensing operation identical with that of steps S 521 -S 526 of FIG. 24 according to the sixth embodiment.
- control returns to step S 702 .
- the camera performs the rangefinding calculation and carries out autofocusing by driving the focusing lens in a manner similar to that of the sixth embodiment.
- the state of focusing of the object is displayed as the amount of shift between two images on the double-image coincidence display device. Continuous focusing is performed by repeatedly executing this operation and the photographer is notified of the results of focusing in the form of the amount of offset between the twin images.
- the photography switch 122 is turned on, the image of the object projected upon the main image sensing device is recorded in the image memory and image of the picture taken is displayed on the display unit 804 .
- the eighth embodiment provides the following advantage in addition to the advantages (AD 10 )-(AD 14 ) according to the sixth embodiment.
- a photoelectric coincidence finder is realized using the rangefinding device of the sixth embodiment.
- a ninth embodiment described below illustrates a case where the photoelectric coincidence finder is realized using the conventional passive- or active-type rangefinding device.
- FIGS. 36 through 39 are diagrams relating to the ninth embodiment.
- FIG. 36 is a diagram showing the construction of a camera body 901 according to the ninth embodiment. Components other than those described below operate in the same manner as set forth in connection with the sixth embodiment of FIG. 20. Only the components that differ will be described.
- the camera body 901 has various functional components for forming the image of an object OBJ, detecting focus and sensing the image.
- a rangefinding module 921 has two light-receiving lenses 922 of the same power spaced apart by a predetermined baselength BL for forming images IM 2 , IM 3 of the object OBJ whose distance is to be measured, an infrared blocking filter 923 for blocking infrared light and passing only visible light of the light flux that has passed through the light-receiving lens 922 , and a subordinate image sensing device 924 .
- the rangefinding optical system which includes the light-receiving lenses 922 and the infrared blocking filter 923 , forms a second image IM 2 of the object OBJ and a third image IM 3 of the object OBJ on the subordinate image sensing device 924 , described later.
- the subordinate image sensing device 924 such as a CCD, is a two-dimensional photoelectric sensor for photoelectrically converting the second and third images IM 2 , IM 3 of the object.
- the distance to the object OBJ can be detected from the spacing between images IM 2 , IM 3 and the baselength BL using a prescribed calculation formula.
- a microcomputer 431 performs rangefinding and presents a display on a coincidence finder in accordance with a flowchart described below.
- FIG. 37 is a diagram showing the subordinate image sensing device 924 and the disposition of two images formed on the image sensing device.
- a light-receiving portion 925 of the subordinate image sensing device 924 comprises m 2 ⁇ n 2 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels.
- a horizontal transfer CCD 926 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receiving portion 925 , then transfers the charge in the direction of arrow TRH and outputs an image signal from an signal output portion 927 to the image sensor driver 134 .
- FIG. 38 is a diagram illustrating the form of the display presented on a display unit 904 .
- An image IM 1 L 0 from the main image sensing device 111 is displayed over the entire display area.
- a rectangular area AR centered on the display area is a twin-image display area.
- An image IM 1 L DEF which is obtained by extracting the central portion of an image which is the copy of the image IM 1 L 0 obtained by the image sensing device 111 , is displayed in the area AR in superposition on the image IM 1 L 0 .
- the images IM 1 L 0 and IM 1 L DEF are displayed in a form offset from each other by an amount DELTA calculated in accordance with the following equation:
- V DEF represents a quantity relating to the distance to the object OBJ
- D FOCUS a quantity relating to the amount of feed of the focusing lens 152
- K a coefficient for improving visibility by enlarging the display offset quantity
- the construction of the rangefinding module 921 and the images superposed on each other in the twin-image display area of the display unit 904 differ from those of the eighth embodiment.
- the offset quantity DELTA corresponds to the amount of focal shift of the image sensing system with respect to the object OBJ, and a finder of twin-image coincidence type similar to that of the eighth embodiment can be implemented.
- FIG. 39 is a flowchart illustrating the flow of control by the microcomputer 931 in a case where focus detection, focusing and photography are carried out in the camera body 901 of the ninth embodiment.
- step S 801 the states of the switches 121 - 124 inside the camera body 901 are sensed.
- step S 803 The state of the photography preparation switch 121 (SW 1 ), which is turned on by pressing the release button through its first stroke length, is sensed at step S 803 . Control returns to step S 802 when the switch 121 is off and proceeds to step S 804 when the switch 121 is on.
- step S 804 the main image sensing device 111 is activated to acquire an image signal.
- Processing of the image signal acquired at step S 804 is executed at step S 805 . More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- step S 806 At which object brightness information is calculated from the image signal processed at step S 805 . Further, in accordance with a predetermined exposure control program, a control value for stopping down the stop 155 and the exposure time (charge storage time) of the image sensing device 111 are calculated.
- step S 807 the setting of the AF mode switch 123 is discriminated at step S 807 to determine if the autofocus (AF) mode is in effect. Control jumps to step S 820 if the mode is not the AF mode and proceeds to step S 811 if the mode is the AF mode.
- the microcomputer 931 senses the state of the focus encoder 163 at step S 811 and senses the state of the zoom encoder 165 at step S 812 to judge the current optical status of the lens.
- the subordinate image sensing device 924 is activated at step S 813 to obtain image signals for rangefinding purposes.
- the image signals acquired at step S 813 are subjected to processing such as A/D conversion at step S 814 .
- step S 815 the position offset quantity between the digital image signals of the images IM 2 and IM 3 obtained at step S 814 is calculated and so is the distance to the object.
- the focusing lens group 152 is driven at step S 816 based upon the result of the above-described calculation to bring the first image IM 1 into focus.
- the offset quantity DELTA between the two images for display purposes shown in FIG. 38 is calculated in accordance with Equation (EQ.6) at step S 817 .
- step S 818 processing for superposing the two images in the manner shown in FIG. 38 is executed.
- step S 819 the image signal obtained at step S 818 , namely the split-image coincidence image signal, is displayed on the display unit 904 .
- step S 820 The state of the photography switch 122 (SW 2 ) is discriminated at step S 820 . If the switch 122 is off, control returns to step S 802 so that the processing of steps S 802 -S 819 , namely automatic focusing and display of images on the display unit, is executed again. If the photography switch 122 is found to be on at step S 820 , it is judged that release has been performed and control shifts to step S 831 .
- Steps S 831 -S 836 are for an image sensing operation identical with that of steps S 731 -S 736 of FIG. 35 according to the eighth embodiment.
- control returns to step S 702 .
- the camera performs the rangefinding calculation using the image signals obtained from the rangefinding module 921 and carries out automatic focusing by driving the focusing lens 152 based upon the results of rangefinding calculation.
- the state of focal shift of the image on the image sensing device 111 is calculated.
- the image of the object obtained from the image sensing device 111 and an image obtained by extracting the central portion of an image which is a copy of the first-mentioned image are superposed and displayed on the display unit 904 with an offset between them that depends upon the amount of focal shift.
- Continuous focusing is performed by repeatedly executing this operation and the photographer is notified of the results of focusing in the form of the amount of offset between the twin images.
- the photography switch 122 is turned on, the image of the object projected upon the main image sensing device is recorded in the image memory and image of the picture taken is displayed on the display unit 904 .
- the ninth embodiment provides the following advantage in addition to the advantages (AD 10 )-(AD 14 ) according to the sixth embodiment.
- AD 20 The state of focusing is displayed as a coincidence finder image even in an image sensing apparatus having the conventional rangefinding device and an electronic finder such as a liquid crystal monitor. As a result, the status of focus of the image of the object is made much more discernible through a simple, inexpensive arrangement.
- the rangefinding device in the ninth embodiment uses a passive triangulation rangefinder according to the prior art.
- a conventional active triangulation rangefinding device or a so-called sonar-type rangefinding device which measures distance based upon the length of time required to receive reflected ultrasonic waves projected toward an object.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Focusing (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
Abstract
Disclosed is a camera having a focus detection module 130 inserted into or withdrawn from an optical path between an image sensing optical system (152-155) and a CCD image sensing device (111). An image pick-up signal from the CCD image sensing device (111) is used as ordinary image data for recording when the module (130) is withdrawn from the optical path and as image data for rangefinding when the module (130) is inserted into the optical path. The focus detection module (130) is internally provided with first through fourth mirrors (131, 132, 133, 134), a field lens (135) provided between the second and third mirrors and secondary image forming lenses (137) provided between the third and fourth mirrors, whereby two secondary images are formed on the CCD image sensing device (111). The two secondary images are used for rangefinding or focusing.
Description
- This invention relates to an image sensing apparatus having an automatic focusing function for focusing the image of a subject.
- So-called digital still cameras photoelectrically convert the image of an object, which has been formed by an image sensing optical system, as a still image using an image sensing device, and record the converted image in a memory or the like. Focus detection devices for automatic focusing used in such digital still cameras usually rely upon one of the following four methods:
- (1) TTL (Through The Lens) secondary image forming phase-difference detection: Optical images that have been formed by passage through different pupil areas of an image sensing optical system are formed again as a pair of secondary images on a focus detection lens via a secondary image forming optical system and the state of focus of the image sensing optical system is detected from the spacing between the two secondary images.
- (2) Passive triangulation: Two images of an object are formed on a focus detection sensor by two of optical systems spaced apart by a predetermined baselength, and absolute distance to the object is sensed from the spacing between the two images formed.
- (3) Active triangulation: A rangefinding pattern formed on an object by a light projection system is received by light-receiving systems spaced apart by a predetermined baselength, and absolute distance to the object is sensed based upon outputs from the light-receiving systems.
- (4) Hill-climbing sharpness detection: Part of an image sensing optical system, or the image sensing device, is oscillated minutely along the direction of the optic axis and the state of focus of the image sensing optical system is detected from the degree of fluctuation of high-frequency components (which are in synchronization with the oscillation) of the object image formed on the image sensing device.
- The following finders are used as monitors for verifying the photographic area of the above-mentioned digital still cameras:
- (a) an optical TTL finder, which allows the photographer to view the image of the object formed by the image sensing optical system;
- (b) an optical rangefinder, which allows the photographer to view an image formed by a finder optical system that is different from the image sensing optical system; and
- (c) a photoelectric finder whereby an output obtained by photoelectrically converting the image of an object is displayed on a monitor such as an a liquid crystal display.
- The prior art described has a number of shortcomings, which will now be set forth.
- The secondary image forming phase-difference detection method and the passive triangulation method require the use of a photoelectric converting sensor for focus detection in addition to the image sensing device for acquisition of the photographic image. This raises the cost of the focus detection mechanism and increases the size of the image sensing device.
- With the passive triangulation method and active triangulation method, the focal length and baselength of the rangefinding optical system cannot be made very large. As a result, it is required that the dimensional precision of the component parts be very high in order to assure measurement accuracy.
- The active triangulation method, besides having the drawback set forth above, requires a special-purpose light receiving device for receiving projected light. This raises the cost of the focus detection mechanism.
- With the hill-climbing sharpness detection method, the in-focus position cannot be detected instantaneously when the object is greatly out of focus. Though this is not a major obstacle in a movie camera, it does make a digital still camera difficult to use and can result in lost photo opportunities.
- The conventional finders have the following drawbacks:
- The optical TTL finder requires a mechanism such as quick-return mirror or half-mirror for separating and switching between a photographic light flux and a finder light flux. This results in an apparatus of large size.
- The optical rangefinder uses a double-image coincidence mechanism in order to display the state of focusing. The result is a complex, costly structure.
- The photoelectric finder displays images at a low resolution and makes it difficult to confirm state of focusing accurately.
- Accordingly, an object of the present invention is to provide an image sensing apparatus that is capable of performing autofocusing highly accurately through a simple structure.
- According to the present invention, the foregoing object is attained by providing an image sensing apparatus comprising photoelectric conversion means for photoelectrically converting an image of an object obtained through image forming optics; optical path changeover means provided on an optical path between the image forming optics and the photoelectric conversion means and movable between first and second positions for changing over the optical path in such a manner that a first image of the object is formed on the photoelectric conversion means when the optical path changeover means is at the first position and a second image of the object is formed on the photoelectric conversion means when the optical path changeover means is at the second position; focus detection means for detecting state of focus of the image forming optics using the first image when the optical path changeover means is at the first position; and image sensing means for picking up the second image using the photoelectric conversion means when the optical path changeover means is at the second position.
- In accordance with this image sensing apparatus, focus detection and image pick-up can be performed by a single photoelectric conversion means. As a result, it is unnecessary to separately provide costly photoelectric conversion means for focus detection, thus making it possible to provide a small-size, inexpensive image sensing apparatus. In addition, a low-resolution image for focusing and a high-quality image for photography can be obtained using the same image forming optical system. Specifically, the apparatus utilizes an image forming optical system for image pick-up and the photoelectric conversion means thereof effectively to make possible rangefinding by TTL secondary image phase-difference detection or passive triangulation. As a result, it is unnecessary to separately provide costly photoelectric conversion means for focus detection, thus making it possible to provide a small-size, inexpensive image sensing apparatus capable of highly accurate rangefinding through a simple structure.
- According to a preferred aspect of the present invention, the optical path changeover means of the image sensing apparatus has focus detection optics for forming two secondary images as the second image of the object from a pair of light fluxes obtained by passing the image of the object through different pupil areas; the apparatus further comprising means for detecting a phase difference between the two secondary images, which have been formed on the photoelectric conversion means, when the optical path changeover means is at the first position.
- As a result, the state of focus of the image forming optical system is detected by TTL secondary image forming phase-difference detection. This makes it possible to perform accurate detection of focusing in a short period of time even in a case where the object is greatly out of focus.
- According to a preferred aspect of the present invention, the optical path changeover means of the image sensing apparatus includes: a first mirror for deflecting an image forming light flux in a direction different from an image forming optic axis connecting the image forming optics and the photoelectric conversion means; and a second mirror for returning the deflected light flux to the image forming optic axis
- As a result, a focus detection optical system having a prescribed optical path can be accommodated in a small space, thus making it possible to reduce the size of the image sensing apparatus.
- According to a preferred aspect of the present invention, the optical path changeover means of the image sensing apparatus includes a beam splitter for splitting an image forming light flux into light fluxes in first and second directions at a predetermined ratio of light quantities; the apparatus having finder means for viewing the image of the object along the second direction.
- The photographic area of the field and the state of focus of the object can be confirmed visually in accurate fashion by a TTL optical finder. This makes it possible to prevent failures when taking pictures.
- According to a preferred aspect of the present invention, the optical path changeover means of the image sensing apparatus has lens means for making image forming power of the image forming optics different at the first and second positions.
- As a result, a focus detection optical system having a prescribed optical path can be accommodated in a small space, thus making it possible to reduce the size of the image sensing apparatus and to obtain a wide range of focus detection.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises release operating means, wherein the focus detection means is activated in response to a first operation of the release operating means, and the optical path changeover means is switched from the first position to the second position in response to a second operation of the release operating means.
- As a result, the transition from a focus detection operation to an image pick-up operation can be achieved quickly, thereby making it possible to perform focusing and image pick-up operations in a short period of time and to prevent the loss of photo opportunities.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises display means for displaying the first image of the object when focus detection is performed by the focus detection means and the second image of the object when pick-up is performed by the image sensing means.
- As a result, the state of focus of an object undergoing focus detection and the image of the object at the time of image pick-up can be confirmed visually in the form of an electronic image even if there is no optical finder provided. This makes it possible to prevent failures when taking pictures.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises focusing control means for performing focusing based upon results of focus detection by the focus detection means.
- As a result, TTL focus detection is performed with a coarse image projected before image pick-up, and automatic focusing is carried out highly accurately in a short period based upon the result, thereby making it possible to focus the image of the object. A high-definition, focused image can subsequently be acquired.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises display means for selectively displaying the first and second images of the object; wherein the optical path changeover means has image magnification changing means for forming the first image on the photoelectric conversion means at a first magnification at the first position and forming the second image on the photoelectric conversion means at a second magnification at the second position, whereby sizes of the first and second images displayed on the display means are made substantially the same.
- In accordance with this arrangement, the normal image of an object and the reduced image of the object obtained by projection can be displayed with their sizes equalized in regard to the same subject imaged at different optical characteristics. This improves the ability to visually confirm an image of reduced size.
- According to a preferred aspect of the present invention, the optical path changeover means of the image sensing apparatus forms a plurality of images of the object on the photoelectric conversion means at the first position and forms a single image of the object on the photoelectric conversion means at the second position.
- In accordance with this arrangement, both an image for focus detection and an image for photography can be obtained through a simple structure by a single image sensing means.
- According to a preferred aspect of the present invention, the optical path changeover means of the image sensing apparatus forms the first image of the object at a location displaced from the center of a light-receiving portion of the photoelectric conversion means at the first position and forms the second image of the object at the center of the light-receiving portion of the photoelectric conversion means at the second position.
- In accordance with this arrangement, only a small image signal that has been formed on part of the light-receiving area of photoelectric conversion means is read out in a short period of time to acquire the first image of the object, and the entire image signal of the light-receiving area of the photoelectric conversion means is read out to acquire the second image of the subject.
- According to a preferred aspect of the present invention, the second optical image forming means of the image sensing apparatus has an image forming power different from that of the first optical image forming means.
- As a result, rangefinding based upon passive triangulation can be carried out.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises image signal recording means for recording the output of the first photoelectric conversion means.
- As a result, an image obtained from an image forming optical system for photography and an image obtained from the photoelectric conversion means thereof can be recorded and preserved.
- According to a preferred aspect of the present invention, the first optical image forming means of the image sensing apparatus includes a zoom lens and the rangefinding means has image magnification correction means for correcting a fluctuation in image magnification that accompanies a zooming operation of the zoom lens.
- This arrangement is such that when the image forming optical means for image pick-up is used for rangefinding, a parameter correction conforming to power fluctuation is carried out to perform a rangefinding calculation. This makes it possible to perform accurate rangefinding at all times even when power fluctuates.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises display means for displaying the first image.
- In accordance with this arrangement, the image of a subject for image pick-up is displayed for monitoring. As a result, the state of subject focus can be checked and it is possible to prevent the taking of a photograph that is out of focus.
- An image sensing apparatus according to a preferred aspect of the present invention comprises projection means for projecting rangefinding a light flux toward an object to form a rangefinding pattern on the object; optical image forming means, which is spaced away from the projection means by a predetermined baselength, for selectively forming the image of the rangefinding pattern and the image of the object; photoelectric conversion means for photoelectrically converting the image of the rangefinding pattern and the image of the object; and rangefinding means for sensing distance between the optical image forming means and the object based upon an output from the photoelectric conversion means when the image of the rangefinding pattern has been received by the photoelectric conversion means.
- In accordance with this arrangement, both a projection pattern for rangefinding in active triangulation and an image of the object can be acquired by a single image sensing system, as a result of which the apparatus can be reduced in size and lowered in cost.
- According to a preferred aspect of the present invention, the optical image forming means of the image sensing apparatus includes a zoom lens and the rangefinding means has image magnification correction means for correcting a fluctuation in image magnification that accompanies a zooming operation of the zoom lens.
- In accordance with this arrangement, when the optical image forming means for image pick-up is used for rangefinding, a parameter correction conforming to power fluctuation is carried out to perform a rangefinding calculation. This makes it possible to perform accurate rangefinding at all times even when power fluctuates.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises image signal recording means for recording the output of the photoelectric conversion means when the image of the subject has been received by the photoelectric conversion means.
- As a result, an image obtained from an image forming optical system for image pick-up and an image obtained from the photoelectric conversion means thereof can be recorded and preserved.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises wavelength region selecting means, which is interposed between the optical image forming means and the photoelectric conversion means, for passing a first wavelength region when the image of the rangefinding pattern is photoelectrically converted and passing a second wavelength region when the image of the object is photoelectrically converted.
- In accordance with this arrangement, a wavelength selection suited to acquisition of a pattern image for rangefinding is performed at the time of rangefinding and a wavelength selection suited to acquisition of the image of the object at the time of image pick-up. As a result, highly accurate rangefinding can be performed and it is possible to obtain a highly precise image that is free of unnecessary light rays.
- According to a preferred aspect of the present invention, the image sensing apparatus further comprises focusing means for focusing the first optical image forming means based upon an output of the rangefinding means; focal shift discrimination means for discriminating state of focus of the image of the object based upon outputs from the rangefinding means and the focusing means; image signal combining means for combining outputs from the first and second photoelectric conversion means; display means for displaying an output image from the image signal combining means; and combining control means for changing operation of the image signal combining means based upon an output from the focal shift discrimination means.
- In accordance with this arrangement, the extent to which the image of an object is out of focus can be checked visually and clearly by images combined and displayed. This makes it possible to prevent the taking of a photograph that is out of focus.
- According to a preferred aspect of the present invention, the combining control means of the image sensing apparatus varies relative amount of offset between display positions, on the display means, of outputs from the first and second photoelectric conversion means in dependence upon a focal shift signal output by the focal shift discriminating means.
- In accordance with this arrangement, the amount of focal shift of the image of an object can be checked visually and clearly from the amount of relative offset between two images displayed in superposition. This makes it possible to prevent the taking of a photograph that is out of focus.
- An image sensing apparatus according to a preferred aspect of the present invention comprises optical image forming means for capturing a light flux from an object and forming an image of the object; photoelectric conversion means for photoelectrically converting the image of the object; rangefinding means for sensing distance between the optical image forming means and the object; focusing means for focusing the image forming optical means based upon an output from the rangefinding means; display means for displaying the image of the object from the photoelectric conversion means; focal shift discrimination means for discriminating state of focus of the image of the object based upon outputs from the rangefinding means and the focusing means; and display control means for varying form of display of the image of the object on the display means based upon an output from the focal shift discrimination means.
- In accordance with this arrangement, the amount of focal shift of the image of an object can be checked visually and clearly based upon images combined and displayed. This makes it possible to prevent the taking of a photograph that is out of focus.
- According to a preferred aspect of the present invention, the display means of the image sensing apparatus has first and second display areas, and the display control means varies position of a displayed image in the second display area relative to a displayed image in the first display area in dependence upon a focal shift signal from the focal shift discrimination means.
- In accordance with this arrangement, the amount of focal shift of the image of an object can be checked visually and clearly from the amount of relative offset between two images displayed in superposition. This makes it possible to prevent the taking of a photograph that is out of focus.
- According to a preferred aspect of the present invention, the focal shift discrimination means of the image sensing apparatus detects amount of focal shift from outputs from the rangefinding means and the focusing means.
- As a result, a finder display or the like can be presented using information relating to the amount of focal shift sensed. This makes it possible to prevent the taking of a photograph that is out of focus.
- Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
- FIG. 1 is a block diagram illustrating the construction of an image sensing apparatus according to a first embodiment of the present invention, this diagram showing the apparatus at the time of focus detection;
- FIG. 2 is a diagram showing formation of images of an object when the object is in focus at the time of focus detection according to the first embodiment;
- FIG. 3 is a diagram showing formation of the images of an object when the object is not in focus at the time of focus detection according to the first embodiment;
- FIG. 4 is a diagram showing the image sensing apparatus at the time of image pick-up according to the first embodiment;
- FIG. 5 is a diagram showing formation of the image of an object at the time of image pick-up according to the first embodiment;
- FIG. 6 is a flowchart showing a procedure for controlling a camera according to the first embodiment;
- FIG. 7 is a flowchart showing a procedure for controlling a lens according to the first embodiment;
- FIG. 8 is a diagram illustrating part of a focus detection optical system according to a second embodiment of the present invention;
- FIG. 9 is a diagram showing formation of the images of an object when the object is in focus at the time of focus detection according to the second embodiment;
- FIG. 10 is a diagram illustrating part of a focus detection optical system according to a third embodiment of the present invention;
- FIG. 11 is a diagram showing formation of the images of an object when the object is in focus at the time of focus detection according to the third embodiment;
- FIG. 12 is a block diagram illustrating the construction of an image sensing apparatus according to a fourth embodiment of the present invention, this diagram showing the apparatus at the time of focus detection;
- FIG. 13 is a diagram showing formation of the images of an object when the object is in focus at the time of focus detection according to the fourth embodiment;
- FIG. 14 is a diagram useful in describing the state of a display on a display unit at the time of focus detection according to the fourth embodiment;
- FIG. 15 is a diagram showing the image sensing apparatus at the time of image pick-up according to the fourth embodiment;
- FIG. 16 is a diagram showing formation of the image of an object at the time of image pick-up according to the fourth embodiment;
- FIG. 17 is a flowchart showing a procedure for controlling a camera according to the fourth embodiment;
- FIG. 18 is a block diagram illustrating the construction of an image sensing apparatus according to a fifth embodiment of the present invention, this diagram showing the apparatus at the time of focus detection;
- FIG. 19 is a diagram showing the construction of an image sensing apparatus at the time of image pick-up according to the fifth embodiment;
- FIG. 20 is a block diagram illustrating the construction of an image sensing apparatus according to a sixth embodiment of the present invention;
- FIGS. 21A, 21B are diagrams useful in describing the state of image formation when rangefinding is performed according to the sixth embodiment;
- FIGS. 22A, 22B are diagrams useful in describing the principle of image magnification correction according to the sixth embodiment;
- FIG. 23 is a diagram useful in describing the concept of an image signal when a rangefinding calculation is performed according to the sixth embodiment;
- FIG. 24 is a flowchart showing a procedure for controlling an image sensing apparatus according to the sixth embodiment;
- FIG. 25 is a diagram showing the construction of an image sensing apparatus at the time of rangefinding according to the seventh embodiment;
- FIG. 26 is a diagram useful in describing the state of formation of a spot image for rangefinding according to the seventh embodiment;
- FIG. 27 is a diagram useful in describing the concept of an image signal when a rangefinding calculation is performed according to the seventh embodiment;
- FIG. 28 is a diagram showing the construction of an image sensing apparatus at the time of image pick-up according to the seventh embodiment;
- FIGS.29 is a diagram useful in describing the state of image formation when image pick-up is performed according to the seventh embodiment;
- FIG. 30 is a diagram useful in describing the state of an image display after image pick-up according to the seventh embodiment;
- FIG. 31 is a flowchart showing a procedure for controlling an image sensing apparatus according to the seventh embodiment;
- FIG. 32 is a block diagram illustrating the construction of an image sensing apparatus according to an eighth embodiment of the present invention;
- FIG. 33 is a diagram useful in describing the concept of an image signal when a rangefinding calculation is performed according to the eighth embodiment;
- FIG. 34 is a diagram useful in describing the state of an image display when rangefinding is performed according to the eighth embodiment;
- FIG. 35 is a flowchart showing a procedure for controlling an image sensing apparatus according to the eighth embodiment;
- FIG. 36 is a block diagram illustrating the construction of an image sensing apparatus according to a ninth embodiment of the present invention;
- FIG. 37 is a diagram useful in describing the state of image formation when rangefinding is performed according to the ninth embodiment;
- FIG. 38 is a diagram useful in describing the state of an image display when rangefinding is performed according to the ninth embodiment; and
- FIG. 39 is a flowchart showing a procedure for controlling an image sensing apparatus according to the ninth embodiment.
- Embodiments of the present invention will now be described with reference to the drawings.
- FIGS. 1 through 7 are diagrams relating to a first embodiment of the present invention.
- FIG. 1 is a block diagram showing the construction of an image sensing apparatus according to a first embodiment. This embodiment relates to a so-called single-lens reflex camera in which an interchangeable lens having an image sensing optical system is capable of being removably mounted on a camera body having a image sensing device.
- The camera includes a
camera body 101 having a mount (not shown) joining various functional portions for image pick-up and aninterchangeable lens 151, described later. An object is indicated at OBJ. - The camera is internally provided with a single-
chip microcomputer 102 having a ROM, a RAM and A/D, D/A conversion functions. In accordance with a camera sequence program stored in the ROM, themicrocomputer 102 implements a series of camera operations such as automatic exposure control (AE), autofocus (AF) and image sensing. Themicrocomputer 102 controls various circuits and lens operation by communicating with peripheral circuitry within thelens body 101 and with theinterchangeable lens 151. According to the present invention, the ROM constitutes a storage medium and can be a semiconductor memory, an optical disk, a magneto-optic device or a magnetic medium, etc. - The mount joining the
camera body 101 and theinterchangeable lens 151 is prided with four connection terminals. Apower supply 103 supplies the camera circuits and actuators with power-supply voltage and supplies theinterchangeable lens 151 with power via a line Vcc. - A line DCL transmits a signal from the
microcomputer 102 to a microcomputer 161 (described later) inside the lens. A line DLC transmits a signal from themicrocomputer 161 inside the lens to themicrocomputer 102 inside the camera body. Thecamera body 101 controls theinterchangeable lens 151 via these two lines. Thecamera body 101 andinterchangeable lens 151 are connected to ground via a line GND. - The
camera body 101 has adisplay unit 104, such as a liquid crystal panel, having a display function for displaying photographic conditions and a monitor function for monitoring a sensed image. - A
driver 105 drives and controls animage sensing device 111, described later. Thedriver 105 controls the storage of charge in theimage sensing device 111, charge transfer, CDS (Correlated Double Sampling), AGC (Automatic Gain Control), A/D conversion, gamma correction and AWB (Automatic White Balance), etc. - A
memory 106 records and preserves image signal data representing a sensed image and can be a semiconductor memory, magnetic disk or optical disk, etc. - A
terminal 107 for outputting a recorded image to external equipment is connected to a personal computer or printer. - The
image sensing device 111, such as a CCD, is a two-dimensional photoelectric sensor for photoelectrically converting the image of the object formed by an image sensing optical system 152-154. - The camera body has a
main switch 120. When this switch is turned on (closed), themicrocomputer 102 allows the execution of a prescribed program relating to preparations for photography, namely exposure metering and focus detection, etc. - Switches121 (SW1) and 122 (SW2) are linked to a camera release button and are turned on (closed) by pressing the release button through first and second stroke lengths, respectively. More specifically, the
switch 121 is for preparing for image pick-up. When this switch is turned on, preparatory photographic operations such as exposure metering, focus detection and focusing are executed. Theswitch 122 is a photography switch. When this switch is turned on, a photographic image that has been formed on theimage sensing device 111 is acquired and recorded in theimage memory 106. - An
AF mode switch 123 is used to select the autofocus mode. Adisplay switch 124 is used to designate a display for monitoring a photographic image. - The image of the object formed by the image sensing optical system is formed again by a
focus detection module 130 using the various optical elements set forth below. Specifically, thefocus detection module 130 includes afirst mirror 131 for fully reflecting the photographic light flux upward in FIG. 1; a semi-transparentsecond mirror 132 for passing about 70% of the fully reflected light flux and reflecting the remaining 30% of the light flux rightward in FIG. 1; athird mirror 133 for fully reflecting the light flux downward in FIG. 1; and afourth mirror 134 for fully reflecting the fully reflected light flux rightward in FIG. 1; afield lens 135 placed in a first predetermined focal plane of the image sensing optical system, with a primary image IM1 of the object OBJ being formed in this predetermined focal plane by the image sensing optical system; afield mask 136 which decides a focus detection area; and a pair of secondaryimage forming lenses 137 for forming the images of the primary image IM1 again. - The entrance pupil of the pair of secondary
image forming lenses 137 and the exit pupil of astop 155, described later, are placed in a projection relationship by thefield lens 135. Consequently, two secondary images IMA and IMB resulting from the light flux that has passed through different pupil areas (the exit pupil of thestop 155 and the entrance pupil of the pair of secondary image forming lenses 137) of the image sensing optical system are formed on theimage sensing device 111. - A
movable mirror unit 138 is capable of moving thefirst mirror 131,fourth mirror 134 and secondaryimage forming lenses 137 in unison to advance and retract the same into and out of the photographic light flux. - A quick-return (QR)
actuator 139 drives themovable mirror unit 138 to advance and retract the same. - A focusing
screen 141 is placed in a second predetermined focal plane that is in a conjugate relationship with the first predetermined focal plane mentioned above. A secondary primary image IM2 resulting from light flux reflected by thefirst mirror 131 and passed by thesecond mirror 132 is formed on the focusingscreen 141. - A
pentagonal prism 142 and aneyepiece 143 construct an optical finder that makes it possible for the photographer to view the secondary primary image IM2. - The components on the side of the lens will now be described.
- The
interchangeable lens 151 is capable of being removably mounted on thecamera body 101 and includes a focusinglens group 152 for performing focusing by being advanced and retracted along the direction of the optic axis; azoom lens group 153 for performing zooming by being advanced and retracted along the direction of the optic axis; and arelay lens group 154 for performing a prescribed image forming operation together with thelens groups lens groups - The
stop 155 decides the entrant light flux of the image sensing optical system, and anactuator 156 drives thestop 155. - Like the
microcomputer 102, themicrocomputer 161 inside the lens is a single-chip microcomputer having a ROM, a RAM and A/D, D/A conversion functions. In accordance with an instruction sent from themicrocomputer 102 via the signal line DCL, themicrocomputer 161 controls the driving of a focus actuator and zoom actuator, described later, as well as the driving of the actuator mentioned above. Various operating states of the lens and parameters specific to the lens are transmitted to themicrocomputer 102 by the signal line DLC. - A
focus actuator 162 drives the focusinglens group 152 to advance and retract the same, and afocus encoder 163 senses position information indicative of the position of the focusinglens group 152, namely object distance information. Azoom actuator 164 drives thezoom lens group 153 to advance and retract the same, and azoom encoder 165 senses position information indicative of the position of thezoom lens group 153, namely focal length information. - By virtue of the construction described above, the
interchangeable lens 151 forms the image of the object OBJ on theimage sensing device 111 of the camera and performs focusing, zooming and control of entrant light quantity based upon a control instruction from the camera. - The state of image formation of the object OBJ at the time of focus detection prior to preparations for photography will now be described.
- A light flux from the object OBJ passes through the
lens groups second mirrors third mirror 133, after which the light flux impinges upon the two secondaryimage forming lenses 137. Each of thelenses 137 function as pupil. Thus, thelenses 137 form two images by pupil-splitting or -separating (referred to as simply “splitting”, hereinafter). In other words, two light fluxes are obtained and these are reflected by thefourth mirror 134, after which the reflected light fluxes are projected onto the light-receiving surface of theimage sensing device 111 as the two secondary images IMA and IMB. - FIG. 2 is a diagram showing the state of image formation of the secondary images IMA, IMB on the
image sensing device 111. - A light-receiving
portion 112 of theimage sensing device 111 has m×n light-receiving pixels and a charge transfer portion (CCDV for vertical transfer, not shown) for transferring electric charge that has accumulated in these pixels. A horizontal transfer portion (CCDH) 113 stores electric charge transferred in the direction of arrow TRV by the vertical transfer portion CCDV in the light-receivingportion 112, then transfers the charge in the direction of arrow TRH and outputs an image signal from asignal output portion 114 to theimage sensor driver 105. - Two areas ARA, ARB on the light-receiving
portion 112 are images of thefield mask 136 projected by the secondaryimage forming lenses 137 of FIG. 1. The secondary images IMA, IMB of the first primary image IM1 are formed in the areas ARA, ARB, respectively. Let V0 represent the spacing between the two images IMA, IMB when the image sensing optical system is in focus with respect to the object OBJ. - FIG. 3 is a diagram showing the state of image formation of the secondary images on the
image sensing device 111 when the image sensing optical system is not in focus with respect to the object OBJ. In this case the spacing between the two images IMA and IMB is V1 (≠V0). - When the
switch 121 which prepares thecamera body 101 for photography is closed, stored electric charge in the light-receivingportion 112 is read out, converted from an analog to a digital value and transmitted to themicrocomputer 102. In accordance with a well-known correlation algorithm, the spacing V1 between the two above-mentioned images is calculated and the difference between V1 and the in-focus spacing V0 i.e., - Δv=V 1 —V 0
- is obtained, thereby making it possible to detect the extent to which the object OBJ is out of focus. The amount of such defocus is transmitted to the
interchangeable lens 151 and the focusinglens group 152 is driven accordingly to perform an autofocus operation. - FIG. 4 is a diagram showing the camera when the
photography switch 122 of thecamera body 101 is closed upon the completion of the autofocus operation, thereby establishing the photographic state. - When the
photography switch 122 is closed, themovable mirror unit 138 is withdrawn away from the optical path of photography (i.e., upward in FIG. 4) by the quick-return actuator 139. When this is done, the mirrors are removed from between the image sensing optical system andimage sensing device 111 so that a primary image IM3 produced by the image sensing optical system is formed on theimage sensing device 111. - FIG. 5 is a diagram showing formation of the image of the object on the
image sensing device 111 at the time of photography. Here the primary image IM3 of the object OBJ is projected onto the light-receivingportion 112. Accordingly, the image signal prevailing under these conditions is accepted and recorded in theimage memory 106 of thecamera body 101, whereby the image is sensed. - FIGS. 6 and 7 are flowcharts illustrating the control flows of the
microcomputers camera body 101 andinterchangeable lens 151 according to the first embodiment of the present invention. - The control flow of the
microcomputer 102 inside the camera will be described first in accordance with FIG. 6 while making reference to FIG. 1. - When the
main switch 120 of thecamera body 101 is closed (turned on), themicrocomputer 102 is activated from the sleep state and control proceeds from step S101 to step S102, at which the states of the switches 121-124 inside thecamera body 101 are sensed. - The state of the photography preparation switch121 (SW1), which is turned on by pressing the release button through its first stroke length, is sensed at step S103. Control returns to step S102 when the
switch 121 is off (open) and proceeds to step S104 when theswitch 121 is on. - The fact that preparation for shutter release has been executed is communicated to the
microcomputer 161 inside theinterchangeable lens 151 at step S104. This is followed by step S105, at which parameters are communicated to themicrocomputer 161. The communication of parameters involves transmitting lens-specific information such as the lens f-number, focal length and focusing sensitivity to the camera. - Next, at step S106, the
image sensing device 111 is activated to acquire an image signal. - Processing of the image signal acquired at step S106 is executed at step S107. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S108, at which object brightness information is calculated from the image signal processed at step S107. Further, in accordance with a predetermined exposure control program, a control value for narrowing the
stop 155 and the exposure time (charge storage time) of theimage sensing device 111 are calculated. - The setting of the
AF mode switch 123 is discriminated at step S109 to determine if the autofocus (AF) mode is in effect. Control proceeds to step S112 if the mode is not the AF mode and to step S110 if the mode is the AF mode. - The defocus quantity ΔV of the object is calculated from the spacing between the two secondary images IMA, IMB at step S110 in the manner illustrated in FIG. 3. Next, at step S111, the defocus quantity ΔV calculated at step S110 is transmitted to the
microcomputer 161 inside the lens. - The state of the photography switch (SW2) 122 is discriminated at step S112. If the
switch 122 is off, control returns to step S102 so that the processing of steps S102-S111 is executed again. If thephotography switch 122 is found to be on at step S112, it is judged that release has been performed and control shifts to step S121. - The transition to the release operation is communicated to the
microcomputer 161 in the lens at step S121. The stop control value that was calculated at step S106 is transmitted to themicrocomputer 161 at step S122. - Next, at step S123, the
movable mirror unit 138 is withdrawn from the photographic optical path of FIG. 1 to the attitude shown in FIG. 4, where the mirror unit 128 is outside the optical path. - Control of charge accumulation and charge transfer in the image sensing device is performed at step S124 for the purpose of photography. This is followed by step S125, at which processing of the image signal acquired at step S124 is executed in the same manner as performed at step S107. More specifically, the image signal is applied to A/D conversion, white balance adjustment, gamma correction and compression processing, etc.
- Next, at step S126, the signal processed at step S125 is recorded and preserved in the
image memory 106. The withdrawnmovable mirror unit 138 is driven at step S127 so as to be returned to the optical path for photography. An instruction for restoring thestop 155 is transmitted to themicrocomputer 161 at step S128. - The image recorded at step S126 is displayed on the
display unit 104 at step S129, thereby allowing the photographer to view the image that is the result of photography. - Control returns to step S102 when the above-described photographic operation is completed.
- FIG. 7 is a flowchart illustrating control of the
microcomputer 161 inside the lens. - Power is supplied to the interchangeable lens by turning on the
main switch 120 on the camera side, whereupon control proceeds from step S151 to step S152. Themicrocomputer 161 is in the sleep state waiting for communication from the camera body. Control stops at step S152 if there is no communication from the camera body. - If communication relating to release preparation corresponding to step S104 in FIG. 6 is received from the
microcomputer 102, control proceeds from step S152 to step S153. - This is followed by step S153, at which the
microcomputer 161 inside the lens senses the states of thefocus encoder 163 andzoom encoder 165 and judges the current status of the lens. Communication of parameters corresponding to step S105 in FIG. 6 is performed at step S154 based upon the lens status sensed at step S153. - A signal representing the amount of defocusing corresponding to step S111 in FIG. 6 is received at step S155.
- The amount of focusing lens drive necessary is calculated at step S156 based upon the amount of defocusing received at step S155 and the lens status sensed at step S153. Next, at step S157, the focusing lens is driven to performing focusing based upon the amount of drive calculated at step S156.
- It is determined at step S158 whether a release interrupt corresponding to step S121 of FIG. 6 has occurred. If the decision is “NO”, control returns to step S152. If a release interrupt occurs, however, control proceeds from step S158 to step S171, at which driving of the focusing lens is halted.
- A stop control value is received from the
microcomputer 102 at step S172. Next, at step S173, thestop actuator 156 is driven in accordance with the stop control value received at step S172. Photography is performed on the camera side. - When the photographic operation on the camera side is completed, a stop restoration instruction is received at step S174. The stop is restored to the open state at step S175 in accordance with this instruction.
- When the photographic operation on the lens side is completed, control returns to step S152.
- The operation of the camera and lens according to the foregoing flowcharts will now be summarized.
- At the photography preparation stage, the photographic light flux is split and projected upon the
image sensing device 111 via the focus detection optical system in the manner shown in FIG. 1. When themain switch 120 andphotography preparation switch 121 are turned on by the photographer, the camera calculates the defocus quantity ΔV of the object from the spacing between the two secondary images projected upon theimage sensing device 111, as shown in FIG. 3, and transmits ΔV to theinterchangeable lens 151. In response, theinterchangeable lens 151 drives the focusing lens in accordance with the defocus quantity ΔV, thereby performing focusing. Next, when thephotography switch 122 is turned on, themovable mirror unit 138 is withdrawn from the photographic light flux, as shown in FIG. 4, so that the in-focus image of the object is projected upon theimage sensing device 111, as shown in FIG. 5. The camera acquires the image of the object and records the image in theimage memory 106. - (AD1) The
focus detection module 130 composed of the four mirrors (131-134) is placed in the optical path between the optical lens system and the image sensing device, and the pair ofoptical lenses 137 is provided in the optical path of thefocus detection module 130. Theoptical lens 137 forms two images on theimage sensing device 111 when the object is not in focus and forms one image on theimage sensing device 111 when the object is in focus. The defocus quantity ΔV can be obtained based upon the spacing V1 between the two images. As a result, it is unnecessary to separately provide photoelectric conversion means for focus detection, and both focus detection and focusing can be performed using the light flux that has passed through the photographic lens. This makes it possible to realize an inexpensive, small-size camera exhibiting highly accurate automatic focusing. - (AD2) The
focus detection module 130 is disposed in back of the final lens group (thelens group 154 in the example of FIG. 1) of the image sensing optical system. As a result, the module is suited to a digital still camera of the single-lens reflex type having a long back focus. - (AD3) Part of the photographic light flux is split by a beam splitter (the mirror 132) before being introduced to the optical finder system (the
pentagonal prism 142 and eyepiece 143). This makes it possible to obtain a high-quality optical finder and to visually confirm the state of focus of an object image in a highly accurate manner. The optical path of the optical finder system and part of the optical path of thefocus detection module 130 can be made to coincide, thereby making it possible to reduce the size of the overall camera. - (M1) If there is no limitation upon the size of the camera overall, the optical path of the optical finder system and the optical path of the
focus detection module 130 may be separated completely. In such case a half-mirror (132′, not shown in figures) such as themirror 132 is provided between thelens 154 and themirror 121, and themirror 132 is made a fully reflective mirror. The optical finder is disposed above themirror 132′. - (M2) The
lenses 137 for splitting the optic axis into two portions should ideally be provided between themirror 133 and themirror 134. Theoretically, however, it is possible to provide thelenses 137 between thelens 135 andmirror 133 or between themirror 132 andlens 135. - According to the first embodiment set forth above, the pair of secondary
image forming lenses 137 are arranged paralely to the light incident direction onto the photographic screen 111 (that is, in a right-to-left direction with respect to the FIG. 1 orientation) so that the two secondary images (IMA and IMB) are disposed one above the other on thephotographic screen 111. In the second embodiment described below, a pair of secondary image forming lenses are adapted (in a direction vertical to the surface of FIG. 1 drawing) to be arranged so that two secondary images are juxtaposed side by side on the photographic screen and are vertically offset from the optic axis. Specifically, the construction of the second embodiment is substantially identical to that of the first embodiment except for the arrangement of secondary forming lenses. - FIG. 8 is a perspective view showing a development of part of the focus detection optical system according to the second embodiment, and FIG. 9 is a plan view showing the disposition of the secondary images on the
image sensing device 111. The construction and operation of this embodiment will now be described. - Elements other than those shown in FIG. 8 are identical with those of the first embodiment illustrated in FIG. 1.
- FIG. 8 schematically illustrates an optical path from the
field mask 136 to theimage sensing device 111. In FIG. 8, thethird mirror 133 and thefourth mirror 134 are omitted from thefocus detection module 130 in FIG. 1 for the sake of simplified illustration purpose. - According to the second embodiment as shown in FIG. 8, the pair of two secondary
image forming lenses 237 are disposed horizontally, i.e., side by side with respect to the photographic screen, between the field lens 135 (having the field mask 136) and theimage sensing device 111. The centers connecting the two secondaryimage forming lenses 237 are offset by a distance OFS to a position below the optic axis C. Accordingly, the projected images of thefield mask 136 formed by the secondaryimage forming lenses 237 become downwardly offset areas ARC, ARD on the light-receivingportion 112 of theimage sensing device 111. - FIG. 9 illustrates the disposition of images on the
image sensing device 111 in the arrangement of FIG. 8. The images ARC, ARD of thefield mask 136 are projected as areas on the light-receivingportion 112, and secondary images IMC, IMD of the object OBJ are formed in the areas ARC, ARD, respectively. Let H0 represent the spacing between the two images when the object is in focus, and let H1 represent the spacing between the two images when the object is not in focus. If H1 is measured and H0 and H1 can be compared, then a defocus quantity ΔH (-H1-H0) of the object OBJ can be detected. - In FIG. 9, the two secondary image projection areas ARC, ARD are disposed at the lower part the light-receiving
portion 112, namely on the side near the exit in the transfer direction of the vertical-transfer CCD in the light-receiving portion. As a result, only the image signal on the lower half of the light-receivingportion 112 need be read for the purpose of detecting focus. In other words, if the image signal on the lower half of the light-receivingportion 112 is used in image processing for focus detection, then the image signal of the upper half of the light-receivingportion 112 maybe discarded without being read. In the first focus detection cycle, therefore, the time needed to read out the image signal is shortened. - The flowcharts for controlling the camera and interchangeable lens in this embodiment are the same as the control flowcharts of the first embodiment shown in FIGS. 6 and 7 and need not be described again.
- The second embodiment has the following advantage in addition to the advantages (AD1)-(AD3) of the first embodiment:
- Since the image-signal readout time for detecting focus is shortened, it is possible to speed up the focus detection operation or autofocus operation.
- In the first and second embodiments, focus detection is carried out based upon one pair of secondary images. In the third embodiment described below, the optical system is so adapted that focus detection is carried out by forming two pairs of secondary images.
- FIG. 10 is a perspective view showing a development of part of the focus detection optical system according to the third embodiment, and FIG. 11 is a plan view showing the disposition of the secondary images on the
image sensing device 111. The construction and operation of this embodiment will now be described. - In FIG. 10, two pairs of secondary
image forming lenses 337 are disposed between the field lens 135 (having the field mask 136) and theimage sensing device 111. The projected images of thefield mask 136 formed by the secondaryimage forming lenses 337 become four areas ARA, ARB, ARC and ARD on the light-receivingportion 112 of theimage sensing device 111. - Elements other than those shown in FIG. 10 are identical with those of the first embodiment illustrated in FIG. 1.
- FIG. 11 illustrates the disposition of images on the
image sensing device 111 in the arrangement of FIG. 10. The images ARA, ARB, ARC, ARD of thefield mask 136 are projected on the light-receivingportion 112, and secondary images IMA, ImB, IMC, IMD of the object OBJ are formed in the areas ARA, ARB, ARC, ARD, respectively. Let V0 represent the spacing between the two images IMA, IMB when the object is in focus, and let H0 represent the spacing between the two images IMC, IMD when the object is in focus. If the spacings between the images when the object is not in focus are measured and compared with the spacings V0 and H0, then a defocus quantity of the object OBJ can be detected. - Except for the fact that two sets of operations for detecting the amount of defocusing of the image of the object are provided in the third embodiment, the flowcharts for controlling the camera and interchangeable lens in this embodiment are the same as the control flowcharts of the first embodiment shown in FIGS. 6 and 7 and need not be described again.
- The third embodiment has the following advantage in addition to the advantages (AD1)-(AD3) of the first embodiment:
- Since focus detection is carried out based upon images of the object OBJ that are offset vertically and horizontally, focus detection can be performed with greater accuracy.
- In the first through third embodiments described above, the secondary image forming optical system for focus detection uses mirrors for deflecting the light flux. In the fourth embodiment described below, use is made of a reducing lens instead of mirrors.
- FIGS.12-17 are diagrams relating to the fourth embodiment.
- FIG. 12 is a diagram showing the construction of an image sensing apparatus according to the present invention. This shows the apparatus when detection of focus is carried out. According to this embodiment, the
focus detection module 130 of the first embodiment shown in FIG. 1 is replaced by afocus detection module 430. Though the optical finder composed of such elements as the pentagonal prism is eliminated, other elements are the same as those shown in FIG. 1. The construction and operation of this embodiment will now be described. - As shown in FIG. 12, the
focus detection module 430 includes a reducinglens 431, afield mask 436, afield lens 435 and two secondaryimage forming lenses 437. Thestop 155 of theinterchangeable lens 151 and the entrance pupil of the pair of secondaryimage forming lenses 137 are in a projection relationship owing to thefield lens 135. A quick-return (QR)actuator 439 is provided for advancing and retracting thefocus detection module 430 into and out of the projected light flux. - The image of the object OBJ is formed as a primary image IM4 on the primary image forming surface in the
field lens 435 via the image sensing optical system, which is constructed by the lens groups 152-154 and stop 155, and the above-mentioned reducinglens 431. It should be noted that the primary image IM4 has a size different from that of the first primary image IM1 or IM2 of the first embodiment owing to the intervention of the reducinglens 431. - The primary image IM4 is split by the two secondary
image forming lenses 437 disposed one above the other, whereby the image is formed again. These secondary images are projected upon theimage sensing device 111 as IMA and IMB. - FIG. 13 is a diagram showing formation of the secondary images on the
image sensing device 111. In a manner similar to that of the first embodiment, this embodiment detects the amount of defocusing of the object OBJ based upon a change in the difference between the two images with respect to the reference spacing value V0 between the images. - FIG. 14 illustrates the state of the display presented on a
display unit 404 when focus detection is performed. One of the two secondary images, e.g., IMB, projected upon theimage sensing device 111 in FIG. 13 is subjected to enlargement processing and displayed on thedisplay unit 404 as IMBL, thereby making it possible for the photographer to check the composition of the photographic area as well as the state of focusing. - FIG. 15 is a diagram showing the camera when the
photography switch 122 of thecamera body 101 is closed upon the completion of the autofocus operation, thereby establishing the photographic state. - When the
photography switch 122 is closed, the entirefocus detection module 430 is withdrawn away from the optical path of photography (i.e., upward in FIG. 15) by the quick-return actuator 439. - FIG. 16 is a diagram showing formation of the image of the object on the
image sensing device 111 at the time of photography. Here the primary image IM3 of the object OBJ is projected onto the light-receivingportion 112 in a manner similar to that of the first embodiment. Accordingly, the image signal prevailing under these conditions is accepted and recorded in theimage memory 106 of thecamera body 401, whereby the image is sensed. - FIG. 17 is a flowchart illustrating the control flow of a
microcomputer 402 inside the camera body. This flowchart differs from that of FIG. 6 only in the addition of an operation for displaying a finder image at the time of focus detection. - When the
main switch 120 of thecamera body 401 is turned on, themicrocomputer 402 is activated from the sleep state and control proceeds from step S401 to steps S402, S403 and S404. - The fact that preparation for shutter release has been executed is communicated to the
microcomputer 161 inside theinterchangeable lens 151 at step S404. This is followed by step S405, at which parameters are communicated to themicrocomputer 161. - Next, at step S406, the
image sensing device 111 is activated to acquire an image signal. - Processing of the image signal acquired at step S406 is executed at step S4107. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S408, at which the image to be displayed on the
display unit 404, namely the image in the area ARB of FIG. 13, is enlarged and then rotated by 180° about its center. In comparison with the image that prevails at the time of imaging, therefore, the image is turned upside down when focus detection is performed. - The image for viewing purposed obtained at step S408 is displayed on the
display unit 404 at step S409. - This is followed by step S410, at which object brightness information is calculated from the image signal processed at step S407. Further, in accordance with a predetermined exposure control program, a control value for narrowing the
stop 155 and the exposure time (charge storage time) of theimage sensing device 111 are calculated. - The setting of the
AF mode switch 123 is discriminated at step S411 to determine if the autofocus (AF) mode is in effect. Control proceeds to step S414 if the mode is not the AF mode and to step S412 if the mode is the AF mode. - The amount of defocus of the object is calculated from the spacing between the two secondary images at step S412 in the manner illustrated in FIG. 13. Next, at step S413, the amount of defocus calculated at step S412 is transmitted to the
microcomputer 161 inside the lens. - The state of the
photography switch 122 is discriminated at step S414. If theswitch 122 is off, control returns to step S402. If thephotography switch 122 is found to be ON, it is judged that release has been performed and control shifts to step S421. - The release sequence of steps S421-S429 is the same as the processing of steps S121-S129 in FIG. 6 and need not be described again. Further, the flow for controlling the
interchangeable lens 151 is the same as that of the first embodiment shown in FIG. 7 and need not be described again. - The operation of the camera and lens according to the foregoing flow will now be summarized.
- At the photography preparation stage, the photographic light flux is split and projected upon the
image sensing device 111 via thefocus detection module 430 in the manner shown in FIG. 12. When themain switch 120 andphotography preparation switch 121 are turned on by the photographer, the camera enlarges one of the two secondary images, which are projected upon theimage sensing device 111 in the manner shown in FIG. 13, and displays the enlarged image on thedisplay unit 404, as illustrated in FIG. 14. Next, the camera calculates the amount of object defocus from the spacing between the two secondary images and transmits this to theinterchangeable lens 151. In response, theinterchangeable lens 151 drives the focusing lens in accordance with the defocus quantity, thereby performing focusing. Next, when thephotography switch 122 is turned on, thefocus detection module 430 is withdrawn from the photographic light flux, as shown in FIG. 15, so that the in-focus image of the object is projected upon theimage sensing device 111, as shown in FIG. 16. The camera acquires the image of the object and records the image in theimage memory 106. - The fourth embodiment has the following advantages in addition to the advantages (AD1), (AD2) of the first embodiment:
- (AD6) A mirror for deflecting the optical path is not required in the
focus detection module 430, thereby making it possible to reduce the size of the module and simplify the same. - (AD7) Since the image for purposes of focus detection is displayed on the monitor screen of the display unit, an optical finder is unnecessary. This makes it possible to reduce the size and lower the cost of the apparatus.
- In the fourth embodiment described above, the reducing lens is used in the optical system for focus detection. In a fifth embodiment described below, however, a relay lens is inserted into the optical system for focus detection and the reducing lens is not employed.
- FIGS. 18 and 19 are diagrams relating to the fifth embodiment.
- FIG. 18 is a diagram showing the construction of an image sensing apparatus according to the present invention. This shows the apparatus when detection of focus is carried out. According to this embodiment, the
focus detection module 430 of FIG. 12 is replaced by afocus detection module 530, and arelay lens module 540 is additionally provided. Alens 531 is provided at the rearmost portion of the image forming optical system inside aninterchangeable lens 551. Other components are the same as shown in FIG. 12. - As shown in FIG. 18, the
focus detection module 530 includes afield mask 536, afield lens 535 and a pair of secondaryimage forming lenses 537. Thestop 155 of theinterchangeable lens 551 and the entrance pupil of the pair of secondaryimage forming lenses 537 are in a projection relationship owing to thefield lens 535. - The
relay lens module 540 is provided internally with aconcave relay lens 541. A quick-return actuator (QR) 539 is provided for moving thefocus detection module 530 andrelay lens module 540 into the photographic light flux alternatively. - The image of the object OBJ is formed as a primary image IM5 on the primary image forming surface in the
field lens 535 via the image sensing optical system, which is constructed by the lens groups 152-154, stop 155 andlens 531. The arrangement is such that the primary image IM5 has a size substantially the same as that of the image IM4 of the fourth embodiment. - The primary image IM5 is split by the two secondary
image forming lenses 437, whereby the image is formed again. These secondary images are projected upon theimage sensing device 111 as IMA and IMB. The projected images are the same as those shown in FIG. 13. In addition, the viewing image displayed on adisplay unit 504 is similar to that shown in FIG. 14. - FIG. 19 is a diagram showing the camera when the
photography switch 122 of thecamera body 501 is closed upon the completion of the autofocus operation, thereby establishing the photographic state. - When the
photography switch 122 is closed, the entirefocus detection module 530 is withdrawn away from the optical path of photography (i.e., upward in FIG. 19) by the quick-return actuator 539. Therelay lens module 540 is inserted into the photographic optical path in place of thefocus detection module 530. When this is done, the primary image IM3 formed by the image sensing optical system in theinterchangeable lens 551 and therelay lens 541 in thecamera body 501 is formed on theimage sensing device 111. The state of the formed image is the same as that of the primary image shown in FIG. 16. Accordingly, the image signal prevailing under these conditions is accepted and recorded in theimage memory 106 of thecamera body 501, whereby the image is sensed. - The control flow of this embodiment is the same as that of the fourth embodiment and need not be described again.
- The fifth embodiment has the following advantages in addition to the advantages (AD1), (AD2) of the first embodiment and the advantages (AD6), (AD7) of the fourth embodiment:
- (AD8) Since use is made of the
relay lens 541 advanced and retracted at the time of photography, greater freedom is provided in terms of optical design and the optical system can be reduced in size and improved in performance. - (AD9) The optical structure of the
focus detection module 530 is simplified and optical aberration of this module can be reduced to improve the accuracy of focus detection. - It should be noted that the focus detection optical system of the second or third embodiment may be applied to the fourth or fifth embodiment. Further, a half-mirror may be placed in front of the focus detection module of the fourth or fifth embodiment to extract part of the photographic light flux and introduce this flux to an optical finder. Furthermore, the invention may be applied not only to an image sensing apparatus of interchangeable lens type but also to an image sensing apparatus having a fixed lens.
- FIGS. 20 through 24 are diagrams relating to the sixth embodiment.
- FIG. 20 is a block diagram showing the construction of an image sensing apparatus according to a sixth embodiment.
-
Numeral 601 denotes a camera body having various functional components for forming the image of an object OBJ, detecting focus and sensing the image. - The camera body includes the focusing
lens group 152 for performing focusing by being advanced and retracted along the direction of the optic axis; thezoom lens group 153 for performing zooming by being advanced and retracted along the direction of the optic axis; and therelay lens group 154 for performing a prescribed image forming operation together with thelens groups stop 155 decides the entrant light flux of the image sensing optical system. - An
infrared blocking filter 606 blocks infrared light from the object OBJ and passes only visible light. Thelens groups stop 155 and theinfrared blocking filter 606 together construct the image sensing optical system. A first IM1 of the object OBJ is formed on a mainimage sensing device 111. - As in the first and other embodiments, the main
image sensing device 111 is a two-dimensional photoelectric sensor, such as a CCD, for photoelectrically converting the first image IM1. - The camera body further includes a
rangefinding module 621 having a light-receivinglens 622 for forming the image of the object OBJ whose range is to be measured, aninfrared blocking filter 623 for blocking infrared light and passing only visible light of the light flux that has passed through the light-receivinglens 622, and a subordinateimage sensing device 624. The rangefinding optical system, which includes the light-receivinglens 622 and theinfrared blocking filter 623, has an image forming power different from that of the above-mentioned image sensing optical system and forms a second image IM2 of the object OBJ on the subordinateimage sensing device 624, described later. - The subordinate
image sensing device 624, such as a CCD, is a two-dimensional photoelectric sensor for photoelectrically converting the second image IM2. Themodule 621 including these elements is so disposed that its optic axis is spaced away from the optic axis of the image sensing optical system by a distance equivalent to a baselength BL. - A
microcomputer 631 is a single-chip microcomputer having a ROM, a RAM and A/D, D/A conversion functions. In accordance with a camera sequence program stored in the ROM (not shown), themicrocomputer 631 implements a series of camera operations such as automatic exposure control (AE), autofocus (AF) and image sensing. To this end, themicrocomputer 631 controls the operation of peripheral circuits and actuators inside thecamera body 601. According to the present invention, the ROM constitutes a storage medium and can be a semiconductor memory, an optical disk, a magneto-optic disk or a magnetic medium, etc. - The
power supply 103 supplies the camera circuits and actuators with power. - The
driver 105 drives and controls the mainimage sensing device 111. Thedriver 105 controls the storage of charge in theimage sensing device 111, charge-transfer, CDS (Correlated Double Sampling), AGC (Automatic Gain Control), A/D conversion, gamma correction and AWB (Automatic White Balance), etc. - A
driver 634 drives and controls the subordinateimage sensing device 624 and, like thedriver 105 of the main image sensing device, controls the storage of charge in theimage sensing device 111, charge transfer, CDS, AGC, A/D conversion, gamma correction and AWB, etc. - The
memory 106 records and preserves image signal data representing an image sensed by the mainimage sensing device 111 and can be a semiconductor memory, optical disk, magneto-optical disk or magnetic medium, etc. - The
terminal 107 for outputting a recorded image to external equipment is connected to a personal computer or printer. - The camera body has a
display unit 104, such as a liquid crystal panel, having a display function for displaying photographic conditions and a monitor function for monitoring a photographic image. - As in the first embodiment, the camera body has the
main switch 120. When this switch is turned on, themicrocomputer 631 allows the execution of a prescribed program relating to preparations for photography, namely exposure metering and focus detection, etc. - The
switches switch 121 is for preparing for picture taking. When this switch is turned on, preparatory photographic operations such as exposure metering, focus detection and focusing are executed. Theswitch 122 is a photography switch. When this switch is turned on, a photographic image that has been formed on theimage sensing device 111 is acquired and recorded in theimage memory 106. - The
AF mode switch 123 is used to select the autofocus mode. Thedisplay switch 124 is used to designate a display for monitoring a photographic image. - The
focus actuator 162 drives the focusinglens group 152 to advance and retract the same, and thefocus encoder 163 senses position information indicative of the position of the focusinglens group 152, namely object distance information. Thezoom actuator 164 drives thezoom lens group 153 to advance and retract the same, and thezoom encoder 165 senses position information indicative of the position of thezoom lens group 153, namely focal length information. - The stop actuator156 controls the stopping down of the
stop 155 and restores thestop 155 to the open state. - By virtue of the construction described above, the
camera body 601 acquires the first image IM1 and second image IM2 of the object OBJ and performs rangefinding, focusing, and image sensing through methods described later. - The state of image formation of the object OBJ at the time of focus detection prior to preparations for photography will now be described.
- A light flux from the object OBJ passes through the image sensing optical system comprising the
lens groups image sensing device 111 as the first image IM1. Further, the second image IM2 is formed on the subordinateimage sensing device 624 inside therangefinding module 621. - FIGS. 21A, 21B are diagrams illustrating the two
image sensing devices - The light-receiving
portion 112 of theimage sensing device 111 comprises m1×n1 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels. Thehorizontal transfer CCD 113 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receivingportion 112, then transfers the charge in the direction of arrow TRH and outputs an image signal from thesignal output portion 114 to theimage sensor driver 105. - In FIG. 21B, IM1 T represents the image of the object OBJ when the image sensing optical system has been set to the maximum telescopic mode, and IM1 W represents the image of the object OBJ when the image sensing optical system has been set to the maximum wide-angle mode. Thus, the size of the first image IM1 of the object varies depending upon the state of the image sensing optical system.
- A light-receiving
portion 625 of the subordinateimage sensing device 624 comprises m2×n2 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels. Ahorizontal transfer CCD 626 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receivingportion 625, then transfers the charge in the direction of arrow TRH and outputs an image signal from thesignal output portion 627 to theimage sensor driver 634. - In FIG. 21A, IM2 INF represents the image obtained when the object OBJ is at infinity, and IM2 DEF represents the image obtained when the object OBJ is at a finite distance. Thus, the position of the second image IM2 of the object varies depending upon the distance of the object OBJ.
- FIGS. 22A, 22B are diagrams useful in describing the principle of image magnification correction for detecting object distance from the first and second images IM1 and IM2, respectively, of the object.
- According to the principle of rangefinding by triangulation, a disparity in regard to the object is detected from the relative positions of two images formed by two image forming systems spaced apart by a predetermined baselength, and the object distance is found from this disparity. In this case, it is required that the sizes of the two images be equalized. However, as described above in connection with FIGS. 21A, 21B, the first image IM1 varies in size depending upon the zoom setting of the image sensing optical system. Even if the two images have the same optical size, the number of pixels (or pixel size) of the light-receiving
portion 112 ofimage sensing device 111 and the number of pixels (or pixel size) of the light-receivingportion 625 ofimage sensing device 624 differ. Consequently, if the image signal is processed digitally, it is necessary to subject the image to a magnitude correction based upon the difference in the numbers of pixels. - In this embodiment, the image forming characteristics of the image sensing optical system are recognized from the results of detection from the
focus encoder 163 andzoom encoder 165 inside thecamera body 601, and the size of the first image IM1 is made equal to the size of the second image IM2 based upon the results of recognition. - FIGS. 22A, 22B are diagrams illustrating the respective images after application of the above-described size correction. FIGS. 21A, 21B illustrate the sizes of the optical images on the
image sensing devices microcomputer 631. In FIG. 22A, IM2 DEF represents the image signal read out of the subordinateimage sensing device 624 and stored in a second computation memory RAM2, and IM1 0 represents the image signal read out of the mainimage sensing device 111 and stored in a first computation memory RAM1. The signal IM1 0is an image signal that has undergone the size correction described above. As the result of the size correction, the image IM2 DEF regarding the second image IM2 of the object and the image IM1 0 regarding the first image IM1 of the object are the same in size but and differ only in terms of their relative positions, as illustrated in FIG. 23. - FIG. 23 is a schematic view showing spacing VDEF of image signals stored in the two computation memories RAM1, RAM2 mentioned above.
-
- where the following holds:
- ΔV=V DEF −V 0 (EQ.2)
- Ideally, the reference spacing V0 should be zero when the object is at infinity. In general, however, the optical systems and image sensing elements develop positional offsets in the camera manufacturing process. According to this embodiment, therefore, information representing the reference spacing V0 conforming to the positions of the focusing
lens group 152 andzoom lens group 153 is stored in the ROM (not shown) of themicrocomputer 631. - FIG. 24 is a flowchart showing the control flow of the
microcomputer 631 when focus detection, focusing and photography are performed in thecamera body 601 according to the sixth embodiment. - The control flow of FIG. 24 will be described while making reference to FIGS. 20 through 23.
- When the
main switch 120 of thecamera body 101 is turned on, themicrocomputer 631 is activated from the sleep state and control proceeds from step S501 to step S502, at which the states of the switches 121-124 inside thecamera body 601 are sensed. - The state of the photography preparation switch121 (SW1), which is turned on by pressing the release button through its first stroke length, is sensed at step S503. Control returns to step S502 when the
switch 121 is off and proceeds to step S504 when theswitch 121 is on. - Next, at step S504, the main
image sensing device 111 is activated to acquire an image signal. - Processing of the image signal acquired at step S504 is executed at step S505. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S506, at which object brightness information is calculated from the image signal processed at step S505. Further, in accordance with a predetermined exposure control program, a control value for stopping down the
stop 155 and the exposure time (charge storage time) of theimage sensing device 111 are calculated. - The image signal produced by the
image sensing device 111 at steps S504 and S505, namely the image signal 1M1 W or 1M1 T in FIG. 21B, is displayed on thedisplay unit 104 at step S507. - Next, the setting of the
AF mode switch 123 is discriminated at step S508 to determine if the autofocus (AF) mode is in effect. Control jumps to step S520 if the mode is not the AF mode and proceeds to step S511 if the mode is the AF mode. The autofocus operation, described below, is then executed. - The
microcomputer 631 senses the state of thefocus encoder 163 at step S511 and senses the state of thezoom encoder 165 at step S512 to judge the current optical status of the lens. - A coefficient for making the size of the first image IM1 of the object equal to that the second image of the object, namely an image magnification correction coefficient, is read out of the ROM (not shown) of
microcomputer 631 at step S513 in the manner described above in connection with FIGS. 22A, 22B. Coefficients are stored in the ROM as matrix data corresponding to the states of thefocus encoder 163 andzoom encoder 165. - Next, a position offset correction quantity V0 is read out of ROM at step S514 in the same manner as the image magnification correction coefficient.
- The subordinate
image sensing device 624 is activated at step S515 to obtain an image signal. - Processing of the image signal acquired at step S515 is executed at step S516. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S517, at which image size is corrected by multiplying the image signal of the first image IM1 acquired at step S505 by the image magnification correction coefficient read out at step S513.
- Next, at step S518, the distance to the object is calculated in accordance with Equations (EQ.1), (EQ.2) using the first image IM1 whose size has been corrected at step S517, the second image IM2 obtained at step S516 and the position offset correction quantity V0 obtained at step S514.
- The focusing
lens group 152 is driven at step S519 based upon the result of the above-described calculation to bring the first image IM1, which is for image sensing purposes, into focus. - The state of the photography switch122 (SW2) is discriminated at step S520. If the
switch 122 is off, control returns to step S502 so that the processing of steps S502-S519 is executed again. If thephotography switch 122 is found to be on at step S520, it is judged that release has been performed and control shifts to step S521. - The
stop actuator 166 is driven at step S521 in accordance with the stop control value calculated at step S506. - Charge accumulation and charge transfer of the main
image sensing device 111 for photography are controlled at step S522. This is followed by step S523, at which processing of the image signal acquired at step S522 is executed in the same manner as performed at step S505. More specifically, the image signal is applied to A/D conversion, white balance adjustment, gamma correction and compression processing, etc. - Next, at step S524, the signal processed at step S523 is recorded and preserved in the
image memory 106. The image recorded at step S524 is displayed on thedisplay unit 104 at step S525, thereby allowing the photographer to check the image that is the result of photography. - The
stop actuator 166 is restored to open thestop 155 at step S526. - Control returns to step S502 when the above-described photographic operation is completed.
- The operation of the camera according to the foregoing flowchart will now be summarized.
- At the photography preparation stage, the first image IM1 of the object is formed on the
image sensing device 111 via the image sensing optical system and the second image IM2 of the object is formed on the subordinateimage sensing device 624 via the light-receivinglens 622, as illustrated in FIG. 20 and FIGS. 21A, 21B. - When the
main switch 120 andphotography preparation switch 121 are turned on by the photographer, the camera obtains the two above-mentioned images, performs the image magnification correction, as shown in FIGS. 22A, 22B, and calculates the distance to the object OBJ by calculating the spacing between the two images in the manner shown in FIG. 23. The focusinglens group 152 is driven based upon the calculated value, whereby focusing is achieved. Continuous focusing is performed by executing this operation repeatedly. Meanwhile, the first image IM1 of the object is displayed on thedisplay unit 104 to inform the photographer of the composition of the picture taken and of the state of focusing. - When the
photography switch 122 is turned on, the image of the object projected upon theimage sensing device 111 is recorded in theimage memory 106 and image of the picture taken is displayed on thedisplay unit 104. - The sixth embodiment provides the following advantages:
- (AD10) The rangefinding module can be simplified to make possible an autofocus camera that is compact and low in price.
- (AD11) The image sensing optical system serves also as the rangefinding optical system. As a result, when telescopic photography requiring more accurate rangefinding is performed, the image of the object for rangefinding purposes is also projected in enlarged size. This makes it possible to achieve a rangefinding accuracy that conforms to the state of the image sensing optical system.
- (AD12) Parameter correction, e.g., correction of image magnification, at the time of rangefinding computation is carried out using a rangefinding parameter that conforms to the state of the image sensing optical system. Even if the state of the image sensing optical system changes, therefore, accurate detection of object distance is possible at all times.
- (AD13) The autofocus operation is executed repeatedly at the photography preparation stage. This makes it possible to shorten release time lag from issuance of a photography start instruction to implementation of image sensing.
- (AD14) Since the automatically focused image of the object is displayed on the
display unit 104 such as a liquid crystal monitor, the state of focus of the image of the object can be verified visually and accurately in real-time. - The sixth embodiment concerns a passive-triangulation-type rangefinding device composed of a single image sensing system and a single rangefinding module. A seventh embodiment described below provides an active-triangulation-type rangefinding device (i.e., a device in which infrared light is projected upon an object and rangefinding is performed based upon the reflected light) comprising a single image sensing system and a single projection system.
- FIGS. 25 through 31 are diagrams for describing the construction and operation of the seventh embodiment. FIG. 25 illustrates the disposition of the image sensing apparatus when rangefinding is performed according to the seventh embodiment. Components in FIGS. 25 through 31 that perform actions identical with those of the sixth embodiment are designated by like reference characters and need not be described again in detail.
-
Numeral 701 denotes a camera body having various functional components for forming the image of an object OBJ, detecting focus and sensing the image. The image sensing optical system is composed of the elements 152-155 of the sixth embodiment. - An
infrared blocking filter 706 blocks infrared light from the object OBJ and passes only visible light. Since thefilter 706 is used for ordinary image sensing, it is withdrawn from the light flux of the sensed image at the time of rangefinding (shown in FIG. 25). - An
infrared passing filter 707 blocks visible light and passes only infrared light from the object OBJ. Since thefilter 707 is used for rangefinding, the filter is inserted into the light flux of the sensed image only at the time of rangefinding. Thelens groups filters -
Numeral 711 denotes an image sensing device such as a CCD. This is a two-dimensional photoelectric sensor for photoelectrically converting the object image, which is for image sensing purposes, or the image of an infrared spot which is for rangefinding, described later. Theimage sensing device 711 is sensitive to light from the visible to infrared wavelengths. - A
projection module 721 includes a light-emittingelement 724 which emits infrared light from a light-emittingportion 723, and a projectinglens 722 for projecting the emitted infrared light onto the object OBJ. Theprojection module 721 having these elements is spaced away from the optic axis of the image sensing optical system by the baselength BL. As a result, a rangefinding pattern that corresponds to the projected image of the light-emittingportion 723, namely an infrared spot SPT, is formed on the object OBJ. The infrared spot SPT is formed, via the image sensing optical system, as an infrared spot image SPT1 on the mainimage sensing device 711 at a position spaced a predetermined distance away from the center thereof. Since theinfrared passing filter 707 has been inserted into the light flux of the sensed image, the light flux of the object OBJ per se does not pass through the filter; only the light flux from the infrared spot SPT arrives at the mainimage sensing device 711. - A
driver 734 drives the light-emittingelement 724 so that the latter emits rangefinding infrared light at the time of a rangefinding operation in accordance with an instruction from amicrocomputer 731. - The
microcomputer 731 is a single-chip microcomputer having a ROM, a RAM and A/D, D/A conversion functions. In accordance with a camera sequence program stored in the ROM serving as a storage medium, themicrocomputer 731 implements a series of camera operations such as automatic exposure control (AE), autofocus (AF) and image sensing in a manner similar to that of the sixth embodiment. To this end, themicrocomputer 731 controls the operation of peripheral circuits and actuators inside thecamera body 701. - The
power supply 103,driver 105,memory 106, terminal 107,display unit 104 and switches 120-124, 162-166 are similar to those of the sixth embodiment. - An
optical finder 761 is composed of alens group 762, azoom lens 762, an erectingprism 764 such as a Porro lens, afield mask 765 and aneyepiece 766. A zoom linkage member 767 mechanically connects thezoom lens 153 with thezoom lens 763. The magnification of theoptical finder 761 is automatically adjusted by the zoom linkage member 767 in operative association with the zooming operation of the image sensing optical system. Anexposure metering element 768 is disposed in the vicinity of theoptical finder 761. Theexposure metering element 768 splits the light flux within theoptical finder 761 by a beam splitter (not shown) and measures the brightness of the object before a picture is taken. - An erect real image IM2 of the object OBJ is projected into the
field mask 765 by theoptical finder 761 so that the photographer can verify the zone of photography by viewing the finder image IM2 through theeyepiece 766. - The state of image formation of the infrared spot image SPT1 at the time of rangefinding, which is a preparation for photography, will now be described.
- FIG. 26 is a diagram illustrating the disposition of the infrared spot image SPT1 formed on the main image sensing device at the time of rangefinding.
- A light-receiving
portion 212 of the mainimage sensing device 711 comprises m1×n1 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels. Ahorizontal transfer CCD 213 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receivingportion 212, then transfers the charge in the direction of arrow TRH and outputs an image signal from asignal output portion 214 to theimage sensor driver 105. - Further, SPT1 T represents the image of the infrared spot SPT when the image sensing optical system has been set to the maximum telescopic mode, and SPT1 W represents the image of the infrared spot when the image sensing optical system has been set to the maximum wide-angle mode. Thus, the size and projected position of the image vary depending upon the state of the image sensing optical system.
- FIG. 27 illustrates the result of subjecting the image SPT1 T or SPT1 W to processing similar to that of the sixth embodiment and normalizing size and position. The normalized image signal is indicated at SPT1 0. Spacing VDEF between the position of the center of gravity of the signal SPT1 0 and a predetermined reference position C is obtained. Distance DST to the object OBJ can be detected in accordance with Equations (EQ.1) and (EQ.2), in a manner similar to that of the sixth embodiment, using the reference spacing V0, normalized focal length f0 of the image sensing optical system and baselength BL of the
optical finder 761. As in the sixth embodiment, information relating to the reference spacing V0 conforming to the positions of the focusinglens group 152 andzoom lens group 153 is stored in the ROM of themicrocomputer 731. Further, f0 represents the focal length of the image sensing optical system normalized by normalization of the size and position of the spot image. This is an imaginary focal length for obtaining the normalized image signal SPT1 0 of FIG. 27 at all times even if there is a changed in the zoom state. - The object OBJ is brought into focus automatically if the focusing
lens group 152 is driven based upon the distance DST to the object OBJ calculated in accordance with Equations (EQ.1), (EQ.2). - FIG. 28 is a diagram showing the camera when the
photography switch 122 of thecamera body 701 is closed upon the completion of the autofocus operation, thereby establishing the image sensing state. - When the
photography switch 122 is turned on, the light-emittingelement 724 stops emitting infrared light. Thefilter actuator 708 is then actuated to withdraw theinfrared passing filter 707 from the photographic light flux and insert theinfrared blocking filter 706 into the light flux. When this is done, the image IM1 of the object OBJ is formed on the mainimage sensing device 711 via the image sensing optical system. If FIGS. 25 and 28 are compared, it will be seen that the positions of theinfrared blocking filter 706 andinfrared passing filter 707 are reversed. - FIG. 29 is a diagram showing formation of the image of the object on the main
image sensing device 711 at the time of photography. The primary image IM1 of the object OBJ is projected upon the image-receivingportion 212. Accordingly, the image signal is acquired under these conditions and recorded in theimage memory 106 of thecamera body 701, whereby the image is sensed. - FIG. 30 is a diagram showing the state of the display on the
display unit 104 after image sensing. The image IM1 acquired in FIG. 29 is displayed on the display screen of thedisplay unit 104 as an image IM1 L resulting from photography. This allows the photographer to determine whether photography has been performed correctly. - FIG. 31 is a flowchart showing the control flow of the
microcomputer 731 when focus detection, focusing and photography are performed in thecamera body 701 according to the seventh embodiment. The control flowchart of FIG. 31 will be described with reference to FIGS. 25 through 30. - When the
main switch 120 of thecamera body 701 is turned on, themicrocomputer 731 is activated from the sleep state and control proceeds from step S601 to step S602, at which the states of the switches 121-124 inside thecamera body 701 are sensed. - The state of the photography preparation switch121 (SW1), which is turned on by pressing the release button through its first stroke length, is sensed at step S603. Control returns to step S602 when the
switch 121 is off and proceeds to step S604 when theswitch 121 is on. - This is followed by step S604, at which the output of the
exposure metering element 768 is read out, object brightness information is calculated and, in accordance with a predetermined exposure control program, a control value for narrowing thestop 155 and the exposure time (charge storage time) of theimage sensing device 711 are calculated. - Next, the setting of the
AF mode switch 123 is discriminated at step S605 to determine if the autofocus (AF) mode is in effect. Control jumps to step S619 if the mode is not the AF mode and proceeds to step S611 if the mode is the AF mode. - The
microcomputer 731 senses the state of thezoom encoder 165 at step S611 to judge the current optical status of the lens. It should be noted that when rangefinding is performed according to this embodiment, the focusinglens 152 is always at an initial position that corresponds to infinity. The state of thefocus encoder 163, therefore, is not sensed. - A coefficient for making the size of the infrared spot image SPT1 conform to the reference value, namely an image magnification correction coefficient, is read out of the ROM (not shown) of
microcomputer 731 at step S612 in the manner described above in connection with FIG. 27. Coefficients are stored in the ROM as matrix data corresponding to the state of thezoom encoder 165. - Next, the image position offset correction quantity V0 is read out of ROM at step S613 in the same manner as the image magnification correction coefficient.
- The light-emitting
element 724 is activated at step S614 to project rangefinding infrared light toward the object OBJ. - The main
image sensing device 711 is activated at step S615 to obtain the signal representing the infrared spot image SPT1. - Processing of the image signal acquired at step S615 is executed at step S616. More specifically, the image signal is converted from an analog to a digital quantity.
- This is followed by step S617, at which image size is corrected by multiplying the image signal of the infrared spot image SPT1 acquired at step S616 by the image magnification correction coefficient read out at step S612. The resulting signal is converted to the signal SPT10 normalized in the manner shown in FIG. 27.
- Next, at step S618, the distance DEF to the object is calculated upon calculating VDEF in accordance with Equations (EQ.1), (EQ.2) using the normalized signal SPT10 obtained at step S617 and the positional offset correction quantity V0 obtained at step S613.
- The state of the photography switch122 (SW2) is discriminated at step S619. If the
switch 122 is off, control returns to step S602 so that the processing of steps S602-S618, namely the rangefinding operation, is executed again. If thephotography switch 122 is found to be on at step S619, it is judged that release has been performed and control shifts to step S621. - The focusing
lens 152 is driven at step S621 based upon the result of the calculation at step S618 to bring the image IM1 into focus. - The
filter actuator 708 is driven at step S622 to withdraw theinfrared passing filter 707 from the photographic light flux and insert theinfrared blocking filter 706 into the photographic light flux instead. - The
stop actuator 166 is driven at step S623 in accordance with the stop control value calculated at step S604. - Charge accumulation and charge transfer of the main image sensing device for photography are controlled at step S624. This is followed by step S625, at which processing of the image signal acquired at the above-mentioned steps is executed. More specifically, the image signal is applied to A/D conversion, white balance adjustment, gamma correction and compression processing, etc.
- Next, at step S626, the signal processed at step S625 is recorded and preserved in the
image memory 106. The image recorded at step S626 is displayed on thedisplay unit 104 at step S627, thereby allowing the photographer to check the image that is the result of photography. - The
stop actuator 166 is restored to open thestop 155 at step S628. Theinfrared blocking filter 706 andinfrared passing filter 707 are interchanged, i.e., restored to the positions that prevail at the time of the rangefinding operation, at step S629. The focusinglens group 152 is restored to its initial position at step S630. - Control returns to step S502 when the above-described photographic operation is completed.
- The operation of the camera according to the foregoing flowchart will now be summarized.
- When a switch operation in preparation for photography is performed by the photographer, infrared light is projected toward the object OBJ from the
projection module 721 to form the infrared spot SPT on the object, as shown in FIG. 25. As a result, the image sensing optical system forms the image of the infrared spot SPT on the mainimage sensing device 711 via theinfrared passing filter 707, and the distance to the object OBJ is detected based upon the amount of shift of the spot image from the reference position. - Next, when a picture is taken, the projection of infrared light is halted and the focusing lens is driven based upon the results of rangefinding. Next, photography is performed upon changing over the filter in front of the main
image sensing device 711 to theinfrared blocking filter 706, namely to the filter that passes visible light, the image acquired is stored and preserved in theimage memory 106 and the image of the picture taken is displayed on thedisplay unit 104. - In accordance with the seventh embodiment, the following advantages are obtained in addition to the advantages (AD10)-(AD14) of the sixth embodiment:
- (AD15) An active-triangulation-type rangefinding device that projects infrared light can be provided. This makes it possible to sense distance accurately even in a dark field.
- (AD16) Since an infrared spot image is obtained by an image sensing optical system having a large aperture, it is possible to measure distance even to a distant object.
- (AD17) Since a changeover is made between an infrared passing filter used for rangefinding and an infrared blocking filter used for imaging, rangefinding can be performed without the influence of external light even when the field is bright. In addition, degradation of the photographic image by infrared light can be prevented at the time of imaging, making it possible to obtain a high-quality image.
- (AD18) Since the camera has an optical finder, it is unnecessary to present an image display by a liquid crystal monitor or the like at the time of photographic preparations such as rangefinding. This makes it possible to conserve power.
- In the sixth embodiment described above, the image from the main
image sensing device 111 is displayed on thedisplay unit 104, such as a liquid crystal monitor, as is when rangefinding is performed. However, a liquid crystal monitor provides a display of low resolution and, though it makes it possible to roughly ascertain the focused state, accurate verification of focusing is difficult. In an eighth embodiment described below, a second image of an object is displayed superimposed on a first image of the object after being shifted by an amount proportional to the amount of defocusing. In other words, the eighth embodiment provides a finder of double-image coincidence type. - FIGS. 32 through 35 are diagrams relating to the eighth embodiment.
- FIG. 32 is a block diagram showing the structure of a
camera body 801 used in the eighth embodiment. The components are the same as those of the sixth embodiment, the only difference being the manner of control at the time of rangefinding and the manner in which an image is displayed. In thecamera body 801 of FIG. 32, thereference numerals - FIG. 33 corresponds to FIG. 23 of the sixth embodiment and illustrates a first object image IM31 0 for imaging formed in the computation memory and a second object image IM32DEF within the
rangefinding module 621. In the sixth embodiment, computation is performed to make the size of the first image IM1 of the object conform to the size of the second image of the object. According to the eighth embodiment, however, these images are used in presenting a display. The second image, therefore, is made to conform to the first image, which is for image picture-taking purposes, and it is so arranged that the sizes and limits of the images displayed on thedisplay unit 804 will coincide with the imaging area. - FIG. 34 is a diagram illustrating the state of a display on the
display unit 804 displaying the image signals of FIG. 33. - An image IM31L from the main
image sensing device 111 is displayed over the entire display area. A rectangular area AR centered on the display area is a twin-image display area in which an image IM32L, which is obtained by extracting part of the image from the subordinateimage sensing device 624, is displayed in superposition on the image IM31L. The images IM31L and IM32L are displayed in a form offset from each other by an amount DELTA calculated in accordance with the following equation: - DELTA=K·(V DEF −D FOCUS) (EQ.3)
- where VDEF represents a quantity relating to the distance to the object OBJ, DFOCUS a quantity relating to the amount of feed of the focusing
lens 152 and K a coefficient for improving visibility by enlarging the display offset quantity. - For example, if we assume that the distance to the object OBJ is infinity and that the amount of feed of the focusing
lens 152 for focusing the object is zero, then we have - DELTA=K·(V DEF −D FOCUS)=K·(0−0)=0 (EQ4)
- If the distance to the object is finite, then values of VDEF and DFOCUS conforming thereto are used. However, if the object is in focus owing to feed of the focusing
lens 152, then DELTA=0 holds at this time as well. - In other words, if the offset quantity DELTA corresponds to the amount of focal shift of the image sensing system with respect to the object OBJ and the image IM1 of the object is in focus, then DELTA will always be equal to zero. Accordingly, a coincidence finder can be implemented by the arrangement described above.
- FIG. 35 is a flowchart illustrating the flow of control by the
microcomputer 831 in a case where focus detection, focusing and photography are carried out in thecamera body 801 of the eighth embodiment. - The control flow of FIG. 35 will be described while making reference to FIGS. 32 and 34.
- When the
main switch 120 of thecamera body 801 is turned on, themicrocomputer 831 is activated from the sleep state and control proceeds from step S701 to step S702, at which the states of the switches 121-124 inside thecamera body 801 are sensed. - The state of the photography preparation switch121 (SW1), which is turned on by pressing the release button through its first stroke length, is sensed at step S703. Control returns to step S702 when the
switch 121 is off and proceeds to step S704 when theswitch 121 is on. - Next, at step S704, the main
image sensing device 111 is activated to acquire an image signal. - Processing of the image signal acquired at step S704 is executed at step S705. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S706, at which object brightness information is calculated from the image signal processed at step S705. Further, in accordance with a predetermined exposure control program, a control value for stopping down the
stop 155 and the exposure time (charge storage time) of theimage sensing device 111 are calculated. - Next, the setting of the
AF mode switch 123 is discriminated at step S707 to determine if the autofocus (AF) mode is in effect. Control jumps to step S722 if the mode is not the AF mode and proceeds to step S711 if the mode is the AF mode. - The
microcomputer 831 senses the state of thefocus encoder 163 at step S711 and senses the state of thezoom encoder 165 at step S712 to judge the current optical status of the lens. - A coefficient for making the size of the first image IM1 of the object equal to that the second image of the object, namely an image magnification correction coefficient, is read out of the ROM of
microcomputer 831 at step S713 in the manner described above in connection with FIGS. 22A, 22B. Coefficients are stored in the ROM as matrix data corresponding to the states of thefocus encoder 163 andzoom encoder 165. - Next, a position offset correction quantity V0 is read out of the ROM at step S714 in the same manner as the image magnification correction coefficient.
- The subordinate
image sensing device 624 is activated at step S715 to obtain an image signal. - Processing of the image signal acquired at step S715 is executed at step S716. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S717, at which image size is corrected by multiplying the image signal of the second image IM2 acquired at step S715 by the reciprocal of the image magnification correction coefficient read out at step S713.
- Next, at step S718, the distance to the object is calculated in accordance with Equations (EQ.1), (EQ.2) using the second image IM2 that was subjected to the image magnification correction at step S717, the first image IM1 obtained at step S705 and the position offset correction quantity V0 obtained at step S714.
- The focusing
lens group 152 is driven at step S719 based upon the result of the above-described calculation to bring the first image IM1 into focus. - This is followed by step S720, at which the offset DELTA between the two images for display shown in FIG. 34 is calculated in accordance with Equation (EQ.3) and processing for superposing the two images is executed.
- Next, at step S721, the image signal obtained at step S720, namely the split-image coincidence image signal, is displayed on the
display unit 804. - The state of the photography switch122 (SW2) is discriminated at step S722. If the
switch 122 is off, control returns to step S702 so that the processing of steps S702-S721 is executed again. If thephotography switch 122 is found to be on at step S722, it is judged that release has been performed and control shifts to step S731. - Steps S731-S736 are for an image sensing operation identical with that of steps S521-S526 of FIG. 24 according to the sixth embodiment. When the processing of step S736 is completed, control returns to step S702.
- The operation of the camera according to the foregoing flowchart will now be summarized.
- When the
main switch 120 andphotography preparation switch 121 are turned on by the photographer, the camera performs the rangefinding calculation and carries out autofocusing by driving the focusing lens in a manner similar to that of the sixth embodiment. On the basis of the results of rangefinding calculation and the results of driving the focusing lens, the state of focusing of the object is displayed as the amount of shift between two images on the double-image coincidence display device. Continuous focusing is performed by repeatedly executing this operation and the photographer is notified of the results of focusing in the form of the amount of offset between the twin images. Next, when thephotography switch 122 is turned on, the image of the object projected upon the main image sensing device is recorded in the image memory and image of the picture taken is displayed on thedisplay unit 804. - The eighth embodiment provides the following advantage in addition to the advantages (AD10)-(AD14) according to the sixth embodiment.
- (AD19) The state of focusing is displayed as a coincidence finder image on an electronic finder such as a liquid crystal monitor. As a result, the status of focus of the image of the object is made much more discernible.
- According to the eighth embodiment, a photoelectric coincidence finder is realized using the rangefinding device of the sixth embodiment. A ninth embodiment described below illustrates a case where the photoelectric coincidence finder is realized using the conventional passive- or active-type rangefinding device.
- FIGS. 36 through 39 are diagrams relating to the ninth embodiment.
- FIG. 36 is a diagram showing the construction of a
camera body 901 according to the ninth embodiment. Components other than those described below operate in the same manner as set forth in connection with the sixth embodiment of FIG. 20. Only the components that differ will be described. - The
camera body 901 has various functional components for forming the image of an object OBJ, detecting focus and sensing the image. - A
rangefinding module 921 has two light-receivinglenses 922 of the same power spaced apart by a predetermined baselength BL for forming images IM2, IM3 of the object OBJ whose distance is to be measured, aninfrared blocking filter 923 for blocking infrared light and passing only visible light of the light flux that has passed through the light-receivinglens 922, and a subordinateimage sensing device 924. The rangefinding optical system, which includes the light-receivinglenses 922 and theinfrared blocking filter 923, forms a second image IM2 of the object OBJ and a third image IM3 of the object OBJ on the subordinateimage sensing device 924, described later. - The subordinate
image sensing device 924, such as a CCD, is a two-dimensional photoelectric sensor for photoelectrically converting the second and third images IM2, IM3 of the object. The distance to the object OBJ can be detected from the spacing between images IM2, IM3 and the baselength BL using a prescribed calculation formula. - A
microcomputer 431 performs rangefinding and presents a display on a coincidence finder in accordance with a flowchart described below. - FIG. 37 is a diagram showing the subordinate
image sensing device 924 and the disposition of two images formed on the image sensing device. - A light-receiving
portion 925 of the subordinateimage sensing device 924 comprises m2×n2 light-receiving pixels and a charge transfer portion (vertical transfer CCD) for transferring electric charge that has accumulated in these pixels. Ahorizontal transfer CCD 926 stores electric charge transferred in the direction of arrow TRV by the vertical transfer CCD in the light-receivingportion 925, then transfers the charge in the direction of arrow TRH and outputs an image signal from ansignal output portion 927 to theimage sensor driver 134. - The second and third images of the object described in connection with FIG. 36 are shown at IM2 and IM3, respectively, and the spacing VDEF between the two images varies depending upon the object to the distance OBJ.
-
- FIG. 38 is a diagram illustrating the form of the display presented on a
display unit 904. An image IM1L0 from the mainimage sensing device 111 is displayed over the entire display area. A rectangular area AR centered on the display area is a twin-image display area. An image IM1LDEF, which is obtained by extracting the central portion of an image which is the copy of the image IM1L0 obtained by theimage sensing device 111, is displayed in the area AR in superposition on the image IM1L0. The images IM1L0 and IM1LDEF are displayed in a form offset from each other by an amount DELTA calculated in accordance with the following equation: - DELTA=K×(V DEF −D FOCUS) (EQ. 6)
- where VDEF represents a quantity relating to the distance to the object OBJ, DFOCUS a quantity relating to the amount of feed of the focusing
lens 152 and K a coefficient for improving visibility by enlarging the display offset quantity. - For example, if we assume that the distance to the object OBJ is infinity and that the amount of feed of the focusing
lens 152 for focusing the object is zero, then the display image offset quantity DELTA will be represented by the following equation, which is similar to Equation (EQ.4) of the eighth embodiment: - DELTA=K×(V DEF −D FOCUS)=K×X(0−0)=0 (EQ.7)
- If the distance to the object is finite, then values Of VDEF and DFOCUS conforming there to are used. However, if the object is in focus owing to feed of the focusing
lens 152, then DELTA=0 holds at this time as well. - In other words, according to the ninth embodiment, the construction of the
rangefinding module 921 and the images superposed on each other in the twin-image display area of thedisplay unit 904 differ from those of the eighth embodiment. However, as in the eighth embodiment, the offset quantity DELTA corresponds to the amount of focal shift of the image sensing system with respect to the object OBJ, and a finder of twin-image coincidence type similar to that of the eighth embodiment can be implemented. - FIG. 39 is a flowchart illustrating the flow of control by the
microcomputer 931 in a case where focus detection, focusing and photography are carried out in thecamera body 901 of the ninth embodiment. - The control flow of FIG. 39 will be described while making reference to FIGS. 36 through 38.
- When the
main switch 120 of thecamera body 901 is turned on, themicrocomputer 931 is activated from the sleep state and control proceeds from step S801 to step S802, at which the states of the switches 121-124 inside thecamera body 901 are sensed. - The state of the photography preparation switch121 (SW1), which is turned on by pressing the release button through its first stroke length, is sensed at step S803. Control returns to step S802 when the
switch 121 is off and proceeds to step S804 when theswitch 121 is on. - Next, at step S804, the main
image sensing device 111 is activated to acquire an image signal. - Processing of the image signal acquired at step S804 is executed at step S805. More specifically, processing such as A/D conversion of the image signal, white balance adjustment and gamma correction is executed.
- This is followed by step S806, at which object brightness information is calculated from the image signal processed at step S805. Further, in accordance with a predetermined exposure control program, a control value for stopping down the
stop 155 and the exposure time (charge storage time) of theimage sensing device 111 are calculated. - Next, the setting of the
AF mode switch 123 is discriminated at step S807 to determine if the autofocus (AF) mode is in effect. Control jumps to step S820 if the mode is not the AF mode and proceeds to step S811 if the mode is the AF mode. - The
microcomputer 931 senses the state of thefocus encoder 163 at step S811 and senses the state of thezoom encoder 165 at step S812 to judge the current optical status of the lens. - The subordinate
image sensing device 924 is activated at step S813 to obtain image signals for rangefinding purposes. - Next, the image signals acquired at step S813 are subjected to processing such as A/D conversion at step S814.
- Next, at step S815, the position offset quantity between the digital image signals of the images IM2 and IM3 obtained at step S814 is calculated and so is the distance to the object.
- The focusing
lens group 152 is driven at step S816 based upon the result of the above-described calculation to bring the first image IM1 into focus. - The offset quantity DELTA between the two images for display purposes shown in FIG. 38 is calculated in accordance with Equation (EQ.6) at step S817.
- This is followed by step S818, at which processing for superposing the two images in the manner shown in FIG. 38 is executed.
- Next, at step S819, the image signal obtained at step S818, namely the split-image coincidence image signal, is displayed on the
display unit 904. - The state of the photography switch122 (SW2) is discriminated at step S820. If the
switch 122 is off, control returns to step S802 so that the processing of steps S802-S819, namely automatic focusing and display of images on the display unit, is executed again. If thephotography switch 122 is found to be on at step S820, it is judged that release has been performed and control shifts to step S831. - Steps S831-S836 are for an image sensing operation identical with that of steps S731-S736 of FIG. 35 according to the eighth embodiment. When the processing of step S836 is completed, control returns to step S702.
- The operation of the camera according to the foregoing flowchart will now be summarized.
- When the
main switch 120 andphotography preparation switch 121 are turned on by the photographer, the camera performs the rangefinding calculation using the image signals obtained from therangefinding module 921 and carries out automatic focusing by driving the focusinglens 152 based upon the results of rangefinding calculation. On the basis of the results of rangefinding calculation and the results of driving the focusing lens, the state of focal shift of the image on theimage sensing device 111 is calculated. The image of the object obtained from theimage sensing device 111 and an image obtained by extracting the central portion of an image which is a copy of the first-mentioned image are superposed and displayed on thedisplay unit 904 with an offset between them that depends upon the amount of focal shift. Continuous focusing is performed by repeatedly executing this operation and the photographer is notified of the results of focusing in the form of the amount of offset between the twin images. Next, when thephotography switch 122 is turned on, the image of the object projected upon the main image sensing device is recorded in the image memory and image of the picture taken is displayed on thedisplay unit 904. - The ninth embodiment provides the following advantage in addition to the advantages (AD10)-(AD14) according to the sixth embodiment.
- (AD20) The state of focusing is displayed as a coincidence finder image even in an image sensing apparatus having the conventional rangefinding device and an electronic finder such as a liquid crystal monitor. As a result, the status of focus of the image of the object is made much more discernible through a simple, inexpensive arrangement.
- The rangefinding device in the ninth embodiment uses a passive triangulation rangefinder according to the prior art. However, it is possible to use a conventional active triangulation rangefinding device or a so-called sonar-type rangefinding device, which measures distance based upon the length of time required to receive reflected ultrasonic waves projected toward an object.
- As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the appended claims.
Claims (33)
1. An image sensing apparatus comprising:
photoelectric conversion means for photoelectrically converting an image of an object obtained through image forming optics;
optical path changeover means provided on an optical path between the image forming optics and said photoelectric conversion means and movable between first and second positions for changing over the optical path in such a manner that a first image of the object is formed on said photoelectric conversion means when said optical path changeover means is at the first position and a second image of the object is formed on said photoelectric conversion means when said optical path changeover means is at the second position;
focus detection means for detecting state of focus of the image forming optics using the first image when said optical path changeover means is at the first position; and
image sensing means for picking up the second image using the photoelectric conversion means when said optical path changeover means is at the second position.
2. The apparatus according to , wherein said optical path changeover means has focus detection optics for forming two secondary images as the second image of the object from a pair of light fluxes obtained by passing the image of the object through different pupil areas;
claim 1
said apparatus further comprising means for detecting a phase difference between the two secondary images, which have been formed on said photoelectric conversion means, when said optical path changeover means is at the first position.
3. The apparatus according to , wherein said optical path changeover means includes
claim 1
a first mirror for deflecting an image forming light flux in a direction different from an image forming optic axis connecting said image forming optics and said photoelectric conversion means; and
a second mirror for returning the deflected light flux to the image forming optic axis.
4. The apparatus according to , wherein said optical path changeover means includes a beam splitter for splitting an image forming light flux into light fluxes in first and second directions at a predetermined ratio of light quantities;
claim 1
said apparatus having finder means for viewing the image of the object along the second direction.
5. The apparatus according to , wherein said optical path changeover means has lens means for making image forming power of the image forming optics different at the first and second positions.
claim 1
6. The apparatus according to , further comprising release operating means, wherein said focus detection means is activated in response to a first operation of said release operating means, and said optical path changeover means is switched from the first position to the second position in response to a second operation of said release operating means.
claim 1
7. The apparatus according to , further comprising display means for displaying the first image of the object when focus detection is performed by said focus detection means and the second image of the object when pick-up is performed by said image sensing means.
claim 1
8. The apparatus according to , further comprising focusing control means for performing focusing based upon results of focus detection by said focus detection means.
claim 1
9. The apparatus according to , further comprising display means for selectively displaying the first and second images of the object;
claim 1
wherein said optical path changeover means has image magnification changing means for forming the first image on said photoelectric conversion means at a first magnification at the first position and forming the second image on said photoelectric conversion means at a second magnification at the second position, whereby sizes of the first and second images displayed on said display means are made substantially the same.
10. The apparatus according to , wherein said optical path changeover means forms a plurality of images of the object on said photoelectric conversion means at the first position and forms a single image of the object on said photoelectric conversion means at the second position.
claim 1
11. The apparatus according to , said optical path changeover means forms the first image of the object at a location displaced from the center of a light-receiving portion of said photoelectric conversion means at the first position and forms the second image of the object at the center of the light-receiving portion of the photoelectric conversion means at the second position.
claim 1
12. An image sensing method comprising the steps of:
photoelectrically converting an image of an object, which is obtained through image forming optics, using photoelectric conversion means;
forming a first image of the object on said photoelectric conversion means by moving optical path changeover means, which is movably disposed between the image forming optics and said photoelectric conversion means, to a first position, and forming a second image of the object on said photoelectric conversion means by moving said optical path changeover means to a second position;
detecting state of focus of the image forming optics using the first image; and
picking up the second image.
13. A computer readable storage medium storing a program executable by a computer, said storage medium storing:
program code for a procedure for converting an image of an object, which is obtained through image forming optics, using photoelectric conversion means;
program code for a procedure for forming a first image of the object on said photoelectric conversion means by moving optical path changeover means, which is movably disposed between the image forming optics and said photoelectric conversion means, to a first position, and forming a second image of the object on said photoelectric conversion means by moving said optical path changeover means to a second position;
program code for a procedure for detecting state of focus of the image forming optics using the first image; and
program code for a procedure for picking up the second image.
14. An image sensing apparatus comprising:
first optical image forming means for capturing a light flux from an object and forming a first image of the object;
first photoelectric conversion means for photoelectrically converting the first image;
second optical image forming means, which is spaced away from said first optical image forming means by a predetermined baselength, for forming a second image of the object;
second photoelectric conversion means for photoelectrically converting the second image; and
rangefinding means for sensing distance between said first optical image forming means and the object based upon outputs from said first and second photoelectric conversion means.
15. The apparatus according to , wherein said second optical image forming means has an image forming power different from that of said first optical image forming means.
claim 14
16. The apparatus according to , further comprising image signal recording means for recording the output of said first photoelectric conversion means.
claim 14
17. The apparatus according to , wherein said first optical image forming means includes a zoom lens and said rangefinding means has image magnification correction means for correcting a fluctuation in image magnification that accompanies a zooming operation of said zoom lens.
claim 14
18. The apparatus according to , further comprising display means for displaying the first image.
claim 14
19. An image sensing apparatus comprising:
projection means for projecting a rangefinding light flux toward an object to form a rangefinding pattern on the object;
optical image forming means, which is spaced away from said projection means by a predetermined baselength, for selectively forming the image of the rangefinding pattern and the image of the object;
photoelectric conversion means for photoelectrically converting the image of the rangefinding pattern and the image of the object; and
rangefinding means for sensing distance between said optical image forming means and the object based upon an output from said photoelectric conversion means when the image of the rangefinding pattern has been received by said photoelectric conversion means.
20. The apparatus according to , wherein said optical image forming means includes a zoom lens and said rangefinding means has image magnification correction means for correcting a fluctuation in image magnification that accompanies a zooming operation of said zoom lens.
claim 19
21. The apparatus according to , further comprising image signal recording means or recording the output of said photoelectric conversion means when the image of the object has been received by said photoelectric conversion means.
claim 19
22. The apparatus according to , further comprising wavelength region selecting means, which is interposed between said optical image forming means and said photoelectric conversion means, for passing a first wavelength region when the image of the rangefinding pattern is photoelectrically converted and passing a second wavelength region when the image of the object is photoelectrically converted.
claim 19
23. The apparatus according to , further comprising:
claim 14
focusing means for focusing said first optical image forming means based upon an output of said rangefinding means;
focal shift discrimination means for discriminating state of focus of the image of the object based upon outputs from said rangefinding means and said focusing means;
image signal combining means for combining outputs from said first and second photoelectric conversion means;
display means for displaying an output image from said image signal combining means; and
combining control means for changing operation of said image signal combining means based upon an output from said focal shift discrimination means.
24. The apparatus according to , wherein said combining control means varies relative amount of offset between display positions, on said display means, of outputs from said first and second photoelectric conversion means in dependence upon a focal shift signal output by said focal shift discriminating means.
claim 23
25. An image sensing apparatus comprising:
optical image forming means for capturing a light flux from an object and forming an image of the object;
photoelectric conversion means for photoelectrically converting the image of the object;
rangefinding means for sensing distance between said optical image forming means and the object;
focusing means for focusing said image forming optical means based upon an output from said rangefinding means;
display means for displaying the image of the object from said photoelectric conversion means;
focal shift discrimination means for discriminating state of focus of the image of the object based upon outputs from said rangefinding means and said focusing means; and
display control means for varying form of display of the image of the object on said display means based upon an output from said focal shift discrimination means.
26. The apparatus according to , wherein said display means has first and second display areas, and said display control means varies position of a displayed image in the second display area relative to a displayed image in the first display area in dependence upon a focal shift signal from said focal shift discrimination means.
claim 25
27. The apparatus according to , wherein said focal shift discrimination means detects amount of focal shift from outputs from said rangefinding means and said focusing means.
claim 25
28. An image sensing method comprising the steps of:
capturing a light flux from an object via first optical image forming means and forming a first image of the object;
photoelectrically converting the first image using first photoelectric conversion means;
forming a second image of the object via second optical image forming means, which is spaced away from said first optical image forming means by a predetermined baselength;
photoelectrically converting the second image using second photoelectric conversion means; and
measuring distance between said first optical image forming means and the object based upon outputs from said first and second photoelectric conversion means.
29. An image sensing method comprising the steps of:
projecting a rangefinding light flux toward an object using projection means to form a rangefinding pattern on the object;
selectively forming the image of the rangefinding pattern and the image of the object via optical image forming means spaced away from said projection means by a predetermined baselength;
photoelectrically converting the image of the rangefinding pattern and the image of the object using photoelectric conversion means; and
measuring distance between said optical image forming means and the object based upon an output from said photoelectric conversion means when the image of the rangefinding pattern has been received by said photoelectric conversion means.
30. An image sensing method comprising the steps of:
capturing a light flux from an object via optical image forming means and forming an image of the object;
photoelectrically converting the image of the object using photoelectric conversion means;
measuring distance between said optical image forming means and the object;
focusing said image forming optical means based upon the distance measured;
displaying the image of the object from said photoelectric conversion means;
discriminating focal shift of the image of the object based upon an output of measured distance and an output of focusing; and
varying form of the display based upon the focal shift.
31. A computer readable storage medium storing a program executable by a computer, said storage medium storing:
program code for capturing a light flux from an object via first optical image forming means and forming a first image of the object;
program code for photoelectrically converting the first image using first photoelectric conversion means;
program code for forming a second image of the object via second optical image forming means, which is spaced away from said first optical image forming means by a predetermined baselength;
program code for photoelectrically converting the second image using second photoelectric conversion means; and
program code for measuring distance between said first optical image forming means and the object based upon outputs from said first and second photoelectric conversion means.
32. A computer readable storage medium storing a program executable by a computer, said storage medium storing:
program code for projecting a rangefinding light flux toward an object using projection means to form a rangefinding pattern on the object;
program code for selectively forming the image of the rangefinding pattern and the image of the object via optical image forming means spaced away from said projection means by a predetermined baselength;
program code for photoelectrically converting the image of the rangefinding pattern and the image of the object using photoelectric conversion means; and
program code for measuring distance between said optical image forming means and the object based upon an output from said photoelectric conversion means when the image of the rangefinding pattern has been received by said photoelectric conversion means.
33. A computer readable storage medium storing a program executable by a computer, said storage medium storing:
program code for capturing a light flux from an object via optical image forming means and forming an image of the object; photoelectrically converting the image of the object using photoelectric conversion means;
program code for measuring distance between said optical image forming means and the object;
program code for focusing said image forming optical means based upon the distance measured;
program code for displaying the image of the object from said photoelectric conversion means;
program code for discriminating focal shift of the image of the object based upon an output of measured distance and an output of focusing; and
program code for varying form of the display based upon the focal shift.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP9-278593 | 1997-10-13 | ||
JP9-278594 | 1997-10-13 | ||
JP9278594A JPH11122517A (en) | 1997-10-13 | 1997-10-13 | Image pickup device and storage medium read by computer |
JP9278593A JPH11119089A (en) | 1997-10-13 | 1997-10-13 | Image pickup device and storage medium readable by computer |
Publications (2)
Publication Number | Publication Date |
---|---|
US20010045989A1 true US20010045989A1 (en) | 2001-11-29 |
US6377305B2 US6377305B2 (en) | 2002-04-23 |
Family
ID=26552939
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/170,056 Expired - Lifetime US6377305B2 (en) | 1997-10-13 | 1998-10-13 | Image sensing apparatus |
Country Status (1)
Country | Link |
---|---|
US (1) | US6377305B2 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030227655A1 (en) * | 2002-06-06 | 2003-12-11 | Tsung-Yin Chen | Lens module with adjustable magnification |
US6781632B1 (en) * | 1999-04-20 | 2004-08-24 | Olympus Corporation | Image pick-up apparatus capable of focus detection |
US6882369B1 (en) * | 1999-05-19 | 2005-04-19 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US6897899B1 (en) * | 1999-05-10 | 2005-05-24 | Olympus Optical Co., Ltd. | Electronic image pickup apparatus |
EP1533999A2 (en) * | 2003-11-21 | 2005-05-25 | Fujinon Corporation | Autofocus system |
US20050157180A1 (en) * | 2000-07-04 | 2005-07-21 | Koji Takahashi | Image sensing system and its control method |
US20050174464A1 (en) * | 2004-02-09 | 2005-08-11 | Olympus Corporation | Camera |
US6933978B1 (en) * | 1999-10-28 | 2005-08-23 | Canon Kabushiki Kaisha | Focus detecting device with photoelectric conversion portion having microlens and with light blocking portion having first and second openings |
US20070211351A1 (en) * | 2006-02-20 | 2007-09-13 | Lg Electronics Inc. | Apparatus and method for performing offset compensation, and computer-readable recording medium for performing the offset compensation |
US20080266652A1 (en) * | 2007-04-30 | 2008-10-30 | General Electric Company | Microscope with dual image sensors for rapid autofocusing |
US20090128657A1 (en) * | 2004-03-25 | 2009-05-21 | Ozluturk Fatih M | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
US7702229B2 (en) | 2006-10-18 | 2010-04-20 | Eastman Kodak Company | Lens array assisted focus detection |
US20100171871A1 (en) * | 2009-01-07 | 2010-07-08 | Canon Kabushiki Kaisha | Image sensing apparatus and control method therefor |
US20120014611A1 (en) * | 2010-07-19 | 2012-01-19 | Dean Bruce H | System and method for determining phase retrieval sampling from the modulation transfer function |
US20130308933A1 (en) * | 2012-05-18 | 2013-11-21 | Canon Kabushikiki Kaisha | Lens apparatus and image pickup system |
US20140063234A1 (en) * | 2012-08-31 | 2014-03-06 | Canon Kabushiki Kaisha | Distance calculating apparatus, image pick-up apparatus using the same, distance calculating method, program for realizing distance calculation, and storage medium having the program stored thereon |
US20140362277A1 (en) * | 2013-06-07 | 2014-12-11 | Canon Kabushiki Kaisha | Imaging apparatus and control method for same |
US9826159B2 (en) | 2004-03-25 | 2017-11-21 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US20180316846A1 (en) * | 2017-04-28 | 2018-11-01 | Canon Kabushiki Kaisha | Focus detection device and imaging system |
US10721405B2 (en) | 2004-03-25 | 2020-07-21 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3461275B2 (en) * | 1997-12-25 | 2003-10-27 | キヤノン株式会社 | Photoelectric conversion device and camera using the same |
FR2782809B1 (en) * | 1998-05-15 | 2005-08-26 | Asahi Optical Co Ltd | DETECTION DEVICE FOR DEVELOPING STATUS |
FR2782808B1 (en) * | 1998-05-15 | 2003-11-28 | Asahi Optical Co Ltd | IMAGE FORMING DEVICE ARRANGEMENT |
JP2000350086A (en) * | 1999-06-03 | 2000-12-15 | Fuji Photo Film Co Ltd | Image processor, image output device and camera |
JP3548539B2 (en) * | 2001-03-09 | 2004-07-28 | キヤノン株式会社 | Observation optics and binoculars |
JP3800052B2 (en) * | 2001-08-20 | 2006-07-19 | ソニー株式会社 | Still image capturing apparatus and method |
TWI232349B (en) * | 2003-07-07 | 2005-05-11 | Benq Corp | Method for adjusting relative position of lens module by using uniform light source |
US7099555B2 (en) * | 2003-08-20 | 2006-08-29 | Canon Kabushiki Kaisha | Light amount adjusting apparatus, optical equipment, optical filter and image-taking apparatus |
JP4532865B2 (en) * | 2003-09-09 | 2010-08-25 | キヤノン株式会社 | Imaging device and focus control method of imaging device |
JP4865284B2 (en) * | 2005-09-13 | 2012-02-01 | キヤノン株式会社 | Focusing optical system |
JP4770560B2 (en) * | 2006-04-11 | 2011-09-14 | 株式会社ニコン | Imaging apparatus, camera, and image processing method |
DE102008028486A1 (en) * | 2008-06-13 | 2009-12-31 | Siemens Aktiengesellschaft | Device and method for generating a locating signal |
JP5381510B2 (en) * | 2008-09-05 | 2014-01-08 | 株式会社リコー | Imaging device |
JP5637895B2 (en) * | 2011-02-24 | 2014-12-10 | キヤノン株式会社 | Image projection device |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6037511A (en) * | 1983-08-09 | 1985-02-26 | Minolta Camera Co Ltd | Focus detector of camera |
JPH02210974A (en) | 1989-02-09 | 1990-08-22 | Minolta Camera Co Ltd | Still video camera |
JPH02210976A (en) | 1989-02-09 | 1990-08-22 | Minolta Camera Co Ltd | Automatic focus adjustment device for still video camera |
JP2756351B2 (en) * | 1990-07-06 | 1998-05-25 | キヤノン株式会社 | Focus detection device |
JPH07131721A (en) | 1993-11-01 | 1995-05-19 | Konica Corp | Digital still camera |
JP3491343B2 (en) * | 1994-06-29 | 2004-01-26 | 株式会社ニコン | Focus detection device and focus detection method |
JPH0875986A (en) * | 1994-09-06 | 1996-03-22 | Nikon Corp | Focus detector |
-
1998
- 1998-10-13 US US09/170,056 patent/US6377305B2/en not_active Expired - Lifetime
Cited By (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6781632B1 (en) * | 1999-04-20 | 2004-08-24 | Olympus Corporation | Image pick-up apparatus capable of focus detection |
US6897899B1 (en) * | 1999-05-10 | 2005-05-24 | Olympus Optical Co., Ltd. | Electronic image pickup apparatus |
US20080056699A1 (en) * | 1999-05-19 | 2008-03-06 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US6882369B1 (en) * | 1999-05-19 | 2005-04-19 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US7808544B2 (en) | 1999-05-19 | 2010-10-05 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US20100328515A1 (en) * | 1999-05-19 | 2010-12-30 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US7463303B2 (en) | 1999-05-19 | 2008-12-09 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US8077250B2 (en) | 1999-05-19 | 2011-12-13 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US20050162541A1 (en) * | 1999-05-19 | 2005-07-28 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US20080055461A1 (en) * | 1999-05-19 | 2008-03-06 | Olympus Optical Co., Ltd. | Electronic still camera with capability to perform optimal focus detection according to selected mode |
US6933978B1 (en) * | 1999-10-28 | 2005-08-23 | Canon Kabushiki Kaisha | Focus detecting device with photoelectric conversion portion having microlens and with light blocking portion having first and second openings |
US7679645B2 (en) * | 2000-07-04 | 2010-03-16 | Canon Kabushiki Kaisha | Image sensing system and its control method |
US20050157180A1 (en) * | 2000-07-04 | 2005-07-21 | Koji Takahashi | Image sensing system and its control method |
US20030227655A1 (en) * | 2002-06-06 | 2003-12-11 | Tsung-Yin Chen | Lens module with adjustable magnification |
US7564595B2 (en) * | 2002-06-06 | 2009-07-21 | Tsung-Yin Chen | Lens module with adjustable magnification |
EP1638319A1 (en) * | 2003-11-21 | 2006-03-22 | Fujinon Corporation | Autofocus system |
EP1533999A3 (en) * | 2003-11-21 | 2005-06-29 | Fujinon Corporation | Autofocus system |
US20050110890A1 (en) * | 2003-11-21 | 2005-05-26 | Fujinon Corporation | Autofocus system |
EP1533999A2 (en) * | 2003-11-21 | 2005-05-25 | Fujinon Corporation | Autofocus system |
US20050174464A1 (en) * | 2004-02-09 | 2005-08-11 | Olympus Corporation | Camera |
US7453508B2 (en) * | 2004-02-09 | 2008-11-18 | Olympus Corporation | Camera having optical viewfinder |
US10382689B2 (en) | 2004-03-25 | 2019-08-13 | Clear Imaging Research, Llc | Method and apparatus for capturing stabilized video in an imaging device |
US11457149B2 (en) | 2004-03-25 | 2022-09-27 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US12132992B2 (en) | 2004-03-25 | 2024-10-29 | Clear Imaging Research, Llc | Method and apparatus for correcting blur in all or part of a digital video |
US11924551B2 (en) | 2004-03-25 | 2024-03-05 | Clear Imaging Research, Llc | Method and apparatus for correcting blur in all or part of an image |
US11812148B2 (en) | 2004-03-25 | 2023-11-07 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US20090128657A1 (en) * | 2004-03-25 | 2009-05-21 | Ozluturk Fatih M | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
US11800228B2 (en) | 2004-03-25 | 2023-10-24 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US11706528B2 (en) | 2004-03-25 | 2023-07-18 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US11627254B2 (en) | 2004-03-25 | 2023-04-11 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US11627391B2 (en) | 2004-03-25 | 2023-04-11 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US11595583B2 (en) | 2004-03-25 | 2023-02-28 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US11589138B2 (en) | 2004-03-25 | 2023-02-21 | Clear Imaging Research, Llc | Method and apparatus for using motion information and image data to correct blurred images |
US11490015B2 (en) | 2004-03-25 | 2022-11-01 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US9154699B2 (en) | 2004-03-25 | 2015-10-06 | Fatih M. Ozluturk | Method and apparatus to correct blur in all or part of a digital image by combining plurality of images |
US9167162B2 (en) | 2004-03-25 | 2015-10-20 | Fatih M. Ozluturk | Method and apparatus to correct digital image blur due to motion of subject or imaging device by adjusting image sensor |
US9294674B2 (en) * | 2004-03-25 | 2016-03-22 | Fatih M. Ozluturk | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
US9338356B2 (en) | 2004-03-25 | 2016-05-10 | Fatih M. Ozluturk | Method and apparatus to correct digital video to counteract effect of camera shake |
US9392175B2 (en) | 2004-03-25 | 2016-07-12 | Fatih M. Ozluturk | Method and apparatus for using motion information and image data to correct blurred images |
US11165961B2 (en) | 2004-03-25 | 2021-11-02 | Clear Imaging Research, Llc | Method and apparatus for capturing digital video |
US9774785B2 (en) | 2004-03-25 | 2017-09-26 | Clear Imaging Research, Llc | Method and apparatus to correct blur in all or part of a digital image by combining plurality of images |
US9800788B2 (en) | 2004-03-25 | 2017-10-24 | Clear Imaging Research, Llc | Method and apparatus for using motion information and image data to correct blurred images |
US9800787B2 (en) | 2004-03-25 | 2017-10-24 | Clear Imaging Research, Llc | Method and apparatus to correct digital video to counteract effect of camera shake |
US9826159B2 (en) | 2004-03-25 | 2017-11-21 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US9860450B2 (en) | 2004-03-25 | 2018-01-02 | Clear Imaging Research, Llc | Method and apparatus to correct digital video to counteract effect of camera shake |
US11108959B2 (en) | 2004-03-25 | 2021-08-31 | Clear Imaging Research Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US10171740B2 (en) | 2004-03-25 | 2019-01-01 | Clear Imaging Research, Llc | Method and apparatus to correct blur in all or part of a digital image by combining plurality of images |
US10341566B2 (en) | 2004-03-25 | 2019-07-02 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US10880483B2 (en) | 2004-03-25 | 2020-12-29 | Clear Imaging Research, Llc | Method and apparatus to correct blur in all or part of an image |
US10389944B2 (en) | 2004-03-25 | 2019-08-20 | Clear Imaging Research, Llc | Method and apparatus to correct blur in all or part of an image |
US10721405B2 (en) | 2004-03-25 | 2020-07-21 | Clear Imaging Research, Llc | Method and apparatus for implementing a digital graduated filter for an imaging apparatus |
US20070211351A1 (en) * | 2006-02-20 | 2007-09-13 | Lg Electronics Inc. | Apparatus and method for performing offset compensation, and computer-readable recording medium for performing the offset compensation |
US7692712B2 (en) * | 2006-02-20 | 2010-04-06 | Lg Electronics Inc. | Apparatus and method for performing offset compensation, and computer-readable recording medium for performing the offset compensation |
US7702229B2 (en) | 2006-10-18 | 2010-04-20 | Eastman Kodak Company | Lens array assisted focus detection |
US7576307B2 (en) * | 2007-04-30 | 2009-08-18 | General Electric Company | Microscope with dual image sensors for rapid autofocusing |
US20080266652A1 (en) * | 2007-04-30 | 2008-10-30 | General Electric Company | Microscope with dual image sensors for rapid autofocusing |
US8300137B2 (en) * | 2009-01-07 | 2012-10-30 | Canon Kabushiki Kaisha | Image sensing apparatus providing driving direction of focus lens for attaining in-focus state and control method therefor |
US20100171871A1 (en) * | 2009-01-07 | 2010-07-08 | Canon Kabushiki Kaisha | Image sensing apparatus and control method therefor |
US20120014611A1 (en) * | 2010-07-19 | 2012-01-19 | Dean Bruce H | System and method for determining phase retrieval sampling from the modulation transfer function |
US8781313B2 (en) * | 2012-05-18 | 2014-07-15 | Canon Kabushiki Kaisha | Lens apparatus and image pickup system |
US20130308933A1 (en) * | 2012-05-18 | 2013-11-21 | Canon Kabushikiki Kaisha | Lens apparatus and image pickup system |
US20140063234A1 (en) * | 2012-08-31 | 2014-03-06 | Canon Kabushiki Kaisha | Distance calculating apparatus, image pick-up apparatus using the same, distance calculating method, program for realizing distance calculation, and storage medium having the program stored thereon |
US9451216B2 (en) * | 2012-08-31 | 2016-09-20 | Canon Kabushiki Kaisha | Distance calculating apparatus, image pick-up apparatus using the same, distance calculating method, program for realizing distance calculation, and storage medium having the program stored thereon |
US20140362277A1 (en) * | 2013-06-07 | 2014-12-11 | Canon Kabushiki Kaisha | Imaging apparatus and control method for same |
US10455140B2 (en) * | 2017-04-28 | 2019-10-22 | Canon Kabushiki Kaisha | Focus detection device and imaging system |
US20180316846A1 (en) * | 2017-04-28 | 2018-11-01 | Canon Kabushiki Kaisha | Focus detection device and imaging system |
Also Published As
Publication number | Publication date |
---|---|
US6377305B2 (en) | 2002-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6377305B2 (en) | Image sensing apparatus | |
EP2340455B1 (en) | Camera and camera system | |
CN101662588B (en) | Image sensing apparatus, image sensing system and focus detection method | |
US6954233B1 (en) | Electronic image pick-up apparatus and method of adjusting the focal position thereof | |
JP5168798B2 (en) | Focus adjustment device and imaging device | |
JP4972960B2 (en) | Focus adjustment device and imaging device | |
JP4963569B2 (en) | Imaging system and lens unit | |
JP5168797B2 (en) | Imaging device | |
CN103837959A (en) | Focus detection apparatus, focus detection method, and image capturing apparatus | |
JP4995002B2 (en) | Imaging device, focusing device, imaging method, and focusing method | |
US8644698B2 (en) | Focusing-state detection apparatus, imaging apparatus, and its control method | |
JPH11122517A (en) | Image pickup device and storage medium read by computer | |
JP5157073B2 (en) | Focus adjustment device and imaging device | |
JP4863370B2 (en) | Imaging device | |
JP2006065080A (en) | Imaging device | |
JP5240591B2 (en) | Imaging device, focusing device, imaging method, and focusing method | |
JP2004109864A (en) | Imaging apparatus and imaging system provided with it | |
JP2005308960A (en) | Imaging apparatus provided with automatic focusing device | |
JP4560420B2 (en) | Imaging device | |
JP2013122494A (en) | Focus detector and camera | |
JP2000258683A (en) | Focus detector and range-finding device | |
JP4865275B2 (en) | Focus detection apparatus and imaging apparatus | |
US20110050981A1 (en) | Focusing apparatus | |
JP2000075201A (en) | Focus detector, focusing device and camera | |
US6532343B1 (en) | Camera finder device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ONUKI, ICHIRO;REEL/FRAME:009517/0591 Effective date: 19981005 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |