WO2015144973A1 - Digital imaging with pulsed illumination to generate a depth map - Google Patents
Digital imaging with pulsed illumination to generate a depth map Download PDFInfo
- Publication number
- WO2015144973A1 WO2015144973A1 PCT/FI2014/050227 FI2014050227W WO2015144973A1 WO 2015144973 A1 WO2015144973 A1 WO 2015144973A1 FI 2014050227 W FI2014050227 W FI 2014050227W WO 2015144973 A1 WO2015144973 A1 WO 2015144973A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- pulse
- image
- camera unit
- illumination
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/18—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/507—Depth or shape recovery from shading
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/18—Focusing aids
- G03B13/20—Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
Definitions
- the present application generally relates to digital imaging.
- Digital imaging is a technical field that relies on long-known physics particularly regarding optics, but that further involves numerous further considerations that derive from the digital image capturing and processing of digital image information. While film photography is relatively old an art, the digital imaging has very rapidly developed during the last decades. Focusing of a camera objective is one of the oldest challenges in camera imaging, both with film and digital cameras. While fixed focusing typically produces crisp and clear images with a short focal length, it is often desirable to blur background around desired image object or subject of an image. For such images, longer focal length and/or larger shutter aperture can be used to produce a suitably compressed depth of field (DOF) i.e. the distance range within which objects appear sharp. In modern digital cameras, the focal depth is typically automatically adjustable with an auto- focus (AF) function.
- DOF depth of field
- AF auto- focus
- Digital cameras can implement the AF making use of the digital information of image objects. For example, distance to a foreground object can be determined e.g. using focus bracketing in which a set of digital images are taken with different focusing settings. Objects appear blurred when out of focus and correspondingly the contrasts between adjacent pixels are smaller than when the same objects are imaged with suitable focusing i.e. appear within the DOF. It is also possible to use specialized equipment and circuitries such as multiple cameras to produce an inherently three-dimensional imaging and to estimate distance by computation similar to that of a human brain with the two eyes' information. A light-field camera is capable of simultaneously forming numerous images with different focal lengths and thus also suited for computational determination of distance to image objects.
- Distance information can be used for auto-focus, but also postprocessing can benefit from depth information.
- a depth map can be produced on photographing or after image capture by post-processing. With a depth map, desired image objects can be subjected to different processing such as recolored or blurred. Any techniques that require changing focus settings require physically adjusting the optics, multi-objective (3D) imaging is optically complex and postprocessing typically depends on visibility of regular shapes or clear edges in captured images and requires a fair amount of processing.
- Dedicated image sensors such as phase detectors also require dedicated and often complex circuitries.
- an apparatus comprising:
- a controller configured to control the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination
- a processor configured to produce a depth map based on brightness variation in the captured images.
- a computer program comprising:
- a computer-readable medium encoded with instructions that, when executed by a computer, perform:
- Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette, optical storage, magnetic storage, holographic storage, opto-magnetic storage, phase-change memory, resistive random access memory, magnetic random access memory, solid-electrolyte memory, ferroelectric random access memory, organic memory or polymer memory.
- the memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other functions, including but not limited to a memory of a computer, a chip set, and a sub assembly of an electronic device.
- FIG. 1 shows a schematic system for use as a reference with which some example embodiments of the invention can be explained;
- FIG. 2 shows a block diagram of an imaging apparatus of an example embodiment of the invention
- FIG. 3 shows a block diagram of an imaging unit of an example embodiment of the invention
- Fig. 4 shows an example scene with three objects
- Fig. 5 shows a timing diagram of an example embodiment that illustrates timing of an illumination pulse and of an image capture period for a first image
- Fig. 6 shows a timing diagram of an example embodiment that illustrates timing of an illumination pulse and of an image capture period for a second image
- Fig. 7 shows a timing chart of an example embodiment that illustrates again the timing of the illumination pulse and of the exposure period
- Figs. 8a to 8c show a flow chart illustrating a method of an example embodiment.
- Fig. 1 shows a schematic system 100 for use as a reference with which some example embodiments of the invention can be explained.
- the system 100 comprises a device 1 10 such as a camera phone, gaming device, security camera device, personal digital assistant, tablet computer or a digital camera having an imaging unit 120 with a field of view 130.
- the device 1 10 further comprises a display 140.
- Fig. 1 also shows a user 105 and an image object 150 that is being imaged by the imaging unit 120 and a background 160.
- the image object 150 is relatively small in comparison to the field of view at the image object 150.
- a continuous background 160 and a secondary object 155 are next to the image object 150. While this setting is not by any means necessary, it serves to simplify Fig. 1 and description of some example embodiments of the invention.
- the objects and the background collectively form a scene that is seen by the imaging unit 120 in its field of view 130.
- FIG. 2 shows a block diagram of an imaging apparatus 200 of an example embodiment of the invention.
- the imaging apparatus 200 is suited for operating as the device 1 10.
- the apparatus 200 comprises a communication interface 220, a host processor 210 coupled to the communication interface module 220, and a user interface 230 and a memory 240 coupled to the host processor 210.
- the memory 240 comprises a work memory and a non-volatile memory such as a read-only memory, flash memory, optical or magnetic memory.
- a non-volatile memory such as a read-only memory, flash memory, optical or magnetic memory.
- the software 250 may comprise one or more software modules and can be in the form of a computer program product that is software stored in a memory medium.
- the imaging apparatus 200 further comprises a digital image capture unit 260 and a viewfinder 270 each coupled to the host processor 210.
- the viewfinder 270 is implemented in an example embodiment by using a display configured to show a live camera view.
- the digital image capture unit 260 and the processor 210 are connected via a camera interface 280.
- Term host processor refers to a processor in the apparatus 200 in distinction of one or more processors in the digital image capture unit 260, referred to as camera processor(s) 330 in Fig. 3.
- different example embodiments of the invention share processing of image information and control of the imaging unit 300 differently.
- the processing is performed on the fly in one example embodiment and with off-line processing in another example embodiment. It is also possible that a given amount of images or image information can be processed on the fly and after that off-line operation mode is used as in one example embodiment.
- the on the fly operation refers e.g. to such real-time or near real-time operation that occurs in pace with taking images and that typically also is completed before next image can be taken.
- the camera processor 330 is referred to as a controller and the host processor is simply referred to as a processor.
- the communication interface module 220 is configured to provide local communications over one or more local links.
- the links may be wired and/or wireless links.
- the communication interface 220 may further or alternatively implement telecommunication links suited for establishing links with other users or for data transfer (e.g. using the Internet).
- Such telecommunication links may be links using any of: wireless local area network links, Bluetooth, ultra-wideband, cellular or satellite communication links.
- the communication interface 220 may be integrated into the apparatus 200 or into an adapter, card or the like that may be inserted into a suitable slot or port of the apparatus 200. While Fig. 2 shows one communication interface 220, the apparatus may comprise a plurality of communication interfaces 220.
- Any processor mentioned in this document is selected, for instance, from a group consisting of at least one of the following: a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a graphics processing unit, an application specific integrated circuit (ASIC), a field programmable gate array, a microcontroller, and any number of and any a combination thereof.
- Figure 2 shows one host processor 210, but the apparatus 200 may comprise a plurality of host processors.
- the memory 240 may comprise volatile and a non-volatile memory, such as a read-only memory (ROM), a programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), a random-access memory (RAM), a flash memory, a data disk, an optical storage, a magnetic storage, a smart card, or the like.
- ROM read-only memory
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- RAM random-access memory
- the apparatus 200 may comprise other elements, such as microphones, displays, as well as additional circuitry such as further input/output (I/O) circuitries, memory chips, application-specific integrated circuits (ASIC), processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, ciphering/deciphering circuitry, and the like. Additionally, the apparatus 200 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus when external power if external power supply is not available.
- I/O input/output
- ASIC application-specific integrated circuits
- processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, ciphering/deciphering circuitry, and the like.
- the apparatus 200 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus when external power if external power supply is not available.
- apparatus refers to the processor 210, an input of the processor 210 configured to receive information from the digital image capture unit 260 and an output of the processor 210 configured to provide information to the viewfinder.
- the image processor may comprise the processor 210 and the device in question may comprise the camera processor 330 and the camera interface 280 shown in Fig. 3.
- FIG. 3 shows a block diagram of an imaging unit 300 of an example embodiment of the invention.
- the digital image capture unit 300 comprises an objective 310, a pulse illuminator 315, and an image sensor 320 corresponding to the objective 310, a camera processor 330, a memory 340 comprising data such as user settings 344 and software 342 with which the camera processor 330 can manage operations of the imaging unit 300.
- the camera processor 330 operates as an image processing circuitry of an example embodiment.
- An input/output or camera interface 280 is also provided to enable exchange of information between the imaging unit 300 and the host processor 210.
- the image sensor 320 is, for instance, a CCD or CMOS unit.
- the image sensor 320 can also contain built-in analog-to-digital implemented on common silicon chip with the image sensor 320.
- a separate AID conversion is provided between the image sensor 320 and the camera processor 330.
- the image sensor 320 comprises a global shutter.
- the camera processor 330 takes care in particular example embodiments of one or more of the following functions: digital image stabilization; pixel color interpolation; white balance correction; edge enhancement; aspect ratio control; vignetting correction; combining of subsequent images for high dynamic range imaging; Bayer reconstruction filtering; chromatic aberration correction; dust effect compensation; and downscaling images; and pulsed illumination of the scene.
- the camera processor 330 performs little or no processing at all.
- the camera processor 330 is entirely omitted in an example embodiment in which the imaging unit 300 merely forms digitized images for subsequent processing e.g. by the host processor 210.
- the processing can be performed using the camera processor 330, the host processor 210, their combination or any other processor or processors.
- Fig. 4 shows an example scene 400 with three objects: a first object 410 nearest to the camera, a second object 420 little farther than the first object 410 and a third object 430 still farther. While different example embodiments can be used over a large range of different distances, in an example embodiment, the scene will be divided into three distance ranges that are, from nearest to farthest: a first range of 0 to 2 m; a second range for objects farther than the first range up to 5 m; and a third range behind the second range.
- pulsed light source or pulse illuminator is synchronized to the global reset of a camera exposure sequence produced by the camera processor 330.
- the pulsed light source located, for example, next to the camera unit 260.
- the start and stop of the light pulse can be synchronized to the global reset.
- a light pulse can also overlap with exposure.
- light pulse driving is controlled with a signal from the image sensor 320.
- This signal is typically available in sensors supporting mechanical shutter. In an example embodiment, this signal can be used directly. In another example embodiment where that is not possible or if timing resolution is not fine enough, an external timer circuit is used. Such an external timer circuit is configured takes the signal from the sensor as an input and to generate the start and stop light pulse signals with adequately fine accuracy.
- Fig. 5 shows a timing diagram that illustrates timing of an illumination pulse 510 and of an image capture or exposure period 520 for a first image
- Fig. 6 shows a timing diagram that illustrates timing of an illumination pulse 610 and of an image capture period 620 for a second image, according to an example embodiment. It is assumed that the camera unit 260 has a rolling shutter sensor and thus the reading of pixel lines extends over a period marked with successive vertical lines at the end of the exposure period.
- Fig. 7 shows a timing chart of an example embodiment that illustrates again the timing of the illumination pulse 510 and of the exposure period 520.
- Fig. 7 further shows as an oblique band 515 the time at which light of the illumination pulse reflects back to the camera as a function of distance to an object from which reflection occurs. For example, it is seen that most light of the illumination pulse reflects from an object 2 m away after the transmission of the illumination pulse has lapsed and during the exposure period 520.
- Fig. 7 demonstrates that how the range of effective illumination of image objects or range in which the pulse illuminates objects depends on the mutual timing of the pulse and of the exposing of an image. For example, by delaying the beginning of the image exposure, the range is brought forward and by shortening the pulse, the range is compressed.
- the scenes are not formed of equally reflective objects.
- two or more images are captured with different pulses.
- One or more of the images can be taken without any pulse illumination at all.
- the images with differently timed pulses are used to determine brightness variations that indicate the depth variations in the scene. The time of flight of the pulses causes that objects at given distance are exposed more than other objects and by comparing such different images, the differences in the reflectance, color or texture can be compensated.
- an electronic global shutter or mechanical shutter is used to cut the exposure.
- a global shutter or mechanical shutter reduces the amount of external light that might affect the measurement.
- the image sensor has a partial reset or a partial electronic global shutter. Using such a partial reset or partial global shutter, it may be possible to allow taking the measurement and reference image at the same time and to avoid a registration step.
- the timing between the light pulse and the exposure is overlapped. This increases the signal to noise level at the cost of depth resolution. The better signal to noise level may help to computer the depth map with a higher accuracy.
- An example embodiment is carried out by an apparatus that comprises: a camera unit; a pulse illuminator; a controller configured to control pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and a processor configured to produce a depth map based on brightness variation in the captured images.
- the camera unit can be considered to comprise the pulse illuminator 315 and controller (e.g. camera processor 330).
- controller e.g. camera processor 330
- the components can be freely implemented using blocks that perform more than one functions or by distributing functions to two or more blocks.
- the controller is configured to control the pulse illuminator and the camera unit to capture the set of images with variably timed pulses.
- the controller can be configured to synchronize the pulse illumination unit and the camera unit to capture a set of two or more images each with illumination of one pulse of variable timing for varying illumination of different portions of the scene in the images of the set of images.
- the pulse illuminator is configured to cast illumination with each pulse in the entire scene area.
- the pulse illuminator can be implemented using a flash light.
- the pulse illuminator is formed of a pulsed laser diode. If the laser diode is emitting light at a range partly blocked by an infrared filter of the camera unit 260, the infrared filter can be temporarily removed entirely or in part.
- the integration of an exposure can be interrupted after each pulse using a mechanical shutter or by electrically adjusting the image sensor so that each further pulse is sent with same timing with relation to the exposure period.
- variable timing of pulses refers to pulses differing from one another by at least one of: beginning moment of time of the pulse; and ending moment of time of the pulse.
- the pulses have equal duration.
- the controller is configured to control the pulse illuminator to illuminate different images captured by the camera unit at different portion of exposure period so as to selectively illuminate objects at different distances from the camera unit. See e.g. Figs. 5 and 6 that make it clear that the scene is illuminated by the pulses so that corresponding images are captured by the camera unit with different objects being illuminated.
- the first object 410 of Fig. 4 could be illuminated by the pulse of Fig. 5
- the second object 420 could be illuminated by the pulse of Fig. 6 and neither pulse could illuminate the third object 430 farther back.
- the processor is configured to:
- the processor is configured to:
- the difference image defines relative brightness difference between compared images.
- brightness can be expressed as a numeric value from 0 to 255 for each pixel.
- relative brightness changes or a relative image can be created. From such an image, the image objects could be automatically classified by their range e.g. to those in the close range, those in the mid-range and those farther away.
- the comparison is simplified by not performing it using all the image pixels, but using only a fraction of all image pixels such as one pixel from each group of four or three pixels.
- the set comprises three or more images and the processor is configured to form two or more difference images and to form the depth map based on the two or more difference images and the propagation information.
- the processor is configured to form two or more difference images and to form the depth map based on the two or more difference images and the propagation information.
- the controller is configured to control the pulse illuminator and the camera unit to capture a non-pulsed image in series with the set of plurality images of the scene so that the non-pulsed image is not illuminated by the pulse illuminator. It is not even necessary to capture first non- pulsed image and then the pulsed images or vice versa: the non-pulsed image can also be taken between two pulsed images. Moreover, the pulsed images need not be taken with progressively changing mutual timing between the pulses and the exposure periods: a non-pulsed image can be taken between pulsed images.
- the processor is configured to use the non-pulsed image with the set of images to refine the depth map.
- the processor is configured to use the non-pulsed image to produce a photograph of the scene. In an example embodiment, the processor is configured to control the camera unit to adjust focusing based on the depth map before controlling the camera unit to capture the non-pulsed image.
- the controller is configured to cause adjusting focusing of the camera unit while producing the set of images so as to anticipate the final taking of a picture for a user. It is understood that changing focus does not change the amount of exposure that is received on any sub-part of an image, whereas the sharpness of that sub-part may change.
- the comparison is performed on a block-by-block basis.
- the block size may correspond to 2, 4, 8, 12, 16, 20, 25, 32, 36 or any other number of pixels.
- the set of images is used by the controller or processor in determining the focusing, exposure and white balance.
- the controller is configured to cause producing a first pulse so that the first pulse begins before an exposure period of a first image, wherein the first image is being illuminated by the first pulse.
- the controller can be configured to cause producing the first pulse so that the first pulse ends on or before starting of the exposure period of a first image.
- the first pulse ends after the starting of the exposure period of a first image.
- the controller is configured to cause producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
- the controller is configured to cause producing the second pulse so that the second pulse ends on or before ending of the exposure period of a first image.
- the processor is configured to determine regions of the scene of given distance range from difference images formed from pairs of subsequently captured images taken with different timing of pulses.
- the controller is configured to cause producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
- the set of images consists of two images.
- Figs. 8a to 8c show a flow chart illustrating a method of an example embodiment. The method comprises:
- the method comprises, in various example embodiments, also one or more of steps:
- controlling the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination in which one image is captured without any pulse;
- controlling the pulse illuminator to illuminate different images captured by the camera unit at different portions of exposure period so as to selectively illuminate objects at different distances from the camera unit;
- a technical effect of one or more of the example embodiments disclosed herein is that normal image sensors can be used as such for producing depth maps
- Another technical effect of one or more of the example embodiments disclosed herein is that little processing is required for producing the depth maps.
- Another technical effect of one or more of the example embodiments disclosed herein is that interpretation of the results is simple.
- Yet another technical effect of one or more of the example embodiments disclosed herein is that the produced depth maps are very reliable and their producing does not necessarily require changing focus setting at all: the brightness variation can be detected even in regions that are out of focus without the delay of focusing.
- Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic.
- the software, application logic and/or hardware may reside on memory 240 or memory 340, for example.
- the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media.
- a "computer-readable medium" may be any non-transitory media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a computer described and depicted in Fig. 2.
- a computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
- the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the before-described functions may be optional or may be combined.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
A method, apparatus and computer program for controlling a pulse illuminator and a camera unit to capture a plurality of images of a scene with pulsed illumination that is variably timed in relation to the exposure period of the camera; and producing a depth map based on brightness variation in the captured images.
Description
DIGITAL IMAGING WITH PULSED ILLUMINATION TO GENERATE A DEPTH MAP
TECHNICAL FIELD
[0001] The present application generally relates to digital imaging.
BACKGROUND
[0002] This section illustrates useful background information without admission of any technique described herein representative of the state of the art.
[0003] Digital imaging is a technical field that relies on long-known physics particularly regarding optics, but that further involves numerous further considerations that derive from the digital image capturing and processing of digital image information. While film photography is relatively old an art, the digital imaging has very rapidly developed during the last decades. Focusing of a camera objective is one of the oldest challenges in camera imaging, both with film and digital cameras. While fixed focusing typically produces crisp and clear images with a short focal length, it is often desirable to blur background around desired image object or subject of an image. For such images, longer focal length and/or larger shutter aperture can be used to produce a suitably compressed depth of field (DOF) i.e. the distance range within which objects appear sharp. In modern digital cameras, the focal depth is typically automatically adjustable with an auto- focus (AF) function.
[0004] Digital cameras can implement the AF making use of the digital information of image objects. For example, distance to a foreground object can be determined e.g. using focus bracketing in which a set of digital images are taken with different focusing settings. Objects appear blurred when out of focus and correspondingly the contrasts between adjacent pixels are smaller than when the same objects are imaged with suitable focusing i.e. appear within the DOF. It is also possible to use specialized equipment and circuitries such as multiple cameras to produce an inherently three-dimensional imaging and to estimate distance by computation similar to that of a human brain with the two eyes' information. A light-field camera is capable of simultaneously forming numerous images with different focal lengths and thus also suited for computational determination of distance to image objects.
[0005] Distance information can be used for auto-focus, but also postprocessing can benefit from depth information. A depth map can be produced on photographing or after image capture by post-processing. With a depth map, desired image objects can be subjected to different processing such as recolored or blurred. Any techniques that require changing focus settings require physically adjusting the optics, multi-objective (3D) imaging is optically complex and postprocessing typically depends on visibility of regular shapes or clear edges in captured images and requires a fair amount of processing. Dedicated image sensors such as phase detectors also require dedicated and often complex circuitries.
SUMMARY
[0006] Various aspects of examples of the invention are set out in the claims.
[0007] According to a first example aspect of the present invention, there is provided an apparatus, comprising:
[0008] a camera unit;
[0009] a pulse illuminator;
[0010] a controller configured to control the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
[0011] a processor configured to produce a depth map based on brightness variation in the captured images.
[0012] According to a second example aspect of the present invention, there is provided a method, comprising:
[0013] controlling a pulse illuminator and a camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
[0014] producing a depth map based on brightness variation in the captured images.
[0015] According to a third example aspect of the present invention, there is provided a computer program, comprising:
[0016] code for controlling a pulse illuminator and a camera unit to capture
a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
[0017] code for producing a depth map based on brightness variation in the captured images;
[0018] when the computer program is run on a processor.
[0019] According to a fourth example aspect of the present invention, there is provided a computer-readable medium encoded with instructions that, when executed by a computer, perform:
[0020] controlling a pulse illuminator and a camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
[0021] producing a depth map based on brightness variation in the captured images.
[0022] Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette, optical storage, magnetic storage, holographic storage, opto-magnetic storage, phase-change memory, resistive random access memory, magnetic random access memory, solid-electrolyte memory, ferroelectric random access memory, organic memory or polymer memory. The memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other functions, including but not limited to a memory of a computer, a chip set, and a sub assembly of an electronic device.
[0023] Different non-binding example aspects and embodiments of the present invention have been illustrated in the foregoing. The embodiments in the foregoing are used merely to explain selected aspects or steps that may be utilized in implementations of the present invention. Some embodiments may be presented only with reference to certain example aspects of the invention. It should be appreciated that corresponding embodiments may apply to other example aspects as well.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] For a more complete understanding of example embodiments of the present invention, reference is now made to the following descriptions taken in
connection with the accompanying drawings in which:
[0025] Fig. 1 shows a schematic system for use as a reference with which some example embodiments of the invention can be explained;
[0026] Fig. 2 shows a block diagram of an imaging apparatus of an example embodiment of the invention;
[0027] Fig. 3 shows a block diagram of an imaging unit of an example embodiment of the invention;
[0028] Fig. 4 shows an example scene with three objects;
[0029] Fig. 5 shows a timing diagram of an example embodiment that illustrates timing of an illumination pulse and of an image capture period for a first image;
[0030] Fig. 6 shows a timing diagram of an example embodiment that illustrates timing of an illumination pulse and of an image capture period for a second image;
[0031] Fig. 7 shows a timing chart of an example embodiment that illustrates again the timing of the illumination pulse and of the exposure period; and
[0032] Figs. 8a to 8c show a flow chart illustrating a method of an example embodiment.
DETAILED DESCRIPTON OF THE DRAWINGS
[0033] An example embodiment of the present invention and its potential advantages are understood by referring to Figs. 1 through 8 of the drawings. In this document, like reference signs denote like parts or steps.
[0034] Fig. 1 shows a schematic system 100 for use as a reference with which some example embodiments of the invention can be explained. The system 100 comprises a device 1 10 such as a camera phone, gaming device, security camera device, personal digital assistant, tablet computer or a digital camera having an imaging unit 120 with a field of view 130. The device 1 10 further comprises a display 140. Fig. 1 also shows a user 105 and an image object 150 that is being imaged by the imaging unit 120 and a background 160.
[0035] In Fig. 1 , the image object 150 is relatively small in comparison to the field of view at the image object 150. Next to the image object 150, there is a
continuous background 160 and a secondary object 155. While this setting is not by any means necessary, it serves to simplify Fig. 1 and description of some example embodiments of the invention. The objects and the background collectively form a scene that is seen by the imaging unit 120 in its field of view 130.
[0036] Fig. 2 shows a block diagram of an imaging apparatus 200 of an example embodiment of the invention. The imaging apparatus 200 is suited for operating as the device 1 10. The apparatus 200 comprises a communication interface 220, a host processor 210 coupled to the communication interface module 220, and a user interface 230 and a memory 240 coupled to the host processor 210.
[0037] The memory 240 comprises a work memory and a non-volatile memory such as a read-only memory, flash memory, optical or magnetic memory. In the memory 240, typically at least initially in the non-volatile memory, there is stored software 250 operable to be loaded and executed by the host processor 210. The software 250 may comprise one or more software modules and can be in the form of a computer program product that is software stored in a memory medium. The imaging apparatus 200 further comprises a digital image capture unit 260 and a viewfinder 270 each coupled to the host processor 210. The viewfinder 270 is implemented in an example embodiment by using a display configured to show a live camera view. The digital image capture unit 260 and the processor 210 are connected via a camera interface 280.
[0038] Term host processor refers to a processor in the apparatus 200 in distinction of one or more processors in the digital image capture unit 260, referred to as camera processor(s) 330 in Fig. 3. Depending on implementation, different example embodiments of the invention share processing of image information and control of the imaging unit 300 differently. Also, the processing is performed on the fly in one example embodiment and with off-line processing in another example embodiment. It is also possible that a given amount of images or image information can be processed on the fly and after that off-line operation mode is used as in one example embodiment. The on the fly operation refers e.g. to such real-time or near real-time operation that occurs in pace with taking images and that typically also is completed before next image can be taken.
[0039] In an example embodiment, the camera processor 330 is referred to as a controller and the host processor is simply referred to as a processor.
[0040] It shall be understood that any coupling in this document refers to functional or operational coupling; there may be intervening components or circuitries in between coupled elements.
[0041] The communication interface module 220 is configured to provide local communications over one or more local links. The links may be wired and/or wireless links. The communication interface 220 may further or alternatively implement telecommunication links suited for establishing links with other users or for data transfer (e.g. using the Internet). Such telecommunication links may be links using any of: wireless local area network links, Bluetooth, ultra-wideband, cellular or satellite communication links. The communication interface 220 may be integrated into the apparatus 200 or into an adapter, card or the like that may be inserted into a suitable slot or port of the apparatus 200. While Fig. 2 shows one communication interface 220, the apparatus may comprise a plurality of communication interfaces 220.
[0042] Any processor mentioned in this document is selected, for instance, from a group consisting of at least one of the following: a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a graphics processing unit, an application specific integrated circuit (ASIC), a field programmable gate array, a microcontroller, and any number of and any a combination thereof. Figure 2 shows one host processor 210, but the apparatus 200 may comprise a plurality of host processors.
[0043] As mentioned in the foregoing, the memory 240 may comprise volatile and a non-volatile memory, such as a read-only memory (ROM), a programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), a random-access memory (RAM), a flash memory, a data disk, an optical storage, a magnetic storage, a smart card, or the like. In some example embodiments, only volatile or non-volatile memory is present in the apparatus 200. Moreover, in some example embodiments, the apparatus comprises a plurality of memories. In some example embodiments, various elements are integrated. For instance, the memory 240 can be constructed as a part of the apparatus 200 or inserted into a slot, port, or the like. Further still, the memory 240 may serve the
sole purpose of storing data, or it may be constructed as a part of an apparatus serving other purposes, such as processing data. Similar options are thinkable also for various other elements.
[0044] A skilled person appreciates that in addition to the elements shown in Figure 2, the apparatus 200 may comprise other elements, such as microphones, displays, as well as additional circuitry such as further input/output (I/O) circuitries, memory chips, application-specific integrated circuits (ASIC), processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, ciphering/deciphering circuitry, and the like. Additionally, the apparatus 200 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus when external power if external power supply is not available.
[0045] It is also useful to realize that the term apparatus is used in this document with varying scope. In some of the broader claims and examples, the apparatus may refer to only a subset of the features presented in Fig. 2 or even be implemented without any one of the features of Fig. 2. In one example embodiment term apparatus refers to the processor 210, an input of the processor 210 configured to receive information from the digital image capture unit 260 and an output of the processor 210 configured to provide information to the viewfinder. For instance, the image processor may comprise the processor 210 and the device in question may comprise the camera processor 330 and the camera interface 280 shown in Fig. 3.
[0046] Fig. 3 shows a block diagram of an imaging unit 300 of an example embodiment of the invention. The digital image capture unit 300 comprises an objective 310, a pulse illuminator 315, and an image sensor 320 corresponding to the objective 310, a camera processor 330, a memory 340 comprising data such as user settings 344 and software 342 with which the camera processor 330 can manage operations of the imaging unit 300. The camera processor 330 operates as an image processing circuitry of an example embodiment. An input/output or camera interface 280 is also provided to enable exchange of information between the imaging unit 300 and the host processor 210. The image sensor 320 is, for instance, a CCD or CMOS unit. In case of a CMOS unit, the image sensor 320 can also contain built-in analog-to-digital implemented on common silicon chip with the
image sensor 320. In an alternative example embodiment, a separate AID conversion is provided between the image sensor 320 and the camera processor 330. In an example embodiment, the image sensor 320 comprises a global shutter.
[0047] The camera processor 330 takes care in particular example embodiments of one or more of the following functions: digital image stabilization; pixel color interpolation; white balance correction; edge enhancement; aspect ratio control; vignetting correction; combining of subsequent images for high dynamic range imaging; Bayer reconstruction filtering; chromatic aberration correction; dust effect compensation; and downscaling images; and pulsed illumination of the scene.
[0048] In an example embodiment, the camera processor 330 performs little or no processing at all. The camera processor 330 is entirely omitted in an example embodiment in which the imaging unit 300 merely forms digitized images for subsequent processing e.g. by the host processor 210. For most of the following description, the processing can be performed using the camera processor 330, the host processor 210, their combination or any other processor or processors.
[0049] Fig. 4 shows an example scene 400 with three objects: a first object 410 nearest to the camera, a second object 420 little farther than the first object 410 and a third object 430 still farther. While different example embodiments can be used over a large range of different distances, in an example embodiment, the scene will be divided into three distance ranges that are, from nearest to farthest: a first range of 0 to 2 m; a second range for objects farther than the first range up to 5 m; and a third range behind the second range.
[0050] In an example embodiment, pulsed light source or pulse illuminator is synchronized to the global reset of a camera exposure sequence produced by the camera processor 330. The pulsed light source located, for example, next to the camera unit 260. The start and stop of the light pulse can be synchronized to the global reset. A light pulse can also overlap with exposure.
[0051] In an example embodiment, light pulse driving is controlled with a signal from the image sensor 320. This signal is typically available in sensors supporting mechanical shutter. In an example embodiment, this signal can be
used directly. In another example embodiment where that is not possible or if timing resolution is not fine enough, an external timer circuit is used. Such an external timer circuit is configured takes the signal from the sensor as an input and to generate the start and stop light pulse signals with adequately fine accuracy.
[0052] Fig. 5 shows a timing diagram that illustrates timing of an illumination pulse 510 and of an image capture or exposure period 520 for a first image and Fig. 6 shows a timing diagram that illustrates timing of an illumination pulse 610 and of an image capture period 620 for a second image, according to an example embodiment. It is assumed that the camera unit 260 has a rolling shutter sensor and thus the reading of pixel lines extends over a period marked with successive vertical lines at the end of the exposure period.
[0053] Fig. 7 shows a timing chart of an example embodiment that illustrates again the timing of the illumination pulse 510 and of the exposure period 520. Fig. 7 further shows as an oblique band 515 the time at which light of the illumination pulse reflects back to the camera as a function of distance to an object from which reflection occurs. For example, it is seen that most light of the illumination pulse reflects from an object 2 m away after the transmission of the illumination pulse has lapsed and during the exposure period 520.
[0054] Fig. 7 demonstrates that how the range of effective illumination of image objects or range in which the pulse illuminates objects depends on the mutual timing of the pulse and of the exposing of an image. For example, by delaying the beginning of the image exposure, the range is brought forward and by shortening the pulse, the range is compressed.
[0055] It is understood that in real life, the scenes are not formed of equally reflective objects. According to an example embodiment, two or more images are captured with different pulses. One or more of the images can be taken without any pulse illumination at all. According to an example embodiment, the images with differently timed pulses are used to determine brightness variations that indicate the depth variations in the scene. The time of flight of the pulses causes that objects at given distance are exposed more than other objects and by comparing such different images, the differences in the reflectance, color or texture can be compensated.
[0056] In an example embodiment, an electronic global shutter or
mechanical shutter is used to cut the exposure. A global shutter or mechanical shutter reduces the amount of external light that might affect the measurement.
[0057] In an example embodiment, the image sensor has a partial reset or a partial electronic global shutter. Using such a partial reset or partial global shutter, it may be possible to allow taking the measurement and reference image at the same time and to avoid a registration step.
[0058] In an example embodiment, the timing between the light pulse and the exposure is overlapped. This increases the signal to noise level at the cost of depth resolution. The better signal to noise level may help to computer the depth map with a higher accuracy.
[0059] An example embodiment is carried out by an apparatus that comprises: a camera unit; a pulse illuminator; a controller configured to control pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and a processor configured to produce a depth map based on brightness variation in the captured images.
[0060] As shown in Fig. 3, the camera unit can be considered to comprise the pulse illuminator 315 and controller (e.g. camera processor 330). However, the components can be freely implemented using blocks that perform more than one functions or by distributing functions to two or more blocks.
[0061] In an example embodiment, the controller is configured to control the pulse illuminator and the camera unit to capture the set of images with variably timed pulses. For example, the controller can be configured to synchronize the pulse illumination unit and the camera unit to capture a set of two or more images each with illumination of one pulse of variable timing for varying illumination of different portions of the scene in the images of the set of images.
[0062] In an example embodiment, the pulse illuminator is configured to cast illumination with each pulse in the entire scene area. For example, the pulse illuminator can be implemented using a flash light. In another example embodiment, the pulse illuminator is formed of a pulsed laser diode. If the laser diode is emitting light at a range partly blocked by an infrared filter of the camera unit 260, the infrared filter can be temporarily removed entirely or in part.
[0063] Speaking of the hardware, with a pixel where photon capturing and
storage is separated it may be possible to store several light pulse sequences. In an example embodiment, this is performed to increase signal to noise ratio. For instance, the integration of an exposure can be interrupted after each pulse using a mechanical shutter or by electrically adjusting the image sensor so that each further pulse is sent with same timing with relation to the exposure period.
[0064] In an example embodiment, the variable timing of pulses refers to pulses differing from one another by at least one of: beginning moment of time of the pulse; and ending moment of time of the pulse.
[0065] In an example embodiment, the pulses have equal duration.
[0066] In an example embodiment, the controller is configured to control the pulse illuminator to illuminate different images captured by the camera unit at different portion of exposure period so as to selectively illuminate objects at different distances from the camera unit. See e.g. Figs. 5 and 6 that make it clear that the scene is illuminated by the pulses so that corresponding images are captured by the camera unit with different objects being illuminated. For example, the first object 410 of Fig. 4 could be illuminated by the pulse of Fig. 5, the second object 420 could be illuminated by the pulse of Fig. 6 and neither pulse could illuminate the third object 430 farther back.
[0067] In an example embodiment, the processor is configured to:
receive the set of images and propagation information indicative of propagation of light of the pulses of the pulsed illumination with relation to exposure periods of the images of the set of images; and
produce the depth map based on the propagation information and the brightness variation in the captured images.
[0068] In an example embodiment, the processor is configured to:
compare brightness of the images of the set of images and to produce, based on the comparing, a difference image indicative of the brightness variation; and
produce the depth map based on the propagation information and the brightness variation indicated by the difference image.
[0069] In an example embodiment, the difference image defines relative brightness difference between compared images. For example, brightness can be expressed as a numeric value from 0 to 255 for each pixel. By dividing brightness
values between two images (e.g. Figs. 5 and 6), relative brightness changes or a relative image can be created. From such an image, the image objects could be automatically classified by their range e.g. to those in the close range, those in the mid-range and those farther away.
[0070] In some example embodiments, the comparison is simplified by not performing it using all the image pixels, but using only a fraction of all image pixels such as one pixel from each group of four or three pixels.
[0071] In an example embodiment, the set comprises three or more images and the processor is configured to form two or more difference images and to form the depth map based on the two or more difference images and the propagation information. By using more than two images, spatial resolution can be achieved in depth mapping the scene.
[0072] In an example embodiment, the controller is configured to control the pulse illuminator and the camera unit to capture a non-pulsed image in series with the set of plurality images of the scene so that the non-pulsed image is not illuminated by the pulse illuminator. It is not even necessary to capture first non- pulsed image and then the pulsed images or vice versa: the non-pulsed image can also be taken between two pulsed images. Moreover, the pulsed images need not be taken with progressively changing mutual timing between the pulses and the exposure periods: a non-pulsed image can be taken between pulsed images. In an example embodiment, the processor is configured to use the non-pulsed image with the set of images to refine the depth map. In an example embodiment, the processor is configured to use the non-pulsed image to produce a photograph of the scene. In an example embodiment, the processor is configured to control the camera unit to adjust focusing based on the depth map before controlling the camera unit to capture the non-pulsed image.
[0073] In an example embodiment, the controller is configured to cause adjusting focusing of the camera unit while producing the set of images so as to anticipate the final taking of a picture for a user. It is understood that changing focus does not change the amount of exposure that is received on any sub-part of an image, whereas the sharpness of that sub-part may change. In an example embodiment, the comparison is performed on a block-by-block basis. The block size may correspond to 2, 4, 8, 12, 16, 20, 25, 32, 36 or any other number of
pixels.
[0074] In an example embodiment, the set of images is used by the controller or processor in determining the focusing, exposure and white balance.
[0075] In an example embodiment, the controller is configured to cause producing a first pulse so that the first pulse begins before an exposure period of a first image, wherein the first image is being illuminated by the first pulse. The controller can be configured to cause producing the first pulse so that the first pulse ends on or before starting of the exposure period of a first image. In another example embodiment, the first pulse ends after the starting of the exposure period of a first image.
[0076] In an example embodiment, the controller is configured to cause producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse. In an example embodiment, the controller is configured to cause producing the second pulse so that the second pulse ends on or before ending of the exposure period of a first image.
[0077] In an example embodiment, the processor is configured to determine regions of the scene of given distance range from difference images formed from pairs of subsequently captured images taken with different timing of pulses.
[0078] In an example embodiment, the controller is configured to cause producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
[0079] In an example embodiment, the set of images consists of two images.
[0080] Figs. 8a to 8c show a flow chart illustrating a method of an example embodiment. The method comprises:
801 . controlling a pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
802. producing a depth map based on brightness variation in the captured images.
[0081] The method comprises, in various example embodiments, also one or more of steps:
803. controlling the pulse illuminator and the camera unit to capture the set of images with variably timed pulses;
804. synchronizing the pulse illumination unit and the camera unit to capture a set of two or more images each with illumination of one pulse of variable timing for varying illumination of different portions of the scene in the images of the set of images;
805. casting illumination with each pulse in the entire scene area;
806. controlling the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination in which one image is captured without any pulse;
807. controlling the pulse illuminator to illuminate different images captured by the camera unit at different portions of exposure period so as to selectively illuminate objects at different distances from the camera unit;
808. receiving the set of images and propagation information indicative of propagation of light of the pulses of the pulsed illumination with relation to exposure periods of the images of the set of images; and
809. producing the depth map based on the propagation information and the brightness variation in the captured images;
810. comparing brightness of the images of the set of images and to produce, based on the comparing, a difference image indicative of the brightness variation; and
81 1 . producing the depth map based on the propagation information and the brightness variation indicated by the difference image;
812. causing forming of two or more difference images and forming the depth map based on the two or more difference images and the propagation information;
813. controlling the pulse illuminator and the camera unit to capture a non-pulsed image in series with the set of plurality images of the scene so that the non-pulsed image is not illuminated by the pulse
illuminator;
814. using the non-pulsed image with the set of images to refine the depth map;
815. using the non-pulsed image to capture a photograph of the scene;
816. controlling the camera unit to adjust focusing based on the depth map before controlling the camera unit to capture the non-pulsed image;
817. causing producing of a first pulse so that the first pulse begins before an exposure period of a first image, wherein the first image is being illuminated by the first pulse;
818. causing producing the first pulse so that the first pulse ends on or before starting of the exposure period of a first image;
819. causing producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse;
820. causing producing of the second pulse so that the second pulse ends on or before ending of the exposure period of a first image;
821 . determining regions of the scene of given distance range from difference images formed from pairs of subsequently captured images taken with different timing of pulses;
822. causing producing of a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse;
823. controlling the camera unit to adjust focusing during producing of the set of images;
824. controlling the camera unit to estimate white balance during producing of the set of images;
825. controlling the camera unit to estimate exposure time during producing of the set of images.
[0082] Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is that normal image sensors can be used as such
for producing depth maps Another technical effect of one or more of the example embodiments disclosed herein is that little processing is required for producing the depth maps. Another technical effect of one or more of the example embodiments disclosed herein is that interpretation of the results is simple. Yet another technical effect of one or more of the example embodiments disclosed herein is that the produced depth maps are very reliable and their producing does not necessarily require changing focus setting at all: the brightness variation can be detected even in regions that are out of focus without the delay of focusing. Still another technical effect of one or more of the example embodiments disclosed herein is that the pulses can be evenly spread all over the scene without a need to produce any patterns. Still another technical effect of one or more of the example embodiments disclosed herein is that the detection of distances based on the brightness comparison is not dependent on the shapes of surfaces or even continuity of the shapes.
[0083] Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on memory 240 or memory 340, for example. In an example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any non-transitory media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a computer described and depicted in Fig. 2. A computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
[0084] If desired, the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the before-described functions may be optional or may be combined.
[0085] Although various aspects of the invention are set out in the
independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.
[0086] It is also noted herein that while the foregoing describes example embodiments of the invention, it should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims.
Claims
1. An apparatus, comprising:
a camera unit;
a pulse illuminator;
a controller configured to control the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
a processor configured to produce a depth map based on brightness variation in the captured images.
2. The apparatus of claim 1 , wherein the controller is configured to control the pulse illuminator and the camera unit to capture the set of images with variably timed pulses.
3. The apparatus of claim 1 or 2, wherein the controller is configured to synchronize the pulse illuminator and the camera unit to capture a set of two or more images each with illumination of one pulse of variable timing for varying illumination of different portions of the scene in the images of the set of images.
4. The apparatus of any of preceding claims, wherein the pulse illuminator is configured to cast illumination with each pulse in the entire scene area.
5. The apparatus of any of preceding claims, wherein the controller is configured to control the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination in which one image is captured without any pulse.
6. The apparatus of any of preceding claims, wherein the variable timing of pulses refers to pulses differing from one another by at least one of: beginning moment of time of the pulse; and ending moment of time of the pulse.
7. The apparatus of any of preceding claims, wherein the controller is configured to control the pulse illuminator to illuminate different images captured by the camera unit at different portions of exposure period so as to selectively illuminate objects at different distances from the camera unit.
8. The apparatus of any of preceding claims, wherein the processor is configured to:
receive the set of images and propagation information indicative of propagation of light of the pulses of the pulsed illumination with relation to exposure periods of the images of the set of images; and
produce the depth map based on the propagation information and the brightness variation in the captured images.
9. The apparatus of claim 8, wherein the set comprises three or more images and the processor is configured to form two or more difference images and to form the depth map based on the two or more difference images and the propagation information.
10. The apparatus of claim 8 or 9, wherein the is processor configured to:
compare brightness of the images of the set of images and to produce, based on the comparing, a difference image indicative of the brightness variation; and
produce the depth map based on the propagation information and the brightness variation indicated by the difference image.
1 1 . The apparatus of claim 10, wherein the difference image defines relative brightness difference between compared images.
12. The apparatus of claim 10 or 1 1 , wherein the difference image defines difference of brightness of plurality of image pixels.
13. The apparatus of claim 12, wherein the difference image defines difference of brightness of at least 25 % of image pixels of the images of compared images.
14. The apparatus of any of preceding claims, wherein the controller is configured to control the pulse illuminator and the camera unit to capture a non-pulsed image in series with the set of plurality images of the scene so that the non-pulsed image is not illuminated by the pulse illuminator.
15. The apparatus of claim 14, wherein the processor is configured to use the non- pulsed image with the set of images to refine the depth map.
16. The apparatus of claim 14 or 15, wherein the processor is configured to use the non-pulsed image to capture a photograph of the scene.
17. The apparatus of any of claims 14 to 16, wherein the processor is configured to control the camera unit to adjust focusing based on the depth map before controlling the camera unit to capture the non-pulsed image.
18. The apparatus of any of preceding claims, wherein the controller is configured to cause producing a first pulse so that the first pulse begins before an exposure period of a first image, wherein the first image is being illuminated by the first pulse.
19. The apparatus of claim 18, wherein the controller is configured to cause producing the first pulse so that the first pulse ends on or before starting of the exposure period of the first image.
20. The apparatus of any of preceding claims, wherein the controller is configured to cause producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
21 . The apparatus of claim 20, wherein the controller is configured to cause producing the second pulse so that the second pulse ends on or before ending of the exposure period of a first image.
22. The apparatus of any of preceding claims, wherein the processor is configured to determine regions of the scene of given distance range from difference images formed from pairs of subsequently captured images taken with different timing of pulses.
23. The apparatus of any of preceding claims, wherein the controller is configured to cause producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
24. The apparatus of any of preceding claims, wherein the set of images consists of two images.
25. The apparatus of any of preceding claims, wherein the camera unit has a global shutter.
26. The apparatus of any of preceding claims, wherein the controller is configured to control the camera unit to adjust focusing during producing of the set of images.
27. The apparatus of any of preceding claims, wherein the controller is configured to control the camera unit to estimate white balance during producing of the set of images.
28. The apparatus of any of preceding claims, wherein the controller is configured to control the camera unit to estimate exposure time during producing of the set of images.
29. A method, comprising:
controlling a pulse illuminator and a camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
producing a depth map based on brightness variation in the captured images.
30. The method of claim 29, comprising controlling the pulse illuminator and the camera unit to capture the set of images with variably timed pulses.
31 . The method of claim 29 or 30, comprising synchronizing the pulse illumination unit and the camera unit to capture a set of two or more images each with illumination of one pulse of variable timing for varying illumination of different portions of the scene in the images of the set of images.
32. The method of claim 31 , wherein the variable timing of pulses refers to pulses differing from one another by at least one of: beginning moment of time of the pulse; and ending moment of time of the pulse.
33. The method of claim any of claims 29 to 32, comprising casting illumination with each pulse in an entire area of the scene.
34. The method of any of claims 29 to 33, comprising controlling the pulse illuminator and the camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination in which one image is captured without any pulse.
35. The method of any of claims 29 to 34, comprising controlling the pulse illuminator to illuminate different images captured by the camera unit at different portions of exposure period so as to selectively illuminate objects at different distances from the camera unit.
36. The method of any of claims 29 to 35, comprising:
receiving the set of images and propagation information indicative of propagation of light of the pulses of the pulsed illumination with relation to exposure periods of the images of the set of images; and
producing the depth map based on the propagation information and the brightness variation in the captured images.
37. The method of claim 36, comprising:
comparing brightness of the images of the set of images and to produce, based on the comparing, a difference image indicative of the brightness variation; and
producing the depth map based on the propagation information and the brightness variation indicated by the difference image.
38. The method of claim 37, wherein the difference image defines relative brightness difference between compared images.
39. The method of claim 37, wherein the difference image defines difference of brightness of plurality of image pixels.
40. The method of claim 39, wherein the difference image defines difference of brightness of at least 25 % of image pixels of the images of compared images.
41 . The method of any of claims 36 to 40, wherein the set comprises three or more images and the method comprises causing forming of two or more difference images and forming the depth map based on the two or more difference images and the propagation information.
42. The method of any of claims 29 to 41 , comprising controlling the pulse illuminator and the camera unit to capture a non-pulsed image in series with the set of plurality images of the scene so that the non-pulsed image is not illuminated by the pulse illuminator.
43. The method of claim 42, comprising using the non-pulsed image with the set of images to refine the depth map.
44. The method of claim 42 or 43, comprising using the non-pulsed image to capture a photograph of the scene.
45. The method of any of claims 42 to 44, comprising controlling the camera unit to adjust focusing based on the depth map before controlling the camera unit to capture the non-pulsed image.
46. The method of any of claims 29 to 45, comprising causing producing of a first pulse so that the first pulse begins before an exposure period of a first image, wherein the first image is being illuminated by the first pulse.
47. The method of claim 46, comprising causing producing the first pulse so that the first pulse ends on or before starting of the exposure period of a first image.
48. The method of any of claims 29 to 47, comprising causing producing a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
49. The method of claim 48, comprising causing producing of the second pulse so that the second pulse ends on or before ending of the exposure period of a first image.
50. The method of any of claims 29 to 49, comprising determining regions of the scene of given distance range from difference images formed from pairs of subsequently captured images taken with different timing of pulses.
51 . The method of any of claims 29 to 50, comprising causing producing of a second pulse so that the second pulse begins after beginning of an exposure period of a second image, wherein the second image is being illuminated by the second pulse.
52. The method of any of claims 29 to 51 , wherein the set of images consists of two images.
53. The method of any of claims 29 to 52, comprising controlling the camera unit to adjust focusing during producing of the set of images.
54. The method of any of claims 29 to 53, comprising controlling the camera unit to estimate white balance during producing of the set of images.
55. The method of any of claims 29 to 54, comprising controlling the camera unit to estimate exposure time during producing of the set of images.
56. A computer program, comprising:
code for controlling a pulse illuminator and a camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and code for producing a depth map based on brightness variation in the captured images;
when the computer program is run on a processor.
57. The computer program according to claim 56, wherein the computer program is a computer program product comprising a computer-readable medium bearing computer program code embodied therein for use with a computer.
58. A computer-readable medium encoded with instructions that, when executed by a computer, perform:
controlling a pulse illuminator and a camera unit to capture a set of plurality images of a scene with pulsed illumination of variable pulse illumination; and
producing a depth map based on brightness variation in the captured images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/FI2014/050227 WO2015144973A1 (en) | 2014-03-28 | 2014-03-28 | Digital imaging with pulsed illumination to generate a depth map |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/FI2014/050227 WO2015144973A1 (en) | 2014-03-28 | 2014-03-28 | Digital imaging with pulsed illumination to generate a depth map |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015144973A1 true WO2015144973A1 (en) | 2015-10-01 |
Family
ID=54194013
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2014/050227 WO2015144973A1 (en) | 2014-03-28 | 2014-03-28 | Digital imaging with pulsed illumination to generate a depth map |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015144973A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5081530A (en) * | 1987-06-26 | 1992-01-14 | Antonio Medina | Three dimensional camera and range finder |
US6091905A (en) * | 1995-06-22 | 2000-07-18 | 3Dv Systems, Ltd | Telecentric 3D camera and method |
US20100301193A1 (en) * | 2008-02-01 | 2010-12-02 | Commissariat A L' Energie Atomique Et Aux Energies Alternatives | 3d active imaging device |
-
2014
- 2014-03-28 WO PCT/FI2014/050227 patent/WO2015144973A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5081530A (en) * | 1987-06-26 | 1992-01-14 | Antonio Medina | Three dimensional camera and range finder |
US6091905A (en) * | 1995-06-22 | 2000-07-18 | 3Dv Systems, Ltd | Telecentric 3D camera and method |
US20100301193A1 (en) * | 2008-02-01 | 2010-12-02 | Commissariat A L' Energie Atomique Et Aux Energies Alternatives | 3d active imaging device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107087107B (en) | Image processing apparatus and method based on dual camera | |
US10477185B2 (en) | Systems and methods for multiscopic noise reduction and high-dynamic range | |
TWI584634B (en) | Electronic apparatus and method of generating depth map | |
US8199222B2 (en) | Low-light video frame enhancement | |
US8885091B2 (en) | Imaging device and distance information detecting method | |
TWI538512B (en) | Method for adjusting focus position and electronic apparatus | |
US20160366323A1 (en) | Methods and systems for providing virtual lighting | |
JP6802372B2 (en) | Shooting method and terminal for terminal | |
US20140002606A1 (en) | Enhanced image processing with lens motion | |
KR20170106325A (en) | Method and apparatus for multiple technology depth map acquisition and fusion | |
CN108024056B (en) | Imaging method and device based on dual camera | |
CN107872631B (en) | Image shooting method and device based on double cameras and mobile terminal | |
CN102959942A (en) | Image capture device for stereoscopic viewing-use and control method of same | |
CN112261292B (en) | Image acquisition method, terminal, chip and storage medium | |
US20160292842A1 (en) | Method and Apparatus for Enhanced Digital Imaging | |
CN104092954B (en) | Flash control method and control device, image-pickup method and harvester | |
KR102506363B1 (en) | A device with exactly two cameras and how to create two images using the device | |
JP6700818B2 (en) | Image processing device, imaging device, and image processing method | |
JP2014103643A (en) | Imaging device and subject recognition method | |
JP2017139646A (en) | Imaging apparatus | |
JP2019169985A (en) | Image processing apparatus | |
US20150254856A1 (en) | Smart moving object capture methods, devices and digital imaging systems including the same | |
JP2015216604A (en) | Image processing system, method of the same, and program | |
WO2015144973A1 (en) | Digital imaging with pulsed illumination to generate a depth map | |
JP6975144B2 (en) | Imaging processing device, electronic device, imaging processing method, imaging processing device control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14887083 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase | ||
122 | Ep: pct application non-entry in european phase |
Ref document number: 14887083 Country of ref document: EP Kind code of ref document: A1 |