[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN102812697A - Imager For Constructing Color And Depth Images - Google Patents

Imager For Constructing Color And Depth Images Download PDF

Info

Publication number
CN102812697A
CN102812697A CN2010800437795A CN201080043779A CN102812697A CN 102812697 A CN102812697 A CN 102812697A CN 2010800437795 A CN2010800437795 A CN 2010800437795A CN 201080043779 A CN201080043779 A CN 201080043779A CN 102812697 A CN102812697 A CN 102812697A
Authority
CN
China
Prior art keywords
detector
visible light
filter
double
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010800437795A
Other languages
Chinese (zh)
Other versions
CN102812697B (en
Inventor
S·麦克尔道尼
E·吉埃默
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Publication of CN102812697A publication Critical patent/CN102812697A/en
Application granted granted Critical
Publication of CN102812697B publication Critical patent/CN102812697B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0224Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using polarising or depolarising elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0229Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using masks, aperture plates, spatial light modulators or spatial filters, e.g. reflective filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0235Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using means for replacing an element by another, for replacing a filter or a grating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A dual-mode includes a light source configured to project a structured illumination from which visible light can be filtered. The dual-mode imager also includes a detector configured to capture both the structured illumination and visible light from the scene. A temporal or spatial filter is used to selectively block visible light from one or more portions of the detector while passing the structured illumination to the one or more portions of the detector.

Description

Be used to make up the imager of color and depth image
Background
Camera can be used for catching the rest image of scene.Some rest images that rapid Continuous is taken can be used for generating the film that comprises a plurality of frames, and each frame is corresponding to different rest images.The image of even now is very useful in various application, but such image is not to be suitable for some purpose well.Particularly, conventional rest image and film do not provide the information of the relative depth on the various surfaces that are enough to be evaluated in the scene exactly and caught.
General introduction
The double-mode imaging device that the scene that is used for that visible light is thrown light on is carried out to picture is disclosed.This double-mode imaging device comprises and is configured to the light source that the structured lighting of visible light can be therefrom filtered in projection.This double-mode imaging device also comprises and is configured to catch from the structured lighting of scene and the detector of visible light.Time or spatial filter are used for optionally blocking visible light makes its one or more parts that do not arrive detector, and makes structured lighting arrive these one or more parts of detector through making it.
This general introduction is provided so that some notions that will in following detailed description, further describe with the reduced form introduction.This general introduction is not intended to identify the key or the essential feature of theme required for protection, is not intended to be used to limit the scope of theme required for protection yet.In addition, theme required for protection is not limited to solve the realization of any or all shortcoming of in arbitrary part of the present disclosure, mentioning.
The accompanying drawing summary
Fig. 1 shows and can be processed the exemplary scene that is used to make up color image and depth image.
Fig. 2 schematically shows the example double-mode imaging device according to an embodiment of the present disclosure.
Fig. 3 schematically shows the example runner filter according to an embodiment of the present disclosure.
The runner filter that Fig. 4 schematically shows Fig. 3 wherein just with the detector cooperation to make up the time series of color image and depth image.
Fig. 5 schematically shows and the instance space filter of detector cooperation with structure color image and depth image.
Fig. 6 shows to use and shares the exemplary method that imageing sensor makes up color image and depth image.
Fig. 7 schematically shows and can use the example calculations system that imageing sensor makes up color image and depth image of sharing.
Describe in detail
Disclose to use and shared the double-mode imaging device that imageing sensor is handled depth information and color information.Numeral double-mode imaging device filters on time and/or space to be sent to imageing sensor (promptly; Detector) light; Make the subclass of subclass in the time of pixel (for example be exposed to first parameter; Wavelength band) light that is characterized, and make the identical or different subclass of identical or different subclass in the time of pixel be exposed to the light that second parameter different with first parameter (for example, different wavelengths band) characterized.In this way, the light that first parameter is characterized can be used for making up color image, and the light that second parameter is characterized can be used for making up depth image.These two kinds of images all can use identical imageing sensor to make up---for example; Through use all pixels come read carry out between color information and the depth information temporal alternately, or use other pixels to read depth information through using selected pixel to read color information.
Use shared imageing sensor that the hereinafter that is created in of color image and depth image is described as an example and with reference to some illustrated embodiment.With noticing that the accompanying drawing that comprises in this manual is schematic.The general not to scale (NTS) of the view of illustrated embodiment is drawn.Aspect ratio, feature sizes and the on purpose distortion of characteristic quantity are so that selected characteristic or relation are more readily understood.
Fig. 1 shows the simplified perspective view that can be processed the exemplary scene 10 that is used to make up color image and depth image.Scene comprises each object and the surface of arranging with different depth (that is, being positioned at the observer's of scene front the distance of viewpoint certainly).Surface 12 is the darkest (apart from observer's viewpoint farthest) in scene.Surface 14 is arranged at the front (viewpoint apart from the observer is nearer) on surface 12, and surface 16,18 and 20 is arranged at the front on surface 14.Thus, the surface of current consideration is the macro sheet face, has the dimension with scene dimension same order.Yet, will notice that system and method disclosed herein is not limited to such surface, but also will allow the much little zone of inspection structuring macro sheet face, for example, inquire coarse or irregular topological structure etc.In addition, although Fig. 1 shows static scene, notion described herein can be used for dynamic scene (such as comprising the one or more mobile people or the scene of object) is formed images.
The different depth in being arranged at scene, each surface shown in Fig. 1 is relative to each other directed by different ground with the observer.Surface 12 and 14 is oriented with observer's sight line and is normal direction, and surperficial 16 tilt for observer's sight line.In addition, curved surface 18 and 20 presents the continuously-directional scope with respect to observer's sight line.
Surface shown in Fig. 1 also can present various textures.For example, surface 20 can be level and smooth relatively than bottom surface 18.On the optics, the different texture of scene can be showed different light reflectance properties.For example, surface 20 can be a direct reflection to a great extent, and surface 18 can be scattering to a great extent.
Finally, each object in the scene can be a different color.Although black line and white line are drawn and be used to schematically describe scene 10, those of skill in the art will understand, and light absorption that each is surperficial and light reflectance properties can differ from one another, and each surperficial color can differ from one another thus.
In some applications, only handle the color image (for example, digital photograph or digital movie) of color information to form this scene from scene.In other are used, only handle depth information from scene to form depth image.As described herein, handle color information and depth information, make to form color image and depth image.---camera be used to generate color image and another camera is used to generate depth image---the present invention relates to generate the single double-mode imaging device of two kinds of images replace to use two cameras that separate.
Fig. 2 shows the transversal plane view of scene 10.In an example embodiment, also shown is double-mode imaging device 22.This double-mode imaging device is the optical system that is used for the scene imaging; It comprises controller 23, light source 24, detector 26 and filter 28.
Controller 23 can be any control appliance that is configured to control light source 24, detector 26 and/or filter 28---for example, is used to trigger, coordinate and/or the running of these assemblies synchronously.Controller can comprise that logic subsystem and/or the data described like hereinafter keep subsystem.In certain embodiments, controller can comprise the depth analysis device.In other embodiments, the depth analysis device can be communicated by letter with controller in operation, but itself can be the system that opened in a minute.
The timing of controller tunable filter and detector makes that the image of being caught by detector is sorted to make up depth image when filter is just blocking to come the visible light of self-detector.Controller is the timing of tunable filter and detector also, makes the image of being caught by detector at filter visible light sorted to make up color image when making it arrive detector.The depth analysis device can make up depth image based on one or more images of the structured lighting of being caught by detector (that is, when filter is just blocking visible light and do not make it arrive detector, one or more images of being caught by detector) subsequently.This will describe below in more detail.
Light source 24 can be to be configured to filter each lip-deep any suitable light source that illumination projects to scene 10.Particularly, light source 24 is configured to the light that projection has the one or more characteristics different with the character pair of visible light, allows thus to filter visible light (for example, can filter via wavelength and/or polarization state) with respect to projected light.
Among the embodiment, light source comprises laser 30 and disperser 32 shown in figure 2.Laser can provide has known polarization state and intensive, the warp calibration, relevant and monochromatic basically light beam.
As any extraordinary indication of the rotation of the direction of the term ' polarization state ' that here uses light generation when being included in light and propagating or sensing or this direction; Indication can be accurately or approximate, completely or incomplete.An example of polarization state is to comprise S fully 0, S 1, S 2And S 3The complete Stokes vector representation of component, these components are defined as:
S 0=|E x| 2+|E y| 2
S 1=|E x| 2-|E y| 2
S 2=|E a| 2-|E b| 2
S 1=|E l| 2-|E r| 2
Wherein, E 1And E 2It is base
Figure BDA0000148715060000041
The complex amplitudes component of middle electric field,
Figure BDA0000148715060000042
Be the standard cartesian basis,
Figure BDA0000148715060000043
Be the cartesian basis of 45 ° of rotations, and Be be defined by feasible
Figure BDA0000148715060000045
The circumference base.Not exclusively the example of polarization state is the degree of polarization p by the following formula definition:
p = S 1 2 + S 2 2 + S 3 2 S 0 ,
And linear polarization direction ψ is defined by following formula:
2ψ=arctan(S 2/S 1).
Continue Fig. 2, in certain embodiments, laser 30 can be continuous wave (CW) laser; In other embodiments, this laser can be laser of pulse, mode locking, Q-switch or the like.The power that is included in the laser in the light source 24 can be selected based on the scene that will form images, and the laser that power is bigger is used for far away and wide scene, and lower-powered laser is used for nearer and compacter scene.Except power, the emission optical maser wavelength of laser can be selected based on the scene that will form images.Particularly, can select to launch optical maser wavelength so that minimally and visible light are overlapping.In one embodiment, emission optical maser wavelength can be near-infrared wavelength.
Disperser 32 can be to be configured to being dispersed in the projected angle scope and thus with throw light on any equipment of this scene of a plurality of smooth characteristic that separates each other from laser 30 through calibration beam.Among the embodiment, the light characteristic forms the patterning or the otherwise structurized illumination 33 of laser 30 shown in figure 2.Fig. 2 illustrates the laser beam that is dispersed in the deflection angle scope that is limited to horizontal plane.In the embodiment shown, deflection angle adopts the centrifugal pump of being separated by constant increment, for example ,-20 ° ,-15 ° ... ,+20 °.In other embodiments, centrifugal pump by increment at random separately.In other embodiments, laser beam flatly is dispersed in the successive range of deflection angle.To understand, digital scope described herein is merely example, and other scopes fall in the scope of the present invention fully.
Disperser 32 can further be dispersed in laser beam in the deflection angle scope that is limited to vertical plane.Similar with above-mentioned level dispersion, vertically dispersion can be that disperse or continuous.If level disperses the both to disperse with vertical, then scene will be thrown light on by constellation point.If vertically disperse to disperse, and the level dispersion is continuous that then scene will be thrown light on by a series of horizontal bars.And if vertically disperse to disperse, and the level dispersion is continuous, and then scene will be thrown light on by a series of vertical bars, further quotes like hereinafter.Can use these or other structured light pattern and do not deviate from scope of the present invention.
In order to disperse laser beam, disperser 32 can comprise various optical modules---lens, diffraction optical device, diffuser, mirror, waveguide, mask etc.In certain embodiments, disperser can further comprise various active blocks---for example, and electromechanical actuator, chopper, piezoelectric device and liquid crystal light valve.
Continue Fig. 2, detector 26 can be any equipment that is configured to catch from the light of scene through detection the image of this scene.In addition, as shown in Figure 2, detector can be directed, and makes the image that captures comprise at least a portion by light source 24 and/or the scene that visible light threw light on.The part of the illumination of being reflected from each surface of scene in this way, can detect by device to be detected.Detector 26 is configured to catch structured lighting and the visible light from scene.
Detector 26 can comprise the almost any combination that is used to collect and/or light is focused on each optical module of imageing sensor 40.
Imageing sensor 40 can be any transducer of relative intensity that is configured to detect relative intensity and the structured lighting 33 of visible light.Light source 24 comprises among the embodiment of near-infrared emitting laser therein, and for example, imageing sensor can comprise the complementary metal oxide semiconductors (CMOS) (CMOS) that is configured to detect the light with approximate 380 nanometers to, thousand nano wave lengths.In addition, imageing sensor can be configured to the graphical representation that captures is become pel array.Thus, each pixel of the image that captures can be encoded to the luminous intensity that zones of different reflected from scene for one or more color channels.It will be understood by those skilled in the art that to use and to detect visible light, the light of illumination light photograph 33 and/or the various different images transducers of modulated illumination, and do not deviate from scope of the present invention.In addition, be appreciated that and incorporate imageing sensor into various device with different optical configuration.
Filter 28 can be time filter or spatial filter.A non-limiting example of time filter comprises the runner filter.A schematically illustrated example runner filter 50 in Fig. 3.Runner filter 50 comprises the first 52 that is configured to block visible light and near infrared light is passed through.First comes schematically to identify with vertical curve.Revolving filter also comprises the second portion 54 that is configured to visible light is passed through and can randomly blocks near infrared light.Second portion is with horizontal line incoming letter meaning property ground sign.
Schematically illustrated four different time: the t of Fig. 4 0, t 1, t 2And t 3Runner filter 50, this four times are corresponding to four continuous exposures (that is, detector is caught four continuous times of image) of detector 26a.Although this figure is schematically in essence, the figure shows light advancing from left to right.Advance bottom through filter to detector from the light of scene.Shown in this example, the first of runner filter 50 detector every when once catching image (for example, time t 0And t 2) optically in the middle of scene and detector 26a.Equally, the second portion of runner filter 50 detector every when once catching image (for example, time t 1And t 3) optically in the middle of scene and detector 26a.As discussed above, the time period that controller can time filter is set to the twice of the time period of detector.In other words, detector 26a catches two images---visible images and near-infrared image during the each rotation of runner filter.That is, each rotation, runner filter 50 is configured to block visible light for approximately half rotation, and for approximately half rotation visible light is passed through.
As schematically illustrated among Fig. 4, can be used for generating depth image (that is, at time t through the infrared light that arrives detector 0And t 2).As use, depth image is included in wherein any image to each pixel record position depth value (for example, z coordinate). hereEqually, can be used for generating color image (that is, at time t through the visible light that arrives detector 1And t 3).As use here; Color image is included in wherein any image that each pixel is write down one or more intensity levels (for example, the single intensity level of black and white or grayscale image or with different color or two or more corresponding intensity levels of luminance channel of multicolor image).
Above-mentioned configuration is a non-limiting example of time filter.Can use visible light obstruction part and visible light to pass through other runner filters partly with different numbers.For example, the runner filter can comprise two quarterings that are configured to block visible light, assigns to alternately through two fourth class of visible light being configured to.Visible light blocks and visible light can come sizing and shape in any suitable way through part.In addition, be appreciated that the time filter that can use except the runner filter.Generally speaking, can use and be configured to make it not arrive detector to carry out any filter of replacing on the time between its arrival detector through making blocking visible light with making visible light.
Although provide above-mentioned runner the notion of from the light of depth image, filtering the light of color image to be shown, be appreciated that other and arrange also within the scope of the invention as non-limiting example.For example, can combine the polarization rotator of wavelength sensitive to use other vehicular equipments of modulating polarization (for example, photoelasticity modulator) to come to change in time the signal of imageing sensor.
The example of the schematically illustrated spatial filter 60 of Fig. 5 (partly illustrating).Spatial filter 60 is configured to block visible light makes its some part that does not arrive detector 62 (partly illustrating), and allows visible light to arrive other parts of this detector through making it.For example, spatial filter can be configured to block visible light that it is not arrived in the pixel groups (pixel column that replaces on the pixel column that for example, replaces on the space, the space, the checkerboard pattern of pixel etc.) that replaces on a plurality of spaces of detector is every at a distance from a pixel groups.In the embodiment shown, spatial filter 60 is disposed and alignment by collaborative with detector 62, make the even pixel row be exposed near infrared light, and the odd pixel row is exposed to visible light.
As schematically illustrated among Fig. 5, can be used for generating depth image (that is, using the even pixel row) through the infrared light that arrives detector.Equally, can be used for generating color image (that is, using the odd pixel row) through the visible light that arrives detector.
The another kind of method that visible light and infrared light are separated is to use the color circulator.Under this situation, through linear polarization and subsequently through filter, this filter is for some wavelength rotatory polarization state from the light of scene, but other wavelength are kept identical polarization state.The visible light that the formation of light has the infrared light that is in a kind of polarization state and is in a different conditions.Such light can spatially separate through using patterned polarizer, and wherein the polarization direction is a spatial variations.
Fig. 6 shows to use and shares the exemplary method 70 that imageing sensor makes up color image and depth image.72, method 70 comprises structured lighting is projected on the scene.Structured lighting can be filtered any other characteristic that keeps structured lighting not to be filtered basically simultaneously by near-infrared wavelength or allowance visible light and characterize.Structured lighting can be from any suitable light source by projection, such as the light source that comprises near infrared laser.
74, method 70 be included in obstruction make it not arrive detector from the visible light of scene and make from the visible light of scene through it is arrived carry out between detector temporal alternately.In certain embodiments, this can use such as runner filter equal time filter and accomplish.
76, method 70 comprises logic branch.If visible light gets clogged, then method moves to 78.If visible light is not prevented from, then method moves to 80.78, method 70 comprises with detector comes the capturing structure illumination.Detector can be can the capturing structure illumination and any suitable detector of visible light.
82, method 70 comprises based on the structured lighting that captures with detector and generates depth image.Depth image can be made up by the depth analysis device at least in part, and this depth analysis device is configured to the relative position based on the light characteristic of spatially separating that constitutes structured lighting, assesses the locations of pixels depth value.For ease of such image processing, imageing sensor can comprise picture element array structure, makes monochromatic light and rgb light all can be detected.
80, method 70 comprises logic branch.If visible light is passed through, then method moves to 84.If visible light is passed through (for example, both also having passed through the near-infrared filtration fraction of filter through the visible light filtration fraction of filter from the light of scene), then method loops back 76.
84, method 70 comprises with detector catches visible light.78 be used to the capturing structure illumination same detector be used to catch visible lights 84.
86, method 70 comprises based on the visible light that captures with detector and generates color image.Color image can be multicolor image, black and white image or grayscale image.
Although it is above in the infrared structure illumination that is used to form depth image and be used to form in the context that filters between the visible light of color image and described strobe utility; Be appreciated that; Strobe utility described herein can be used for filtering from (for example having one or more different characteristics; Wavelength, polarization etc.) the Depth Imaging illumination of other types of light, or vice versa.Generally speaking, describe like text, the light with one or more types of first characteristic can filter out from the dissimilar light with different characteristic.The non-limiting example that relative to each other is filtrable smooth type comprises visible light, infrared light, near infrared light, ultraviolet light, structured light and/or modulated light.
As following said, can use various computing system, and not depart from spirit of the present disclosure with reference to figure 7.The operating environment of describing with reference to figure 2 provides as an example, but and does not mean that by any way and limit.On the contrary, shown operating environment is intended to show the universal that can under the situation that does not deviate from the scope of the present disclosure, be applied to various different operation environment.
Said method and process can be tied to various dissimilar computing systems.The schematically illustrated one or more computing system 90 carried out in color image structure described herein and the depth image construction method of Fig. 7.Computing system 90 can be taked various form, especially includes but not limited to: game console, personal computing system, military tracking and/or scaling system and the collection apparatus system that green screen or motion-captured function are provided.
Computing system 90 can comprise logic subsystem 92, in operation, be connected to data preservation subsystem 94, display subsystem 96 and/or the double-mode imaging device 98 of logic subsystem.Computing system can randomly comprise not shown assembly in Fig. 7, and/or some assembly of being illustrated among Fig. 7 can be the peripheral assembly that is not integrated in the computing system.
Logic subsystem 92 can comprise the one or more physical equipments that are configured to carry out one or more instructions.For example, logic subsystem can be configured to carry out one or more instructions, and these one or more instructions are parts of one or more programs, routine, object, assembly, data structure or other logical constructs.Can realize that such instruction is with the state of executing the task, realize data type, the one or more equipment of conversion or otherwise obtain desirable result.Logic subsystem can comprise the one or more processors that are configured to the executive software instruction.Additionally or alternately, logic subsystem can comprise one or more hardware or the firmware logic machine that is configured to carry out hardware or firmware instructions.Logic subsystem can randomly comprise the stand-alone assembly that is distributed on two or more equipment, and these stand-alone assemblies in certain embodiments can be by long-range placement.
Data are preserved subsystem 94 can comprise one or more physical equipments, and these one or more equipment are configured to preserve and can carry out data and/or instruction to realize method described herein and process by logic subsystem.When realizing such method with process, state (for example, to keep different data) that can transform data maintenance subsystem 94.Data keep subsystem 94 can comprise removable medium and/or built-in device.Data keep subsystem 94 can comprise optical memory devices, semiconductor memory devices (like RAM, EEPROM, flash memory etc.) and/or magnetic storage device etc.Data keep subsystem 94 can comprise the equipment with the one or more characteristics in the following characteristic: volatibility, non-volatile, dynamic, static, read/write, read-only, arbitrary access, sequential access, position addressable, file addressable and content addressable.In certain embodiments, can keep subsystem 94 to be integrated in one or more common device logic subsystem 92 and data, like application-specific integrated circuit (ASIC) or SOC(system on a chip).
Fig. 7 also illustrates the one side of the data maintenance subsystem of readable removable medium 100 forms that use a computer, and this aspect can be used for storing and/or transmitting carrying out data and/or the instruction that realizes method described herein and process.
Display subsystem 96 can be used for appearing the visual representation of the data that kept by data maintenance subsystem 94.Because method described herein and process change preserve the data that subsystem is preserved by data; And and then conversion data preserve the state of subsystem; So maybe same conversion the state of display subsystem 96 visually to represent the change (for example, display subsystem can show the visual representation of constructed color image, constructed depth image and/or based on the dummy model of constructed depth image) in the bottom data.Display subsystem 96 can comprise and uses in fact one or more display devices of the technology of any kind.Can this type of display device be preserved subsystem 94 with logic subsystem 92 and/or data and be combined in the shared encapsulation, or this type of display device can be peripheral display device.
Computing system 90 also comprises and is configured to obtain the depth image of one or more targets and/or scene and the double-mode imaging device 98 of color image.But the analysis of depth image utilization structure light makes up.In such analysis, patterning light (that is, be shown as such as known pattern such as lattice or candy strips light) can be projected on the scene.On the surface of scene, pattern can become distortion, and this distortion of pattern can be studied the physical distance with the ad-hoc location confirming from the double-mode imaging device to scene.The double-mode imaging device can comprise time or the spatial filter that is used for optionally blocking visible light, is convenient to thus catch depth image and color image with same double-mode imaging device.
Be appreciated that and carry out some depth analysis operation at least by the logic machine of one or more double-mode imaging devices.The double-mode imaging device can comprise processing unit on the one or more plates that are configured to carry out one or more depth analysis functions.The double-mode imaging device can comprise that firmware is to promote that upgrading such plate carries processing logic.
Should be appreciated that configuration described herein and/or method are exemplary in itself, and these specific embodiments or example should not be considered to circumscribed, because a plurality of variant is possible.Concrete routine described herein or method can be represented one or more in any amount of processing policy.Thus, shown each action can be carried out in the indicated order, carried out in proper order, carries out concurrently, perhaps omits in some cases by other.Equally, can change the order of said process.
Theme of the present disclosure comprise the novel and non-obvious combination of all of various processes, system and configuration and son combination and other characteristics, function, action and/or characteristic disclosed herein, with and any and whole equivalent.

Claims (15)

1. double-mode imaging device that is used for the scene by visible illumination is carried out to picture, said double-mode imaging device comprises:
Light source is configured to projecting on the said scene from the structured lighting that wherein can filter visible light;
Detector is configured to catch structured lighting and the visible light from said scene; And
Filter is used for optionally blocking visible light and makes it not arrive one or more parts of said detector and make said structured lighting arrive these one or more parts of said detector through making it.
2. double-mode imaging device as claimed in claim 1; It is characterized in that; Said filter comprises time filter, and said time filter is configured to make it not arrive said detector to carry out temporal replacing with making visible light between the said detector through it is arrived blocking visible light.
3. double-mode imaging device as claimed in claim 2; It is characterized in that; Said filter comprises the runner filter, and said runner filter comprises and is configured to block one or more parts of visible light and is configured to one or more parts that visible light is passed through.
4. double-mode imaging device as claimed in claim 3 is characterized in that, said runner filter is configured to block visible light for approximately half rotation, and for approximately half this rotation visible light is passed through.
5. double-mode imaging device as claimed in claim 2; It is characterized in that; Also comprise controller; Said controller is used to coordinate said time filter and said detector, makes the image that when said time filter is just blocking visible light and makes it not arrive said detector, is being captured by said detector sorted to make up depth image; And make the image that when said time filter just makes visible light arrive said detector through making it, is capturing sorted to make up color image by said detector.
6. double-mode imaging device as claimed in claim 5; It is characterized in that; Said time filter comprises polarizer and photoelasticity modulator; Be used for block visible light make it not arrive said detector with make visible light through it is arrived carry out between the said detector temporal alternately, and need not mobile each several part.
7. double-mode imaging device as claimed in claim 1; It is characterized in that; Said filter comprises spatial filter, and said spatial filter is configured to block visible light to be made it not arrive some part of said detector and allow visible light to arrive other parts of said detector through making it.
8. double-mode imaging device as claimed in claim 7; It is characterized in that; Said detector comprises the pixel groups that replaces on a plurality of spaces, and said spatial filter is configured to block visible light and makes it not arrive in the pixel groups that replaces on said a plurality of space whenever the pixel groups at a distance from.
9. double-mode imaging device as claimed in claim 8 is characterized in that, the pixel groups that replaces on said a plurality of spaces is the pixel column that replaces on the space.
10. double-mode imaging device as claimed in claim 7 is characterized in that, said spatial filter comprises the color circulator of polarizer, wavelength sensitive and the patterned polarizer with polarization direction of spatial variations.
11. double-mode imaging device as claimed in claim 1 is characterized in that said detector comprises complementary metal oxide semiconductors (CMOS).
12. double-mode imaging device as claimed in claim 1 is characterized in that, said light source comprises the laser with near-infrared emission optical maser wavelength.
13. double-mode imaging device as claimed in claim 12; It is characterized in that; Said light source comprises disperser; Said disperser is configured to the bundle through calibration from said laser is dispersed in the projected angle scope, with closet the light characteristic of separating on a plurality of spaces of the near-infrared wavelength said scene of throwing light on is arranged.
14. double-mode imaging device as claimed in claim 1 is characterized in that, also comprises the depth analysis device, said depth analysis device is configured to make up depth image based on one or more images of the structured lighting that is captured by said detector.
15. a method that makes up color image and depth image, said method comprises:
The structured lighting that will have near-infrared wavelength is projected on the scene;
Make it not arrive detector to carry out temporal replacing between the said detector through it is arrived blocking visible light from said scene with making visible light from said scene;
When the visible light that blocks from said scene, catch said structured lighting with said detector;
Make from the visible light of said scene through the time, catch visible light with said detector;
Generate said depth image based on the structured lighting that captures with said detector; And
Generate said color image based on the visible light that captures with said detector.
CN201080043779.5A 2009-10-01 2010-09-01 Imager for constructing color and depth images Active CN102812697B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12/572,082 US8723118B2 (en) 2009-10-01 2009-10-01 Imager for constructing color and depth images
US12/572,082 2009-10-01
PCT/US2010/047564 WO2011041066A2 (en) 2009-10-01 2010-09-01 Imager for constructing color and depth images

Publications (2)

Publication Number Publication Date
CN102812697A true CN102812697A (en) 2012-12-05
CN102812697B CN102812697B (en) 2014-10-15

Family

ID=43822470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201080043779.5A Active CN102812697B (en) 2009-10-01 2010-09-01 Imager for constructing color and depth images

Country Status (6)

Country Link
US (2) US8723118B2 (en)
EP (1) EP2484107A4 (en)
JP (1) JP2013506868A (en)
KR (1) KR101719388B1 (en)
CN (1) CN102812697B (en)
WO (1) WO2011041066A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104519338A (en) * 2013-09-30 2015-04-15 三星电子株式会社 Method and apparatus generating color and depth images
CN106382920A (en) * 2016-11-25 2017-02-08 深圳悉罗机器人有限公司 Multifunctional visual sensor, mobile robot and control method of mobile robot
CN111855621A (en) * 2015-02-24 2020-10-30 国立大学法人东京大学 Dynamic high-speed high-sensitivity imaging device and imaging method
CN112399028A (en) * 2016-03-01 2021-02-23 奇跃公司 Depth sensing system and method
CN112954153A (en) * 2021-01-26 2021-06-11 维沃移动通信有限公司 Camera device, electronic equipment, depth of field detection method and depth of field detection device

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6168383B2 (en) * 2012-12-27 2017-07-26 三星電子株式会社Samsung Electronics Co.,Ltd. Defect inspection apparatus and defect inspection method
US9407837B2 (en) * 2013-02-28 2016-08-02 Google Inc. Depth sensor using modulated light projector and image sensor with color and IR sensing
US9398287B2 (en) 2013-02-28 2016-07-19 Google Technology Holdings LLC Context-based depth sensor control
WO2016029283A1 (en) * 2014-08-27 2016-03-03 Muniz Samuel System for creating images, videos and sounds in an omnidimensional virtual environment from real scenes using a set of cameras and depth sensors, and playback of images, videos and sounds in three-dimensional virtual environments using a head-mounted display and a movement sensor
US9553423B2 (en) 2015-02-27 2017-01-24 Princeton Optronics Inc. Miniature structured light illuminator
US9953428B2 (en) 2015-03-03 2018-04-24 Microsoft Technology Licensing, Llc Digital camera unit with simultaneous structured and unstructured illumination
US9936151B2 (en) * 2015-10-16 2018-04-03 Capsovision Inc Single image sensor for capturing mixed structured-light images and regular images
US10547830B2 (en) * 2015-11-16 2020-01-28 Samsung Electronics Co., Ltd Apparatus for and method of illumination control for acquiring image information and depth information simultaneously
US20170366773A1 (en) * 2016-06-21 2017-12-21 Siemens Aktiengesellschaft Projection in endoscopic medical imaging
US10466036B2 (en) 2016-10-07 2019-11-05 Arizona Board Of Regents On Behalf Of The University Of Arizona Attachable depth and orientation tracker device and method of depth and orientation tracking using focal plane polarization and color camera
CA3055079A1 (en) * 2017-04-07 2018-10-11 Toi Labs, Inc. Biomonitoring devices, methods, and systems for use in a bathroom setting
CN112740666A (en) 2018-07-19 2021-04-30 艾科缇弗外科公司 System and method for multi-modal depth sensing in an automated surgical robotic vision system
WO2020210168A1 (en) 2019-04-08 2020-10-15 Activ Surgical, Inc. Systems and methods for medical imaging
US11213194B2 (en) * 2019-06-20 2022-01-04 Cilag Gmbh International Optical fiber waveguide in an endoscopic system for hyperspectral, fluorescence, and laser mapping imaging
US11102400B2 (en) * 2019-06-20 2021-08-24 Cilag Gmbh International Pulsed illumination in a fluorescence imaging system
EP4017340A4 (en) 2019-08-21 2023-12-13 Activ Surgical, Inc. Systems and methods for medical imaging
DE102020201119A1 (en) 2020-01-30 2021-08-05 Robert Bosch Gesellschaft mit beschränkter Haftung Detector element for a time-of-flight sensor
US11245877B2 (en) 2020-06-11 2022-02-08 Viavi Solutions Inc. Scrolling spectral filter
CN112070065B (en) * 2020-10-01 2024-06-04 奥比中光科技集团股份有限公司 Method, device and face recognition system for detecting infrared image and depth image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050285966A1 (en) * 2004-01-28 2005-12-29 Canesta, Inc. Single chip red, green, blue, distance (RGB-Z) sensor
US7274393B2 (en) * 2003-02-28 2007-09-25 Intel Corporation Four-color mosaic pattern for depth and image capture
CN101179742A (en) * 2006-11-10 2008-05-14 三洋电机株式会社 Imaging apparatus and image signal processing device
US20090114799A1 (en) * 2007-11-07 2009-05-07 Fujifilm Corporation Image capturing system, image capturing method, and recording medium

Family Cites Families (189)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3848129A (en) * 1973-08-24 1974-11-12 Sanders Associates Inc Spectral discriminating radiation detection apparatus
US4288078A (en) * 1979-11-20 1981-09-08 Lugo Julio I Game apparatus
US4349277A (en) * 1980-06-11 1982-09-14 General Electric Company Non-contact measurement of surface profile
US4695953A (en) 1983-08-25 1987-09-22 Blair Preston E TV animation interactively controlled by the viewer
US4630910A (en) 1984-02-16 1986-12-23 Robotic Vision Systems, Inc. Method of measuring in three-dimensions at high speed
JPH0646977B2 (en) * 1984-06-09 1994-06-22 オリンパス光学工業株式会社 Measuring endoscope
US4627620A (en) 1984-12-26 1986-12-09 Yang John P Electronic athlete trainer for improving skills in reflex, speed and accuracy
US4645458A (en) 1985-04-15 1987-02-24 Harald Phillip Athletic evaluation and training apparatus
US4679068A (en) * 1985-07-25 1987-07-07 General Electric Company Composite visible/thermal-infrared imaging system
US4702475A (en) 1985-08-16 1987-10-27 Innovating Training Products, Inc. Sports technique and reaction training system
US4843568A (en) 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US4711543A (en) 1986-04-14 1987-12-08 Blair Preston E TV animation interactively controlled by the viewer
US4796997A (en) 1986-05-27 1989-01-10 Synthetic Vision Systems, Inc. Method and system for high-speed, 3-D imaging of an object at a vision station
US5184295A (en) 1986-05-30 1993-02-02 Mann Ralph V System and method for teaching physical skills
US4751642A (en) 1986-08-29 1988-06-14 Silva John M Interactive sports simulation system with physiological sensing and psychological conditioning
US4809065A (en) 1986-12-01 1989-02-28 Kabushiki Kaisha Toshiba Interactive system and related method for displaying data to produce a three-dimensional image of an object
US4817950A (en) 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US5239463A (en) 1988-08-04 1993-08-24 Blair Preston E Method and apparatus for player interaction with animated characters and objects
US5239464A (en) 1988-08-04 1993-08-24 Blair Preston E Interactive video system providing repeated switching of multiple tracks of actions sequences
US4901362A (en) 1988-08-08 1990-02-13 Raytheon Company Method of recognizing patterns
US4893183A (en) 1988-08-11 1990-01-09 Carnegie-Mellon University Robotic vision system
JPH02199526A (en) 1988-10-14 1990-08-07 David G Capper Control interface apparatus
US4925189A (en) 1989-01-13 1990-05-15 Braeunig Thomas F Body-mounted video game exercise device
US5229756A (en) 1989-02-07 1993-07-20 Yamaha Corporation Image control apparatus
US5469740A (en) 1989-07-14 1995-11-28 Impulse Technology, Inc. Interactive video testing and training system
JPH03103822U (en) 1990-02-13 1991-10-29
US5101444A (en) 1990-05-18 1992-03-31 Panacea, Inc. Method and apparatus for high speed object location
US5148154A (en) 1990-12-04 1992-09-15 Sony Corporation Of America Multi-dimensional user interface
US5534917A (en) 1991-05-09 1996-07-09 Very Vivid, Inc. Video image based control system
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5295491A (en) 1991-09-26 1994-03-22 Sam Technology, Inc. Non-invasive human neurocognitive performance capability testing method and system
US6054991A (en) 1991-12-02 2000-04-25 Texas Instruments Incorporated Method of modeling player position and movement in a virtual reality system
EP0590101B1 (en) 1991-12-03 1999-06-23 French Sportech Corporation Interactive video testing and training system
US5875108A (en) 1991-12-23 1999-02-23 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
JPH07325934A (en) 1992-07-10 1995-12-12 Walt Disney Co:The Method and equipment for provision of graphics enhanced to virtual world
US5999908A (en) 1992-08-06 1999-12-07 Abelow; Daniel H. Customer-based product design module
US5320538A (en) 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
IT1257294B (en) 1992-11-20 1996-01-12 DEVICE SUITABLE TO DETECT THE CONFIGURATION OF A PHYSIOLOGICAL-DISTAL UNIT, TO BE USED IN PARTICULAR AS AN ADVANCED INTERFACE FOR MACHINES AND CALCULATORS.
US5495576A (en) 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5690582A (en) 1993-02-02 1997-11-25 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
JP2799126B2 (en) 1993-03-26 1998-09-17 株式会社ナムコ Video game device and game input device
US5405152A (en) 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
US5454043A (en) 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US5423554A (en) 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5980256A (en) 1993-10-29 1999-11-09 Carmein; David E. E. Virtual reality system with enhanced sensory apparatus
JP3419050B2 (en) 1993-11-19 2003-06-23 株式会社日立製作所 Input device
US5347306A (en) 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
JP2552427B2 (en) 1993-12-28 1996-11-13 コナミ株式会社 Tv play system
US5577981A (en) 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5580249A (en) 1994-02-14 1996-12-03 Sarcos Group Apparatus for simulating mobility of a human
US5597309A (en) 1994-03-28 1997-01-28 Riess; Thomas Method and apparatus for treatment of gait problems associated with parkinson's disease
US5385519A (en) 1994-04-19 1995-01-31 Hsu; Chi-Hsueh Running machine
US5524637A (en) 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
US5563988A (en) 1994-08-01 1996-10-08 Massachusetts Institute Of Technology Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment
US6714665B1 (en) 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US5516105A (en) 1994-10-06 1996-05-14 Exergame, Inc. Acceleration activated joystick
US5638300A (en) 1994-12-05 1997-06-10 Johnson; Lee E. Golf swing analysis system
JPH08161292A (en) 1994-12-09 1996-06-21 Matsushita Electric Ind Co Ltd Method and system for detecting congestion degree
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5682229A (en) 1995-04-14 1997-10-28 Schwartz Electro-Optics, Inc. Laser range camera
US5913727A (en) 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
JP3481631B2 (en) 1995-06-07 2003-12-22 ザ トラスティース オブ コロンビア ユニヴァーシティー イン ザ シティー オブ ニューヨーク Apparatus and method for determining a three-dimensional shape of an object using relative blur in an image due to active illumination and defocus
US5682196A (en) 1995-06-22 1997-10-28 Actv, Inc. Three-dimensional (3D) video presentation system providing interactive 3D presentation with personalized audio responses for multiple viewers
US5702323A (en) 1995-07-26 1997-12-30 Poulton; Craig K. Electronic exercise enhancer
US6098458A (en) 1995-11-06 2000-08-08 Impulse Technology, Ltd. Testing and training system for assessing movement and agility skills without a confining field
US6430997B1 (en) 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
US6073489A (en) 1995-11-06 2000-06-13 French; Barry J. Testing and training system for assessing the ability of a player to complete a task
US6308565B1 (en) 1995-11-06 2001-10-30 Impulse Technology Ltd. System and method for tracking and assessing movement skills in multidimensional space
US6176782B1 (en) 1997-12-22 2001-01-23 Philips Electronics North America Corp. Motion-based command generation technology
US5933125A (en) 1995-11-27 1999-08-03 Cae Electronics, Ltd. Method and apparatus for reducing instability in the display of a virtual environment
US5641288A (en) 1996-01-11 1997-06-24 Zaenglein, Jr.; William G. Shooting simulating process and training device using a virtual reality display screen
WO1997041925A1 (en) 1996-05-08 1997-11-13 Real Vision Corporation Real time simulation using position sensing
US6173066B1 (en) 1996-05-21 2001-01-09 Cybernet Systems Corporation Pose determination and tracking by matching 3D objects to a 2D sensor
JPH1051668A (en) * 1996-08-06 1998-02-20 Iseki & Co Ltd Image pickup device for agricultural robot
US5989157A (en) 1996-08-06 1999-11-23 Walton; Charles A. Exercising system with electronic inertial game playing
CN1168057C (en) 1996-08-14 2004-09-22 挪拉赫梅特·挪利斯拉莫维奇·拉都包夫 Method for tracking and displaying the position and orientation of a user in space, method for presenting a virtual environment to a user and system for implementing these methods
JP3064928B2 (en) 1996-09-20 2000-07-12 日本電気株式会社 Subject extraction method
EP0849697B1 (en) 1996-12-20 2003-02-12 Hitachi Europe Limited A hand gesture recognition system and method
US6081612A (en) * 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
US6009210A (en) 1997-03-05 1999-12-28 Digital Equipment Corporation Hands-free interface to a virtual reality environment using head tracking
US6100896A (en) 1997-03-24 2000-08-08 Mitsubishi Electric Information Technology Center America, Inc. System for designing graphical multi-participant environments
US5877803A (en) 1997-04-07 1999-03-02 Tritech Mircoelectronics International, Ltd. 3-D image detector
US6215898B1 (en) 1997-04-15 2001-04-10 Interval Research Corporation Data processing system and method
JP3077745B2 (en) 1997-07-31 2000-08-14 日本電気株式会社 Data processing method and apparatus, information storage medium
US6188777B1 (en) 1997-08-01 2001-02-13 Interval Research Corporation Method and apparatus for personnel detection and tracking
US6289112B1 (en) 1997-08-22 2001-09-11 International Business Machines Corporation System and method for determining block direction in fingerprint images
US6720949B1 (en) 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
JPH1173491A (en) * 1997-08-29 1999-03-16 Namco Ltd Foreground image generating device
AUPO894497A0 (en) 1997-09-02 1997-09-25 Xenotech Research Pty Ltd Image processing method and apparatus
EP0905644A3 (en) 1997-09-26 2004-02-25 Matsushita Electric Industrial Co., Ltd. Hand gesture recognizing device
US6141463A (en) 1997-10-10 2000-10-31 Electric Planet Interactive Method and system for estimating jointed-figure configurations
US6101289A (en) 1997-10-15 2000-08-08 Electric Planet, Inc. Method and apparatus for unencumbered capture of an object
WO1999019828A1 (en) 1997-10-15 1999-04-22 Electric Planet, Inc. Method and apparatus for performing a clean background subtraction
WO1999019840A1 (en) 1997-10-15 1999-04-22 Electric Planet, Inc. A system and method for generating an animatable character
US6072494A (en) 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
US6130677A (en) 1997-10-15 2000-10-10 Electric Planet, Inc. Interactive computer vision system
US7155363B1 (en) * 1997-12-01 2006-12-26 Mks Instruments, Inc. Thermal imaging for semiconductor process monitoring
US6181343B1 (en) 1997-12-23 2001-01-30 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6159100A (en) 1998-04-23 2000-12-12 Smith; Michael D. Virtual reality game
US6077201A (en) 1998-06-12 2000-06-20 Cheng; Chau-Yang Exercise bicycle
US6801637B2 (en) 1999-08-10 2004-10-05 Cybernet Systems Corporation Optical body tracker
US6681031B2 (en) 1998-08-10 2004-01-20 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US6950534B2 (en) 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
US20010008561A1 (en) 1999-08-10 2001-07-19 Paul George V. Real-time object tracking system
US7036094B1 (en) 1998-08-10 2006-04-25 Cybernet Systems Corporation Behavior recognition system
US7121946B2 (en) 1998-08-10 2006-10-17 Cybernet Systems Corporation Real-time head tracking system for computer games and other applications
IL126284A (en) 1998-09-17 2002-12-01 Netmor Ltd System and method for three dimensional positioning and tracking
EP0991011B1 (en) 1998-09-28 2007-07-25 Matsushita Electric Industrial Co., Ltd. Method and device for segmenting hand gestures
US7323634B2 (en) * 1998-10-14 2008-01-29 Patterning Technologies Limited Method of forming an electronic device
WO2000034919A1 (en) 1998-12-04 2000-06-15 Interval Research Corporation Background estimation and segmentation based on range and color
US6147678A (en) 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
EP2026035A2 (en) 1998-12-16 2009-02-18 3DV Systems Ltd. 3D camera for distance measurements
US6570555B1 (en) 1998-12-30 2003-05-27 Fuji Xerox Co., Ltd. Method and apparatus for embodied conversational characters with multimodal input/output in an interface device
US6363160B1 (en) 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US7003134B1 (en) 1999-03-08 2006-02-21 Vulcan Patents Llc Three dimensional object pose estimation which employs dense depth information
US6299308B1 (en) 1999-04-02 2001-10-09 Cybernet Systems Corporation Low-cost non-imaging eye tracker system for computer control
US6503195B1 (en) 1999-05-24 2003-01-07 University Of North Carolina At Chapel Hill Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
US6476834B1 (en) 1999-05-28 2002-11-05 International Business Machines Corporation Dynamic creation of selectable items on surfaces
US6959869B2 (en) * 1999-06-07 2005-11-01 Metrologic Instruments, Inc. Automatic vehicle identification (AVI) system employing planar laser illumination and imaging (PLIIM) based subsystems
US6873723B1 (en) 1999-06-30 2005-03-29 Intel Corporation Segmenting three-dimensional video images using stereo
US6738066B1 (en) 1999-07-30 2004-05-18 Electric Plant, Inc. System, method and article of manufacture for detecting collisions between video images generated by a camera and an object depicted on a display
US7113918B1 (en) 1999-08-01 2006-09-26 Electric Planet, Inc. Method for video enabled electronic commerce
US7050606B2 (en) 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
US6663491B2 (en) 2000-02-18 2003-12-16 Namco Ltd. Game apparatus, storage medium and computer program that adjust tempo of sound
US6633294B1 (en) 2000-03-09 2003-10-14 Seth Rosenthal Method and apparatus for using captured high density motion for animation
EP1152261A1 (en) 2000-04-28 2001-11-07 CSEM Centre Suisse d'Electronique et de Microtechnique SA Device and method for spatially resolved photodetection and demodulation of modulated electromagnetic waves
US6889075B2 (en) * 2000-05-03 2005-05-03 Rocky Mountain Biosystems, Inc. Optical imaging of subsurface anatomical structures and biomolecules
US6640202B1 (en) 2000-05-25 2003-10-28 International Business Machines Corporation Elastic sensor mesh system for 3-dimensional measurement, mapping and kinematics applications
US6731799B1 (en) 2000-06-01 2004-05-04 University Of Washington Object segmentation with background extraction and moving boundary techniques
US6788809B1 (en) 2000-06-30 2004-09-07 Intel Corporation System and method for gesture recognition in three dimensions using stereo imaging and color vision
AU2001277964A1 (en) 2000-07-21 2002-02-05 The Trustees Of Columbia University In The City Of New York Method and apparatus for image mosaicing
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
JP2002084451A (en) * 2000-09-11 2002-03-22 Minolta Co Ltd Digital image pickup device, image processing system, recording medium, and digital image pickup method
US7058204B2 (en) 2000-10-03 2006-06-06 Gesturetek, Inc. Multiple camera control system
US7039676B1 (en) 2000-10-31 2006-05-02 International Business Machines Corporation Using video image analysis to automatically transmit gestures over a network in a chat or instant messaging session
US6539931B2 (en) 2001-04-16 2003-04-01 Koninklijke Philips Electronics N.V. Ball throwing assistant
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
JP3420221B2 (en) 2001-06-29 2003-06-23 株式会社コナミコンピュータエンタテインメント東京 GAME DEVICE AND PROGRAM
US6937742B2 (en) 2001-09-28 2005-08-30 Bellsouth Intellectual Property Corporation Gesture activated home appliance
US6825928B2 (en) * 2001-12-19 2004-11-30 Wisconsin Alumni Research Foundation Depth-resolved fluorescence instrument
JP3972654B2 (en) * 2001-12-28 2007-09-05 松下電工株式会社 Solid-state image sensor camera and door phone with camera
JP2005526971A (en) 2002-04-19 2005-09-08 アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ. Vehicle safety device
US7170492B2 (en) 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US7348963B2 (en) 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US7489812B2 (en) 2002-06-07 2009-02-10 Dynamic Digital Depth Research Pty Ltd. Conversion and encoding techniques
WO2003105289A2 (en) 2002-06-07 2003-12-18 University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
US7257437B2 (en) * 2002-07-05 2007-08-14 The Regents Of The University Of California Autofluorescence detection and imaging of bladder cancer realized through a cystoscope
US7576727B2 (en) 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
US7154157B2 (en) 2002-12-30 2006-12-26 Intel Corporation Stacked semiconductor radiation sensors having color component and infrared sensing capability
JP4235729B2 (en) 2003-02-03 2009-03-11 国立大学法人静岡大学 Distance image sensor
DE602004006190T8 (en) 2003-03-31 2008-04-10 Honda Motor Co., Ltd. Device, method and program for gesture recognition
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
WO2004107266A1 (en) 2003-05-29 2004-12-09 Honda Motor Co., Ltd. Visual tracking using depth data
JP4546956B2 (en) 2003-06-12 2010-09-22 本田技研工業株式会社 Target orientation estimation using depth detection
WO2005034747A1 (en) * 2003-09-15 2005-04-21 Beth Israel Deaconess Medical Center Medical imaging systems
EP1830585B1 (en) * 2003-10-21 2008-12-17 Barco N.V. Method and device for performing stereoscopic image display based on color selective filters
WO2005041579A2 (en) 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
US8134637B2 (en) 2004-01-28 2012-03-13 Microsoft Corporation Method and system to increase X-Y resolution in a depth (Z) camera using red, blue, green (RGB) sensing
US7289211B1 (en) * 2004-04-09 2007-10-30 Walsh Jr Joseph T System and method for imaging sub-surface polarization-sensitive material structures
JP4708422B2 (en) 2004-04-15 2011-06-22 ジェスチャー テック,インコーポレイテッド Tracking of two-hand movement
US7308112B2 (en) 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
US7704135B2 (en) 2004-08-23 2010-04-27 Harrison Jr Shelton E Integrated game system, method, and device
US20080039715A1 (en) * 2004-11-04 2008-02-14 Wilson David F Three-dimensional optical guidance for catheter placement
US8548570B2 (en) * 2004-11-29 2013-10-01 Hypermed Imaging, Inc. Hyperspectral imaging of angiogenesis
KR20060070280A (en) 2004-12-20 2006-06-23 한국전자통신연구원 Apparatus and its method of user interface using hand gesture recognition
CN101198964A (en) 2005-01-07 2008-06-11 格斯图尔泰克股份有限公司 Creating 3D images of objects by illuminating with infrared patterns
US7853041B2 (en) 2005-01-07 2010-12-14 Gesturetek, Inc. Detecting and tracking objects in images
BRPI0606477A2 (en) 2005-01-07 2009-06-30 Gesturetek Inc optical flow based tilt sensor
US7598942B2 (en) 2005-02-08 2009-10-06 Oblong Industries, Inc. System and method for gesture based control system
US7317836B2 (en) 2005-03-17 2008-01-08 Honda Motor Co., Ltd. Pose estimation based on critical point analysis
US7560679B1 (en) 2005-05-10 2009-07-14 Siimpel, Inc. 3D camera
WO2006124935A2 (en) 2005-05-17 2006-11-23 Gesturetek, Inc. Orientation-sensitive signal output
US7541588B2 (en) * 2005-07-12 2009-06-02 Northrop Grumman Corporation Infrared laser illuminated imaging systems and methods
DE602005010696D1 (en) 2005-08-12 2008-12-11 Mesa Imaging Ag Highly sensitive, fast pixel for use in an image sensor
US20080026838A1 (en) 2005-08-22 2008-01-31 Dunstan James E Multi-player non-role-playing virtual world games: method for two-way interaction between participants and multi-player virtual world games
US7450736B2 (en) 2005-10-28 2008-11-11 Honda Motor Co., Ltd. Monocular tracking of 3D human motion with a coordinated mixture of factor analyzers
US7701439B2 (en) 2006-07-13 2010-04-20 Northrop Grumman Corporation Gesture recognition simulation system and method
JP5395323B2 (en) 2006-09-29 2014-01-22 ブレインビジョン株式会社 Solid-state image sensor
US7412077B2 (en) 2006-12-29 2008-08-12 Motorola, Inc. Apparatus and methods for head pose estimation and head gesture detection
US7729530B2 (en) 2007-03-03 2010-06-01 Sergey Antonov Method and apparatus for 3-D data input to a personal computer with a multimedia oriented operating system
US8239007B2 (en) * 2007-04-13 2012-08-07 Ethicon Endo-Surgert, Inc. Biocompatible nanoparticle compositions and methods
US20080309913A1 (en) * 2007-06-14 2008-12-18 James John Fallon Systems and methods for laser radar imaging for the blind and visually impaired
EP2017591A1 (en) * 2007-07-18 2009-01-21 Fujifilm Corporation Imaging apparatus
US7852262B2 (en) 2007-08-16 2010-12-14 Cybernet Systems Corporation Wireless mobile indoor/outdoor tracking system
US8421015B1 (en) * 2007-09-13 2013-04-16 Oceanit Laboratories, Inc. Position sensing detector focal plane array (PSD-FPA) event detection and classification system
CN201254344Y (en) 2008-08-20 2009-06-10 中国农业科学院草原研究所 Plant specimens and seed storage
US9107567B2 (en) * 2012-12-27 2015-08-18 Christie Digital Systems Usa, Inc. Spectral imaging with a color wheel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7274393B2 (en) * 2003-02-28 2007-09-25 Intel Corporation Four-color mosaic pattern for depth and image capture
US20050285966A1 (en) * 2004-01-28 2005-12-29 Canesta, Inc. Single chip red, green, blue, distance (RGB-Z) sensor
CN101179742A (en) * 2006-11-10 2008-05-14 三洋电机株式会社 Imaging apparatus and image signal processing device
US20090114799A1 (en) * 2007-11-07 2009-05-07 Fujifilm Corporation Image capturing system, image capturing method, and recording medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104519338A (en) * 2013-09-30 2015-04-15 三星电子株式会社 Method and apparatus generating color and depth images
CN104519338B (en) * 2013-09-30 2018-03-27 三星电子株式会社 The method and apparatus for producing coloured image and depth image
CN111855621A (en) * 2015-02-24 2020-10-30 国立大学法人东京大学 Dynamic high-speed high-sensitivity imaging device and imaging method
CN111855621B (en) * 2015-02-24 2023-11-10 国立大学法人东京大学 Dynamic high-speed high-sensitivity imaging device and imaging method
CN112399028A (en) * 2016-03-01 2021-02-23 奇跃公司 Depth sensing system and method
CN106382920A (en) * 2016-11-25 2017-02-08 深圳悉罗机器人有限公司 Multifunctional visual sensor, mobile robot and control method of mobile robot
CN112954153A (en) * 2021-01-26 2021-06-11 维沃移动通信有限公司 Camera device, electronic equipment, depth of field detection method and depth of field detection device

Also Published As

Publication number Publication date
KR20120080591A (en) 2012-07-17
WO2011041066A2 (en) 2011-04-07
WO2011041066A3 (en) 2011-06-23
US8723118B2 (en) 2014-05-13
EP2484107A2 (en) 2012-08-08
EP2484107A4 (en) 2014-10-29
CN102812697B (en) 2014-10-15
US20140291520A1 (en) 2014-10-02
JP2013506868A (en) 2013-02-28
US20110079714A1 (en) 2011-04-07
KR101719388B1 (en) 2017-03-23

Similar Documents

Publication Publication Date Title
CN102812697B (en) Imager for constructing color and depth images
JP6260006B2 (en) IMAGING DEVICE, IMAGING SYSTEM USING THE SAME, ELECTRONIC MIRROR SYSTEM, AND RANGING DEVICE
Cao et al. A prism-mask system for multispectral video acquisition
US10992924B2 (en) Stereo-polarimetric compressed ultrafast photography (SP-CUP) systems and methods
US10425598B2 (en) Methods and systems for time-encoded multiplexed imaging
EP2518996B1 (en) Image capture device and method
US8532427B2 (en) System and method for image enhancement
RU2535640C2 (en) Forming multispectral images
JP4538766B2 (en) Imaging device, display device, and image processing device
EP3250881B1 (en) System and method for structured light pattern generation
US8264536B2 (en) Depth-sensitive imaging via polarization-state mapping
US20170075050A1 (en) Imaging device
JP2013506868A5 (en)
US10679370B2 (en) Energy optimized imaging system with 360 degree field-of-view
KR20120039440A (en) Method and apparatus for generating three-dimensional image information
CN104541193A (en) Light microscope and method of controlling the same
KR20170044524A (en) Apparatus and method for acquiring image
CN103004218B (en) Three-dimensional image pickup device, imaging apparatus, transmittance section and image processing apparatus
CN103052914A (en) Three-dimensional image pickup apparatus
CN102289329B (en) Method for enhancing resolution and image detecting system
CN102474649B (en) Three-dimensional imaging device and optical transmission plate
Jeon et al. Multisampling compressive video spectroscopy
CN107144348A (en) A kind of polarization differential multispectral imaging device and method for real-time detection
Gupta Development of spectropolarimetric imagers for imaging of desert soils
JP2004085965A (en) Stereoscopic image pickup device and stereoscopic display device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150505

C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20150505

Address after: Washington State

Patentee after: Micro soft technique license Co., Ltd

Address before: Washington State

Patentee before: Microsoft Corp.