[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US7162153B2 - Apparatus for generating a combined image - Google Patents

Apparatus for generating a combined image Download PDF

Info

Publication number
US7162153B2
US7162153B2 US10/603,540 US60354003A US7162153B2 US 7162153 B2 US7162153 B2 US 7162153B2 US 60354003 A US60354003 A US 60354003A US 7162153 B2 US7162153 B2 US 7162153B2
Authority
US
United States
Prior art keywords
reflector
camera
reflective
perspective views
perspective
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US10/603,540
Other versions
US20040263612A1 (en
Inventor
Joseph E. Harter, Jr.
Gregory K. Scharenbroch
Ronald M. Taylor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aptiv Technologies AG
Original Assignee
Delphi Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Delphi Technologies Inc filed Critical Delphi Technologies Inc
Priority to US10/603,540 priority Critical patent/US7162153B2/en
Assigned to DELPHI TECHNOLOGIES, INC. reassignment DELPHI TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TAYLOR, RONALD M., SCHARENBROCH, GREGORY K., HARTER JR., JOSEPH E.
Priority to EP04076770A priority patent/EP1492357A3/en
Publication of US20040263612A1 publication Critical patent/US20040263612A1/en
Application granted granted Critical
Publication of US7162153B2 publication Critical patent/US7162153B2/en
Assigned to APTIV TECHNOLOGIES LIMITED reassignment APTIV TECHNOLOGIES LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DELPHI TECHNOLOGIES INC.
Assigned to Aptiv Technologies AG reassignment Aptiv Technologies AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L.
Assigned to APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L. reassignment APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L. MERGER Assignors: APTIV TECHNOLOGIES (2) S.À R.L.
Assigned to APTIV TECHNOLOGIES (2) S.À R.L. reassignment APTIV TECHNOLOGIES (2) S.À R.L. ENTITY CONVERSION Assignors: APTIV TECHNOLOGIES LIMITED
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/143Beam splitting or combining systems operating by reflection only using macroscopically faceted or segmented reflective surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1066Beam splitting or combining systems for enhancing image performance, like resolution, pixel numbers, dual magnifications or dynamic range, by tiling, slicing or overlapping fields of view
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • stereo imaging systems are used to produce combined images.
  • a stereo imaging system is a form of a combined imaging system and is used to produce stereo images.
  • a typical stereo imaging system includes a pair of imagers, for example cameras, which cooperate to render two images each depicting a different perspective view of the same target. The two images are analyzed by a processing unit to extract predetermined data, for example, three dimensional information.
  • stereo imaging systems are used for numerous automotive applications including, for example, precrash warning, driver monitoring, and occupant positioning. In automotive applications, the size and expense of the stereo image system is an important consideration.
  • an apparatus for generating a combined image of an object includes a first reflector, a second reflector in reflective communication with the first reflector and with a first portion of the object corresponding to a first perspective view of the object, a third reflector in reflective communication with the first reflector and a second portion of the object corresponding to a second perspective view of the object, and a camera.
  • the camera receives the first and second perspective views from the first reflector and forms the combined image from the first and second perspective views.
  • an apparatus for generating a combined image of an object includes means for reflecting a first perspective view of the object, means for reflecting a second perspective view of the object, and a camera.
  • the camera receives the first and second perspective views and forms the combined image from the first and second perspective views.
  • FIG. 1 is diagrammatic perspective view of one embodiment of a combined imaging system
  • FIG. 3 is a top plan view of another alternative combined imaging system
  • FIG. 4 is a cross-sectional view of the combined imaging system of FIG. 3 viewed along section lines 4 — 4 ;
  • FIG. 6 is a cross-sectional view of the combined imaging system of FIGS. 5 a and 5 b viewed along section lines 6 — 6 ;
  • One embodiment of a combined imaging system 10 includes a camera 12 , an on-frontal-axis reflector 14 , and a pair of off-frontal-axis reflectors 16 , 18 , as illustrated in FIG. 1 .
  • Each of the reflectors 14 , 16 , and 18 has a substantially reflective side 20 , 22 , 24 , respectively, capable of reflecting an image.
  • the reflectors 14 , 16 , and 18 may be formed of a glass substrate having a silver nitrate composite film applied to one side of the glass substrate so as to form a reflective opposite side.
  • other reflectors having at least one substantially reflective surface or side may be used.
  • reflectors having a highly polished metal surface are contemplated.
  • the on-frontal-axis reflector 14 is formed from a rectangular panel having a first end 21 , a second end 23 , and a substantially reflective side 20 defined by the ends 21 , 23 .
  • the ends 21 , 23 are displaced toward each other to form a vertex 25 substantially within the center of the reflector 14 .
  • the vertex 25 is defined by an acute angle so as to form a substantially “V” shaped top profile of the reflector 14 .
  • the vertex 25 may be defined by a right or obtuse angle.
  • the vertex 25 may be rounded so as to form a substantially “U” shaped top profile of the reflector 14 .
  • a first portion 27 of the reflective side 20 is defined by the vertex 25 and the first end 21 .
  • a second portion 29 of the reflective side 20 is defined by the vertex 25 and the second end 23 .
  • the camera 12 and the reflector 14 are secured to a suitable substrate 26 using appropriate fasteners, such as screws, bolts, nuts, or clamps.
  • the substrate 26 may be part of a housing or enclosure.
  • the substrate 26 is rigid and includes a flat area suitable for securing the camera 12 and reflectors 14 , 16 , 18 .
  • the substrate 26 may be formed from any suitable material, for example, a plastic or metallic material may be used.
  • a frontal axis 31 is defined to extend perpendicularly away from the center of the lens 28 of the camera 12 in a forward direction.
  • the frontal axis 31 defines the center of the field of view of the lens 28 under normal operating conditions.
  • the reflectors 16 , 18 are also secured to the substrate 26 using suitable fasteners.
  • the reflector 16 is positioned on the substrate 26 in front of the camera 12 but off the frontal axis 31 .
  • the reflector 16 is orientated so as the reflective side 22 of the reflector 16 is in reflective communication with a target area 30 and the first portion 27 of the reflective side 20 of the reflector 14 . In this orientation, the reflector 16 is not in reflective communication with the lens 28 of the camera 12 .
  • the reflector 18 is positioned on the substrate 26 in front of the camera 12 but off the frontal axis 31 .
  • the reflector 18 is positioned on an opposite side of the frontal axis 31 relative to the reflector 16 .
  • the reflector 18 is orientated so as the reflective side 24 of the reflector 18 is in reflective communication with the target area 30 and the second portion 29 of the reflective side 20 of the reflector 14 . In this orientation, the reflector 18 is not in reflective communication with the lens 28 of the camera 12 .
  • the reflectors 16 , 18 are isometrically positioned from the reflector 14 . However, in other embodiments, the reflectors 16 , 18 may not be isometrically positioned from the reflector 14 .
  • the field of view of the combined imaging system 10 is defined by the quantity of the target area 30 visible by the lens 28 of the camera 12 .
  • the field of view of the system 10 may be modified by altering the distance between the camera 12 and the reflector 14 and by altering the distance between the reflectors 14 , 16 .
  • the reflector 14 is positioned about two inches away from the camera 12 , as illustrated by an arrow 44
  • the reflectors 14 , 16 are displaced about six inches away from each other, as illustrated by an arrow 42 .
  • This configuration provides the system 10 with about a 28° field of view as illustrated in FIG. 1 by a number of arcs 46 .
  • the lens 28 of camera 12 receives a single image which is a combination of the first perspective view reflected by the first portion 27 of the reflector 14 and the second perspective view reflected by the second portion 29 of the reflector 14 .
  • the camera 12 renders the single combined image formed from the two perspective views.
  • the first and second perspective views are perspective views of the target area 30 from opposite sides of the frontal axis 31 . Due to the off frontal axis positioning of the reflectors 16 , 18 , the first and second perspective views are substantially different perspective views of the target area 30 .
  • typical stereo vision analyzing techniques may be used to extract information from the combined image because the combined image is a stereo image.
  • triangulation techniques may be used to determine depth information of objects or areas of interest.
  • disparities between the first and second perspectives can be analyzed to determine the three dimensional coordinates of areas of interest.
  • the first and second perspectives can be compared to determine if an area of interest in rotating or translating which may be valuable information in some applications.
  • Other techniques may also be used to process the combined or stereo image.
  • the first and second fields may be stored in a single buffer or alternative analyzing algorithms may be employed.
  • the camera 12 may be a progressive scanning camera.
  • the combined or stereo image is transmitted to the processing unit 50 by the camera 12 , received by the unit 50 , and divided into fields by the unit 50 prior to the application of the stereo vision analyzing techniques.
  • the on-frontal-axis reflector 14 B may be formed by a first reflective portion 52 , a second reflective portion 54 , and a support member 56 as illustrated in FIG. 2 .
  • the support member 56 is secured to the substrate 26 in front of the camera 12 by suitable fasteners, such as screws, bolts, or clamps.
  • the member 56 has a triangular cross section and vertically extends away from the substrate 26 at a substantially perpendicular angle.
  • the member 56 is positioned on the substrate 26 so as one of the three vertexes which define the triangular cross section substantially lies on the axis 31 .
  • a first side 57 of the member 56 faces the off-axis-reflector 16 and the lens 28 of the camera 12 .
  • a second side 59 of the member 56 faces the off-axis-reflector 18 and the lens 28 of the camera 12 .
  • the camera 12 forms a combined image in a manner similar to the embodiment illustrated in FIG. 1 .
  • the combined image is formed by reflecting two perspective views of the target area 30 to the lens 28 .
  • the reflector 16 reflects a first perspective view of the target area 30 , as illustrated in FIG. 2 by optical train lines 64 and 66 , to the first portion 52 of the reflector 14 B.
  • the reflector 18 reflects a second perspective view of the target area 30 , illustrated in FIG. 2 by optical train lines 68 and 70 , to the second portion 54 of the reflector 14 B.
  • the reflector 14 B subsequently reflects the first and second perspective views to the lens 28 of the camera 12 .
  • the reflector 14 B contemporaneously reflects both perspective views to the lens 28 .
  • the lens 28 of camera 12 receives a single image which is a combination of the first perspective view reflected by the first portion 52 of the reflector 14 B and the second perspective view reflected by the second portion 54 of the reflector 14 B.
  • the camera 12 renders the single combined image formed from the two perspective views.
  • the first and second perspective views are perspective views of the target area 30 from opposite sides of the frontal axis 31 . Due to the off frontal axis positioning of the reflectors 16 , 18 , the first and second perspective views are substantially different perspective views of the target area 30 .
  • the combined image formed in the embodiment illustrated in FIG. 2 is a stereo image because the first and second perspective views are different perspective views of the same target area. Due to the different vertical positions of the portions 52 , 54 on the member 56 , the stereo image formed by the embodiment illustrated in FIG. 2 is a vertical juxtaposed combination of the perspective views.
  • the camera 12 transmits the combined (i.e. stereo) image to the processing unit 50 via the interconnects 48 using, for example, the interlaced frame technique.
  • the processing unit 50 analyzes the stereo image using techniques similar to those techniques discussed above in regard to FIG. 1 .
  • the relative vertical or horizontal positioning of the first and second perspective views in the combined image does not substantially alter the processing techniques.
  • the perspective views may be orientated in several different positions in the combined image.
  • the areas analyzed by the processing unit may be predetermined depending upon the positioning of the first and second perspective views within the stereo image so as to allow use of typical processing techniques regardless of the orientation of the perspective views.
  • the on-frontal-axis reflector 14 C is formed from a rectangular panel having a substantially reflective side 72 .
  • the reflector 14 C is positioned on the substrate 26 in front of the lens 28 of the camera 12 and substantially on the axis 31 .
  • the reflector 14 C is movable between a first position and a second position using suitable means for movement. For example, as illustrated in FIGS. 3 and 4 , the reflector 14 C is rotatable about a center axis. When in the first position (solid line) the reflector 14 C is in reflective communication with the reflector 16 and the lens 28 of the camera 12 .
  • the reflector 14 C When in the second position (phantom line) the reflector 14 C is in reflective communication with the reflector 18 and the lens 28 of the camera 12 . It should be noted that when the reflector 14 C is in the first position, the reflector 14 C is not in reflective communication with the reflector 18 . Additionally, when in the second position, the reflector 14 C is not in reflective communication with reflector 16 .
  • An actuator 74 for example a galvanometer, is operably coupled to the reflector 14 C via a drive shaft 76 so as to provide means to rotate the reflector 14 C as illustrated in FIG. 4 .
  • the drive shaft 76 is coupled to the reflector 14 C using suitable fasteners, for example screws, bolts, or clamps. Additionally, the actuator 74 may be secured to the same side of substrate 26 as the reflector 14 C . Alternatively, the actuator 74 may be secured to an opposite side of substrate 26 as illustrated in FIG. 4 . In the latter configuration, the drive shaft 76 of the actuator 74 is coupled to the reflector 14 C through an access hole (not shown) in the substrate 26 .
  • the actuator 74 is also coupled to the processing unit 50 via electrical interconnects 80 .
  • the electrical interconnects 80 may include such interconnects as wires, cables, and other electrical interconnects useful in operablely coupling the actuator 74 to the processing unit 50 .
  • the processing unit 50 controls the rotation of the reflector 14 C by controlling the operation of the actuator 74 .
  • the reflector 14 C may be rotated by the cooperation of the unit 50 and the actuator 74 in a clockwise or counter clockwise direction as illustrated in FIG. 3 by an arrow 82 .
  • the even and odd fields of the first image containing the first perspective view and the second image containing the second perspective view are stored in suitable memory locations or buffers of the processing unit 50 .
  • the processing unit 50 may subsequently use typical analyzing techniques to extract information from the combined image, such as those techniques described above in regard to FIG. 1 .
  • the processing unit may analyze areas of interest by comparing the respective fields of the first and second images.
  • the reflector 16 A is also in reflective communication with a on-frontal-axis reflector 14 D in both the first and second positions.
  • the reflector 18 A is movable between the third position (solid line in FIG. 5 a ) which is in reflective communication with a third target area 94 and the fourth position (solid line in FIG. 5 b ) which is in reflective communication with a fourth target area 114 . Similar to reflector 16 A, the reflector 18 A is in reflective communication with the on-frontal-axis reflector 14 D in both the third and fourth positions.
  • each of the reflectors 16 A, 18 A are coupled to separate actuators, for example galvanometers, so as to provide means of motion.
  • the coupling and operation of each actuator and respective reflector 16 A, 18 A are substantially similar. Therefore, the coupling and operation of the actuators and respective reflectors 16 A, 18 A are described in regard to reflector 16 A only with the understanding that the coupling and operation of reflector 18 A is substantially similar.
  • an actuator 108 is operably coupled to reflector 16 A via a drive shaft 110 .
  • the drive shaft 110 is coupled to the reflector 16 A using suitable fasteners, for example screws, bolts, or clamps. Additionally, the actuator 108 may be secured to the same side of substrate 26 as the reflector 16 A.
  • the reflector 14 D is formed from a first rectangular panel portion 100 having a first end 104 and a second rectangular panel portion 102 having a second end 106 .
  • the first and second portions 100 , 102 each have a substantially reflective side 101 , 103 , respectively.
  • the first end 104 of the first portion 100 is perpendicularly abutted to the first end 106 of the second portion 102 so as to form a vertex.
  • the reflector 14 D, formed from the first and second portions 100 , 102 is secured to the substrate 26 in front of the camera 12 in a position so as the vertex lies substantially on the frontal axis 31 .
  • the reflective side 101 of the first portion 100 is in reflective communication with the reflector 16 A and the lens 28 of the camera 12 .
  • the reflective side 103 of the second portion 102 is in reflective communication with the reflector 18 A and the lens 28 of the camera 12 .
  • the reflector 14 D is formed from two portions, the reflector 14 D may also be formed from a single reflective panel having two ends displaced toward each other similar to the reflector 14 of FIG. 1 .
  • the reflector 18 A when in the third position, subsequently reflects a perspective view, illustrated by optical train lines 124 and 126 in FIG. 5 a , of the target area 94 to the reflective side 103 of the reflector 14 D.
  • the reflective sides 101 , 103 of the reflector 14 D subsequently reflects the perspective views of the target areas 92 , 94 to the lens 28 of the camera 12 .
  • the lens 28 of the camera 12 therefore, receives a single image which is a combination of the perspective view reflected by the reflective side 101 of the reflector 14 D and the perspective view reflected by the reflective side 103 of the reflector 14 D.
  • the camera 12 receives the combined image formed from the two perspective views.
  • the processing unit 50 may rotate the reflector 16 A to a second position and reflector 18 A to a fourth position.
  • reflector 16 A reflects a perspective view, illustrated by optical train lines 130 and 132 in FIG. 5 b , of the second target area 112 to the reflector 14 D.
  • the reflector 18 A reflects a perspective view, illustrated by optical train lines 134 and 136 in FIG. 5 b , of the fourth target area 114 to the reflector 14 D.
  • the perspective views from the reflectors 16 A, 18 A are subsequently reflected to the lens 28 of the camera 12 by the reflector 14 D as described above in regard to FIG. 5 a .
  • the processing unit 50 may continue to rotate the reflectors 16 A, 18 A to various positions so as to reflect perspective views of different selective target areas. If the selected target areas are different target areas, for example target area 92 , 94 as illustrated in FIG. 5 a , then the two perspective views will be different perspective views of different target areas and will form a combined image when received by the camera 12 . Alternatively, if the target areas are identical target areas, for example if target area 92 and target area 94 are identical, the two perspective views will be different perspective views of the same target area and will form a combined image which is a stereo image when received by the camera 12 .
  • the processing unit 50 coordinates the camera 12 and the speed and positioning of the reflectors 16 A, 18 A so as to render an image at a correct time point (i.e. when the reflectors 16 A, 18 A are in the correct positions).
  • the combined image formed by the perspective views of target areas 92 , 94 is reflected to the processing unit 50 by the camera 12 and the interconnects 118 using, for example, the interlaced frame technique.
  • the processing unit 50 captures the combined image and analyzes the image to extract desirable information. If the combined image is a stereo image, the processing unit 50 may analyze the combined image using techniques similar to those discussed above in regard to FIG. 1 . However, if the combined image is not a stereo image, three dimensional information may not obtainable. However, other processing techniques may be used to extract information from the combined image. For example, the occupancy of motor vehicle passengers in two different target areas may be detected.
  • a combined image system 10 D may incorporate the combination of the embodiments illustrated and described in regard to FIGS. 3–4 and FIGS. 5–6 .
  • the system 10 D includes the on-center-axis reflector 14 C movable between a first and second position and off-center-axis reflectors 16 A, 18 A separately movable between a first and second position and a third and fourth position, respectively.
  • the operation of the reflector 14 C is substantially similar to the operation of the reflector 14 C described in regard to FIGS. 3–4 .
  • the operation of the reflectors 16 A, 18 A are similar to the operation of the reflectors 16 A, 18 A described in regard to FIGS. 5–6 .
  • FIG. 7 the system 10 D includes the on-center-axis reflector 14 C movable between a first and second position and off-center-axis reflectors 16 A, 18 A separately movable between a first and second position and a third and fourth position, respectively.
  • the operation of the reflector 14 C is substantially
  • the processing unit 50 coordinates the speed and positioning of each reflector 14 C, 16 A, 18 A so as to render an image at the correct time point (i.e. when the reflectors 14 C, 16 A, 18 A are each in the correct position). If the perspective views reflected by the reflectors 16 A, 18 A are different perspective views of an identical target area, the camera 12 receives a combined image which is a stereo image. The processing unit 50 may analyze the stereo image using techniques similar to those discussed above in regard to FIG. 1 . Conversely, if the perspective views reflected by the reflectors 16 A, 18 A are perspective views of different target areas, the camera 12 receives a combined image which is not be a stereo image. The processing unit 50 may analyze the combined image using techniques similar to those discussed above in regard to FIGS. 5–6 .
  • a substrate 26 B of a combined image system 10 E may be movable between a first position and a second position as illustrated in FIGS. 8 a–b . Moving the substrate 26 B alters the target view of the system 10 E.
  • the system 10 E when in the first position, as shown in FIG. 8 a , the system 10 E has a target view of a first target area 140 .
  • the system 10 E When in the second position, as shown in FIG. 8 b , the system 10 E has a target view of a second target area 142 . Therefore, a single system 10 E can target a plurality of areas of interest.
  • a single system 10 E can monitor a driver occupying a first target area of a motor vehicle and a passenger occupying a second target area of the motor vehicle by alternating between the two target areas.
  • the operation of the system 10 E may be similar to any of the embodiments illustrated in FIGS. 1–7 .
  • a combined image rendered by the system 10 E may be analyzed using the techniques discussed above in regard to FIGS. 1 , 3 – 4 , 5 – 6 , or 7 .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Endoscopes (AREA)

Abstract

An apparatus for generating a combined image includes a first reflector, a second reflector, a third reflector, and a camera. In some embodiments, one or more of the reflectors may be movable between a first position and a second position. Additionally, the apparatus may change position to alter the target view of the apparatus.

Description

TECHNICAL FIELD
The present disclosure generally relates to the generation of combined images, and more specifically to the generation of combined images using a single camera.
BACKGROUND OF THE INVENTION
A combined image is a collection of images depicting different perspective views of the same or different target areas. One form of a combined image is a collection of images depicting perspective views of different target areas. Another form of a combined image is a collection of images depicting perspective views of the same target area produced at different points in time. A further form of a combined image is a stereo image. A stereo image is a collection of images depicting different perspective views of the same target area. Typically, a stereo image is formed from a pair of images which show a single target from two different perspectives. By analyzing relational elements of the two images, three dimensional information relating to the target or target area can be extracted among other useful data.
Combined imaging systems are used to produce combined images. A stereo imaging system is a form of a combined imaging system and is used to produce stereo images. A typical stereo imaging system includes a pair of imagers, for example cameras, which cooperate to render two images each depicting a different perspective view of the same target. The two images are analyzed by a processing unit to extract predetermined data, for example, three dimensional information. Among the many applications for such systems, stereo imaging systems are used for numerous automotive applications including, for example, precrash warning, driver monitoring, and occupant positioning. In automotive applications, the size and expense of the stereo image system is an important consideration.
SUMMARY OF THE INVENTION
In accordance with one illustrative embodiment, an apparatus for generating a combined image of an object includes a first reflector, a second reflector in reflective communication with the first reflector and with a first portion of the object corresponding to a first perspective view of the object, a third reflector in reflective communication with the first reflector and a second portion of the object corresponding to a second perspective view of the object, and a camera. The camera receives the first and second perspective views from the first reflector and forms the combined image from the first and second perspective views.
In accordance with another illustrative embodiment, an apparatus for generating a combined image of an object includes a first reflector movable between a first position and a second position, a second reflector in reflective communication with the first reflector when the first reflector is in the first position and with a first portion of the object corresponding to a first perspective view of the object, a third reflector in reflective communication with the first reflector when the first reflector is in the second position and a second portion of the object corresponding to a second perspective view of the object, a camera receiving the first and second perspective views from the first reflector, and a processing unit coupled to the camera. The processing unit receives the first and second perspective views from the camera and forms the combined image from the first and second perspective views.
In accordance with a further illustrative embodiment, an apparatus for generating a combined image of an object includes means for reflecting a first perspective view of the object, means for reflecting a second perspective view of the object, and a camera. The camera receives the first and second perspective views and forms the combined image from the first and second perspective views.
These and other features of the present invention will become more apparent from the following description of the illustrative embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is diagrammatic perspective view of one embodiment of a combined imaging system;
FIG. 2 is a diagrammatic perspective view of an alternative embodiment of a combined imaging system;
FIG. 3 is a top plan view of another alternative combined imaging system;
FIG. 4 is a cross-sectional view of the combined imaging system of FIG. 3 viewed along section lines 44;
FIG. 5 ais a top plan view of yet another embodiment of a combined imaging system illustrating rotatable off-frontal-axis reflectors rotated to a first position;
FIG. 5 bis a top plan view of the combined imaging system of FIG. 5 aillustrating the rotatable off-frontal-axis reflectors rotated to a second position;
FIG. 6 is a cross-sectional view of the combined imaging system of FIGS. 5 a and 5 bviewed along section lines 66;
FIG. 7 is a top plan view of still another embodiment of a combined imaging system;
FIG. 8 ais a top plan view of the combined imaging system of FIG. 1 positioned to have a perspective view of a target area; and
FIG. 8 bis a top plan view of the combined image system of FIG. 1 positioned to have a perspective view of a different target area than the target area of FIG. 8 a.
DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS
One embodiment of a combined imaging system 10 includes a camera 12, an on-frontal-axis reflector 14, and a pair of off-frontal- axis reflectors 16, 18, as illustrated in FIG. 1. Each of the reflectors 14, 16, and 18 has a substantially reflective side 20, 22, 24, respectively, capable of reflecting an image. Illustratively, the reflectors 14, 16, and 18 may be formed of a glass substrate having a silver nitrate composite film applied to one side of the glass substrate so as to form a reflective opposite side. However, other reflectors having at least one substantially reflective surface or side may be used. For example, reflectors having a highly polished metal surface are contemplated.
In the embodiment illustrated in FIG. 1, the on-frontal-axis reflector 14 is formed from a rectangular panel having a first end 21, a second end 23, and a substantially reflective side 20 defined by the ends 21, 23. The ends 21, 23 are displaced toward each other to form a vertex 25 substantially within the center of the reflector 14. In the embodiment illustrated in FIG. 1, the vertex 25 is defined by an acute angle so as to form a substantially “V” shaped top profile of the reflector 14. In other embodiments the vertex 25 may be defined by a right or obtuse angle. Additionally, the vertex 25 may be rounded so as to form a substantially “U” shaped top profile of the reflector 14. Regardless, a first portion 27 of the reflective side 20 is defined by the vertex 25 and the first end 21. A second portion 29 of the reflective side 20 is defined by the vertex 25 and the second end 23.
The camera 12 and the reflector 14 are secured to a suitable substrate 26 using appropriate fasteners, such as screws, bolts, nuts, or clamps. In some embodiments, the substrate 26 may be part of a housing or enclosure. The substrate 26 is rigid and includes a flat area suitable for securing the camera 12 and reflectors 14, 16, 18. The substrate 26 may be formed from any suitable material, for example, a plastic or metallic material may be used. A frontal axis 31 is defined to extend perpendicularly away from the center of the lens 28 of the camera 12 in a forward direction. The frontal axis 31 defines the center of the field of view of the lens 28 under normal operating conditions. The reflector 14 is positioned on the substrate 26 in front of the lens 28 of the camera 12 so as the vertex 25 lies substantially on the frontal axis 31. In this configuration, the first and second portions 27, 29 of the reflective side 20 of the reflector 14 are both in communication with the lens 28 of the camera 12.
The reflectors 16, 18 are also secured to the substrate 26 using suitable fasteners. The reflector 16 is positioned on the substrate 26 in front of the camera 12 but off the frontal axis 31. The reflector 16 is orientated so as the reflective side 22 of the reflector 16 is in reflective communication with a target area 30 and the first portion 27 of the reflective side 20 of the reflector 14. In this orientation, the reflector 16 is not in reflective communication with the lens 28 of the camera 12. Similarly, the reflector 18 is positioned on the substrate 26 in front of the camera 12 but off the frontal axis 31. The reflector 18 is positioned on an opposite side of the frontal axis 31 relative to the reflector 16. The reflector 18 is orientated so as the reflective side 24 of the reflector 18 is in reflective communication with the target area 30 and the second portion 29 of the reflective side 20 of the reflector 14. In this orientation, the reflector 18 is not in reflective communication with the lens 28 of the camera 12. In the embodiment illustrated in FIG. 1, the reflectors 16, 18 are isometrically positioned from the reflector 14. However, in other embodiments, the reflectors 16, 18 may not be isometrically positioned from the reflector 14.
The combined imaging system 10 also includes a processing unit 50. The processing unit 50 is electrically coupled to the camera 12 by a plurality of electrical interconnects 48. The electrical interconnects 48 may include such interconnects as wires, cables, Red-Green-Blue (RGB) cables, Bayonet Neill-Concelman (BNC) connector cables, and other electrical interconnects useful in operablely coupling the camera 12 to the processing unit 50.
The field of view of the combined imaging system 10 is defined by the quantity of the target area 30 visible by the lens 28 of the camera 12. The field of view of the system 10 may be modified by altering the distance between the camera 12 and the reflector 14 and by altering the distance between the reflectors 14, 16. In the embodiment illustrated in FIG. 1, the reflector 14 is positioned about two inches away from the camera 12, as illustrated by an arrow 44, and the reflectors 14, 16 are displaced about six inches away from each other, as illustrated by an arrow 42. This configuration provides the system 10 with about a 28° field of view as illustrated in FIG. 1 by a number of arcs 46.
In the illustrative embodiment shown in FIG. 1, the camera 12 forms a combined image by receiving two perspective views of the target area 30 which are reflected to the lens 28 of the camera 12. The reflector 16 reflects a first perspective view of the target area 30, as illustrated in FIG. 1 by optical train lines 32 and 34, to the first portion 27 of the reflector 14. Similarly, the reflector 18 reflects a second perspective view of the target area 30, illustrated in FIG. 1 by optical train lines 36 and 40, to the second portion 29 of the reflector 14. The first and second portions 27, 29 of reflector 14 subsequently reflect the first and second perspective views to the lens 28 of the camera 12. The reflector 14 contemporaneously reflects both perspective views to the lens 28. The lens 28 of camera 12 receives a single image which is a combination of the first perspective view reflected by the first portion 27 of the reflector 14 and the second perspective view reflected by the second portion 29 of the reflector 14. The camera 12 renders the single combined image formed from the two perspective views. The first and second perspective views are perspective views of the target area 30 from opposite sides of the frontal axis 31. Due to the off frontal axis positioning of the reflectors 16, 18, the first and second perspective views are substantially different perspective views of the target area 30.
In the embodiment illustrated in FIG. 1, the combined image rendered by camera 12 is a stereo image because the first and second perspective views are different perspective views of the same target area. The stereo image is formed by contemporaneously reflecting the first and second perspective views to the lens 28 of the camera 12. The stereo image is a juxtaposed combination of the first and second perspective views. In particular, the stereo image formed by the illustrative embodiment shown in FIG. 1 is a horizontal juxtaposed combination of the perspective views because the first and second portions 27, 29 of the reflector 14 lie on the same horizontal plane. In other configurations, the reflective portions of the reflector 14 may lie on separate horizontal planes so as to form a combined or stereo image having perspective views in alternative orientations, for example in a vertical juxtaposed configuration.
The combined (i.e. stereo) image formed from the first and second perspective views is transmitted to the processing unit 50 by the camera 12 and the interconnects 48. In the illustrative embodiments, the camera 12 is an interlaced scanning camera, however, other types of cameras or imagers may be used. The camera 12 divides the image into two fields of odd and even rows. Each field contains information from both the first and the second perspective views. The camera 12 transmits the first field, for example the odd field, to the processing unit 50 via the interconnects 48. The processing unit 50 captures and stores the first field in a memory buffer. Subsequent to the transmission of the first field, the camera 12 transmits the second field, for example the even field, to the processing unit 50 via the interconnects 48. The processing unit 50 captures and stores the second field in another memory buffer. The full combined or stereo image, composed of the odd and even fields, are stored in the memory buffers and available to the processing unit 50 for analysis.
In the embodiment illustrated in FIG. 1, typical stereo vision analyzing techniques may be used to extract information from the combined image because the combined image is a stereo image. For example, triangulation techniques may be used to determine depth information of objects or areas of interest. In particular, with a known three dimensional position of the combined image system 10, for example the position of the system 10 within a motor vehicle, disparities between the first and second perspectives can be analyzed to determine the three dimensional coordinates of areas of interest. Additionally, the first and second perspectives can be compared to determine if an area of interest in rotating or translating which may be valuable information in some applications. Other techniques may also be used to process the combined or stereo image. For example, the first and second fields may be stored in a single buffer or alternative analyzing algorithms may be employed. Additionally, in some embodiments, the camera 12 may be a progressive scanning camera. In these embodiments, the combined or stereo image is transmitted to the processing unit 50 by the camera 12, received by the unit 50, and divided into fields by the unit 50 prior to the application of the stereo vision analyzing techniques.
In another embodiment of the disclosure, the on-frontal-axis reflector 14B may be formed by a first reflective portion 52, a second reflective portion 54, and a support member 56 as illustrated in FIG. 2. The support member 56 is secured to the substrate 26 in front of the camera 12 by suitable fasteners, such as screws, bolts, or clamps. The member 56 has a triangular cross section and vertically extends away from the substrate 26 at a substantially perpendicular angle. The member 56 is positioned on the substrate 26 so as one of the three vertexes which define the triangular cross section substantially lies on the axis 31. In this configuration, a first side 57 of the member 56 faces the off-axis-reflector 16 and the lens 28 of the camera 12. Similarly, a second side 59 of the member 56 faces the off-axis-reflector 18 and the lens 28 of the camera 12.
The first and second portions 52, 54 of the reflector 14B are rectangular panels each having a substantially reflective side 58, 60, respectively. The first portion 52 is secured to the first side 57 of the member 56 so as the reflective side 58 of the portion 52 is in reflective communication with the reflector 16 and the lens 28 of the camera 12. The second portion 54 is secured to the second side 59 of the member 56 so as the reflective side 60 of the portion 54 is in reflective communication with the reflector 18 and the lens 28 of the camera 12. The first portion 52 is secured to the member 56 on the first side 57 at a position which is vertically higher on the member 56 than the position of the portion 54 secured to the second side 59 of the member 56. This configuration allows parts of the portions 52, 54 to extend horizontally away from the member 56 and vertically cross each other. A camera support 62 may be used in some applications to vertically raise the camera 12 from the substrate 26 so as the lens 28 of the camera 12 is better positioned to be in reflective communication with both portions 52, 54. Although in the embodiment illustrated in FIG. 2 the support member has a triangular cross section, support members having other geometric cross sections may be used. For example, support members having a square, rectangular, or hexagonal cross sections are contemplated.
In the embodiment illustrated in FIG. 2, the camera 12 forms a combined image in a manner similar to the embodiment illustrated in FIG. 1. In particular, the combined image is formed by reflecting two perspective views of the target area 30 to the lens 28. The reflector 16 reflects a first perspective view of the target area 30 , as illustrated in FIG. 2 by optical train lines 64 and 66, to the first portion 52 of the reflector 14B. Similarly, the reflector 18 reflects a second perspective view of the target area 30, illustrated in FIG. 2 by optical train lines 68 and 70, to the second portion 54 of the reflector 14B. The reflector 14B subsequently reflects the first and second perspective views to the lens 28 of the camera 12. The reflector 14B contemporaneously reflects both perspective views to the lens 28. The lens 28 of camera 12 receives a single image which is a combination of the first perspective view reflected by the first portion 52 of the reflector 14B and the second perspective view reflected by the second portion 54 of the reflector 14B. The camera 12 renders the single combined image formed from the two perspective views. The first and second perspective views are perspective views of the target area 30 from opposite sides of the frontal axis 31. Due to the off frontal axis positioning of the reflectors 16, 18, the first and second perspective views are substantially different perspective views of the target area 30.
The combined image formed in the embodiment illustrated in FIG. 2 is a stereo image because the first and second perspective views are different perspective views of the same target area. Due to the different vertical positions of the portions 52, 54 on the member 56, the stereo image formed by the embodiment illustrated in FIG. 2 is a vertical juxtaposed combination of the perspective views.
The camera 12 transmits the combined (i.e. stereo) image to the processing unit 50 via the interconnects 48 using, for example, the interlaced frame technique. The processing unit 50 analyzes the stereo image using techniques similar to those techniques discussed above in regard to FIG. 1. The relative vertical or horizontal positioning of the first and second perspective views in the combined image does not substantially alter the processing techniques. In particular, the perspective views may be orientated in several different positions in the combined image. The areas analyzed by the processing unit may be predetermined depending upon the positioning of the first and second perspective views within the stereo image so as to allow use of typical processing techniques regardless of the orientation of the perspective views.
In a further embodiment of the disclosure, the on-frontal-axis reflector 14C is formed from a rectangular panel having a substantially reflective side 72. The reflector 14C is positioned on the substrate 26 in front of the lens 28 of the camera 12 and substantially on the axis 31. Additionally, the reflector 14C is movable between a first position and a second position using suitable means for movement. For example, as illustrated in FIGS. 3 and 4, the reflector 14C is rotatable about a center axis. When in the first position (solid line) the reflector 14C is in reflective communication with the reflector 16 and the lens 28 of the camera 12. When in the second position (phantom line) the reflector 14C is in reflective communication with the reflector 18 and the lens 28 of the camera 12. It should be noted that when the reflector 14C is in the first position, the reflector 14C is not in reflective communication with the reflector 18. Additionally, when in the second position, the reflector 14C is not in reflective communication with reflector 16.
An actuator 74, for example a galvanometer, is operably coupled to the reflector 14C via a drive shaft 76 so as to provide means to rotate the reflector 14C as illustrated in FIG. 4. The drive shaft 76 is coupled to the reflector 14C using suitable fasteners, for example screws, bolts, or clamps. Additionally, the actuator 74 may be secured to the same side of substrate 26 as the reflector 14C . Alternatively, the actuator 74 may be secured to an opposite side of substrate 26 as illustrated in FIG. 4. In the latter configuration, the drive shaft 76 of the actuator 74 is coupled to the reflector 14C through an access hole (not shown) in the substrate 26. In either configuration, a support bracket assembly 78 and screws or other suitable fasteners may be used to secure the actuator 74 to the substrate 26. In some applications, a gear assembly may be operablely coupled between the drive shaft 76 and the reflector 14C . In addition, other means of moving reflector 14C between the first and second position may be used. For example, reflector 14C may be pivoted at one end.
The actuator 74 is also coupled to the processing unit 50 via electrical interconnects 80. The electrical interconnects 80 may include such interconnects as wires, cables, and other electrical interconnects useful in operablely coupling the actuator 74 to the processing unit 50. The processing unit 50 controls the rotation of the reflector 14C by controlling the operation of the actuator 74. The reflector 14C may be rotated by the cooperation of the unit 50 and the actuator 74 in a clockwise or counter clockwise direction as illustrated in FIG. 3 by an arrow 82.
In the embodiment illustrated in FIGS. 3–4, the processing unit 50 forms a combined image by receiving and processing two images each containing different perspective views of a target area. The processing unit 50 rotates the reflector 14C to the first position (solid line) in reflective communication with the reflector 16 and the lens 28 of the camera 12. A first image containing a first perspective view, illustrated by optical train lines 84 and 86, of the target area 30 is reflected to the reflector 14C by the reflector 16. The reflector 14C subsequently reflects the first image to the lens 28 of the camera 12. The camera 12 transmits the first image to the processing unit 50 via the interconnects 48 using, for example, the interlaced frame technique. The processing unit 50 subsequently rotates the reflector 14C to the second position (phantom line) in reflective communication with the reflector 18 and the lens 28 of the camera 12. A second image containing a second perspective view, illustrated by optical train lines 88 and 90, of the target area 30 is reflected to the reflector 14C by the reflector 18. The reflector 14C subsequently reflects the second image to the lens 28 of the camera 12. The camera 12 transmits the second image to the processing unit 50 via the interconnects 48 using, for example, the interlaced frame technique. The first and second perspective views contained in the first and second images, respectively, are perspective views of the target area 30 from opposite sides of the frontal axis 31. Due to the off frontal axis position of the reflectors 16, 18, the first and second perspective views are substantially different perspective views of the target area 30.
The processing unit 50 coordinates the camera 12 and the speed and positioning of the reflector 14C so as to render an image at a correct time point (i.e. when the reflector 14C is in the first and second positions). After receiving each of the first and second images, the processing unit 50 forms a combined image by storing and processing the first and second images. The combined image formed by the first and second images is a stereo image because each of the first and second image depict a different perspective view of the same target area 30. The processing unit 50 captures, stores, and analyzes the combined (i.e. stereo) image using techniques similar to the techniques described above in regard to FIG. 1. The even and odd fields of the first image containing the first perspective view and the second image containing the second perspective view are stored in suitable memory locations or buffers of the processing unit 50. The processing unit 50 may subsequently use typical analyzing techniques to extract information from the combined image, such as those techniques described above in regard to FIG. 1. For example, the processing unit may analyze areas of interest by comparing the respective fields of the first and second images.
In yet a further embodiment of the disclosure, the off-frontal-axis reflector 16A is movable between a first position and a second position. Similarly, the off-frontal-axis reflector 18A is movable between a third position and a fourth position. For example, as illustrated in FIGS. 5–6, the reflectors 16A, 18A are rotatable around a center axis. In particular, the reflector 16A is movable between the first position (solid line in FIG. 5 a) which is in reflective communication with a first target area 92 and the second position (solid line in FIG. 5 b) which is in reflective communication with a second target area 112. The reflector 16A is also in reflective communication with a on-frontal-axis reflector 14D in both the first and second positions. The reflector 18A is movable between the third position (solid line in FIG. 5 a) which is in reflective communication with a third target area 94 and the fourth position (solid line in FIG. 5 b) which is in reflective communication with a fourth target area 114. Similar to reflector 16A, the reflector 18A is in reflective communication with the on-frontal-axis reflector 14D in both the third and fourth positions.
Each of the reflectors 16A, 18A are coupled to separate actuators, for example galvanometers, so as to provide means of motion. The coupling and operation of each actuator and respective reflector 16A, 18A are substantially similar. Therefore, the coupling and operation of the actuators and respective reflectors 16A, 18A are described in regard to reflector 16A only with the understanding that the coupling and operation of reflector 18A is substantially similar. As illustrated in FIG. 6, an actuator 108 is operably coupled to reflector 16A via a drive shaft 110. The drive shaft 110 is coupled to the reflector 16A using suitable fasteners, for example screws, bolts, or clamps. Additionally, the actuator 108 may be secured to the same side of substrate 26 as the reflector 16A. Alternatively, the actuator 108 may be secured to an opposite side of the substrate 26 as illustrated in FIG. 6. In the latter configuration, the drive shaft 110 of the actuator 108 is coupled to the reflector 16A through an access hole (not shown) in the substrate 26. In either configuration, a support bracket assembly 116 and screws or other suitable fasteners may be used to secure the actuator 108 to the substrate 26. In some applications, a gear assembly may be operablely coupled between the drive shaft 110 and the reflector 16A. In addition, other means of moving reflector 16A between the first and second position may be used. For example, reflector 16A may be pivoted at one end between the first and second positions.
The reflector 14D is formed from a first rectangular panel portion 100 having a first end 104 and a second rectangular panel portion 102 having a second end 106. The first and second portions 100, 102 each have a substantially reflective side 101, 103, respectively. The first end 104 of the first portion 100 is perpendicularly abutted to the first end 106 of the second portion 102 so as to form a vertex. The reflector 14D, formed from the first and second portions 100, 102, is secured to the substrate 26 in front of the camera 12 in a position so as the vertex lies substantially on the frontal axis 31. In this configuration, the reflective side 101 of the first portion 100 is in reflective communication with the reflector 16A and the lens 28 of the camera 12. The reflective side 103 of the second portion 102 is in reflective communication with the reflector 18A and the lens 28 of the camera 12. Although in the embodiment illustrated in FIGS. 5–6, the reflector 14D is formed from two portions, the reflector 14D may also be formed from a single reflective panel having two ends displaced toward each other similar to the reflector 14 of FIG. 1.
The actuator 108 is also coupled to the processing unit 50 via electrical interconnects 118. The interconnects 118 may include such interconnects as wires, cables, and other electrical interconnects useful in operablely coupling the actuator 118 to the processing unit 50. The processing unit 50 controls the rotation of the reflectors 16A, 18A by controlling the operation of the actuators. The reflectors 16A, 18A may be rotated by the cooperation of the unit 50 and the actuators in a clockwise or counter clockwise direction as illustrated in FIGS. 5 a– b by arrows 96, 98, respectively.
In the embodiment illustrated in FIGS. 5–6, a combined image is formed by reflecting two perspective views of a target area to the lens 28 of the camera 12. The processing unit 50 rotates the reflector 16A to the first position (solid line in FIG. 5 a) in reflective communication with the reflective side 101 of the reflector 14D and the first target area 92. The processing unit 50 also rotates the reflector 18A to the third position (solid line in FIG. 5 a) in reflective communication with the reflective side 103 of the reflector 14D and the third target area 94. When in the first position, the reflector 16A reflects a perspective view, illustrated by optical train lines 120 and 122 in FIG. 5 a, of the target area 92 to the reflective side 101 of the reflector 14D. Similarly, when in the third position, the reflector 18A subsequently reflects a perspective view, illustrated by optical train lines 124 and 126 in FIG. 5 a, of the target area 94 to the reflective side 103 of the reflector 14D. The reflective sides 101, 103 of the reflector 14D subsequently reflects the perspective views of the target areas 92, 94 to the lens 28 of the camera 12. The lens 28 of the camera 12, therefore, receives a single image which is a combination of the perspective view reflected by the reflective side 101 of the reflector 14D and the perspective view reflected by the reflective side 103 of the reflector 14D. The camera 12 receives the combined image formed from the two perspective views.
To view additional target areas, the processing unit 50 may rotate the reflector 16A to a second position and reflector 18A to a fourth position. In the second position, reflector 16A reflects a perspective view, illustrated by optical train lines 130 and 132 in FIG. 5 b, of the second target area 112 to the reflector 14D. Similarly, in the fourth position, the reflector 18A reflects a perspective view, illustrated by optical train lines 134 and 136 in FIG. 5 b, of the fourth target area 114 to the reflector 14D. The perspective views from the reflectors 16A, 18A are subsequently reflected to the lens 28 of the camera 12 by the reflector 14D as described above in regard to FIG. 5 a. The processing unit 50 may continue to rotate the reflectors 16A, 18A to various positions so as to reflect perspective views of different selective target areas. If the selected target areas are different target areas, for example target area 92, 94 as illustrated in FIG. 5 a, then the two perspective views will be different perspective views of different target areas and will form a combined image when received by the camera 12. Alternatively, if the target areas are identical target areas, for example if target area 92 and target area 94 are identical, the two perspective views will be different perspective views of the same target area and will form a combined image which is a stereo image when received by the camera 12.
The processing unit 50 coordinates the camera 12 and the speed and positioning of the reflectors 16A, 18A so as to render an image at a correct time point (i.e. when the reflectors 16A, 18A are in the correct positions). The combined image formed by the perspective views of target areas 92, 94 is reflected to the processing unit 50 by the camera 12 and the interconnects 118 using, for example, the interlaced frame technique. The processing unit 50 captures the combined image and analyzes the image to extract desirable information. If the combined image is a stereo image, the processing unit 50 may analyze the combined image using techniques similar to those discussed above in regard to FIG. 1. However, if the combined image is not a stereo image, three dimensional information may not obtainable. However, other processing techniques may be used to extract information from the combined image. For example, the occupancy of motor vehicle passengers in two different target areas may be detected.
In some applications, a combined image system 10D may incorporate the combination of the embodiments illustrated and described in regard to FIGS. 3–4 and FIGS. 5–6. In particular, as shown in FIG. 7, the system 10D includes the on-center-axis reflector 14C movable between a first and second position and off-center- axis reflectors 16A, 18A separately movable between a first and second position and a third and fourth position, respectively. The operation of the reflector 14C is substantially similar to the operation of the reflector 14C described in regard to FIGS. 3–4. The operation of the reflectors 16A, 18A are similar to the operation of the reflectors 16A, 18A described in regard to FIGS. 5–6. In the embodiment illustrated in FIG. 7, the processing unit 50 coordinates the speed and positioning of each reflector 14C, 16A, 18A so as to render an image at the correct time point (i.e. when the reflectors 14C, 16A, 18A are each in the correct position). If the perspective views reflected by the reflectors 16A, 18A are different perspective views of an identical target area, the camera 12 receives a combined image which is a stereo image. The processing unit 50 may analyze the stereo image using techniques similar to those discussed above in regard to FIG. 1. Conversely, if the perspective views reflected by the reflectors 16A, 18A are perspective views of different target areas, the camera 12 receives a combined image which is not be a stereo image. The processing unit 50 may analyze the combined image using techniques similar to those discussed above in regard to FIGS. 5–6.
In yet a further embodiment, a substrate 26B of a combined image system 10E may be movable between a first position and a second position as illustrated in FIGS. 8 a–b. Moving the substrate 26B alters the target view of the system 10E. In particular, when in the first position, as shown in FIG. 8 a, the system 10E has a target view of a first target area 140. When in the second position, as shown in FIG. 8 b, the system 10E has a target view of a second target area 142. Therefore, a single system 10E can target a plurality of areas of interest. For example, a single system 10E can monitor a driver occupying a first target area of a motor vehicle and a passenger occupying a second target area of the motor vehicle by alternating between the two target areas. The operation of the system 10E may be similar to any of the embodiments illustrated in FIGS. 1–7. For example, a combined image rendered by the system 10E may be analyzed using the techniques discussed above in regard to FIGS. 1, 34, 56, or 7.
While the invention has been illustrated and described in detail in the foregoing drawings and description, the same is to be considered as illustrative and not restrictive in character, it being understood that only illustrative embodiments thereof have been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected.

Claims (17)

1. An apparatus for generating a combined image of an object, the apparatus comprising:
a first reflector;
a second reflector in reflective communication with the first reflector and with a first portion of the object corresponding to a first perspective view of the object;
a third reflector in reflective communication with the first reflector and a second portion of the object corresponding to a second perspective view of the object;
a camera receiving the first and second perspective views from the first reflector and forming the combined image from the first and second perspective views; and
a substrate, the first reflector, second reflector, third reflector, and camera being secured to the substrate, the substrate being movable between a first position and a second position.
2. The apparatus of claim 1, wherein the first reflector includes a first reflective portion in reflective communication with the second reflector and a second reflective portion in reflective communication with the third reflector.
3. The apparatus of claim 1, wherein the first reflector includes a reflective panel having a first end and a second end, the first and second ends being displaced toward each other to form a vertex substantially in the center of the reflective panel.
4. The apparatus of claim 1, further comprising an actuator coupled to the first reflector, the actuator moving the first reflector between a first position in reflective communication with the second reflector and a second position in reflective communication with the third reflector.
5. The apparatus of claim 4, wherein the actuator comprises a galvanometer.
6. The apparatus of claim 1, further comprising a first actuator coupled to the second reflector and a second actuator coupled to the third reflector, the first actuator moving the second reflector between a first position and a second position, the second actuator moving the third reflector between a first position and a second position.
7. The apparatus of claim 6, wherein the first and second actuators comprise galvanometers.
8. The apparatus of claim 1, wherein the apparatus is coupled to a motor vehicle.
9. The apparatus of claim 1, further comprising a processing unit coupled to the camera.
10. The apparatus of claim 1, wherein the combined image is a stereo image.
11. An apparatus for generating a combined image of an object, the apparatus comprising:
a first reflector movable between a first position and a second position;
a second reflector in reflective communication with the first reflector when the first reflector is in the first position and with a first portion of the object corresponding to a first perspective view of the object;
a third reflector in reflective communication with the first reflector when the first reflector is in the second position and a second portion of the object corresponding to a second perspective view of the object;
a camera receiving the first and second perspective views from the first reflector;
a processing unit coupled to the camera, the processing unit receiving the first and second perspective views from the camera and forming the combined image from the first and second perspective views; and
a substrate, the first reflector, second reflector, third reflector, and camera being secured to the substrate, the substrate being movable between a first position and a second position.
12. The apparatus of claim 11, further comprising an actuator coupled to the first reflector, the actuator moving the first reflector between the first and second positions.
13. The apparatus of claim 12, wherein the actuator is a galvanometer.
14. The apparatus of claim 11, wherein the first reflector comprises a panel having at least one substantially reflective side.
15. The apparatus of claim 11, further comprising a first actuator coupled to the second reflector and a second actuator coupled to the third reflector, the first actuator moving the second reflector between a first position and a second position, the second actuator moving the third reflector between a first position and a second position.
16. The apparatus of claim 15, wherein the first and second actuators comprise galvanometers.
17. The apparatus of claim 11, wherein the apparatus is coupled to a motor vehicle.
US10/603,540 2003-06-25 2003-06-25 Apparatus for generating a combined image Active 2025-04-05 US7162153B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/603,540 US7162153B2 (en) 2003-06-25 2003-06-25 Apparatus for generating a combined image
EP04076770A EP1492357A3 (en) 2003-06-25 2004-06-16 Apparatus for generating a combined image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/603,540 US7162153B2 (en) 2003-06-25 2003-06-25 Apparatus for generating a combined image

Publications (2)

Publication Number Publication Date
US20040263612A1 US20040263612A1 (en) 2004-12-30
US7162153B2 true US7162153B2 (en) 2007-01-09

Family

ID=33418663

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/603,540 Active 2025-04-05 US7162153B2 (en) 2003-06-25 2003-06-25 Apparatus for generating a combined image

Country Status (2)

Country Link
US (1) US7162153B2 (en)
EP (1) EP1492357A3 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080019684A1 (en) * 2006-07-24 2008-01-24 Young Optics Inc. Camera module
US20100328424A1 (en) * 2004-10-06 2010-12-30 Walter Carl Thomas Method and apparatus for 3-d electron holographic visual and audio scene propagation in a video or cinematic arena, digitally processed, auto language tracking
US8786767B2 (en) * 2012-11-02 2014-07-22 Microsoft Corporation Rapid synchronized lighting and shuttering
US8854799B2 (en) 2012-03-02 2014-10-07 Microsoft Corporation Flux fountain
US20150037022A1 (en) * 2013-07-31 2015-02-05 Delphi Technologies, Inc. Camera system with rotating mirror
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US9824808B2 (en) 2012-08-20 2017-11-21 Microsoft Technology Licensing, Llc Switchable magnetic lock
US9857575B2 (en) 2012-02-06 2018-01-02 Cognex Corporation System and method for expansion of field of view in a vision system
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US10120420B2 (en) 2014-03-21 2018-11-06 Microsoft Technology Licensing, Llc Lockable display and techniques enabling use of lockable displays
US20190077318A1 (en) * 2017-09-12 2019-03-14 Ford Global Technologies, Llc Vehicle sensor system
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US10678743B2 (en) 2012-05-14 2020-06-09 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state
US11172112B2 (en) 2019-09-09 2021-11-09 Embedtek, LLC Imaging system including a non-linear reflector
US11966810B2 (en) 2012-02-06 2024-04-23 Cognex Corporation System and method for expansion of field of view in a vision system

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7791640B2 (en) * 2004-01-23 2010-09-07 Olympus Corporation Electronic camera and image generating apparatus generating stereo image
US7771054B2 (en) * 2004-09-10 2010-08-10 Hitachi, Ltd. Display system and camera system
IL166595A0 (en) * 2005-01-31 2006-01-15 Uri Neta Image acquisition system
KR101234078B1 (en) * 2006-09-26 2013-02-15 삼성전자주식회사 Adapter and three dimensional image photography apparatus having the same
CN101963750B (en) * 2009-07-21 2013-08-28 鸿富锦精密工业(深圳)有限公司 Three-dimensional image-taking camera module
US9892298B2 (en) 2012-02-06 2018-02-13 Cognex Corporation System and method for expansion of field of view in a vision system
US8646690B2 (en) 2012-02-06 2014-02-11 Cognex Corporation System and method for expansion of field of view in a vision system
US9678099B2 (en) * 2014-04-24 2017-06-13 Cubic Corporation Athermalized optics for laser wind sensing
US9791244B2 (en) 2014-11-17 2017-10-17 Cubic Corporation Rifle scope targeting display adapter mount
US10274286B2 (en) 2014-11-17 2019-04-30 Cubic Corporation Rifle scope targeting display adapter
US10443984B2 (en) 2014-11-17 2019-10-15 Cubic Corporation Low-cost rifle scope display adapter
DE102015218179A1 (en) * 2015-09-22 2017-03-23 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and device for enlarging the field of view of a camera and camera equipped therewith
FR3075738B1 (en) * 2017-12-22 2020-11-27 Valeo Comfort & Driving Assistance VISUALIZATION SYSTEM, DASHBOARD INCLUDING SUCH A VISUALIZATION SYSTEM, AND CONSOLE FOR A VEHICLE INTERIOR INCLUDING SUCH VIEWING SYSTEM
US11281868B2 (en) 2020-03-10 2022-03-22 Cognex Corporation Modular vision system and methods
US11665410B2 (en) * 2020-03-10 2023-05-30 Cognex Corporation Modular vision systems and methods

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1655850A (en) * 1921-12-15 1928-01-10 George E Watts Photographic camera
US4093364A (en) * 1977-02-04 1978-06-06 Miller Keith G Dual path photographic camera for use in motor vehicles
US5856888A (en) * 1996-05-13 1999-01-05 Ait Corporation Split beam optical character reader
US20020021354A1 (en) * 2000-06-07 2002-02-21 Katsushi Suzuki Imaging sensing apparatus
US6356854B1 (en) 1999-04-05 2002-03-12 Delphi Technologies, Inc. Holographic object position and type sensing system and method
US6447132B1 (en) 2001-02-20 2002-09-10 Delphi Technologies, Inc. Day/night HUD backlighting system
US6574048B2 (en) 2001-02-28 2003-06-03 Delphi Technologies, Inc. Method and apparatus for attenuating solar flux in a head-up display
US6809451B1 (en) * 2002-08-14 2004-10-26 Gsi Lumonics Corporation Galvanometer motor with composite rotor assembly
US6862140B2 (en) * 2000-02-01 2005-03-01 Canon Kabushiki Kaisha Stereoscopic image pickup system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2663075B2 (en) * 1992-02-26 1997-10-15 本田技研工業株式会社 Ambient monitoring system for vehicles
JPH07181608A (en) * 1993-12-21 1995-07-21 Asahi Optical Co Ltd Camera having reflection optical system for stereophotography
US5532777A (en) * 1995-06-06 1996-07-02 Zanen; Pieter O. Single lens apparatus for three-dimensional imaging having focus-related convergence compensation
US6567682B1 (en) * 1999-11-16 2003-05-20 Carecord Technologies, Inc. Apparatus and method for lesion feature identification and characterization
JP3905736B2 (en) * 2001-10-12 2007-04-18 ペンタックス株式会社 Stereo image pickup device and automatic congestion adjustment device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1655850A (en) * 1921-12-15 1928-01-10 George E Watts Photographic camera
US4093364A (en) * 1977-02-04 1978-06-06 Miller Keith G Dual path photographic camera for use in motor vehicles
US5856888A (en) * 1996-05-13 1999-01-05 Ait Corporation Split beam optical character reader
US6356854B1 (en) 1999-04-05 2002-03-12 Delphi Technologies, Inc. Holographic object position and type sensing system and method
US6862140B2 (en) * 2000-02-01 2005-03-01 Canon Kabushiki Kaisha Stereoscopic image pickup system
US20020021354A1 (en) * 2000-06-07 2002-02-21 Katsushi Suzuki Imaging sensing apparatus
US6447132B1 (en) 2001-02-20 2002-09-10 Delphi Technologies, Inc. Day/night HUD backlighting system
US6574048B2 (en) 2001-02-28 2003-06-03 Delphi Technologies, Inc. Method and apparatus for attenuating solar flux in a head-up display
US6809451B1 (en) * 2002-08-14 2004-10-26 Gsi Lumonics Corporation Galvanometer motor with composite rotor assembly

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100328424A1 (en) * 2004-10-06 2010-12-30 Walter Carl Thomas Method and apparatus for 3-d electron holographic visual and audio scene propagation in a video or cinematic arena, digitally processed, auto language tracking
US8482601B2 (en) * 2004-10-06 2013-07-09 Presley Jordan Thomas-Wayne Method and apparatus for 3-D electron holographic visual and audio scene propagation in a video or cinematic arena, digitally processed, auto language tracking
US7643749B2 (en) * 2006-07-24 2010-01-05 Young Optics Inc. Camera module
US20080019684A1 (en) * 2006-07-24 2008-01-24 Young Optics Inc. Camera module
US11966810B2 (en) 2012-02-06 2024-04-23 Cognex Corporation System and method for expansion of field of view in a vision system
US9857575B2 (en) 2012-02-06 2018-01-02 Cognex Corporation System and method for expansion of field of view in a vision system
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US9904327B2 (en) 2012-03-02 2018-02-27 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US9678542B2 (en) 2012-03-02 2017-06-13 Microsoft Technology Licensing, Llc Multiple position input device cover
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9158384B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Flexible hinge protrusion attachment
US9176900B2 (en) 2012-03-02 2015-11-03 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US9176901B2 (en) 2012-03-02 2015-11-03 Microsoft Technology Licensing, Llc Flux fountain
US9268373B2 (en) 2012-03-02 2016-02-23 Microsoft Technology Licensing, Llc Flexible hinge spine
US9275809B2 (en) 2012-03-02 2016-03-01 Microsoft Technology Licensing, Llc Device camera angle
US10963087B2 (en) 2012-03-02 2021-03-30 Microsoft Technology Licensing, Llc Pressure sensitive keys
US9465412B2 (en) 2012-03-02 2016-10-11 Microsoft Technology Licensing, Llc Input device layers and nesting
US10013030B2 (en) 2012-03-02 2018-07-03 Microsoft Technology Licensing, Llc Multiple position input device cover
US9618977B2 (en) 2012-03-02 2017-04-11 Microsoft Technology Licensing, Llc Input device securing techniques
US9619071B2 (en) 2012-03-02 2017-04-11 Microsoft Technology Licensing, Llc Computing device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices
US8854799B2 (en) 2012-03-02 2014-10-07 Microsoft Corporation Flux fountain
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US8947864B2 (en) 2012-03-02 2015-02-03 Microsoft Corporation Flexible hinge and removable attachment
US10678743B2 (en) 2012-05-14 2020-06-09 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state
US9824808B2 (en) 2012-08-20 2017-11-21 Microsoft Technology Licensing, Llc Switchable magnetic lock
US9544504B2 (en) 2012-11-02 2017-01-10 Microsoft Technology Licensing, Llc Rapid synchronized lighting and shuttering
US8786767B2 (en) * 2012-11-02 2014-07-22 Microsoft Corporation Rapid synchronized lighting and shuttering
US9042717B2 (en) * 2013-07-31 2015-05-26 Delphi Technologies, Inc. Camera system with rotating mirror
US20150037022A1 (en) * 2013-07-31 2015-02-05 Delphi Technologies, Inc. Camera system with rotating mirror
US10120420B2 (en) 2014-03-21 2018-11-06 Microsoft Technology Licensing, Llc Lockable display and techniques enabling use of lockable displays
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US20190077318A1 (en) * 2017-09-12 2019-03-14 Ford Global Technologies, Llc Vehicle sensor system
US10632926B2 (en) * 2017-09-12 2020-04-28 Ford Global Technologies, Llc Vehicle sensor system
US11172112B2 (en) 2019-09-09 2021-11-09 Embedtek, LLC Imaging system including a non-linear reflector

Also Published As

Publication number Publication date
EP1492357A3 (en) 2008-05-21
EP1492357A2 (en) 2004-12-29
US20040263612A1 (en) 2004-12-30

Similar Documents

Publication Publication Date Title
US7162153B2 (en) Apparatus for generating a combined image
CN110178369B (en) Imaging device, imaging system, and display system
CN1924514B (en) Obstacle detector for vehicle
WO2013022577A1 (en) Field of view matching video display system
US8199975B2 (en) System and method for side vision detection of obstacles for vehicles
US10198639B2 (en) System and method for providing image information around vehicle
WO2004036895A2 (en) Method for arranging cameras and mirrors to allow panoramic visualization
CN1333626A (en) Image synthesizing device and method
US11087438B2 (en) Merging of partial images to form an image of surroundings of a mode of transport
CN1335238A (en) Rear-viewing mirror of commercial vehicle with pickup camera and monitor II
WO2015182457A1 (en) Vehicle exterior observation device, and imaging device
CN1921621A (en) Panorama visible driving ancillary equipment
JP2002033943A (en) Omnidirectional visual sensor
JP6433684B2 (en) Imaging device for vehicle external monitoring device
US11055541B2 (en) Vehicle lane marking and other object detection using side fisheye cameras and three-fold de-warping
Gehrig Large-field-of-view stereo for automotive applications
US20040223077A1 (en) Imaging three-dimensional objects
EP3410705A1 (en) 3d vision system for a motor vehicle and method of controlling a 3d vision system
US20040071316A1 (en) Method for image recognition in motor vehicles
CN1825951A (en) Vehicle-periphery viewing apparatus
CN212220070U (en) Vehicle real-time positioning system based on visual semantic segmentation technology
CN111452726A (en) Far and near view combined panoramic imaging system for vehicle
US20230007190A1 (en) Imaging apparatus and imaging system
CN208585173U (en) Virtual image display device for vehicle
CN113039485A (en) Moving body periphery monitoring device

Legal Events

Date Code Title Description
AS Assignment

Owner name: DELPHI TECHNOLOGIES, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARTER JR., JOSEPH E.;SCHARENBROCH, GREGORY K.;TAYLOR, RONALD M.;REEL/FRAME:014240/0190;SIGNING DATES FROM 20030612 TO 20030618

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553)

Year of fee payment: 12

AS Assignment

Owner name: APTIV TECHNOLOGIES LIMITED, BARBADOS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELPHI TECHNOLOGIES INC.;REEL/FRAME:047143/0874

Effective date: 20180101

AS Assignment

Owner name: APTIV TECHNOLOGIES (2) S.A R.L., LUXEMBOURG

Free format text: ENTITY CONVERSION;ASSIGNOR:APTIV TECHNOLOGIES LIMITED;REEL/FRAME:066746/0001

Effective date: 20230818

Owner name: APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L., LUXEMBOURG

Free format text: MERGER;ASSIGNOR:APTIV TECHNOLOGIES (2) S.A R.L.;REEL/FRAME:066566/0173

Effective date: 20231005

Owner name: APTIV TECHNOLOGIES AG, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L.;REEL/FRAME:066551/0219

Effective date: 20231006