US20180061008A1 - Imaging system and method - Google Patents
Imaging system and method Download PDFInfo
- Publication number
- US20180061008A1 US20180061008A1 US15/253,159 US201615253159A US2018061008A1 US 20180061008 A1 US20180061008 A1 US 20180061008A1 US 201615253159 A US201615253159 A US 201615253159A US 2018061008 A1 US2018061008 A1 US 2018061008A1
- Authority
- US
- United States
- Prior art keywords
- window
- image
- optical assembly
- images
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 54
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000003287 optical effect Effects 0.000 claims abstract description 50
- 238000004891 communication Methods 0.000 claims abstract description 13
- 238000010438 heat treatment Methods 0.000 claims description 9
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 claims description 2
- 238000005562 fading Methods 0.000 claims description 2
- 229910052732 germanium Inorganic materials 0.000 claims description 2
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 claims description 2
- 229910052710 silicon Inorganic materials 0.000 claims description 2
- 239000010703 silicon Substances 0.000 claims description 2
- 230000001681 protective effect Effects 0.000 abstract description 3
- 230000015654 memory Effects 0.000 description 15
- 230000005855 radiation Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000002411 adverse Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G06T5/001—
-
- G06K9/00791—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H04N5/2254—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60S—SERVICING, CLEANING, REPAIRING, SUPPORTING, LIFTING, OR MANOEUVRING OF VEHICLES, NOT OTHERWISE PROVIDED FOR
- B60S1/00—Cleaning of vehicles
- B60S1/02—Cleaning windscreens, windows or optical devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention generally relates to imaging systems. More specifically, the invention relates to infrared imaging systems utilized in automotive safety systems.
- a layer of moisture or water may form on an external optical or a protective window surface of a camera that is exposed to the environment.
- cameras such as infrared, long-wave, mid-wave, short-wave, near infrared, and visible cameras are used to detect objects in a scene from a moving vehicle. These objects may be of interest to the driver of an automobile or safety systems of the automobile, so as to prevent or minimize vehicle accidents.
- the layer of water and moisture that collects on the front window reduces the thermal energy that reaches the infrared long-wave sensitive sensor and reduces the effectiveness of the camera from properly seeing a scene.
- the image produced has a low thermal contrast and histogram with limited or reduced usable contents for the driver of the vehicle or other systems using detection algorithms.
- Prior art solutions have utilized a heater to eliminate moisture from a window.
- this solution generates major artifacts that reduce the usefulness of the camera system.
- One type of artifact is a rain/splash artifact. This occurs when wet snow or rain makes contact with a warm window. The moisture becomes warm, and it appears to the driver or detection algorithms as a flash or image burst. These artifacts reduce the effectiveness of the vision system to the driver and/or other automobile safety systems.
- the second issue commonly found is that when the moisture is heated by the heater, the moisture tends to linger on the window while the moisture evaporates. As a result, the scene develops bright or sometimes dark corners found in the corner of the image captured by the camera system.
- An imaging system and method includes an imaging sensor, a processor in communication with the imaging sensor, a protective window or other exposed optical surface located between the camera system and the scene to be captured by the camera system, and a heater system in thermal communication with the window or other exposed optical surface.
- the imaging sensor configured to capture images of a scene, each of the captured images comprising a plurality of pixels.
- the processor is configured to receive information representing the images comprising the plurality of pixels captured by the imaging sensor, determine if rain splash artifacts or bright/dark non-uniformities are present, and remove artifacts in the information representing the images.
- FIG. 1 illustrates an environment having an automobile with an imaging system
- FIG. 2 illustrates a block diagram of the imaging system
- FIG. 3 illustrates an imaging method
- FIG. 4 illustrates an image captured by the imaging system of FIG. 2 and processed by the imaging method of FIG. 3 ;
- FIG. 5 illustrates another imaging method.
- an environment 10 includes an imaging system 12 that is located in a vehicle 14 is shown.
- the environment 10 may be any type of environment.
- the environment 10 includes a road 16 wherein the vehicle 14 is traveling on.
- the environment 10 also includes a number of different objects.
- the environment 10 includes a building 18 , trees 20 and 22 .
- the environment 10 includes a number of moving objects, such as wildlife 24 and persons 26 .
- the environment 10 may vary significantly.
- the vehicle 14 may be alternatively traveling on a highway or off-road altogether.
- the environment 10 could be subject to any which one of a number of different weather conditions, such as sunny, partly cloudy, rainy, foggy, snowy, or any other known weather conditions.
- the imaging system 12 may generally be mounted on or near a grill 28 of the vehicle 14 .
- the grill 28 is generally at the front of the car so as to capture a scene 30 forward of a vehicle 14 .
- the scene 30 varies so that the imaging system 12 can capture images of the building 18 , trees 20 and 22 , wildlife 24 , and persons 26 , if any of these objects are located within the scene 30 as the vehicle 14 moves along the road 16 or elsewhere.
- the imaging system 12 may not be able to see the objects located in the environment 10 . This, in turn, prevents this information from being presented to a driver or algorithms executed by any one of a number of different systems of the vehicle 14 .
- the imaging system 12 may be located and utilized in any one of a number of different applications.
- the imaging system 12 may be utilized on any other type of vehicle, such as a boat, plane, truck, construction equipment, tractors, and the like.
- the imaging system 12 could be utilized separate and apart from any other vehicle 14 shown or previously mentioned.
- the imaging system 12 could be mounted to a person, structure, and the like.
- mounting of the system 12 may be such that the mounting of the system 12 makes the system 12 removable so that it can be utilized in any one of a number of different applications.
- the imaging system 12 includes an imaging sensor 102 , an optical assembly 104 , a shutter 106 , and a window 108 .
- the sensor 102 may be any type of sensor capable of capturing images.
- the sensor 102 is an infrared sensor capable of capturing infrared images. It should also be understood that the sensor 102 may be a sensor capable of capturing different wavelengths of light.
- the senor 102 may be a long-wave sensor (7-15 microns wavelength), a mid-wave sensor (2.5-7 microns wavelength), a short-wave sensor (1.1-2.5 microns wavelength), and/or a near infrared sensor (0.75-1.1 microns wavelength).
- a long-wave sensor 7-15 microns wavelength
- a mid-wave sensor (2.5-7 microns wavelength)
- a short-wave sensor 1.1-2.5 microns wavelength
- a near infrared sensor (0.75-1.1 microns wavelength
- the optical assembly 104 may include one or more optics for directing radiation from the scene 30 towards the sensor 102 and has a first side 129 facing towards the sensor 102 and a second side 127 generally facing towards the scene 30 to be captured.
- radiation could mean any type of radiation or visual information, such as light, capable of being received and detected by the sensor 102 .
- the system 12 may include a shutter 106 that allows light or radiation to pass for a predetermined period of time.
- the window 108 may be any one of a number of different windows capable of allowing the radiation or light to pass through.
- the window 108 may be a germanium or silicon window.
- any one of a number of different materials may be utilized in the manufacturing of the window 108 .
- the window 108 may not be present at all.
- the window 108 is generally located between the sensor 102 and the scene 30 to be captured.
- the optical assembly 104 is located between the window 108 and the imaging sensor 102 . If a shutter 106 is utilized, the shutter 106 may be located between the window 108 and the optical assembly 104 . Optionally, the shutter 106 may also be located behind the optical assembly 104 or in front of the window 108 —essentially anywhere between the imaging sensor 102 and the scene 30 .
- the sensor 102 and the optical assembly 104 generally form a camera system 110 that is configured to capture images of the scene 30 . Each of these captured images comprises a plurality of pixels.
- the imaging system 12 also includes a processor 112 configured to receive information representing the images comprising the plurality of pixels captured by the camera system 112 .
- the processor 112 may be a single processor or may be multiple processors working in concert.
- a memory device 114 may be in communication with the processor 112 .
- the memory device 114 may be configured to store instructions for executing an imaging method to be described later in this specification.
- the memory 114 may be configured to store information received from the camera system 110 regarding the captured images from the scene 30 .
- the memory 114 may be any type of memory capable of storing digital information, such as optical memories, magnetic memories, solid state memories, and the like. Additionally, it should be understood that the memory 114 may be integrated within the processor 112 or separate as shown.
- the processor 112 may be connected to a number of different devices that utilize the information representing the images captured from the scene 30 .
- the processor 112 may provide this information to a display device 116 having a display area 118 .
- the display device 116 displays captured images from the scene 30 to a user.
- the user may be an operator of the vehicle 14 of FIG. 1 . This allows the user to make adjustments in the operation of the vehicle 14 .
- the processor 112 may also be in communication with other vehicle systems 120 and 122 .
- vehicle systems 120 and 122 may be any one of a number of different vehicle systems found in a vehicle.
- the vehicle systems 120 and/or 122 may be vehicle safety systems, such as airbags, pre-tensioners, and the like.
- the safety systems may include accident avoidance systems, such as automatic braking, cruise control, automatic steering, and the like. It should be understood that the vehicle systems described are only examples and that the vehicle systems may include any system found in a vehicle. Further, while vehicle systems have been discussed in this specification, the systems 120 and 122 may not be related at all to a vehicle and may be related to some other application of the system 12 .
- These systems 120 and 122 utilize the information regarding the captured images from the scene 30 that have been processed by the processor 112 to perform any one of a number of different algorithms and functions.
- the system 12 may include the window 108 .
- the window 108 has a first side 124 facing towards the camera system 110 and a second side 126 generally facing towards the scene 30 to be captured.
- a heater 128 Located on the first side 124 is a heater 128 configured to heat the window 108 .
- the heater 128 may be a heating wire or a heating mesh.
- the system 12 may also include a temperature sensing element 130 for determining the temperature of the window 108 .
- the temperature sensing element may be any one of a number of different temperature sensing elements. In this example, the temperature sensing element 130 is a thermistor.
- the heater 128 may be positioned and configured so as to heat the optical assembly 104 .
- the heater 128 and temperature sensing element 130 may be located on the first side 129 of the optical assembly 104 .
- a processor 132 may be in communication with the heater 128 and the temperature sensing element 130 .
- the processor 132 may be configured to heat the window 108 or optical assembly 104 , in the case where the window 108 is not utilized, to a temperature less than or about equal to 100° Celsius.
- the processor 132 may be configured to heat the window 108 or optical assembly 104 to approximately 80° Celsius, which is less than 100° Celsius.
- the processor 132 may be configured so as to heat the window 108 or optical assembly 104 , in the case where the window 108 is not utilized, above the ambient temperature by a certain specified temperature.
- this certain specified temperature may be 40° Celsius above the ambient temperature.
- the processor 132 may be configured to activate the heater 128 at certain times or certain temperatures.
- the processor 132 may be a single processor or may be made of multiple processors working in concert. Also, it should be understood that the processor 112 and the processor 132 may, in fact, be the same processor or set of processors that are managing both the camera system 110 and the heater 128 . Further, a memory device 134 , similar to the memory device 114 may be in communication with the processor 132 . The memory device 134 may contain instructions for configuring the processor 132 regarding heating the window 108 or optical assembly 104 and receiving feedback information from the temperature sensing device 130 . Like before, the memory 134 may be any type of memory capable of storing digital information, such as an optical memory, magnetic memory, or solid state memory, and the like. Further, the memory 134 may be integrated within the processor 132 or separate from the processor 132 , as shown.
- the processor 112 is configured to determine if artifacts are present in the captured image. If this occurs, the processor is configured to remove artifacts from any information representing the images. The artifacts may be caused by moisture coming into physical contact with the second side 126 of the window 108 or the optical assembly 104 .
- a method 200 and example image 300 are shown, respectively.
- the method 200 may be executed by any one of the processors 112 or 132 as shown in FIG. 2 .
- the instructions for this method may be located in the memories 114 or 134 .
- the method begins by heating the window 108 or optical assembly 104 to a temperature less than or equal to 100° C.
- the method may heat the window 108 or optical assembly 104 to 80° C., which is less than 100° C.
- this temperature is maintained during the operation of the vehicle 14 of FIG. 1 .
- the temperature in which the window 108 or optical assembly 104 is heated to by a predetermined range or may be based on a set amount above an ambient temperature, for example, 40° C. above the ambient temperature.
- the camera system 110 captures images of the scene 30 .
- the processor 112 determines if artifacts are present in the images.
- the processor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size.
- a series of flashes generally occurs, the flashes being the artifacts. These flashes are generally viewed as a significant change in the pixels. Further, theses flashes are contiguous and cover a specific area size.
- artifacts can be filtered out by looking not only at which pixels have changed, but also if these pixels are contiguous in nature. If no artifacts are detected, the method returns to step 204 . However, if disturbances caused by moisture and heat are detected, the method continues to step 208 , wherein the processor 112 is configured to remove artifacts in the images.
- the sample image 300 includes the road 16 and portions of the building 18 from FIG. 1 .
- the image also includes artifacts 310 A and 310 B caused by moisture coming into contact with the second side 126 of the window 108 or the second side 127 of the optical assembly 104 .
- These artifacts 310 A and 310 B essentially appear as a series of flashes but represent moisture coming into contact with the second side of 126 of the window 108 or the second side 127 of the optical assembly 104 .
- the moisture may accumulate on the edges of the image 300 .
- the artifacts 310 A- 310 B and 312 A- 312 D may be removed by applying a low pass filter to the information representing the pixels that changed in the image.
- the pixels are located where the artifacts 310 A- 310 B and 312 A- 312 D are located.
- the processor 12 may also be further configured to remove the artifacts by determining a splash profile by subtracting a splash image, such as artifacts 310 A and 310 B from a previously captured image or a low pass filtered version of previously captured image.
- a splash image is the pixels that changed in the image.
- This splash pattern is removed from the sample image 300 and fading of the removal of the splash pattern from the plurality of images is performed as the artifacts are no longer present in the captured images.
- the splash pattern may be located near the edges of 314 A, 314 B, 314 C, and 314 D of the image 310 instead of, or in addition to a central area 316 of the image 300 .
- step 402 the camera system 110 captures images of the scene 30 .
- step 404 the processor 112 determines if artifacts are present in the images.
- the processor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size.
- the processor 112 determines if moisture is the likely cause of the artifacts that are present in the captured images. This determination can be made by not only using the captured images but also using external data 407 .
- the external data 407 could include data from other sensors, such as environmental sensors, such as rain detecting windshield wipers that can detect if the vehicle 14 is traveling in a location that is likely to have moisture. Further, the external data 407 could be data from a database that tracks the weather conditions of an area where the vehicle 14 is traveling. Additionally, the external data 407 could also include information from other vehicle systems, such as a determination if the windshield wipers of the vehicle and/or defroster of a vehicle are being utilized. If the windshield wipers and/or defroster, and/or any other moisture related system are being utilized, this information could be useful in determining if moisture is the likely cause of the artifacts.
- the method 400 turns on heater 128 in step 408 .
- the heater 128 is only on selectively if moisture is determined to be present. Otherwise, the method 400 is essentially always looking to remove artifacts but will only heat the window 108 or optical assembly 104 when moisture is determined to be present.
- step 410 the processor 112 is configured to remove artifacts from the captured images.
- the methodologies described in method 200 regarding removing artifacts from the captured images are equally applicable in this method and will not be described again. After artifacts are removed, the method 400 returns to step 402 .
- dedicated hardware implementations such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods described herein.
- Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems.
- One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
- the methods described herein may be implemented by software programs executable by a computer system.
- implementations can include distributed processing, component/object distributed processing, and parallel processing.
- virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.
- computer-readable medium includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions.
- computer-readable medium shall also include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Studio Devices (AREA)
Abstract
Description
- The present invention generally relates to imaging systems. More specifically, the invention relates to infrared imaging systems utilized in automotive safety systems.
- In adverse weather, especially when rain, fog, or wet snow is present, a layer of moisture or water may form on an external optical or a protective window surface of a camera that is exposed to the environment. It is well known in the art, cameras, such as infrared, long-wave, mid-wave, short-wave, near infrared, and visible cameras are used to detect objects in a scene from a moving vehicle. These objects may be of interest to the driver of an automobile or safety systems of the automobile, so as to prevent or minimize vehicle accidents. The layer of water and moisture that collects on the front window reduces the thermal energy that reaches the infrared long-wave sensitive sensor and reduces the effectiveness of the camera from properly seeing a scene. As a result, the image produced has a low thermal contrast and histogram with limited or reduced usable contents for the driver of the vehicle or other systems using detection algorithms.
- Prior art solutions have utilized a heater to eliminate moisture from a window. However, this solution generates major artifacts that reduce the usefulness of the camera system. One type of artifact is a rain/splash artifact. This occurs when wet snow or rain makes contact with a warm window. The moisture becomes warm, and it appears to the driver or detection algorithms as a flash or image burst. These artifacts reduce the effectiveness of the vision system to the driver and/or other automobile safety systems.
- The second issue commonly found is that when the moisture is heated by the heater, the moisture tends to linger on the window while the moisture evaporates. As a result, the scene develops bright or sometimes dark corners found in the corner of the image captured by the camera system.
- An imaging system and method includes an imaging sensor, a processor in communication with the imaging sensor, a protective window or other exposed optical surface located between the camera system and the scene to be captured by the camera system, and a heater system in thermal communication with the window or other exposed optical surface. The imaging sensor configured to capture images of a scene, each of the captured images comprising a plurality of pixels. The processor is configured to receive information representing the images comprising the plurality of pixels captured by the imaging sensor, determine if rain splash artifacts or bright/dark non-uniformities are present, and remove artifacts in the information representing the images.
- Further objects, features, and advantages of this invention will become readily apparent to persons skilled in the art after a review of the following description, with reference to the drawings and claims that are appended to and form a part of this specification.
-
FIG. 1 illustrates an environment having an automobile with an imaging system; -
FIG. 2 illustrates a block diagram of the imaging system; -
FIG. 3 illustrates an imaging method; -
FIG. 4 illustrates an image captured by the imaging system ofFIG. 2 and processed by the imaging method ofFIG. 3 ; and -
FIG. 5 illustrates another imaging method. - Referring now to
FIG. 1 , anenvironment 10 includes animaging system 12 that is located in avehicle 14 is shown. It should be understood that theenvironment 10 may be any type of environment. Here, theenvironment 10 includes aroad 16 wherein thevehicle 14 is traveling on. Theenvironment 10 also includes a number of different objects. For example, theenvironment 10 includes abuilding 18,trees 20 and 22. Further, theenvironment 10 includes a number of moving objects, such aswildlife 24 andpersons 26. Of course, it should be understood that theenvironment 10 may vary significantly. For example, thevehicle 14 may be alternatively traveling on a highway or off-road altogether. Further, theenvironment 10 could be subject to any which one of a number of different weather conditions, such as sunny, partly cloudy, rainy, foggy, snowy, or any other known weather conditions. - The
imaging system 12 may generally be mounted on or near agrill 28 of thevehicle 14. Thegrill 28 is generally at the front of the car so as to capture ascene 30 forward of avehicle 14. As thevehicle 14 travels along theroad 16, thescene 30 varies so that theimaging system 12 can capture images of thebuilding 18,trees 20 and 22,wildlife 24, andpersons 26, if any of these objects are located within thescene 30 as thevehicle 14 moves along theroad 16 or elsewhere. - As stated in the background section, if the weather conditions of the
environment 10 are adverse, such as rainy or snowy, moisture can develop on a window of theimaging system 12 that may reduce the usefulness of theimaging system 12, especially if theimaging system 12 utilizes an infrared sensor, as will be explained later in this specification. In such an occurrence, theimaging system 12 may not be able to see the objects located in theenvironment 10. This, in turn, prevents this information from being presented to a driver or algorithms executed by any one of a number of different systems of thevehicle 14. - Also, it should be understood that while the
imaging system 12 is shown as being located within avehicle 14, theimaging system 12 may be located and utilized in any one of a number of different applications. For example, theimaging system 12 may be utilized on any other type of vehicle, such as a boat, plane, truck, construction equipment, tractors, and the like. Further, it should be understood that theimaging system 12 could be utilized separate and apart from anyother vehicle 14 shown or previously mentioned. For example, theimaging system 12 could be mounted to a person, structure, and the like. Further, it should be understood that mounting of thesystem 12 may be such that the mounting of thesystem 12 makes thesystem 12 removable so that it can be utilized in any one of a number of different applications. - Referring to
FIG. 2 , a more detailed view of theimaging system 12 for capturing thescene 30 is shown. Here, theimaging system 12 includes animaging sensor 102, anoptical assembly 104, ashutter 106, and awindow 108. Thesensor 102 may be any type of sensor capable of capturing images. In this example, thesensor 102 is an infrared sensor capable of capturing infrared images. It should also be understood that thesensor 102 may be a sensor capable of capturing different wavelengths of light. For example, thesensor 102 may be a long-wave sensor (7-15 microns wavelength), a mid-wave sensor (2.5-7 microns wavelength), a short-wave sensor (1.1-2.5 microns wavelength), and/or a near infrared sensor (0.75-1.1 microns wavelength). Of course, other sensors capable of capturing other wavelengths outside the ranges mentioned may also be utilized. - The
optical assembly 104 may include one or more optics for directing radiation from thescene 30 towards thesensor 102 and has afirst side 129 facing towards thesensor 102 and asecond side 127 generally facing towards thescene 30 to be captured. It should be understood that the term radiation could mean any type of radiation or visual information, such as light, capable of being received and detected by thesensor 102. Optionally, thesystem 12 may include ashutter 106 that allows light or radiation to pass for a predetermined period of time. - The
window 108 may be any one of a number of different windows capable of allowing the radiation or light to pass through. In this example, thewindow 108 may be a germanium or silicon window. However, it should be understood that any one of a number of different materials may be utilized in the manufacturing of thewindow 108. Further, should be understood that thewindow 108 may not be present at all. - As for location, the
window 108 is generally located between thesensor 102 and thescene 30 to be captured. Similarly, theoptical assembly 104 is located between thewindow 108 and theimaging sensor 102. If ashutter 106 is utilized, theshutter 106 may be located between thewindow 108 and theoptical assembly 104. Optionally, theshutter 106 may also be located behind theoptical assembly 104 or in front of thewindow 108—essentially anywhere between theimaging sensor 102 and thescene 30. Thesensor 102 and theoptical assembly 104 generally form acamera system 110 that is configured to capture images of thescene 30. Each of these captured images comprises a plurality of pixels. - The
imaging system 12 also includes aprocessor 112 configured to receive information representing the images comprising the plurality of pixels captured by thecamera system 112. It should be understood that theprocessor 112 may be a single processor or may be multiple processors working in concert. Amemory device 114 may be in communication with theprocessor 112. Thememory device 114 may be configured to store instructions for executing an imaging method to be described later in this specification. However, thememory 114 may be configured to store information received from thecamera system 110 regarding the captured images from thescene 30. It should be understood that thememory 114 may be any type of memory capable of storing digital information, such as optical memories, magnetic memories, solid state memories, and the like. Additionally, it should be understood that thememory 114 may be integrated within theprocessor 112 or separate as shown. - The
processor 112 may be connected to a number of different devices that utilize the information representing the images captured from thescene 30. For example, theprocessor 112 may provide this information to adisplay device 116 having adisplay area 118. Thedisplay device 116 displays captured images from thescene 30 to a user. In this case, the user may be an operator of thevehicle 14 ofFIG. 1 . This allows the user to make adjustments in the operation of thevehicle 14. - Also, the
processor 112 may also be in communication withother vehicle systems other vehicle systems vehicle systems 120 and/or 122 may be vehicle safety systems, such as airbags, pre-tensioners, and the like. Further, the safety systems may include accident avoidance systems, such as automatic braking, cruise control, automatic steering, and the like. It should be understood that the vehicle systems described are only examples and that the vehicle systems may include any system found in a vehicle. Further, while vehicle systems have been discussed in this specification, thesystems system 12. Thesesystems scene 30 that have been processed by theprocessor 112 to perform any one of a number of different algorithms and functions. - As stated before, the
system 12 may include thewindow 108. Thewindow 108 has afirst side 124 facing towards thecamera system 110 and asecond side 126 generally facing towards thescene 30 to be captured. Located on thefirst side 124 is aheater 128 configured to heat thewindow 108. Theheater 128 may be a heating wire or a heating mesh. Thesystem 12 may also include atemperature sensing element 130 for determining the temperature of thewindow 108. The temperature sensing element may be any one of a number of different temperature sensing elements. In this example, thetemperature sensing element 130 is a thermistor. - Also, it should be understood that if the
system 12 does not include thewindow 108, theheater 128 may be positioned and configured so as to heat theoptical assembly 104. For example, theheater 128 andtemperature sensing element 130 may be located on thefirst side 129 of theoptical assembly 104. - A
processor 132 may be in communication with theheater 128 and thetemperature sensing element 130. Theprocessor 132 may be configured to heat thewindow 108 oroptical assembly 104, in the case where thewindow 108 is not utilized, to a temperature less than or about equal to 100° Celsius. As an example, theprocessor 132 may be configured to heat thewindow 108 oroptical assembly 104 to approximately 80° Celsius, which is less than 100° Celsius. - Further, the
processor 132 may be configured so as to heat thewindow 108 oroptical assembly 104, in the case where thewindow 108 is not utilized, above the ambient temperature by a certain specified temperature. For example, this certain specified temperature may be 40° Celsius above the ambient temperature. Also, theprocessor 132 may be configured to activate theheater 128 at certain times or certain temperatures. - Like the
processor 112, theprocessor 132 may be a single processor or may be made of multiple processors working in concert. Also, it should be understood that theprocessor 112 and theprocessor 132 may, in fact, be the same processor or set of processors that are managing both thecamera system 110 and theheater 128. Further, amemory device 134, similar to thememory device 114 may be in communication with theprocessor 132. Thememory device 134 may contain instructions for configuring theprocessor 132 regarding heating thewindow 108 oroptical assembly 104 and receiving feedback information from thetemperature sensing device 130. Like before, thememory 134 may be any type of memory capable of storing digital information, such as an optical memory, magnetic memory, or solid state memory, and the like. Further, thememory 134 may be integrated within theprocessor 132 or separate from theprocessor 132, as shown. - The
processor 112 is configured to determine if artifacts are present in the captured image. If this occurs, the processor is configured to remove artifacts from any information representing the images. The artifacts may be caused by moisture coming into physical contact with thesecond side 126 of thewindow 108 or theoptical assembly 104. - Referring to
FIGS. 3 and 4 , amethod 200 andexample image 300 are shown, respectively. Themethod 200 may be executed by any one of theprocessors FIG. 2 . The instructions for this method may be located in thememories step 202 the method begins by heating thewindow 108 oroptical assembly 104 to a temperature less than or equal to 100° C. For example, the method may heat thewindow 108 oroptical assembly 104 to 80° C., which is less than 100° C. As described before, this temperature is maintained during the operation of thevehicle 14 ofFIG. 1 . As stated previously, the temperature in which thewindow 108 oroptical assembly 104 is heated to by a predetermined range or may be based on a set amount above an ambient temperature, for example, 40° C. above the ambient temperature. - In
step 204, thecamera system 110 captures images of thescene 30. Instep 206, theprocessor 112 determines if artifacts are present in the images. Theprocessor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size. As stated in the background section, when moisture comes into contact with thesecond side 126 of thewindow 108 oroptical assembly 104, a series of flashes generally occurs, the flashes being the artifacts. These flashes are generally viewed as a significant change in the pixels. Further, theses flashes are contiguous and cover a specific area size. - As such, artifacts can be filtered out by looking not only at which pixels have changed, but also if these pixels are contiguous in nature. If no artifacts are detected, the method returns to step 204. However, if disturbances caused by moisture and heat are detected, the method continues to step 208, wherein the
processor 112 is configured to remove artifacts in the images. - Referring to
FIG. 4 , asample image 300 is shown. The sample image includes theroad 16 and portions of thebuilding 18 fromFIG. 1 . The image also includesartifacts second side 126 of thewindow 108 or thesecond side 127 of theoptical assembly 104. Theseartifacts window 108 or thesecond side 127 of theoptical assembly 104. In addition to these artifacts, there are alsoother artifacts sample image 300. As the moisture collecting on thesecond side 126 of thewindow 108 or thesecond side 127 of theoptical assembly 104 is heated, the moisture may accumulate on the edges of theimage 300. Theartifacts 310A-310B and 312A-312D may be removed by applying a low pass filter to the information representing the pixels that changed in the image. Here, the pixels are located where theartifacts 310A-310B and 312A-312D are located. - The
processor 12 may also be further configured to remove the artifacts by determining a splash profile by subtracting a splash image, such asartifacts sample image 300 and fading of the removal of the splash pattern from the plurality of images is performed as the artifacts are no longer present in the captured images. Further, the splash pattern may be located near the edges of 314A, 314B, 314C, and 314D of the image 310 instead of, or in addition to acentral area 316 of theimage 300. - Referring to
FIG. 5 , anothermethod 400 is shown that may be executed by any one of theprocessors FIG. 2 . The instructions for this method may be located in thememories step 402, thecamera system 110 captures images of thescene 30. Instep 404, theprocessor 112 determines if artifacts are present in the images. Theprocessor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size. As stated in the background section, when moisture comes into contact with thesecond side 126 of thewindow 108 oroptical assembly 104, a series of flashes generally occurs, the flashes being the artifacts. These flashes are generally viewed as a significant change in the pixels. Further, theses flashes are contiguous and cover a specific area size. - In
step 406, theprocessor 112 determines if moisture is the likely cause of the artifacts that are present in the captured images. This determination can be made by not only using the captured images but also usingexternal data 407. Theexternal data 407 could include data from other sensors, such as environmental sensors, such as rain detecting windshield wipers that can detect if thevehicle 14 is traveling in a location that is likely to have moisture. Further, theexternal data 407 could be data from a database that tracks the weather conditions of an area where thevehicle 14 is traveling. Additionally, theexternal data 407 could also include information from other vehicle systems, such as a determination if the windshield wipers of the vehicle and/or defroster of a vehicle are being utilized. If the windshield wipers and/or defroster, and/or any other moisture related system are being utilized, this information could be useful in determining if moisture is the likely cause of the artifacts. - If moisture is determined to be the likely source of the artifacts in the captured images, the
method 400 turns onheater 128 instep 408. As stated previously, the temperature in which thewindow 108 oroptical assembly 104 is heated to by a predetermined range or may be based on a set amount above an ambient temperature, for example, 40° C. above the ambient temperature. In this method, theheater 128 is only on selectively if moisture is determined to be present. Otherwise, themethod 400 is essentially always looking to remove artifacts but will only heat thewindow 108 oroptical assembly 104 when moisture is determined to be present. - In
step 410, theprocessor 112 is configured to remove artifacts from the captured images. The methodologies described inmethod 200 regarding removing artifacts from the captured images are equally applicable in this method and will not be described again. After artifacts are removed, themethod 400 returns to step 402. - In an alternative embodiment, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods described herein. Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
- In accordance with various embodiments of the present disclosure, the methods described herein may be implemented by software programs executable by a computer system. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.
- Further, the methods described herein may be embodied in a computer-readable medium. The term “computer-readable medium” includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term “computer-readable medium” shall also include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein.
- As a person skilled in the art will readily appreciate, the above description is meant as an illustration of the principles of this invention. This description is not intended to limit the scope or application of this invention in that the invention is susceptible to modification, variation, and change, without departing from the spirit of this invention, as defined in the following claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/253,159 US20180061008A1 (en) | 2016-08-31 | 2016-08-31 | Imaging system and method |
PCT/US2017/048412 WO2018044681A1 (en) | 2016-08-31 | 2017-08-24 | Imaging system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/253,159 US20180061008A1 (en) | 2016-08-31 | 2016-08-31 | Imaging system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180061008A1 true US20180061008A1 (en) | 2018-03-01 |
Family
ID=61243153
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/253,159 Abandoned US20180061008A1 (en) | 2016-08-31 | 2016-08-31 | Imaging system and method |
Country Status (2)
Country | Link |
---|---|
US (1) | US20180061008A1 (en) |
WO (1) | WO2018044681A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110503609A (en) * | 2019-07-15 | 2019-11-26 | 电子科技大学 | A kind of image rain removing method based on mixing sensor model |
US10687000B1 (en) * | 2018-06-15 | 2020-06-16 | Rockwell Collins, Inc. | Cross-eyed sensor mosaic |
WO2020207850A1 (en) * | 2019-04-12 | 2020-10-15 | Connaught Electronics Ltd. | Image processing method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6815680B2 (en) * | 2002-06-05 | 2004-11-09 | Raytheon Company | Method and system for displaying an image |
JP2005159710A (en) * | 2003-11-26 | 2005-06-16 | Nissan Motor Co Ltd | Display control apparatus and method for vehicle |
US20100073498A1 (en) * | 2005-11-04 | 2010-03-25 | Tobias Hoglund | Enhancement of images |
US7796168B1 (en) * | 2006-06-09 | 2010-09-14 | Flir Systems, Inc. | Methods and systems for detection and mitigation of image-flash in infrared cameras |
US20120008866A1 (en) * | 2010-06-28 | 2012-01-12 | Jad Halimeh | Method and device for detecting an interfering object in a camera image |
EP2589513A1 (en) * | 2011-11-03 | 2013-05-08 | Autoliv Development AB | Vision system and method for a motor vehicle |
US20130236116A1 (en) * | 2012-03-08 | 2013-09-12 | Industrial Technology Research Institute | Method and apparatus for single-image-based rain streak removal |
US20150332099A1 (en) * | 2014-05-15 | 2015-11-19 | Conti Temic Microelectronic Gmbh | Apparatus and Method for Detecting Precipitation for a Motor Vehicle |
US20160231527A1 (en) * | 2015-02-06 | 2016-08-11 | Flir Systems, Inc. | Lens heater to maintain thermal equilibrium in an infrared imaging system |
US20170240138A1 (en) * | 2016-02-19 | 2017-08-24 | Toyota Jidosha Kabushiki Kaisha | Imaging system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3706782A1 (en) * | 1987-03-03 | 1988-09-15 | Daimler Benz Ag | USE OF ORGANIC SILICON COMPOUNDS ON GLASS WINDSHIELDS TO REACH AN ANTI-FOGGING EFFECT AGAINST OILY ORGANIC SUBSTANCES |
US6681163B2 (en) * | 2001-10-04 | 2004-01-20 | Gentex Corporation | Moisture sensor and windshield fog detector |
US7276696B2 (en) * | 2003-07-15 | 2007-10-02 | Ford Global Technologies, Llc | Active night vision thermal control system using wavelength-temperature characteristic of light source |
KR101534973B1 (en) * | 2013-12-19 | 2015-07-07 | 현대자동차주식회사 | Image Processing Apparatus and Method for Removing Rain From Image Data |
-
2016
- 2016-08-31 US US15/253,159 patent/US20180061008A1/en not_active Abandoned
-
2017
- 2017-08-24 WO PCT/US2017/048412 patent/WO2018044681A1/en active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6815680B2 (en) * | 2002-06-05 | 2004-11-09 | Raytheon Company | Method and system for displaying an image |
JP2005159710A (en) * | 2003-11-26 | 2005-06-16 | Nissan Motor Co Ltd | Display control apparatus and method for vehicle |
US20100073498A1 (en) * | 2005-11-04 | 2010-03-25 | Tobias Hoglund | Enhancement of images |
US7796168B1 (en) * | 2006-06-09 | 2010-09-14 | Flir Systems, Inc. | Methods and systems for detection and mitigation of image-flash in infrared cameras |
US20120008866A1 (en) * | 2010-06-28 | 2012-01-12 | Jad Halimeh | Method and device for detecting an interfering object in a camera image |
EP2589513A1 (en) * | 2011-11-03 | 2013-05-08 | Autoliv Development AB | Vision system and method for a motor vehicle |
US20130236116A1 (en) * | 2012-03-08 | 2013-09-12 | Industrial Technology Research Institute | Method and apparatus for single-image-based rain streak removal |
US20150332099A1 (en) * | 2014-05-15 | 2015-11-19 | Conti Temic Microelectronic Gmbh | Apparatus and Method for Detecting Precipitation for a Motor Vehicle |
US20160231527A1 (en) * | 2015-02-06 | 2016-08-11 | Flir Systems, Inc. | Lens heater to maintain thermal equilibrium in an infrared imaging system |
US20170240138A1 (en) * | 2016-02-19 | 2017-08-24 | Toyota Jidosha Kabushiki Kaisha | Imaging system |
Non-Patent Citations (1)
Title |
---|
"Image Segmentation." The University of Auckland, 6 Dec. 2013. Web. Accessed 20 Apr. 2018. * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10687000B1 (en) * | 2018-06-15 | 2020-06-16 | Rockwell Collins, Inc. | Cross-eyed sensor mosaic |
WO2020207850A1 (en) * | 2019-04-12 | 2020-10-15 | Connaught Electronics Ltd. | Image processing method |
KR20210137534A (en) * | 2019-04-12 | 2021-11-17 | 코너트 일렉트로닉스 리미티드 | Image processing method |
JP2022527605A (en) * | 2019-04-12 | 2022-06-02 | コノート、エレクトロニクス、リミテッド | Image processing method |
JP7274603B2 (en) | 2019-04-12 | 2023-05-16 | コノート、エレクトロニクス、リミテッド | Image processing method |
KR102564134B1 (en) * | 2019-04-12 | 2023-08-04 | 코너트 일렉트로닉스 리미티드 | Image processing method |
DE102019109748B4 (en) | 2019-04-12 | 2024-10-02 | Connaught Electronics Ltd. | Image processing method operable in an image acquisition system of a vehicle as well as image acquisition system for carrying out the method as well as vehicle and non-transitory computer program product |
CN110503609A (en) * | 2019-07-15 | 2019-11-26 | 电子科技大学 | A kind of image rain removing method based on mixing sensor model |
Also Published As
Publication number | Publication date |
---|---|
WO2018044681A1 (en) | 2018-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3621850B1 (en) | Shutterless far infrared (fir) camera for automotive safety and driving systems | |
US11836989B2 (en) | Vehicular vision system that determines distance to an object | |
US9517679B2 (en) | Systems and methods for monitoring vehicle occupants | |
EP2351351B1 (en) | A method and a system for detecting the presence of an impediment on a lens of an image capture device to light passing through the lens of an image capture device | |
US11535158B2 (en) | Vehicular camera with automatic lens defogging feature | |
US9286512B2 (en) | Method for detecting pedestrians based on far infrared ray camera at night | |
JP2005515565A (en) | Visibility obstruction identification method and identification apparatus in image sensor system | |
JP2012228916A (en) | Onboard camera system | |
US9398227B2 (en) | System and method for estimating daytime visibility | |
US20170358190A1 (en) | Detection system and method featuring multispectral imaging device | |
US12101569B2 (en) | Techniques for correcting oversaturated pixels in shutterless FIR cameras | |
US20180061008A1 (en) | Imaging system and method | |
US10511793B2 (en) | Techniques for correcting fixed pattern noise in shutterless FIR cameras | |
US20190005625A1 (en) | Techniques for scene-based nonuniformity correction in shutterless fir cameras | |
JP2007293672A (en) | Photographing apparatus for vehicle and soiling detection method for photographing apparatus for vehicle | |
EP3336747A1 (en) | Rain detection with a camera | |
EP2589513A1 (en) | Vision system and method for a motor vehicle | |
US10650250B2 (en) | Determination of low image quality of a vehicle camera caused by heavy rain | |
US10073261B2 (en) | Vehicle vision system camera with enhanced water removal | |
EP2230496A1 (en) | Method and system for automatically detecting objects in front of a motor vehicle | |
EP2352013B1 (en) | A vision system and method for a motor vehicle | |
EP3306522A1 (en) | Device for determining a region of interest on and/or within a vehicle windscreen | |
KR101684782B1 (en) | Rain sensing type wiper apparatus | |
KR101745261B1 (en) | Sun visor control apparatus and method | |
US11252344B2 (en) | Method and system for generating multiple synchronized thermal video streams for automotive safety and driving systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AUTOLIV ASP, INC., UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KORMOS, ALEXANDER LOUIS;MATHIEU, LOUIS-JOSEPH;REEL/FRAME:039897/0451 Effective date: 20160929 |
|
AS | Assignment |
Owner name: VEONEER US, INC., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTOLIV ASP, INC.;REEL/FRAME:046392/0039 Effective date: 20180518 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: VEONEER US, LLC, MICHIGAN Free format text: CHANGE OF NAME;ASSIGNOR:VEONEER US, INC.;REEL/FRAME:061048/0615 Effective date: 20220401 |