[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20180061008A1 - Imaging system and method - Google Patents

Imaging system and method Download PDF

Info

Publication number
US20180061008A1
US20180061008A1 US15/253,159 US201615253159A US2018061008A1 US 20180061008 A1 US20180061008 A1 US 20180061008A1 US 201615253159 A US201615253159 A US 201615253159A US 2018061008 A1 US2018061008 A1 US 2018061008A1
Authority
US
United States
Prior art keywords
window
image
optical assembly
images
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/253,159
Inventor
Alexander L. Kormos
Louis Joseph Mathieu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Veoneer US LLC
Original Assignee
Autoliv ASP Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autoliv ASP Inc filed Critical Autoliv ASP Inc
Priority to US15/253,159 priority Critical patent/US20180061008A1/en
Assigned to AUTOLIV ASP, INC. reassignment AUTOLIV ASP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KORMOS, ALEXANDER LOUIS, MATHIEU, LOUIS-JOSEPH
Priority to PCT/US2017/048412 priority patent/WO2018044681A1/en
Publication of US20180061008A1 publication Critical patent/US20180061008A1/en
Assigned to VEONEER US, INC. reassignment VEONEER US, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AUTOLIV ASP, INC.
Assigned to VEONEER US, LLC reassignment VEONEER US, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: VEONEER US, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • G06T5/001
    • G06K9/00791
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • H04N5/2254
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60SSERVICING, CLEANING, REPAIRING, SUPPORTING, LIFTING, OR MANOEUVRING OF VEHICLES, NOT OTHERWISE PROVIDED FOR
    • B60S1/00Cleaning of vehicles
    • B60S1/02Cleaning windscreens, windows or optical devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention generally relates to imaging systems. More specifically, the invention relates to infrared imaging systems utilized in automotive safety systems.
  • a layer of moisture or water may form on an external optical or a protective window surface of a camera that is exposed to the environment.
  • cameras such as infrared, long-wave, mid-wave, short-wave, near infrared, and visible cameras are used to detect objects in a scene from a moving vehicle. These objects may be of interest to the driver of an automobile or safety systems of the automobile, so as to prevent or minimize vehicle accidents.
  • the layer of water and moisture that collects on the front window reduces the thermal energy that reaches the infrared long-wave sensitive sensor and reduces the effectiveness of the camera from properly seeing a scene.
  • the image produced has a low thermal contrast and histogram with limited or reduced usable contents for the driver of the vehicle or other systems using detection algorithms.
  • Prior art solutions have utilized a heater to eliminate moisture from a window.
  • this solution generates major artifacts that reduce the usefulness of the camera system.
  • One type of artifact is a rain/splash artifact. This occurs when wet snow or rain makes contact with a warm window. The moisture becomes warm, and it appears to the driver or detection algorithms as a flash or image burst. These artifacts reduce the effectiveness of the vision system to the driver and/or other automobile safety systems.
  • the second issue commonly found is that when the moisture is heated by the heater, the moisture tends to linger on the window while the moisture evaporates. As a result, the scene develops bright or sometimes dark corners found in the corner of the image captured by the camera system.
  • An imaging system and method includes an imaging sensor, a processor in communication with the imaging sensor, a protective window or other exposed optical surface located between the camera system and the scene to be captured by the camera system, and a heater system in thermal communication with the window or other exposed optical surface.
  • the imaging sensor configured to capture images of a scene, each of the captured images comprising a plurality of pixels.
  • the processor is configured to receive information representing the images comprising the plurality of pixels captured by the imaging sensor, determine if rain splash artifacts or bright/dark non-uniformities are present, and remove artifacts in the information representing the images.
  • FIG. 1 illustrates an environment having an automobile with an imaging system
  • FIG. 2 illustrates a block diagram of the imaging system
  • FIG. 3 illustrates an imaging method
  • FIG. 4 illustrates an image captured by the imaging system of FIG. 2 and processed by the imaging method of FIG. 3 ;
  • FIG. 5 illustrates another imaging method.
  • an environment 10 includes an imaging system 12 that is located in a vehicle 14 is shown.
  • the environment 10 may be any type of environment.
  • the environment 10 includes a road 16 wherein the vehicle 14 is traveling on.
  • the environment 10 also includes a number of different objects.
  • the environment 10 includes a building 18 , trees 20 and 22 .
  • the environment 10 includes a number of moving objects, such as wildlife 24 and persons 26 .
  • the environment 10 may vary significantly.
  • the vehicle 14 may be alternatively traveling on a highway or off-road altogether.
  • the environment 10 could be subject to any which one of a number of different weather conditions, such as sunny, partly cloudy, rainy, foggy, snowy, or any other known weather conditions.
  • the imaging system 12 may generally be mounted on or near a grill 28 of the vehicle 14 .
  • the grill 28 is generally at the front of the car so as to capture a scene 30 forward of a vehicle 14 .
  • the scene 30 varies so that the imaging system 12 can capture images of the building 18 , trees 20 and 22 , wildlife 24 , and persons 26 , if any of these objects are located within the scene 30 as the vehicle 14 moves along the road 16 or elsewhere.
  • the imaging system 12 may not be able to see the objects located in the environment 10 . This, in turn, prevents this information from being presented to a driver or algorithms executed by any one of a number of different systems of the vehicle 14 .
  • the imaging system 12 may be located and utilized in any one of a number of different applications.
  • the imaging system 12 may be utilized on any other type of vehicle, such as a boat, plane, truck, construction equipment, tractors, and the like.
  • the imaging system 12 could be utilized separate and apart from any other vehicle 14 shown or previously mentioned.
  • the imaging system 12 could be mounted to a person, structure, and the like.
  • mounting of the system 12 may be such that the mounting of the system 12 makes the system 12 removable so that it can be utilized in any one of a number of different applications.
  • the imaging system 12 includes an imaging sensor 102 , an optical assembly 104 , a shutter 106 , and a window 108 .
  • the sensor 102 may be any type of sensor capable of capturing images.
  • the sensor 102 is an infrared sensor capable of capturing infrared images. It should also be understood that the sensor 102 may be a sensor capable of capturing different wavelengths of light.
  • the senor 102 may be a long-wave sensor (7-15 microns wavelength), a mid-wave sensor (2.5-7 microns wavelength), a short-wave sensor (1.1-2.5 microns wavelength), and/or a near infrared sensor (0.75-1.1 microns wavelength).
  • a long-wave sensor 7-15 microns wavelength
  • a mid-wave sensor (2.5-7 microns wavelength)
  • a short-wave sensor 1.1-2.5 microns wavelength
  • a near infrared sensor (0.75-1.1 microns wavelength
  • the optical assembly 104 may include one or more optics for directing radiation from the scene 30 towards the sensor 102 and has a first side 129 facing towards the sensor 102 and a second side 127 generally facing towards the scene 30 to be captured.
  • radiation could mean any type of radiation or visual information, such as light, capable of being received and detected by the sensor 102 .
  • the system 12 may include a shutter 106 that allows light or radiation to pass for a predetermined period of time.
  • the window 108 may be any one of a number of different windows capable of allowing the radiation or light to pass through.
  • the window 108 may be a germanium or silicon window.
  • any one of a number of different materials may be utilized in the manufacturing of the window 108 .
  • the window 108 may not be present at all.
  • the window 108 is generally located between the sensor 102 and the scene 30 to be captured.
  • the optical assembly 104 is located between the window 108 and the imaging sensor 102 . If a shutter 106 is utilized, the shutter 106 may be located between the window 108 and the optical assembly 104 . Optionally, the shutter 106 may also be located behind the optical assembly 104 or in front of the window 108 —essentially anywhere between the imaging sensor 102 and the scene 30 .
  • the sensor 102 and the optical assembly 104 generally form a camera system 110 that is configured to capture images of the scene 30 . Each of these captured images comprises a plurality of pixels.
  • the imaging system 12 also includes a processor 112 configured to receive information representing the images comprising the plurality of pixels captured by the camera system 112 .
  • the processor 112 may be a single processor or may be multiple processors working in concert.
  • a memory device 114 may be in communication with the processor 112 .
  • the memory device 114 may be configured to store instructions for executing an imaging method to be described later in this specification.
  • the memory 114 may be configured to store information received from the camera system 110 regarding the captured images from the scene 30 .
  • the memory 114 may be any type of memory capable of storing digital information, such as optical memories, magnetic memories, solid state memories, and the like. Additionally, it should be understood that the memory 114 may be integrated within the processor 112 or separate as shown.
  • the processor 112 may be connected to a number of different devices that utilize the information representing the images captured from the scene 30 .
  • the processor 112 may provide this information to a display device 116 having a display area 118 .
  • the display device 116 displays captured images from the scene 30 to a user.
  • the user may be an operator of the vehicle 14 of FIG. 1 . This allows the user to make adjustments in the operation of the vehicle 14 .
  • the processor 112 may also be in communication with other vehicle systems 120 and 122 .
  • vehicle systems 120 and 122 may be any one of a number of different vehicle systems found in a vehicle.
  • the vehicle systems 120 and/or 122 may be vehicle safety systems, such as airbags, pre-tensioners, and the like.
  • the safety systems may include accident avoidance systems, such as automatic braking, cruise control, automatic steering, and the like. It should be understood that the vehicle systems described are only examples and that the vehicle systems may include any system found in a vehicle. Further, while vehicle systems have been discussed in this specification, the systems 120 and 122 may not be related at all to a vehicle and may be related to some other application of the system 12 .
  • These systems 120 and 122 utilize the information regarding the captured images from the scene 30 that have been processed by the processor 112 to perform any one of a number of different algorithms and functions.
  • the system 12 may include the window 108 .
  • the window 108 has a first side 124 facing towards the camera system 110 and a second side 126 generally facing towards the scene 30 to be captured.
  • a heater 128 Located on the first side 124 is a heater 128 configured to heat the window 108 .
  • the heater 128 may be a heating wire or a heating mesh.
  • the system 12 may also include a temperature sensing element 130 for determining the temperature of the window 108 .
  • the temperature sensing element may be any one of a number of different temperature sensing elements. In this example, the temperature sensing element 130 is a thermistor.
  • the heater 128 may be positioned and configured so as to heat the optical assembly 104 .
  • the heater 128 and temperature sensing element 130 may be located on the first side 129 of the optical assembly 104 .
  • a processor 132 may be in communication with the heater 128 and the temperature sensing element 130 .
  • the processor 132 may be configured to heat the window 108 or optical assembly 104 , in the case where the window 108 is not utilized, to a temperature less than or about equal to 100° Celsius.
  • the processor 132 may be configured to heat the window 108 or optical assembly 104 to approximately 80° Celsius, which is less than 100° Celsius.
  • the processor 132 may be configured so as to heat the window 108 or optical assembly 104 , in the case where the window 108 is not utilized, above the ambient temperature by a certain specified temperature.
  • this certain specified temperature may be 40° Celsius above the ambient temperature.
  • the processor 132 may be configured to activate the heater 128 at certain times or certain temperatures.
  • the processor 132 may be a single processor or may be made of multiple processors working in concert. Also, it should be understood that the processor 112 and the processor 132 may, in fact, be the same processor or set of processors that are managing both the camera system 110 and the heater 128 . Further, a memory device 134 , similar to the memory device 114 may be in communication with the processor 132 . The memory device 134 may contain instructions for configuring the processor 132 regarding heating the window 108 or optical assembly 104 and receiving feedback information from the temperature sensing device 130 . Like before, the memory 134 may be any type of memory capable of storing digital information, such as an optical memory, magnetic memory, or solid state memory, and the like. Further, the memory 134 may be integrated within the processor 132 or separate from the processor 132 , as shown.
  • the processor 112 is configured to determine if artifacts are present in the captured image. If this occurs, the processor is configured to remove artifacts from any information representing the images. The artifacts may be caused by moisture coming into physical contact with the second side 126 of the window 108 or the optical assembly 104 .
  • a method 200 and example image 300 are shown, respectively.
  • the method 200 may be executed by any one of the processors 112 or 132 as shown in FIG. 2 .
  • the instructions for this method may be located in the memories 114 or 134 .
  • the method begins by heating the window 108 or optical assembly 104 to a temperature less than or equal to 100° C.
  • the method may heat the window 108 or optical assembly 104 to 80° C., which is less than 100° C.
  • this temperature is maintained during the operation of the vehicle 14 of FIG. 1 .
  • the temperature in which the window 108 or optical assembly 104 is heated to by a predetermined range or may be based on a set amount above an ambient temperature, for example, 40° C. above the ambient temperature.
  • the camera system 110 captures images of the scene 30 .
  • the processor 112 determines if artifacts are present in the images.
  • the processor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size.
  • a series of flashes generally occurs, the flashes being the artifacts. These flashes are generally viewed as a significant change in the pixels. Further, theses flashes are contiguous and cover a specific area size.
  • artifacts can be filtered out by looking not only at which pixels have changed, but also if these pixels are contiguous in nature. If no artifacts are detected, the method returns to step 204 . However, if disturbances caused by moisture and heat are detected, the method continues to step 208 , wherein the processor 112 is configured to remove artifacts in the images.
  • the sample image 300 includes the road 16 and portions of the building 18 from FIG. 1 .
  • the image also includes artifacts 310 A and 310 B caused by moisture coming into contact with the second side 126 of the window 108 or the second side 127 of the optical assembly 104 .
  • These artifacts 310 A and 310 B essentially appear as a series of flashes but represent moisture coming into contact with the second side of 126 of the window 108 or the second side 127 of the optical assembly 104 .
  • the moisture may accumulate on the edges of the image 300 .
  • the artifacts 310 A- 310 B and 312 A- 312 D may be removed by applying a low pass filter to the information representing the pixels that changed in the image.
  • the pixels are located where the artifacts 310 A- 310 B and 312 A- 312 D are located.
  • the processor 12 may also be further configured to remove the artifacts by determining a splash profile by subtracting a splash image, such as artifacts 310 A and 310 B from a previously captured image or a low pass filtered version of previously captured image.
  • a splash image is the pixels that changed in the image.
  • This splash pattern is removed from the sample image 300 and fading of the removal of the splash pattern from the plurality of images is performed as the artifacts are no longer present in the captured images.
  • the splash pattern may be located near the edges of 314 A, 314 B, 314 C, and 314 D of the image 310 instead of, or in addition to a central area 316 of the image 300 .
  • step 402 the camera system 110 captures images of the scene 30 .
  • step 404 the processor 112 determines if artifacts are present in the images.
  • the processor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size.
  • the processor 112 determines if moisture is the likely cause of the artifacts that are present in the captured images. This determination can be made by not only using the captured images but also using external data 407 .
  • the external data 407 could include data from other sensors, such as environmental sensors, such as rain detecting windshield wipers that can detect if the vehicle 14 is traveling in a location that is likely to have moisture. Further, the external data 407 could be data from a database that tracks the weather conditions of an area where the vehicle 14 is traveling. Additionally, the external data 407 could also include information from other vehicle systems, such as a determination if the windshield wipers of the vehicle and/or defroster of a vehicle are being utilized. If the windshield wipers and/or defroster, and/or any other moisture related system are being utilized, this information could be useful in determining if moisture is the likely cause of the artifacts.
  • the method 400 turns on heater 128 in step 408 .
  • the heater 128 is only on selectively if moisture is determined to be present. Otherwise, the method 400 is essentially always looking to remove artifacts but will only heat the window 108 or optical assembly 104 when moisture is determined to be present.
  • step 410 the processor 112 is configured to remove artifacts from the captured images.
  • the methodologies described in method 200 regarding removing artifacts from the captured images are equally applicable in this method and will not be described again. After artifacts are removed, the method 400 returns to step 402 .
  • dedicated hardware implementations such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods described herein.
  • Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems.
  • One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
  • the methods described herein may be implemented by software programs executable by a computer system.
  • implementations can include distributed processing, component/object distributed processing, and parallel processing.
  • virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.
  • computer-readable medium includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions.
  • computer-readable medium shall also include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Studio Devices (AREA)

Abstract

An imaging system and method includes an imaging sensor, a processor in communication with the imaging sensor, a protective window or other exposed optical surface located between the imaging sensor and the scene to be captured by the imaging sensor, and a heater system in thermal communication with the window or other exposed optical surface. The imaging sensor configured to capture images of a scene, each of the captured images comprising a plurality of pixels. The processor is configured to receive information representing the images comprising the plurality of pixels captured by the imaging sensor, determine if rain splash artifacts or bright/dark non-uniformities are present, and remove artifacts in the information representing the images.

Description

    BACKGROUND 1. Field of the Invention
  • The present invention generally relates to imaging systems. More specifically, the invention relates to infrared imaging systems utilized in automotive safety systems.
  • 2. Description of Related Art
  • In adverse weather, especially when rain, fog, or wet snow is present, a layer of moisture or water may form on an external optical or a protective window surface of a camera that is exposed to the environment. It is well known in the art, cameras, such as infrared, long-wave, mid-wave, short-wave, near infrared, and visible cameras are used to detect objects in a scene from a moving vehicle. These objects may be of interest to the driver of an automobile or safety systems of the automobile, so as to prevent or minimize vehicle accidents. The layer of water and moisture that collects on the front window reduces the thermal energy that reaches the infrared long-wave sensitive sensor and reduces the effectiveness of the camera from properly seeing a scene. As a result, the image produced has a low thermal contrast and histogram with limited or reduced usable contents for the driver of the vehicle or other systems using detection algorithms.
  • Prior art solutions have utilized a heater to eliminate moisture from a window. However, this solution generates major artifacts that reduce the usefulness of the camera system. One type of artifact is a rain/splash artifact. This occurs when wet snow or rain makes contact with a warm window. The moisture becomes warm, and it appears to the driver or detection algorithms as a flash or image burst. These artifacts reduce the effectiveness of the vision system to the driver and/or other automobile safety systems.
  • The second issue commonly found is that when the moisture is heated by the heater, the moisture tends to linger on the window while the moisture evaporates. As a result, the scene develops bright or sometimes dark corners found in the corner of the image captured by the camera system.
  • SUMMARY
  • An imaging system and method includes an imaging sensor, a processor in communication with the imaging sensor, a protective window or other exposed optical surface located between the camera system and the scene to be captured by the camera system, and a heater system in thermal communication with the window or other exposed optical surface. The imaging sensor configured to capture images of a scene, each of the captured images comprising a plurality of pixels. The processor is configured to receive information representing the images comprising the plurality of pixels captured by the imaging sensor, determine if rain splash artifacts or bright/dark non-uniformities are present, and remove artifacts in the information representing the images.
  • Further objects, features, and advantages of this invention will become readily apparent to persons skilled in the art after a review of the following description, with reference to the drawings and claims that are appended to and form a part of this specification.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an environment having an automobile with an imaging system;
  • FIG. 2 illustrates a block diagram of the imaging system;
  • FIG. 3 illustrates an imaging method;
  • FIG. 4 illustrates an image captured by the imaging system of FIG. 2 and processed by the imaging method of FIG. 3; and
  • FIG. 5 illustrates another imaging method.
  • DETAILED DESCRIPTION
  • Referring now to FIG. 1, an environment 10 includes an imaging system 12 that is located in a vehicle 14 is shown. It should be understood that the environment 10 may be any type of environment. Here, the environment 10 includes a road 16 wherein the vehicle 14 is traveling on. The environment 10 also includes a number of different objects. For example, the environment 10 includes a building 18, trees 20 and 22. Further, the environment 10 includes a number of moving objects, such as wildlife 24 and persons 26. Of course, it should be understood that the environment 10 may vary significantly. For example, the vehicle 14 may be alternatively traveling on a highway or off-road altogether. Further, the environment 10 could be subject to any which one of a number of different weather conditions, such as sunny, partly cloudy, rainy, foggy, snowy, or any other known weather conditions.
  • The imaging system 12 may generally be mounted on or near a grill 28 of the vehicle 14. The grill 28 is generally at the front of the car so as to capture a scene 30 forward of a vehicle 14. As the vehicle 14 travels along the road 16, the scene 30 varies so that the imaging system 12 can capture images of the building 18, trees 20 and 22, wildlife 24, and persons 26, if any of these objects are located within the scene 30 as the vehicle 14 moves along the road 16 or elsewhere.
  • As stated in the background section, if the weather conditions of the environment 10 are adverse, such as rainy or snowy, moisture can develop on a window of the imaging system 12 that may reduce the usefulness of the imaging system 12, especially if the imaging system 12 utilizes an infrared sensor, as will be explained later in this specification. In such an occurrence, the imaging system 12 may not be able to see the objects located in the environment 10. This, in turn, prevents this information from being presented to a driver or algorithms executed by any one of a number of different systems of the vehicle 14.
  • Also, it should be understood that while the imaging system 12 is shown as being located within a vehicle 14, the imaging system 12 may be located and utilized in any one of a number of different applications. For example, the imaging system 12 may be utilized on any other type of vehicle, such as a boat, plane, truck, construction equipment, tractors, and the like. Further, it should be understood that the imaging system 12 could be utilized separate and apart from any other vehicle 14 shown or previously mentioned. For example, the imaging system 12 could be mounted to a person, structure, and the like. Further, it should be understood that mounting of the system 12 may be such that the mounting of the system 12 makes the system 12 removable so that it can be utilized in any one of a number of different applications.
  • Referring to FIG. 2, a more detailed view of the imaging system 12 for capturing the scene 30 is shown. Here, the imaging system 12 includes an imaging sensor 102, an optical assembly 104, a shutter 106, and a window 108. The sensor 102 may be any type of sensor capable of capturing images. In this example, the sensor 102 is an infrared sensor capable of capturing infrared images. It should also be understood that the sensor 102 may be a sensor capable of capturing different wavelengths of light. For example, the sensor 102 may be a long-wave sensor (7-15 microns wavelength), a mid-wave sensor (2.5-7 microns wavelength), a short-wave sensor (1.1-2.5 microns wavelength), and/or a near infrared sensor (0.75-1.1 microns wavelength). Of course, other sensors capable of capturing other wavelengths outside the ranges mentioned may also be utilized.
  • The optical assembly 104 may include one or more optics for directing radiation from the scene 30 towards the sensor 102 and has a first side 129 facing towards the sensor 102 and a second side 127 generally facing towards the scene 30 to be captured. It should be understood that the term radiation could mean any type of radiation or visual information, such as light, capable of being received and detected by the sensor 102. Optionally, the system 12 may include a shutter 106 that allows light or radiation to pass for a predetermined period of time.
  • The window 108 may be any one of a number of different windows capable of allowing the radiation or light to pass through. In this example, the window 108 may be a germanium or silicon window. However, it should be understood that any one of a number of different materials may be utilized in the manufacturing of the window 108. Further, should be understood that the window 108 may not be present at all.
  • As for location, the window 108 is generally located between the sensor 102 and the scene 30 to be captured. Similarly, the optical assembly 104 is located between the window 108 and the imaging sensor 102. If a shutter 106 is utilized, the shutter 106 may be located between the window 108 and the optical assembly 104. Optionally, the shutter 106 may also be located behind the optical assembly 104 or in front of the window 108—essentially anywhere between the imaging sensor 102 and the scene 30. The sensor 102 and the optical assembly 104 generally form a camera system 110 that is configured to capture images of the scene 30. Each of these captured images comprises a plurality of pixels.
  • The imaging system 12 also includes a processor 112 configured to receive information representing the images comprising the plurality of pixels captured by the camera system 112. It should be understood that the processor 112 may be a single processor or may be multiple processors working in concert. A memory device 114 may be in communication with the processor 112. The memory device 114 may be configured to store instructions for executing an imaging method to be described later in this specification. However, the memory 114 may be configured to store information received from the camera system 110 regarding the captured images from the scene 30. It should be understood that the memory 114 may be any type of memory capable of storing digital information, such as optical memories, magnetic memories, solid state memories, and the like. Additionally, it should be understood that the memory 114 may be integrated within the processor 112 or separate as shown.
  • The processor 112 may be connected to a number of different devices that utilize the information representing the images captured from the scene 30. For example, the processor 112 may provide this information to a display device 116 having a display area 118. The display device 116 displays captured images from the scene 30 to a user. In this case, the user may be an operator of the vehicle 14 of FIG. 1. This allows the user to make adjustments in the operation of the vehicle 14.
  • Also, the processor 112 may also be in communication with other vehicle systems 120 and 122. These other vehicle systems 120 and 122 may be any one of a number of different vehicle systems found in a vehicle. For example, the vehicle systems 120 and/or 122 may be vehicle safety systems, such as airbags, pre-tensioners, and the like. Further, the safety systems may include accident avoidance systems, such as automatic braking, cruise control, automatic steering, and the like. It should be understood that the vehicle systems described are only examples and that the vehicle systems may include any system found in a vehicle. Further, while vehicle systems have been discussed in this specification, the systems 120 and 122 may not be related at all to a vehicle and may be related to some other application of the system 12. These systems 120 and 122 utilize the information regarding the captured images from the scene 30 that have been processed by the processor 112 to perform any one of a number of different algorithms and functions.
  • As stated before, the system 12 may include the window 108. The window 108 has a first side 124 facing towards the camera system 110 and a second side 126 generally facing towards the scene 30 to be captured. Located on the first side 124 is a heater 128 configured to heat the window 108. The heater 128 may be a heating wire or a heating mesh. The system 12 may also include a temperature sensing element 130 for determining the temperature of the window 108. The temperature sensing element may be any one of a number of different temperature sensing elements. In this example, the temperature sensing element 130 is a thermistor.
  • Also, it should be understood that if the system 12 does not include the window 108, the heater 128 may be positioned and configured so as to heat the optical assembly 104. For example, the heater 128 and temperature sensing element 130 may be located on the first side 129 of the optical assembly 104.
  • A processor 132 may be in communication with the heater 128 and the temperature sensing element 130. The processor 132 may be configured to heat the window 108 or optical assembly 104, in the case where the window 108 is not utilized, to a temperature less than or about equal to 100° Celsius. As an example, the processor 132 may be configured to heat the window 108 or optical assembly 104 to approximately 80° Celsius, which is less than 100° Celsius.
  • Further, the processor 132 may be configured so as to heat the window 108 or optical assembly 104, in the case where the window 108 is not utilized, above the ambient temperature by a certain specified temperature. For example, this certain specified temperature may be 40° Celsius above the ambient temperature. Also, the processor 132 may be configured to activate the heater 128 at certain times or certain temperatures.
  • Like the processor 112, the processor 132 may be a single processor or may be made of multiple processors working in concert. Also, it should be understood that the processor 112 and the processor 132 may, in fact, be the same processor or set of processors that are managing both the camera system 110 and the heater 128. Further, a memory device 134, similar to the memory device 114 may be in communication with the processor 132. The memory device 134 may contain instructions for configuring the processor 132 regarding heating the window 108 or optical assembly 104 and receiving feedback information from the temperature sensing device 130. Like before, the memory 134 may be any type of memory capable of storing digital information, such as an optical memory, magnetic memory, or solid state memory, and the like. Further, the memory 134 may be integrated within the processor 132 or separate from the processor 132, as shown.
  • The processor 112 is configured to determine if artifacts are present in the captured image. If this occurs, the processor is configured to remove artifacts from any information representing the images. The artifacts may be caused by moisture coming into physical contact with the second side 126 of the window 108 or the optical assembly 104.
  • Referring to FIGS. 3 and 4, a method 200 and example image 300 are shown, respectively. The method 200 may be executed by any one of the processors 112 or 132 as shown in FIG. 2. The instructions for this method may be located in the memories 114 or 134. In step 202 the method begins by heating the window 108 or optical assembly 104 to a temperature less than or equal to 100° C. For example, the method may heat the window 108 or optical assembly 104 to 80° C., which is less than 100° C. As described before, this temperature is maintained during the operation of the vehicle 14 of FIG. 1. As stated previously, the temperature in which the window 108 or optical assembly 104 is heated to by a predetermined range or may be based on a set amount above an ambient temperature, for example, 40° C. above the ambient temperature.
  • In step 204, the camera system 110 captures images of the scene 30. In step 206, the processor 112 determines if artifacts are present in the images. The processor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size. As stated in the background section, when moisture comes into contact with the second side 126 of the window 108 or optical assembly 104, a series of flashes generally occurs, the flashes being the artifacts. These flashes are generally viewed as a significant change in the pixels. Further, theses flashes are contiguous and cover a specific area size.
  • As such, artifacts can be filtered out by looking not only at which pixels have changed, but also if these pixels are contiguous in nature. If no artifacts are detected, the method returns to step 204. However, if disturbances caused by moisture and heat are detected, the method continues to step 208, wherein the processor 112 is configured to remove artifacts in the images.
  • Referring to FIG. 4, a sample image 300 is shown. The sample image includes the road 16 and portions of the building 18 from FIG. 1. The image also includes artifacts 310A and 310B caused by moisture coming into contact with the second side 126 of the window 108 or the second side 127 of the optical assembly 104. These artifacts 310A and 310B essentially appear as a series of flashes but represent moisture coming into contact with the second side of 126 of the window 108 or the second side 127 of the optical assembly 104. In addition to these artifacts, there are also other artifacts 312A, 312B, 312C, and 312D. These artifacts represent moisture that has accumulated in the edges of the sample image 300. As the moisture collecting on the second side 126 of the window 108 or the second side 127 of the optical assembly 104 is heated, the moisture may accumulate on the edges of the image 300. The artifacts 310A-310B and 312A-312D may be removed by applying a low pass filter to the information representing the pixels that changed in the image. Here, the pixels are located where the artifacts 310A-310B and 312A-312D are located.
  • The processor 12 may also be further configured to remove the artifacts by determining a splash profile by subtracting a splash image, such as artifacts 310A and 310B from a previously captured image or a low pass filtered version of previously captured image. Generally, the splash image is the pixels that changed in the image. This splash pattern is removed from the sample image 300 and fading of the removal of the splash pattern from the plurality of images is performed as the artifacts are no longer present in the captured images. Further, the splash pattern may be located near the edges of 314A, 314B, 314C, and 314D of the image 310 instead of, or in addition to a central area 316 of the image 300.
  • Referring to FIG. 5, another method 400 is shown that may be executed by any one of the processors 112 or 132 as shown in FIG. 2. The instructions for this method may be located in the memories 114 or 134. In step 402, the camera system 110 captures images of the scene 30. In step 404, the processor 112 determines if artifacts are present in the images. The processor 112 may be configured to determine that artifacts are present in the images by determining which pixels in an image of the plurality of images have changed. Further, this determination can be made if the pixels that change in the image are contiguous and cover a specific area and size. As stated in the background section, when moisture comes into contact with the second side 126 of the window 108 or optical assembly 104, a series of flashes generally occurs, the flashes being the artifacts. These flashes are generally viewed as a significant change in the pixels. Further, theses flashes are contiguous and cover a specific area size.
  • In step 406, the processor 112 determines if moisture is the likely cause of the artifacts that are present in the captured images. This determination can be made by not only using the captured images but also using external data 407. The external data 407 could include data from other sensors, such as environmental sensors, such as rain detecting windshield wipers that can detect if the vehicle 14 is traveling in a location that is likely to have moisture. Further, the external data 407 could be data from a database that tracks the weather conditions of an area where the vehicle 14 is traveling. Additionally, the external data 407 could also include information from other vehicle systems, such as a determination if the windshield wipers of the vehicle and/or defroster of a vehicle are being utilized. If the windshield wipers and/or defroster, and/or any other moisture related system are being utilized, this information could be useful in determining if moisture is the likely cause of the artifacts.
  • If moisture is determined to be the likely source of the artifacts in the captured images, the method 400 turns on heater 128 in step 408. As stated previously, the temperature in which the window 108 or optical assembly 104 is heated to by a predetermined range or may be based on a set amount above an ambient temperature, for example, 40° C. above the ambient temperature. In this method, the heater 128 is only on selectively if moisture is determined to be present. Otherwise, the method 400 is essentially always looking to remove artifacts but will only heat the window 108 or optical assembly 104 when moisture is determined to be present.
  • In step 410, the processor 112 is configured to remove artifacts from the captured images. The methodologies described in method 200 regarding removing artifacts from the captured images are equally applicable in this method and will not be described again. After artifacts are removed, the method 400 returns to step 402.
  • In an alternative embodiment, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods described herein. Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
  • In accordance with various embodiments of the present disclosure, the methods described herein may be implemented by software programs executable by a computer system. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.
  • Further, the methods described herein may be embodied in a computer-readable medium. The term “computer-readable medium” includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term “computer-readable medium” shall also include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein.
  • As a person skilled in the art will readily appreciate, the above description is meant as an illustration of the principles of this invention. This description is not intended to limit the scope or application of this invention in that the invention is susceptible to modification, variation, and change, without departing from the spirit of this invention, as defined in the following claims.

Claims (20)

1. An imaging system, the imaging system comprising
an imaging sensor, the imaging sensor configured to capture images of a scene, each of the captured images comprising a plurality of pixels;
a processor in communication with the imaging sensor, the processor configured to receive information representing the images comprising the plurality of pixels captured by the imaging sensor;
a window or optical assembly located between the imaging sensor and the scene to be captured by the imaging sensor, the window or optical assembly having a first side facing the imaging sensor and a second side facing the scene to be captured;
a heater system in thermal communication with the window or optical assembly;
wherein the heater system is configured to selectively heat the window or optical assembly to a temperature less than or equal to 100 degrees Celsius;
wherein the processor is configured to determine if one or more artifacts are present in the captured images; and
wherein the processor, after determining that one or more artifacts are present in the captured images, is configured to remove the one or more artifacts in the information representing the images.
2. The system of claim 1, wherein the processor is configured to determine if one or more artifacts are present in the captured images by determining which pixels have changed in an image of the plurality of images and if the pixels that changed in the image are contiguous and cover a specific area size.
3. The system of claim 1, wherein the processor is configured to remove the one or more artifacts by being configured to:
determine a splash profile by subtracting a splash image from a previously captured image or a low pass image, wherein the splash image is the pixels that changed in the image;
remove the splash pattern from the image; and
fade the removal of splash pattern from the plurality of images as moisture is removed from the second side of the window or the optical assembly.
4. The system of claim 3, wherein the removal of splash pattern from the image is weighted higher in near edges of the image than a center area of the image.
5. The system of claim 1, wherein the processor is configured to determine if moisture is present on the window or the optical assembly and instruct the heater system to heat the window or optical assembly to a temperature less than or equal to 100 degrees Celsius when moisture is present.
6. The system of claim 5, wherein the processor is configured to determine if moisture is present on the window or the optical assembly by analyzing the captured images.
7. The system of claim 6, wherein the processor is further configured to determine if moisture is present on the window or the optical assembly by analyzing external data.
8. The system of claim 1, wherein the heater system is configured to heat the window or the optical assembly to a temperature above the ambient temperature by a specific number of degrees Celsius.
9. The system of claim 8, wherein the specific number of degrees Celsius is 40 degrees Celsius.
10. The system of claim 1, wherein the window is a germanium window or a silicon window.
11. The system of claim 1, wherein the imaging sensor comprises at least one of a long-wave sensor, a mid-wave sensor, a short-wave sensor, and/or a near-infrared sensor.
12. The system of claim 1, wherein the heater system further comprises:
a heating element in thermal communication with the window or the optical assembly;
a temperature sensing element in thermal communication with the window or the optical assembly;
a control device in communication with the heating element and the temperature sensing element, the control device configured to measure the temperature of the window or the optical assembly via the temperature sensing element and provide a current to the heating element in response to the temperature of the window or the optical assembly.
13. The system of claim 1, wherein the system is located within an automobile.
14. An imaging method for an imaging system, the method comprising the steps of:
capturing images of a scene by an imaging sensor, each of the captured images comprising a plurality of pixels;
selectively heating a window or an optical assembly located between the imaging sensor and the scene to a temperature less than or equal to 100 degrees Celsius;
determining if one or more artifacts are present in the captured images; and
removing the one or more artifacts in the images.
15. The method of claim 14, wherein the step of determining that if one or more artifacts are present in the captured images includes the step of determining which pixels have changed in an image of the plurality of images and if the pixels that changed in the image are contiguous and cover a specific area size.
16. The method of claim 14, further comprising the steps of:
determining a splash profile by subtracting a splash image from a previously captured image or a low pass image, wherein the splash image is the pixels that changed in the image;
removing the splash pattern from the image; and
fading the removal of splash pattern from the plurality of images as moisture is removed from the window or optical assembly.
17. The method of claim 16, wherein the step of removing the splash pattern from the image is weighted higher in near edges of the image than a center area of the image.
18. The method of claim 16, further comprising the steps of:
determining if moisture is present on the window or the optical assembly and;
heating the window or optical assembly to a temperature less than or equal to 100 degrees Celsius when moisture is present on the window or the optical assembly.
19. The method of claim 18, further comprising the step of determining if moisture is present on the window or the optical assembly by analyzing the captured images.
20. The method of claim 14, wherein the imaging sensor comprises at least one of a long-wave sensor, a mid-wave sensor, a short-wave sensor, and/or a near-infrared sensor.
US15/253,159 2016-08-31 2016-08-31 Imaging system and method Abandoned US20180061008A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/253,159 US20180061008A1 (en) 2016-08-31 2016-08-31 Imaging system and method
PCT/US2017/048412 WO2018044681A1 (en) 2016-08-31 2017-08-24 Imaging system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/253,159 US20180061008A1 (en) 2016-08-31 2016-08-31 Imaging system and method

Publications (1)

Publication Number Publication Date
US20180061008A1 true US20180061008A1 (en) 2018-03-01

Family

ID=61243153

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/253,159 Abandoned US20180061008A1 (en) 2016-08-31 2016-08-31 Imaging system and method

Country Status (2)

Country Link
US (1) US20180061008A1 (en)
WO (1) WO2018044681A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503609A (en) * 2019-07-15 2019-11-26 电子科技大学 A kind of image rain removing method based on mixing sensor model
US10687000B1 (en) * 2018-06-15 2020-06-16 Rockwell Collins, Inc. Cross-eyed sensor mosaic
WO2020207850A1 (en) * 2019-04-12 2020-10-15 Connaught Electronics Ltd. Image processing method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6815680B2 (en) * 2002-06-05 2004-11-09 Raytheon Company Method and system for displaying an image
JP2005159710A (en) * 2003-11-26 2005-06-16 Nissan Motor Co Ltd Display control apparatus and method for vehicle
US20100073498A1 (en) * 2005-11-04 2010-03-25 Tobias Hoglund Enhancement of images
US7796168B1 (en) * 2006-06-09 2010-09-14 Flir Systems, Inc. Methods and systems for detection and mitigation of image-flash in infrared cameras
US20120008866A1 (en) * 2010-06-28 2012-01-12 Jad Halimeh Method and device for detecting an interfering object in a camera image
EP2589513A1 (en) * 2011-11-03 2013-05-08 Autoliv Development AB Vision system and method for a motor vehicle
US20130236116A1 (en) * 2012-03-08 2013-09-12 Industrial Technology Research Institute Method and apparatus for single-image-based rain streak removal
US20150332099A1 (en) * 2014-05-15 2015-11-19 Conti Temic Microelectronic Gmbh Apparatus and Method for Detecting Precipitation for a Motor Vehicle
US20160231527A1 (en) * 2015-02-06 2016-08-11 Flir Systems, Inc. Lens heater to maintain thermal equilibrium in an infrared imaging system
US20170240138A1 (en) * 2016-02-19 2017-08-24 Toyota Jidosha Kabushiki Kaisha Imaging system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3706782A1 (en) * 1987-03-03 1988-09-15 Daimler Benz Ag USE OF ORGANIC SILICON COMPOUNDS ON GLASS WINDSHIELDS TO REACH AN ANTI-FOGGING EFFECT AGAINST OILY ORGANIC SUBSTANCES
US6681163B2 (en) * 2001-10-04 2004-01-20 Gentex Corporation Moisture sensor and windshield fog detector
US7276696B2 (en) * 2003-07-15 2007-10-02 Ford Global Technologies, Llc Active night vision thermal control system using wavelength-temperature characteristic of light source
KR101534973B1 (en) * 2013-12-19 2015-07-07 현대자동차주식회사 Image Processing Apparatus and Method for Removing Rain From Image Data

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6815680B2 (en) * 2002-06-05 2004-11-09 Raytheon Company Method and system for displaying an image
JP2005159710A (en) * 2003-11-26 2005-06-16 Nissan Motor Co Ltd Display control apparatus and method for vehicle
US20100073498A1 (en) * 2005-11-04 2010-03-25 Tobias Hoglund Enhancement of images
US7796168B1 (en) * 2006-06-09 2010-09-14 Flir Systems, Inc. Methods and systems for detection and mitigation of image-flash in infrared cameras
US20120008866A1 (en) * 2010-06-28 2012-01-12 Jad Halimeh Method and device for detecting an interfering object in a camera image
EP2589513A1 (en) * 2011-11-03 2013-05-08 Autoliv Development AB Vision system and method for a motor vehicle
US20130236116A1 (en) * 2012-03-08 2013-09-12 Industrial Technology Research Institute Method and apparatus for single-image-based rain streak removal
US20150332099A1 (en) * 2014-05-15 2015-11-19 Conti Temic Microelectronic Gmbh Apparatus and Method for Detecting Precipitation for a Motor Vehicle
US20160231527A1 (en) * 2015-02-06 2016-08-11 Flir Systems, Inc. Lens heater to maintain thermal equilibrium in an infrared imaging system
US20170240138A1 (en) * 2016-02-19 2017-08-24 Toyota Jidosha Kabushiki Kaisha Imaging system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Image Segmentation." The University of Auckland, 6 Dec. 2013. Web. Accessed 20 Apr. 2018. *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10687000B1 (en) * 2018-06-15 2020-06-16 Rockwell Collins, Inc. Cross-eyed sensor mosaic
WO2020207850A1 (en) * 2019-04-12 2020-10-15 Connaught Electronics Ltd. Image processing method
KR20210137534A (en) * 2019-04-12 2021-11-17 코너트 일렉트로닉스 리미티드 Image processing method
JP2022527605A (en) * 2019-04-12 2022-06-02 コノート、エレクトロニクス、リミテッド Image processing method
JP7274603B2 (en) 2019-04-12 2023-05-16 コノート、エレクトロニクス、リミテッド Image processing method
KR102564134B1 (en) * 2019-04-12 2023-08-04 코너트 일렉트로닉스 리미티드 Image processing method
DE102019109748B4 (en) 2019-04-12 2024-10-02 Connaught Electronics Ltd. Image processing method operable in an image acquisition system of a vehicle as well as image acquisition system for carrying out the method as well as vehicle and non-transitory computer program product
CN110503609A (en) * 2019-07-15 2019-11-26 电子科技大学 A kind of image rain removing method based on mixing sensor model

Also Published As

Publication number Publication date
WO2018044681A1 (en) 2018-03-08

Similar Documents

Publication Publication Date Title
EP3621850B1 (en) Shutterless far infrared (fir) camera for automotive safety and driving systems
US11836989B2 (en) Vehicular vision system that determines distance to an object
US9517679B2 (en) Systems and methods for monitoring vehicle occupants
EP2351351B1 (en) A method and a system for detecting the presence of an impediment on a lens of an image capture device to light passing through the lens of an image capture device
US11535158B2 (en) Vehicular camera with automatic lens defogging feature
US9286512B2 (en) Method for detecting pedestrians based on far infrared ray camera at night
JP2005515565A (en) Visibility obstruction identification method and identification apparatus in image sensor system
JP2012228916A (en) Onboard camera system
US9398227B2 (en) System and method for estimating daytime visibility
US20170358190A1 (en) Detection system and method featuring multispectral imaging device
US12101569B2 (en) Techniques for correcting oversaturated pixels in shutterless FIR cameras
US20180061008A1 (en) Imaging system and method
US10511793B2 (en) Techniques for correcting fixed pattern noise in shutterless FIR cameras
US20190005625A1 (en) Techniques for scene-based nonuniformity correction in shutterless fir cameras
JP2007293672A (en) Photographing apparatus for vehicle and soiling detection method for photographing apparatus for vehicle
EP3336747A1 (en) Rain detection with a camera
EP2589513A1 (en) Vision system and method for a motor vehicle
US10650250B2 (en) Determination of low image quality of a vehicle camera caused by heavy rain
US10073261B2 (en) Vehicle vision system camera with enhanced water removal
EP2230496A1 (en) Method and system for automatically detecting objects in front of a motor vehicle
EP2352013B1 (en) A vision system and method for a motor vehicle
EP3306522A1 (en) Device for determining a region of interest on and/or within a vehicle windscreen
KR101684782B1 (en) Rain sensing type wiper apparatus
KR101745261B1 (en) Sun visor control apparatus and method
US11252344B2 (en) Method and system for generating multiple synchronized thermal video streams for automotive safety and driving systems

Legal Events

Date Code Title Description
AS Assignment

Owner name: AUTOLIV ASP, INC., UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KORMOS, ALEXANDER LOUIS;MATHIEU, LOUIS-JOSEPH;REEL/FRAME:039897/0451

Effective date: 20160929

AS Assignment

Owner name: VEONEER US, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTOLIV ASP, INC.;REEL/FRAME:046392/0039

Effective date: 20180518

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: VEONEER US, LLC, MICHIGAN

Free format text: CHANGE OF NAME;ASSIGNOR:VEONEER US, INC.;REEL/FRAME:061048/0615

Effective date: 20220401