CN116542893B - Water surface oil spill detection method and device, electronic equipment and storage medium - Google Patents
Water surface oil spill detection method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116542893B CN116542893B CN202310804037.9A CN202310804037A CN116542893B CN 116542893 B CN116542893 B CN 116542893B CN 202310804037 A CN202310804037 A CN 202310804037A CN 116542893 B CN116542893 B CN 116542893B
- Authority
- CN
- China
- Prior art keywords
- image
- polarization
- infrared
- fusion
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000003305 oil spill Substances 0.000 title claims abstract description 92
- 238000001514 detection method Methods 0.000 title claims abstract description 62
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 title claims abstract description 44
- 230000010287 polarization Effects 0.000 claims abstract description 324
- 230000004927 fusion Effects 0.000 claims abstract description 170
- 238000000034 method Methods 0.000 claims abstract description 28
- 239000013598 vector Substances 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 14
- 238000000354 decomposition reaction Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 11
- 238000004422 calculation algorithm Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 238000007500 overflow downdraw method Methods 0.000 description 7
- 230000000737 periodic effect Effects 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000009466 transformation Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000004215 Carbon black (E152) Substances 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000000746 body region Anatomy 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 229930195733 hydrocarbon Natural products 0.000 description 1
- 150000002430 hydrocarbons Chemical class 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000003208 petroleum Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A20/00—Water conservation; Efficient water supply; Efficient water use
- Y02A20/20—Controlling water pollution; Waste water treatment
- Y02A20/204—Keeping clear the surface of open water from oil spills
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides a method and a device for detecting water surface oil spill, electronic equipment and a storage medium. The method comprises the following steps: respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired for a target area; fusing the infrared intensity image and the infrared polarized image to obtain a first fused image; fusing the first fused image and the visible light image to obtain a second fused image; and detecting oil spill of the target area based on the second fusion image. By applying the embodiment of the application, the contrast ratio of the oil spilling region and the background can be improved through the first fused image obtained by fusing the infrared intensity image and the infrared polarized image. And by utilizing the characteristic that the visible light contains abundant detail information, a more accurate oil spilling region is obtained by fusing the visible light image and the first fused image, and the accuracy of oil spilling detection is improved.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and apparatus for detecting water surface oil spill, an electronic device, and a storage medium.
Background
Petroleum entering the ocean has multiple sources, and has different types of spilled oil pollution generated by human activities and thin oil films on the sea surface formed by leakage of natural hydrocarbon on the sea bottom. Large oil spill pollution can cause marine environmental disasters, and therefore, sea surface oil spill monitoring is an important part of marine environmental monitoring.
The sea surface oil spill monitoring is to monitor the sea surface oil spill situation for a long time, and specifically comprises the steps of collecting images of sea surface areas according to a certain time period, detecting the sea surface oil spill of the images after each image is collected, and determining whether oil spill occurs in the current sea surface areas. As an important part in sea surface oil spill monitoring, how to perform sea surface oil spill detection is an important issue.
Disclosure of Invention
The embodiment of the application aims to provide a method, a device, electronic equipment and a storage medium for detecting water surface oil spill so as to realize water surface oil spill detection. The specific technical scheme is as follows:
the application provides a water surface oil spill detection method, which comprises the following steps:
respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired for a target area;
fusing the infrared intensity image and the infrared polarized image to obtain a first fused image;
Fusing the first fused image and the visible light image to obtain a second fused image;
and detecting oil spill of the target area based on the second fusion image.
In one possible embodiment, the fusing the infrared intensity image and the infrared polarized image to obtain a first fused image includes:
obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel;
fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image;
and fusing the infrared intensity image and the polarization characteristic fusion image to obtain a first fusion image.
In a possible embodiment, the fusing the polarization degree feature image and the polarization angle feature image to obtain a polarization feature fused image includes:
Determining fused pixel values of all target pixels in the infrared polarized image based on pixel values of all pixels in the polarization degree characteristic image and pixel values of all pixels in the polarization angle characteristic image to obtain a polarization characteristic fused image; the fused pixel value of the target pixel is inversely related to the sum of square values of all pixels in the polarization degree characteristic image and the sum of square values of all pixels in the polarization angle characteristic image, and is positively related to the sum of square values of all pixels in the polarization degree characteristic image, the sum of square values of all pixels in the polarization angle characteristic image, the pixel value of the target pixel in the polarization degree characteristic image and the pixel value of the target pixel in the polarization angle characteristic image.
In a possible embodiment, the infrared polarized image is an infrared polarized image acquired by taking a first preset angle, a second preset angle, a third preset angle and a fourth preset angle as focal plane angles;
the obtaining the polarization degree characteristic image and the polarization angle characteristic image of the infrared polarization image based on the infrared polarization image comprises the following steps:
Obtaining a Stokes vector of the infrared polarized image based on infrared intensity information on the first preset angle, the second preset angle, the third preset angle and the fourth preset angle in the infrared polarized image; the Stokes vector comprises total reflected light intensity, first linear polarized light intensity and second linear polarized light intensity;
and obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the Stokes vector.
In a possible embodiment, the fusing the first fused image and the visible light image to obtain a second fused image includes:
performing wavelet decomposition on the first fusion image and the visible light image to obtain a first low-frequency coefficient and a first high-frequency coefficient of the first fusion image, and obtaining a second low-frequency coefficient and a second high-frequency coefficient of the visible light image;
fusing the first low-frequency coefficient and the second low-frequency coefficient to obtain a low-frequency fusion coefficient;
fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient;
And carrying out wavelet reconstruction on the visible light image and the first fusion image based on the low-frequency fusion coefficient and the high-frequency fusion coefficient to obtain a second fusion image.
In a possible embodiment, the fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient includes:
determining a correlation between the first high frequency coefficient and the second high frequency coefficient;
if the correlation is smaller than a preset correlation threshold, carrying out weighted fusion on the first high-frequency coefficient and the second high-frequency coefficient according to a preset weight to obtain a high-frequency fusion coefficient;
and if the correlation degree is not smaller than a preset correlation degree threshold value, taking the larger value of the first high-frequency coefficient and the second high-frequency coefficient as a high-frequency fusion coefficient.
In one possible embodiment, the method further comprises:
detecting a potential oil spill area of the target area based on the first fusion image;
the detecting the oil spill of the target area based on the second fusion image includes:
and if the potential oil spilling region exists in the target region, detecting the oil spilling of the target region based on the second fusion image.
The embodiment of the application also provides a water surface oil spill detection device, which comprises:
the image acquisition module is used for respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired for the target area;
the first fusion image acquisition module is used for fusing the infrared intensity image and the infrared polarized image to obtain a first fusion image;
the second fusion image acquisition module is used for fusing the first fusion image and the visible light image to obtain a second fusion image;
and the oil spill detection module is used for detecting oil spill of the target area based on the second fusion image.
In one possible embodiment, the fusing the infrared intensity image and the infrared polarized image to obtain a first fused image includes:
obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel;
Fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image;
and fusing the infrared intensity image and the polarization characteristic fusion image to obtain a first fusion image.
In a possible embodiment, the fusing the polarization degree feature image and the polarization angle feature image to obtain a polarization feature fused image includes:
determining fused pixel values of all target pixels in the infrared polarized image based on pixel values of all pixels in the polarization degree characteristic image and pixel values of all pixels in the polarization angle characteristic image to obtain a polarization characteristic fused image; the fused pixel value of the target pixel is inversely related to the sum of square values of all pixels in the polarization degree characteristic image and the sum of square values of all pixels in the polarization angle characteristic image, and is positively related to the sum of square values of all pixels in the polarization degree characteristic image, the sum of square values of all pixels in the polarization angle characteristic image, the pixel value of the target pixel in the polarization degree characteristic image and the pixel value of the target pixel in the polarization angle characteristic image.
In a possible embodiment, the infrared polarized image is an infrared polarized image acquired by taking a first preset angle, a second preset angle, a third preset angle and a fourth preset angle as focal plane angles;
the obtaining the polarization degree characteristic image and the polarization angle characteristic image of the infrared polarization image based on the infrared polarization image comprises the following steps:
obtaining a Stokes vector of the infrared polarized image based on infrared intensity information on the first preset angle, the second preset angle, the third preset angle and the fourth preset angle in the infrared polarized image; the Stokes vector comprises total reflected light intensity, first linear polarized light intensity and second linear polarized light intensity;
and obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the Stokes vector.
In a possible embodiment, the fusing the first fused image and the visible light image to obtain a second fused image includes:
performing wavelet decomposition on the first fusion image and the visible light image to obtain a first low-frequency coefficient and a first high-frequency coefficient of the first fusion image, and obtaining a second low-frequency coefficient and a second high-frequency coefficient of the visible light image;
Fusing the first low-frequency coefficient and the second low-frequency coefficient to obtain a low-frequency fusion coefficient;
fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient;
and carrying out wavelet reconstruction on the visible light image and the first fusion image based on the low-frequency fusion coefficient and the high-frequency fusion coefficient to obtain a second fusion image.
In a possible embodiment, the fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient includes:
determining a correlation between the first high frequency coefficient and the second high frequency coefficient;
if the correlation is smaller than a preset correlation threshold, carrying out weighted fusion on the first high-frequency coefficient and the second high-frequency coefficient according to a preset weight to obtain a high-frequency fusion coefficient;
and if the correlation degree is not smaller than a preset correlation degree threshold value, taking the larger value of the first high-frequency coefficient and the second high-frequency coefficient as a high-frequency fusion coefficient.
In one possible embodiment, the apparatus further comprises:
the potential oil spill region detection module is used for detecting the potential oil spill region of the target region based on the first fusion image;
The oil spill detection module performs oil spill detection on the target area based on the second fusion image, and includes:
and if the potential oil spilling region exists in the target region, detecting the oil spilling of the target region based on the second fusion image.
The embodiment of the application also provides a water surface oil spill detection system, which comprises: a camera and a digital signal processing module;
the camera is used for respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired aiming at a target area, and transmitting the acquired images to the digital signal processing module;
the digital signal processing module is used for fusing the infrared intensity image and the infrared polarized image to obtain a first fused image; detecting a potential oil spill area of the target area based on the first fusion image; if the potential oil spilling area exists in the target area is detected, controlling the cradle head to stop rotating, fusing the first fused image and the visible light image to obtain a second fused image, and detecting the oil spilling of the target area based on the second fused image; if the oil spilling area exists in the target area, uploading an alarm picture; and if the oil spilling region is detected to be not in the target region, controlling the cradle head to resume cruising.
The embodiment of the application also provides electronic equipment, which comprises:
a memory for storing a computer program;
and the processor is used for realizing any one of the above water surface oil spill detection methods when executing the program stored in the memory.
The embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and the computer program realizes any one of the above water surface oil spill detection methods when being executed by a processor.
The embodiment of the application also provides a computer program product containing instructions, which when run on a computer, cause the computer to execute the water surface oil spill detection method.
The embodiment of the application has the beneficial effects that:
in the water surface oil spill detection method provided by the embodiment of the application, the infrared polarized image collected for the target area and the infrared intensity image collected for the target area are fused to obtain the first fused image, the first fused image and the visible light image collected for the target area are fused to obtain the second fused image, and the oil spill detection is performed on the target area based on the second fused image. By applying the embodiment of the application, because the oil spilling region has the difference with the polarization information, the temperature information and the like carried by the light reflected by the water body, the oil spilling detection of the target region can be realized based on the second fused image obtained by fusing the infrared intensity image, the infrared polarization image and the visible light image, the edge contour and the texture detail information of the oil spilling region can be enhanced by the first fused image obtained by fusing the infrared intensity image and the infrared polarization image, the contrast ratio of the oil spilling region and the background is improved, and the oil spilling region is easier to detect. And the characteristic that the visible light contains abundant detail information is utilized, and more detailed oil spilling information is obtained by fusing the visible light image and the first fused image, so that a more accurate oil spilling area is obtained, and the accuracy of oil spilling detection is improved.
Of course, it is not necessary for any one product or method of practicing the application to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the application, and other embodiments may be obtained according to these drawings to those skilled in the art.
Fig. 1 is a schematic flow chart of a water surface oil spill detection method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a wavelet transform tower structure;
FIG. 3 is a flow chart of a water surface oil spill detection method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a water surface oil spill detecting device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a water surface oil spill detection system according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments of the present application, all other embodiments obtained by the person skilled in the art based on the present application are included in the scope of protection of the present application.
In order to realize water surface oil spill detection, the embodiment of the application provides a water surface oil spill detection method, a device, electronic equipment and a storage medium. The following first describes an exemplary method for detecting water surface oil spill according to the embodiment of the present application.
The water surface oil spill detection method provided by the embodiment of the application can be applied to electronic equipment. The electronic device may be a computer, a server, a mobile terminal, or the like. The present application is not particularly limited thereto.
As shown in fig. 1, fig. 1 is a schematic flow chart of a water surface oil spill detection method according to an embodiment of the present application, which specifically may include the following steps:
step S101, respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired for a target area;
step S102, fusing the infrared intensity image and the infrared polarized image to obtain a first fused image;
step S103, fusing the first fused image and the visible light image to obtain a second fused image;
and step S104, detecting oil spill of the target area based on the second fusion image.
By applying the embodiment of the application, because the oil spilling region has the difference with the polarization information, the temperature information and the like carried by the light reflected by the water body, the oil spilling detection of the target region can be realized based on the second fused image obtained by fusing the infrared intensity image, the infrared polarization image and the visible light image, the edge contour and the texture detail information of the oil spilling region can be enhanced by the first fused image obtained by fusing the infrared intensity image and the infrared polarization image, the contrast ratio of the oil spilling region and the background is improved, and the oil spilling region is easier to detect. And the characteristic that the visible light contains abundant detail information is utilized, and more detailed oil spilling information is obtained by fusing the visible light image and the first fused image, so that a more accurate oil spilling area is obtained, and the accuracy of oil spilling detection is improved.
The above steps S101 to S104 are exemplarily described below:
in step S101, the target area may be a water surface area of any size. The water surface area may include a water body, an oil spill area, or only a water body or an oil spill area, wherein the water body is an area not covered by oil, and the oil spill area is an area covered by oil.
The infrared intensity image and the infrared polarized image can be respectively collected by an infrared camera and an infrared polarized camera, or can be collected by the same infrared polarized camera. The infrared camera and the infrared polarization camera can be integrated in the same camera to form a multi-source camera, so that the simultaneous acquisition of infrared intensity images and infrared polarization images is realized. In the infrared camera, an infrared filter is additionally arranged between a lens and a pixel to allow infrared light in a certain wave band to pass through, and one pixel corresponds to one pixel in the acquired image.
In the infrared polarization camera, four polarizing plates with different angles are respectively arranged on four adjacent pixels, and each group of four pixels is used as a calculation unit and corresponds to one pixel in the acquired image. Because the polaroid can only let the light in a certain direction pass through, the polarization information in different directions in the picture object can be collected through the polaroids with four different angles and is directly reflected in the shot picture. The polarizing plate in the infrared polarization camera is also called an infrared focal plane splitting device, and the optical plane corresponding to the polarizing plate may be called a focal plane splitting device. The angle of the focal plane can be preset according to actual needs. For example, the angles of the four focal planes may be 0 °, 90 °, 45 °, and 135 °, respectively, wherein the 0 ° reference plane may be set according to actual needs. Of course, the angles of the four focal planes may be other angles, as long as it is satisfied that, among the four focal planes, there is a 45 degree difference between the angles of two focal planes, and a 45 degree difference between the angles of the other two focal planes.
The infrared intensity image shows the infrared intensity information of the target area, and the infrared polarized image shows the polarized intensity information of the target area in different directions. The visible light image is acquired by a visible light camera. As a specific implementation manner, the infrared camera, the infrared polarization camera and the visible light camera can be integrated into the same camera, and the camera is a multi-source camera, so that the simultaneous acquisition of the infrared intensity image, the infrared polarization image and the visible light image can be realized through the multi-source camera.
The infrared camera, the infrared polarization camera and the visible light camera can be carried on the same device, and image acquisition can be performed on the same region at the same time. The device may be an aircraft such as an unmanned plane, a movable cradle head, etc., which is not particularly limited in the present application. Of course, after the oil spill detection is performed on the target area based on the first fused image, the visible light image of the target area may be acquired by a visible light camera.
As an embodiment, after the infrared polarization camera collects the images of the target areas at the different angles, the images may be uploaded to a DSP (Digital Signal Processing ) module, and the DSP module performs the steps S102 to S104.
In step S102, the infrared intensity image and the infrared polarized image may be fused by any feasible image fusion method.
For example, the average value of the pixel values of the pixels at the same position in the infrared intensity image and the infrared polarized image can be calculated for each pixel in the infrared intensity image and the infrared polarized image to obtain a fused image. Of course, the fused image can also be obtained by selecting the maximum value of the pixels, weighting and summing the pixel values, averaging the pixel values, and the like.
In one possible embodiment, a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image can be obtained based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel; fusing the polarization degree characteristic images and the polarization angle characteristic images to obtain a polarization characteristic fused image; and fusing the infrared intensity image and the polarization characteristic fusion image to obtain a first fusion image.
In an embodiment, the polarization degree characteristic image is determined based on the total intensity of the reflected light and the sum of squares of the first linear polarization intensity and the second linear polarization intensity of each pixel, and the polarization angle characteristic image is determined based on the ratio of the first linear polarization intensity and the second linear polarization intensity. That is, the polarized light characteristic image and the polarized angle characteristic image are determined based on the sum of squares and the ratio of the same data, respectively, reflecting information on different aspects of the same data, and therefore, the information contained in the polarized light characteristic image and the polarized angle characteristic image are complementary. Therefore, the polarization degree characteristic image and the polarization angle characteristic image are fused, so that the polarization characteristics in the infrared polarization image can be enhanced, more texture information is reserved, the polarization information of a target area is highlighted and reserved in the fused image, and the accuracy of oil spill detection is improved.
The polarization degree characteristic image and the polarization angle characteristic image can be obtained based on the polarization intensity information of the target area in the infrared polarization image in different directions.
In one possible embodiment, the above-mentioned polarization degree feature image and polarization angle feature image may be obtained by:
Step S121, obtaining stokes vectors of the infrared polarized image based on the infrared intensity information on the first preset angle, the second preset angle, the third preset angle and the fourth preset angle in the infrared polarized image; the Stokes vector comprises total reflected light intensity, first linear polarized light intensity and second linear polarized light intensity;
and step S122, obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the Stokes vector.
In step S121, for each pixel, a Stokes (Stokes) vector of the pixel may be obtained from the infrared intensity information on the four focal planes.
In polarization detection, stokes vectors (I, Q, U, V) are commonly used to characterize polarization information. Wherein I represents the total intensity of reflected light, Q represents the intensity of linearly polarized light in the horizontal direction, U represents the intensity of linearly polarized light in the direction forming an angle of 45 ° with the horizontal direction, and V represents the circularly polarized light vector, which can be set according to practical needs. In reflection of earth objects, the circularly polarized light vector V tends to be small, and relative instrument errors are negligible, often approximately zero.
Thus, the value of I, Q, U in the stokes vector can be obtained for each pixel in step S121 based on the infrared intensity information acquired at the four focal planes. Taking the angles of the four focal planes of 0 °, 90 °, 45 °, and 135 ° as an example, the above I, Q, U value can be calculated by the following formula:
;
in the formula, I (0 °) represents the infrared intensity at the 0 ° focal plane, I (45 °) represents the infrared intensity at the 45 ° focal plane, I (90 °) represents the infrared intensity at the 90 ° focal plane, and I (135 °) represents the infrared intensity at the 135 ° focal plane.
For convenience of description, the intensity Q of the linear polarization in the horizontal direction is hereinafter referred to as a first linear polarization intensity, and the intensity U of the linear polarization in the direction forming an angle of 45 ° with the horizontal direction is hereinafter referred to as a second linear polarization intensity.
In step S122, for each pixel, the polarization degree P and the polarization angle θ of the pixel can be obtained based on the total reflected light intensity, the first linear polarization intensity, and the second linear polarization intensity in the stokes vector corresponding to the pixel.
As a specific embodiment, the polarization degree P and the polarization angle θ of each pixel can be obtained by the following formula:
then, the pixel value of each pixel may be obtained based on the polarization degree and the polarization angle of each pixel, and as a specific embodiment, the polarization degree P of each pixel may be used as the pixel value of the corresponding pixel to form a polarization degree feature image, and the polarization angle θ of each pixel may be used as the pixel value of the corresponding pixel to obtain a polarization angle feature image.
After obtaining the polarization degree characteristic image and the polarization angle characteristic image of the infrared polarization image, the polarization degree characteristic image and the polarization angle characteristic image may be fused.
As can be seen from the above formula, the polarization degree characteristic image is determined based on the total intensity of the reflected light and the sum of squares of the first linear polarization intensity and the second linear polarization intensity of each pixel, and the polarization angle characteristic image is determined based on the ratio of the first linear polarization intensity and the second linear polarization intensity, reflecting information on different aspects of the same data.
It can be seen that the information contained in the polarization degree feature image is redundant and complementary. Therefore, the polarization degree characteristic image and the polarization angle characteristic image can be fused to strengthen the polarization characteristics in the infrared polarization image, and meanwhile, more texture information is reserved.
For example, for each target pixel in the infrared polarized image, determining a fused pixel value of the target pixel based on a pixel value of each pixel in the polarization degree feature image and a pixel value of each pixel in the polarization angle feature image, so as to obtain a polarization feature fused image.
In a possible embodiment, the fused pixel value of the target pixel is inversely related to the sum of the square of the pixel values of the pixels in the polarization degree feature image and the sum of the square of the pixel values of the pixels in the polarization angle feature image, and is positively related to the sum of the square of the pixel values of the pixels in the polarization degree feature image, the sum of the square of the pixel values of the pixels in the polarization angle feature image, the pixel value of the target pixel in the polarization degree feature image, and the pixel value of the target pixel in the polarization angle feature image.
As a specific embodiment, the polarization feature fusion image PI may be obtained based on the pixel values of each pixel in the polarization degree feature image and the polarization angle feature image by the following formula:
;
in this formula, W, H represents the total number of pixels in a row and the total number of pixels on each column in the image,representing the degree of polarization of the pixel, ">The polarization angle of the pixel is represented, and PI is the pixel value after the pixel is fused.
Of course, in one possible embodiment, the preset coefficient may also be multiplied by the whole to the right of the above equation. The above-mentioned coefficient may be preset according to actual conditions, and may be, for example, 0.95, 0.97, or the like.
As another specific embodiment, the average value of the pixel values of the target pixel in the polarization degree characteristic image and the polarization angle characteristic image may be used as the fused pixel value of the target pixel, which is not particularly limited in the present application.
Since the polarization degree P and the polarization angle θ are different in the range of values, in one possible embodiment, the polarization degree and the polarization angle of each pixel may be normalized. Illustratively, the degree of polarization and the polarization angle of each pixel may be normalized by the following formula:
;
In another possible embodiment, the polarization degree and the polarization angle of each pixel may be normalized by the following formula:
;
any feasible image fusion method may be used in fusing the polarization feature fusion image and the infrared intensity image. By way of example, a laplacian pyramid algorithm, a max/min value based image fusion method, or a pixel weighted average based image fusion method, etc. may be used.
In one possible embodiment, a weighted fusion algorithm may be used. Because the calculation complexity of the weighted fusion algorithm is low, the speed of image fusion can be accelerated to a certain extent.
By way of example, the infrared intensity image and the polarization feature fusion image may be fused by the following formula:
;
in the formula, I represents an infrared intensity image, PI represents a polarization feature fusion image, IF represents a first fusion image, alpha is a preset weight, and the setting can be specifically performed according to actual needs, for example, the setting can be 0.8, 0.7 and the like.
After the first fusion image is obtained, oil spill detection can be performed on the target area based on the first fusion image. For example, since the color difference between the oil spilling region and the water body region is generally larger, that is, the pixel value difference of the corresponding pixel is larger, edge detection can be performed on the first fusion image to determine whether there is a region with larger pixel value change in the first fusion image, if the target regions are all water bodies, the pixel value of the target region in the image will not change greatly, and if there is no region with larger pixel value change, it is determined that no oil spilling occurs in the target region, and if there is a region with larger pixel value change, it is determined that oil spilling occurs in the target region. As another specific implementation manner, because the temperature and polarization characteristics of the oil spilling region are different from those of the normal water surface, and the pixel values of the pixel points in the first fusion image can show the temperature and polarization characteristics, the pixel values of the pixel points of the oil spilling region in the first fusion image are different from those of the pixel points of the water body and belong to different value intervals, and therefore whether the region with the pixel value larger than or smaller than the preset pixel threshold value exists in the first fusion image can be judged to judge whether the oil spilling occurs in the target region.
Although texture information, temperature information, and the like of the target area are included in the infrared intensity image and the infrared polarization image, oil spill detection can be performed on the target area based on the image information included in the first fusion image. But the infrared intensity image contains less optical information in the infrared polarized image. Therefore, in order to further improve the accuracy of the oil spill detection in the target area, it is also necessary to combine the visible light image on the basis of the first fused image, that is, to execute the aforementioned S103.
In step S103, the first fused image and the visible light image may be fused by any feasible image fusion method.
For example, for each pixel in the first fused image and the visible light image, an average value of pixel values of the pixels at the same position in the first fused image and the visible light image may be calculated to obtain a fused image. Of course, the fused image can also be obtained by selecting the maximum value of the pixels, weighting and summing the pixel values, averaging the pixel values, and the like.
In one possible embodiment, the first fused image and the visible light image may be fused using a modulated polarization image fusion method based on wavelet transform.
As a specific embodiment, the first fused image and the visible light image may be fused by:
step S221, performing wavelet decomposition on the first fusion image and the visible light image respectively to obtain a first low-frequency coefficient and a first high-frequency coefficient of the first fusion image, and to obtain a second low-frequency coefficient and a second high-frequency coefficient of the visible light image;
step S222, fusing the first low-frequency coefficient and the second low-frequency coefficient to obtain a low-frequency fusion coefficient;
step S223, fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient;
and step 224, performing wavelet reconstruction on the visible light image and the first fusion image based on the low-frequency fusion coefficient and the high-frequency fusion coefficient to obtain a second fusion image.
In step S221, the first fused image and the visible light image may be subjected to wavelet transformation by using any feasible wavelet transformation algorithm. The wavelet transform is performed on a given signal by expanding the signal in a cluster of wavelet functions, i.e., the signal is represented as a linear combination of a series of wavelet functions of different scales and different time shifts, where the coefficients of each term are called wavelet coefficients.
As a specific embodiment, the Mallat wavelet transform algorithm (a fast algorithm for tower multi-resolution analysis and reconstruction) may be used to perform N-layer wavelet decomposition on the images V and IF, and the tower structure of the wavelet transform is shown in fig. 2.
The Mallat algorithm firstly performs one-dimensional wavelet transformation on each line of the image, namely filters the image according to a preset wavelet function to obtain two parts of low frequency L and high frequency H. Then, one-dimensional wavelet transformation is performed on each column of the obtained images of the low-frequency L portion and the high-frequency H portion, respectively, so as to obtain a low-frequency portion LL1, a high-frequency low-frequency portion HL1, a low-frequency high-frequency portion LH1 and a high-frequency portion HH1. The above procedure is to perform a layer of wavelet decomposition on the image. The two-layer wavelet decomposition is to repeat the above steps for the part LL1 obtained in the previous step, thereby obtaining a low-frequency high-frequency part LH2, a low-frequency part LL2, a high-frequency part HH2 and a high-frequency low-frequency part HL2 in LL 1. The three-layer wavelet decomposition is to repeat the above steps for the LL2 part obtained in the previous step, thereby obtaining a low-frequency high-frequency part LH3, a low-frequency part LL3, a high-frequency part HH3 and a high-frequency low-frequency part HL3 in LL 2. The N-layer wavelet decomposition is to carry out wavelet decomposition on the LLn-1 part obtained in the previous step. The value of N may be selected according to practical needs, for example, may be 2, 3, etc., which is not particularly limited in the present application.
In one possible embodiment, the visible light image V and the first fused image IF are subjected to a layer of wavelet decomposition by using a Mallat wavelet transform algorithm, so as to obtain 10 coefficients, where the low frequency coefficients include: visible light image and low frequency part of low frequency of first fusion image、/>The high frequency coefficient includes a high frequency part of the visible light image V low frequency +.>Low frequency part of high frequency->And a high-frequency part of high frequency +.>The high frequency coefficients include the high frequency part of the first fused image IF low frequency ∈>Low frequency part of high frequency->And a high-frequency part of high frequency +.>. For convenience of description, the low frequency coefficient of the first fused image IF is referred to herein as a first low frequency coefficient, the low frequency coefficient of the visible light image V is referred to as a second low frequency coefficient, the high frequency coefficient of the first fused image IF is referred to as a first high frequency coefficient, and the high frequency coefficient of the visible light image V is referred to as a second high frequency coefficient.
Because the low-frequency image reflects most of information in the original image, a modulation fusion method can be adopted when the low-frequency part is fused, namely the low-frequency coefficients are directly overlapped, so that the complete information in the image is kept as much as possible.
Exemplary, the first low frequency coefficient for each pixel can be calculated by the following formula Second low-frequency coefficient +.>Fusing to obtain low-frequency fusion coefficient +.>:
;
In the formula, the value range of n is related to the brightness of the source diagram, and can be 0-1, such as 0.7, 0.68, 0.71 and the like.
For the high-frequency part, as the high-frequency coefficient reflects the edge detail characteristic of the original image, the high-frequency part can be fused by adopting a coefficient weighting fusion rule along the horizontal direction, the vertical direction and the diagonal direction of the wavelet transformation tower structure.
As a specific embodiment, the above-mentioned first high-frequency coefficient and second high-frequency coefficient may be fused by:
determining a correlation between the first high frequency coefficient and the second high frequency coefficient; if the correlation is smaller than a preset correlation threshold, carrying out weighted fusion on the first high-frequency coefficient and the second high-frequency coefficient according to a preset weight to obtain a high-frequency fusion coefficient; and if the correlation degree is not smaller than a preset correlation degree threshold value, taking the larger value of the first high-frequency coefficient and the second high-frequency coefficient as a high-frequency fusion coefficient.
The correlation degree of the first high frequency coefficient and the second high frequency coefficient can be determined by the following formula:
;
In the formula (i),representing the first high frequency coefficient,/->The second high frequency coefficient is represented, and C (x, y) is the correlation between the first high frequency coefficient and the second high frequency coefficient.
As described above, the first high frequency coefficient and the second high frequency coefficient each include a plurality of high frequency coefficients. In calculating the correlation between the high frequency coefficients by the above formula, it is necessary to determine the first high frequency coefficient and the second high frequency coefficient as the high frequency coefficient at the corresponding position in the wavelet transform tower structure, for example, whenIn particular +.>When (I)>In particular +.>。
Of course, the correlation between the first high frequency coefficient and the second high frequency coefficient may be calculated by other feasible methods, for example, the euclidean distance between the measuring coefficients may be calculated, which is not limited in detail in the present application.
If the correlation between the first high-frequency coefficient and the second high-frequency coefficient is higher than the preset correlation threshold T, it is indicated that the image detail information reflected by the first high-frequency coefficient and the second high-frequency coefficient is similar, so that a larger value of the first high-frequency coefficient and the second high-frequency coefficient can be selected as a high-frequency fusion coefficient to highlight the image detail information as much as possible. The correlation threshold may be set according to actual needs, for example, may be 0.5, 0.6, etc.
Illustratively, the high frequency fusion coefficient can be obtained by the following formula:
;
If the visible light high-frequency coefficient is not smaller than the high-frequency coefficient of the first fusion image, determining that the visible light high-frequency coefficient is the high-frequency fusion coefficient, and if the visible light high-frequency coefficient is smaller than the high-frequency coefficient of the first fusion image, determining that the high-frequency coefficient of the first fusion image is the high-frequency fusion coefficient.
When (when)When the difference is smaller than T, the difference between the image detail information contained in the first high-frequency coefficient and the image detail information contained in the second high-frequency coefficient is larger, and in order to keep the complete image information as far as possible, the first high-frequency coefficient and the second high-frequency coefficient can be subjected to weighted fusion to obtain a high-frequency fusion coefficient.
Illustratively, the high frequency fusion coefficient can be obtained by the following formula:
;
Wherein,,the weighting coefficient can be obtained by the following formula:
;
;
for the preset matching threshold, it may be 0.6, 0.7, etc.
Since the high-frequency coefficient retains more image information and the correlation between the first high-frequency coefficient and the second high-frequency coefficient is higher, it is explained that the visible light image and the image detail information contained in the first fused image are similar, so that a larger value in the first high-frequency coefficient and the second high-frequency coefficient can be considered to retain more complete detail characteristics, the larger value can be selected as the high-frequency fused coefficient, and the first high-frequency coefficient and the second high-frequency coefficient do not need to be weighted and fused. On the contrary, if the correlation between the first high-frequency coefficient and the second high-frequency coefficient is lower, it is indicated that the difference between the detail information contained in the visible light image and the first fusion image is larger, so that the detail features reserved in the first high-frequency coefficient and the second high-frequency coefficient are not complete enough, and the first high-frequency coefficient and the second high-frequency coefficient are integrated to obtain complete detail features, so that the first high-frequency coefficient and the second high-frequency coefficient are weighted and fused to obtain the high-frequency fusion coefficient. Therefore, when the first high-frequency coefficient and the second high-frequency coefficient are fused, the correlation between the first high-frequency coefficient and the second high-frequency coefficient is considered, so that the edge detail characteristics are as prominent and complete as possible, the edge detail characteristics in the obtained fused image are more prominent, the contrast ratio between the water surface and the oil spilling area is more obvious, and the accuracy of detecting the oil spilling on the water surface is improved.
After the low-frequency fusion coefficient and the high-frequency fusion coefficient are obtained, wavelet reconstruction based on the low-frequency fusion coefficient and the high-frequency fusion coefficient can be carried out on the visible light image and the first fusion image, and a second fusion image is obtained. The way how to perform the oil spill detection on the target area based on the fused image is described above, and will not be repeated here.
In the process of detecting the oil spill in the target area, the target area needs to be periodically detected, rather than detecting the target area only once. When the periodic detection is performed, if the oil spill detection is required to be performed on the target area based on the second fusion image, the first fusion image is required to be obtained by fusing the infrared intensity image and the infrared polarization image, and the second fusion image is required to be obtained by fusing the first fusion image and the visible light image in each detection, so that the calculation amount is large when the periodic oil spill detection is performed on the target area.
Based on this, in one possible embodiment, the target region may be subjected to potential oil spill region detection based on the first fused image; and if the potential oil spilling region exists in the target region, detecting the oil spilling of the target region based on the second fusion image.
By adopting the embodiment, when the periodic oil spill detection is carried out on the target area, the target area can be judged once through the first fusion image, whether the potential oil spill area exists in the target area is detected, and under the condition that the potential oil spill area is not detected, the fusion of the first fusion image and the visible light image is not carried out, so that a second fusion image is not required to be adopted when each detection is carried out, and the calculation amount when the periodic oil spill detection is carried out on the target area is reduced. And under the condition that the potential oil spilling area is detected, the first fusion image and the visible light image are fused to obtain a second fusion image, and the target area is secondarily judged based on the second fusion image, so that the accuracy of the detection result is improved. Therefore, the embodiment can reduce the calculated amount during the periodic oil spill detection under the condition of ensuring the accuracy of the periodic oil spill detection on the target area, thereby balancing the accuracy and the calculated amount during the periodic oil spill detection on the target area.
As shown in fig. 3, fig. 3 is a flowchart of a water surface oil spill detection method according to an embodiment of the present application, where the following steps may be executed by the DSP module, and may specifically include:
Step S301, visible light input and infrared focal plane polarization input are obtained.
The images acquired by the visible light camera and the infrared polarization camera are acquired, and the visible light camera and the infrared polarization camera are fixed on the same cradle head, so that the images of the target area are acquired from the same angle by the visible light camera and the infrared polarization camera as far as possible, and the image acquisition errors caused by the installation position of equipment are reduced. The cradle head can continuously rotate, and the target area image is collected according to a preset period so as to monitor the target area in real time. The preset period may be set according to actual needs, for example, may be 5 minutes, 10 minutes, etc.
Step S302, acquiring an infrared intensity image, a polarization degree characteristic image and a polarization angle characteristic image.
Step S303, carrying out first discrimination based on the infrared intensity image and the polarization characteristic fusion image, determining whether an oil spilling region exists, and if so, executing step S304; if not, step S306 is performed.
In the step, after the polarization degree feature image and the polarization angle feature image are fused to obtain a polarization feature fused image, the polarization feature fused image is fused with the infrared intensity image to obtain an infrared polarization fused image, and the first discrimination is performed based on the fused image.
Step S304, suspending rotation of the cradle head, carrying out secondary judgment based on fusion of the infrared polarization fusion image and the visible light image, determining whether oil spill occurs, executing step S305 if the oil spill occurs, and executing step S306 if the oil spill does not occur.
As a specific embodiment, a control signal may be sent to the pan-tilt control module to suspend rotation of the pan-tilt.
Step S305, uploading an alarm picture to perform manual determination, and executing step S306 when the manual determination is completed or the manual confirmation is overtime.
The alarm screen may include a visible light image, an infrared intensity image, a polarization degree characteristic image, a polarization angle characteristic image, and the like.
And step S306, the cradle head resumes cruising.
Therefore, after the embodiment of the application is applied, the information of the polarization degree and the polarization angle is represented by using the Stokes vector, and a group of image fusion rules are designed according to the difference of the polarization information of the oil spilling region and the water body and the characteristic that the polarized image can well weaken background noise, so that the fused image can well inhibit sea surface ripple noise, simultaneously, the edge contour and texture detail information of the oil spilling region are enhanced, and the contrast of the oil spilling region and the background is improved, so that the oil spilling region is easier to detect. When the oil spilling suspicious region is found, a secondary distinguishing process is started, and by utilizing the characteristic that the visible light contains abundant detail information, whether an alarm needs to be given or not and more detailed oil spilling information is acquired by fusing the visible light image and the infrared polarization fusion image, so that an accurate oil spilling region is acquired. By acquiring visible light, polarization angle, polarization degree and infrared intensity images, the method is mutually assisted for monitoring spilled oil on the sea surface, integrates information of data from different sources, and provides richer sea surface monitoring reference information on the premise of ensuring real-time performance.
In the technical scheme of the application, the related operations of acquiring, storing, using, processing, transmitting, providing, disclosing and the like of the image information are all performed under the condition that the user authorization is obtained.
The embodiment of the application also provides a device for detecting the oil spill on the water surface, as shown in fig. 4, the device comprises:
an image acquisition module 401, configured to acquire an infrared intensity image, an infrared polarization image, and a visible light image acquired for a target area, respectively;
a first fused image acquisition module 402, configured to fuse the infrared intensity image and the infrared polarized image to obtain a first fused image;
a second fused image obtaining module 403, configured to fuse the first fused image and the visible light image to obtain a second fused image;
and the oil spill detection module 404 is configured to perform oil spill detection on the target area based on the second fused image.
In one possible embodiment, the fusing the infrared intensity image and the infrared polarized image to obtain a first fused image includes:
obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel;
Fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image;
and fusing the infrared intensity image and the polarization characteristic fusion image to obtain a first fusion image.
In a possible embodiment, the fusing the polarization degree feature image and the polarization angle feature image to obtain a polarization feature fused image includes:
determining fused pixel values of all target pixels in the infrared polarized image based on pixel values of all pixels in the polarization degree characteristic image and pixel values of all pixels in the polarization angle characteristic image to obtain a polarization characteristic fused image; the fused pixel value of the target pixel is inversely related to the sum of square values of all pixels in the polarization degree characteristic image and the sum of square values of all pixels in the polarization angle characteristic image, and is positively related to the sum of square values of all pixels in the polarization degree characteristic image, the sum of square values of all pixels in the polarization angle characteristic image, the pixel value of the target pixel in the polarization degree characteristic image and the pixel value of the target pixel in the polarization angle characteristic image.
In a possible embodiment, the infrared polarized image is an infrared polarized image acquired by taking a first preset angle, a second preset angle, a third preset angle and a fourth preset angle as focal plane angles;
the obtaining the polarization degree characteristic image and the polarization angle characteristic image of the infrared polarization image based on the infrared polarization image comprises the following steps:
obtaining a Stokes vector of the infrared polarized image based on infrared intensity information on the first preset angle, the second preset angle, the third preset angle and the fourth preset angle in the infrared polarized image; the Stokes vector comprises total reflected light intensity, first linear polarized light intensity and second linear polarized light intensity;
and obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the Stokes vector.
In a possible embodiment, the fusing the first fused image and the visible light image to obtain a second fused image includes:
performing wavelet decomposition on the first fusion image and the visible light image to obtain a first low-frequency coefficient and a first high-frequency coefficient of the first fusion image, and obtaining a second low-frequency coefficient and a second high-frequency coefficient of the visible light image;
Fusing the first low-frequency coefficient and the second low-frequency coefficient to obtain a low-frequency fusion coefficient;
fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient;
and carrying out wavelet reconstruction on the visible light image and the first fusion image based on the low-frequency fusion coefficient and the high-frequency fusion coefficient to obtain a second fusion image.
In a possible embodiment, the fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient includes:
determining a correlation between the first high frequency coefficient and the second high frequency coefficient;
if the correlation is smaller than a preset correlation threshold, carrying out weighted fusion on the first high-frequency coefficient and the second high-frequency coefficient according to a preset weight to obtain a high-frequency fusion coefficient;
and if the correlation degree is not smaller than a preset correlation degree threshold value, taking the larger value of the first high-frequency coefficient and the second high-frequency coefficient as a high-frequency fusion coefficient.
In one possible embodiment, the apparatus further comprises:
the potential oil spill region detection module is used for detecting the potential oil spill region of the target region based on the first fusion image;
The oil spill detection module performs oil spill detection on the target area based on the second fusion image, and includes:
and if the potential oil spilling region exists in the target region, detecting the oil spilling of the target region based on the second fusion image.
The embodiment of the application also provides a water surface oil spill detection system, as shown in fig. 5, which can comprise: a camera 501 and a digital signal processing module 502;
the camera 501 is configured to acquire an infrared intensity image, an infrared polarization image, and a visible light image acquired for a target area, and send the acquired images to the digital signal processing module;
the digital signal processing module 502 is configured to fuse the infrared intensity image and the infrared polarized image to obtain a first fused image; detecting a potential oil spill area of the target area based on the first fusion image; if the potential oil spilling area exists in the target area is detected, controlling the cradle head to stop rotating, fusing the first fused image and the visible light image to obtain a second fused image, and detecting the oil spilling of the target area based on the second fused image; if the oil spilling area exists in the target area, uploading an alarm picture; and if the oil spilling region is detected to be not in the target region, controlling the cradle head to resume cruising.
The embodiment of the application also provides an electronic device, as shown in fig. 6, including:
a memory 601 for storing a computer program;
a processor 602, configured to execute a program stored in the memory 601, and implement the following steps:
respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired for a target area;
fusing the infrared intensity image and the infrared polarized image to obtain a first fused image;
fusing the first fused image and the visible light image to obtain a second fused image;
and detecting oil spill of the target area based on the second fusion image.
And the electronic device may further comprise a communication bus and/or a communication interface, through which the processor 602, the communication interface, and the memory 601 communicate with each other.
The communication bus mentioned above for the electronic device may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry StandardArchitecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (DigitalSignal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present application, a computer readable storage medium is provided, in which a computer program is stored, which when executed by a processor, implements the steps of any of the above-mentioned water surface oil spill detection methods.
In yet another embodiment of the present application, a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the water surface oil spill detection methods of the above embodiments is also provided.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, tape), an optical medium (e.g., DVD), or a Solid State Disk (SSD), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for apparatus, electronic devices, computer readable storage media and program product embodiments, the description is relatively simple as it is substantially similar to method embodiments, as relevant points are found in the partial description of method embodiments.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application are included in the protection scope of the present application.
Claims (11)
1. A method for detecting water surface oil spill, the method comprising:
respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired aiming at a target area, wherein the infrared polarization image is acquired by an infrared polarization camera;
fusing the infrared intensity image and the infrared polarized image to obtain a first fused image;
fusing the first fused image and the visible light image to obtain a second fused image;
detecting oil spill of the target area based on the second fusion image;
the fusing of the infrared intensity image and the infrared polarized image to obtain a first fused image comprises the following steps:
obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel;
Fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image;
fusing the infrared intensity image and the polarization characteristic fusion image to obtain a first fusion image;
the step of fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image comprises the following steps:
determining fused pixel values of all target pixels in the infrared polarized image based on pixel values of all pixels in the polarization degree characteristic image and pixel values of all pixels in the polarization angle characteristic image to obtain a polarization characteristic fused image; the fused pixel value of the target pixel is inversely related to the sum of square values of all pixels in the polarization degree characteristic image and the sum of square values of all pixels in the polarization angle characteristic image, and is positively related to the sum of square values of all pixels in the polarization degree characteristic image, the sum of square values of all pixels in the polarization angle characteristic image, the pixel value of the target pixel in the polarization degree characteristic image and the pixel value of the target pixel in the polarization angle characteristic image;
The fusing the first fused image and the visible light image to obtain a second fused image comprises the following steps:
performing wavelet decomposition on the first fusion image and the visible light image to obtain a first low-frequency coefficient and a first high-frequency coefficient of the first fusion image, and obtaining a second low-frequency coefficient and a second high-frequency coefficient of the visible light image;
fusing the first low-frequency coefficient and the second low-frequency coefficient to obtain a low-frequency fusion coefficient;
fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient;
and carrying out wavelet reconstruction on the visible light image and the first fusion image based on the low-frequency fusion coefficient and the high-frequency fusion coefficient to obtain a second fusion image.
2. The method of claim 1, wherein the infrared polarized image is an infrared polarized image acquired with a split focal plane angle of a first preset angle, a second preset angle, a third preset angle, and a fourth preset angle;
the obtaining the polarization degree characteristic image and the polarization angle characteristic image of the infrared polarization image based on the infrared polarization image comprises the following steps:
Obtaining a Stokes vector of the infrared polarized image based on infrared intensity information on the first preset angle, the second preset angle, the third preset angle and the fourth preset angle in the infrared polarized image; the Stokes vector comprises total reflected light intensity, first linear polarized light intensity and second linear polarized light intensity;
and obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the Stokes vector.
3. The method of claim 1, wherein the fusing the first high frequency coefficient and the second high frequency coefficient to obtain a high frequency fused coefficient comprises:
determining a correlation between the first high frequency coefficient and the second high frequency coefficient;
if the correlation is smaller than a preset correlation threshold, carrying out weighted fusion on the first high-frequency coefficient and the second high-frequency coefficient according to a preset weight to obtain a high-frequency fusion coefficient;
and if the correlation degree is not smaller than a preset correlation degree threshold value, taking the larger value of the first high-frequency coefficient and the second high-frequency coefficient as a high-frequency fusion coefficient.
4. The method according to claim 1, wherein the method further comprises:
detecting a potential oil spill area of the target area based on the first fusion image;
the detecting the oil spill of the target area based on the second fusion image includes:
and if the potential oil spilling region exists in the target region, detecting the oil spilling of the target region based on the second fusion image.
5. A water surface oil spill detection device, the device comprising:
the image acquisition module is used for respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired aiming at a target area, wherein the infrared polarization image is acquired by an infrared polarization camera;
the first fusion image acquisition module is used for fusing the infrared intensity image and the infrared polarized image to obtain a first fusion image;
the second fusion image acquisition module is used for fusing the first fusion image and the visible light image to obtain a second fusion image;
the oil spill detection module is used for detecting oil spill of the target area based on the second fusion image;
The fusing of the infrared intensity image and the infrared polarized image to obtain a first fused image comprises the following steps:
obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel;
fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image;
fusing the infrared intensity image and the polarization characteristic fusion image to obtain a first fusion image;
the step of fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image comprises the following steps:
determining fused pixel values of all target pixels in the infrared polarized image based on pixel values of all pixels in the polarization degree characteristic image and pixel values of all pixels in the polarization angle characteristic image to obtain a polarization characteristic fused image; the fused pixel value of the target pixel is inversely related to the sum of square values of all pixels in the polarization degree characteristic image and the sum of square values of all pixels in the polarization angle characteristic image, and is positively related to the sum of square values of all pixels in the polarization degree characteristic image, the sum of square values of all pixels in the polarization angle characteristic image, the pixel value of the target pixel in the polarization degree characteristic image and the pixel value of the target pixel in the polarization angle characteristic image;
The fusing the first fused image and the visible light image to obtain a second fused image comprises the following steps:
performing wavelet decomposition on the first fusion image and the visible light image to obtain a first low-frequency coefficient and a first high-frequency coefficient of the first fusion image, and obtaining a second low-frequency coefficient and a second high-frequency coefficient of the visible light image;
fusing the first low-frequency coefficient and the second low-frequency coefficient to obtain a low-frequency fusion coefficient;
fusing the first high-frequency coefficient and the second high-frequency coefficient to obtain a high-frequency fusion coefficient;
and carrying out wavelet reconstruction on the visible light image and the first fusion image based on the low-frequency fusion coefficient and the high-frequency fusion coefficient to obtain a second fusion image.
6. The apparatus of claim 5, wherein the infrared polarized image is an infrared polarized image acquired with a split focal plane angle of a first preset angle, a second preset angle, a third preset angle, and a fourth preset angle;
the obtaining the polarization degree characteristic image and the polarization angle characteristic image of the infrared polarization image based on the infrared polarization image comprises the following steps:
Obtaining a Stokes vector of the infrared polarized image based on infrared intensity information on the first preset angle, the second preset angle, the third preset angle and the fourth preset angle in the infrared polarized image; the Stokes vector comprises total reflected light intensity, first linear polarized light intensity and second linear polarized light intensity;
and obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the Stokes vector.
7. The apparatus of claim 5, wherein the means for fusing the first high frequency coefficient and the second high frequency coefficient to obtain a high frequency fused coefficient comprises:
determining a correlation between the first high frequency coefficient and the second high frequency coefficient;
if the correlation is smaller than a preset correlation threshold, carrying out weighted fusion on the first high-frequency coefficient and the second high-frequency coefficient according to a preset weight to obtain a high-frequency fusion coefficient;
and if the correlation degree is not smaller than a preset correlation degree threshold value, taking the larger value of the first high-frequency coefficient and the second high-frequency coefficient as a high-frequency fusion coefficient.
8. The apparatus of claim 5, wherein the apparatus further comprises:
the potential oil spill region detection module is used for detecting the potential oil spill region of the target region based on the first fusion image;
the oil spill detection module performs oil spill detection on the target area based on the second fusion image, and includes:
and if the potential oil spilling region exists in the target region, detecting the oil spilling of the target region based on the second fusion image.
9. A water surface oil spill detection system, comprising: a camera and a digital signal processing module;
the camera is used for respectively acquiring an infrared intensity image, an infrared polarization image and a visible light image which are acquired aiming at a target area, and transmitting the acquired images to the digital signal processing module;
the digital signal processing module is used for obtaining a polarization degree characteristic image and a polarization angle characteristic image of the infrared polarization image based on the infrared polarization image; wherein, the pixel value of each pixel in the polarization degree characteristic image is inversely related to the total intensity of the reflected light of the pixel and is positively related to the square sum of the first linear polarization intensity and the second linear polarization intensity of the pixel; the pixel value of each pixel in the polarization angle characteristic image is determined based on the ratio of the first linear polarized light intensity to the second linear polarized light intensity of the pixel; fusing the polarization degree characteristic image and the polarization angle characteristic image to obtain a polarization characteristic fused image; fusing the infrared intensity image and the infrared polarized image to obtain a first fused image; determining fused pixel values of all target pixels in the infrared polarized image based on pixel values of all pixels in the polarization degree characteristic image and pixel values of all pixels in the polarization angle characteristic image to obtain a polarization characteristic fused image; the fused pixel value of the target pixel is inversely related to the sum of square values of all pixels in the polarization degree characteristic image and the sum of square values of all pixels in the polarization angle characteristic image, and is positively related to the sum of square values of all pixels in the polarization degree characteristic image, the sum of square values of all pixels in the polarization angle characteristic image, the pixel value of the target pixel in the polarization degree characteristic image and the pixel value of the target pixel in the polarization angle characteristic image; detecting a potential oil spill area of the target area based on the first fusion image; if the potential oil spilling area exists in the target area is detected, controlling the cradle head to stop rotating, fusing the first fused image and the visible light image to obtain a second fused image, and detecting the oil spilling of the target area based on the second fused image; if the oil spilling area exists in the target area, uploading an alarm picture; and if the oil spilling region is detected to be not in the target region, controlling the cradle head to resume cruising.
10. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the method of any of claims 1-4 when executing a program stored on a memory.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310804037.9A CN116542893B (en) | 2023-07-03 | 2023-07-03 | Water surface oil spill detection method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310804037.9A CN116542893B (en) | 2023-07-03 | 2023-07-03 | Water surface oil spill detection method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116542893A CN116542893A (en) | 2023-08-04 |
CN116542893B true CN116542893B (en) | 2023-10-10 |
Family
ID=87458094
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310804037.9A Active CN116542893B (en) | 2023-07-03 | 2023-07-03 | Water surface oil spill detection method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116542893B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108492274A (en) * | 2018-04-03 | 2018-09-04 | 中国人民解放军国防科技大学 | Long-wave infrared polarization feature extraction and fusion image enhancement method |
WO2021093283A1 (en) * | 2019-11-14 | 2021-05-20 | 青岛理工大学 | Sea surface small-area oil spill region detection system and detection method based on multi-sensing fusion |
CN114119436A (en) * | 2021-10-08 | 2022-03-01 | 中国安全生产科学研究院 | Infrared image and visible light image fusion method and device, electronic equipment and medium |
CN114863261A (en) * | 2022-05-05 | 2022-08-05 | 南通智能感知研究院 | Water surface oil spill detection algorithm based on visible light infrared image |
-
2023
- 2023-07-03 CN CN202310804037.9A patent/CN116542893B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108492274A (en) * | 2018-04-03 | 2018-09-04 | 中国人民解放军国防科技大学 | Long-wave infrared polarization feature extraction and fusion image enhancement method |
WO2021093283A1 (en) * | 2019-11-14 | 2021-05-20 | 青岛理工大学 | Sea surface small-area oil spill region detection system and detection method based on multi-sensing fusion |
CN114119436A (en) * | 2021-10-08 | 2022-03-01 | 中国安全生产科学研究院 | Infrared image and visible light image fusion method and device, electronic equipment and medium |
CN114863261A (en) * | 2022-05-05 | 2022-08-05 | 南通智能感知研究院 | Water surface oil spill detection algorithm based on visible light infrared image |
Non-Patent Citations (1)
Title |
---|
基于偏振信息融合的海洋背景目标检测;梁远安;易维宁;黄红莲;;大气与环境光学学报(01);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN116542893A (en) | 2023-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102901489B (en) | Surface gathered water accumulated ice detection method and device | |
CN110825900A (en) | Training method of feature reconstruction layer, reconstruction method of image features and related device | |
US8705890B2 (en) | Image alignment | |
CN114120176B (en) | Behavior analysis method for fusing far infrared and visible light video images | |
WO2023284656A1 (en) | Unmanned aerial vehicle detection method and system based on infrared polarization | |
CN110415555A (en) | A kind of recognition methods of effective scribing line parking stall and system based on deep learning | |
CN115293992B (en) | Polarization image defogging method and device based on unsupervised weight depth model | |
CN113408454A (en) | Traffic target detection method and device, electronic equipment and detection system | |
Babu et al. | An efficient image dahazing using Googlenet based convolution neural networks | |
CN116542893B (en) | Water surface oil spill detection method and device, electronic equipment and storage medium | |
CN114549642B (en) | Low-contrast infrared dim target detection method | |
CN109325912B (en) | Reflection separation method based on polarized light field and calibration splicing system | |
CN106778822B (en) | Image straight line detection method based on funnel transformation | |
Kusetogullari et al. | Unsupervised change detection in landsat images with atmospheric artifacts: a fuzzy multiobjective approach | |
Zhang | Investigations of image fusion | |
Al Mansoori et al. | An investigation of various dehazing algorithms used on thermal infrared imagery for maritime surveillance systems | |
CN115035168B (en) | Multi-constraint-based photovoltaic panel multi-source image registration method, device and system | |
Bailey et al. | Determining large scale sandbar behaviour | |
CN116468760A (en) | Multi-source remote sensing image registration method based on anisotropic diffusion description | |
Fu et al. | Small bounding-box filter for small target detection | |
Zhang et al. | An Interference Suppression Method For Spaceborne Sar Image Via Space-Channel Attention Network | |
Gao et al. | CP-Net: Channel attention and pixel attention network for single image dehazing | |
CN117911282B (en) | Construction method and application of image defogging model | |
Zhang et al. | Estimation of atmospheric light based on gaussian distribution | |
Zhu et al. | Recaptured image detection through multi-resolution residual-based correlation coefficients |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |