CN118298629B - Video-based visibility monitoring system and method - Google Patents
Video-based visibility monitoring system and method Download PDFInfo
- Publication number
- CN118298629B CN118298629B CN202410385349.5A CN202410385349A CN118298629B CN 118298629 B CN118298629 B CN 118298629B CN 202410385349 A CN202410385349 A CN 202410385349A CN 118298629 B CN118298629 B CN 118298629B
- Authority
- CN
- China
- Prior art keywords
- monitoring
- image frame
- visibility
- vehicle image
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 187
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000010586 diagram Methods 0.000 claims abstract description 68
- 239000011159 matrix material Substances 0.000 claims abstract description 42
- 238000012546 transfer Methods 0.000 claims abstract description 33
- 238000009432 framing Methods 0.000 claims abstract description 19
- 230000000630 rising effect Effects 0.000 claims description 21
- 238000004891 communication Methods 0.000 claims description 16
- 238000001514 detection method Methods 0.000 claims description 12
- 238000003708 edge detection Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 5
- 238000013507 mapping Methods 0.000 claims description 5
- 238000012360 testing method Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 description 23
- 238000012545 processing Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 206010039203 Road traffic accident Diseases 0.000 description 4
- 230000032683 aging Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000015556 catabolic process Effects 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
Abstract
The application provides a video-based visibility monitoring system and method, which comprises the steps of firstly, carrying out single object framing on a highway monitoring video to obtain a single vehicle image frame set, carrying out visible color difference comparison according to the single vehicle image frame set, determining a visible color comparison enthalpy sequence, extracting an object speed scatter diagram and an object light rise rate scatter diagram according to the single vehicle image frame set, determining a vehicle monitoring confidence level according to the object speed scatter diagram and the object light rise rate scatter diagram, and finally determining vehicle object visibility according to a visible color transfer matrix and the vehicle monitoring confidence level, and carrying out visibility grading alarm according to the vehicle object visibility.
Description
Technical Field
The application relates to the technical field of visibility monitoring, and in particular relates to a video-based visibility monitoring system and method.
Background
Under the foggy environment, the visibility is low and can influence the traffic safety of the expressway, for example, the weather such as haze, precipitation can influence the driving sight of a driver, and then the traffic accident risk is increased, especially along with the development of the expressway, the foggy weather brings great threat to the traffic safety of the expressway, the foggy has the characteristics of strong burst, low visibility, small appearance range, strong liquidity and the like, the weather of the foggy is difficult to monitor and forecast in real time and accurately at present by the weather traffic department, on the expressway, the vehicle cannot decelerate in time after driving into the foggy area, so that serious traffic accidents are extremely likely to be caused, the accident mortality rate is often higher than that of general traffic accidents, and therefore, the visibility monitoring must be carried out on the road with multiple foggy sections of the expressway.
In the prior art, when the visibility of the expressway is monitored based on videos, distance information marks on two sides of the expressway, such as fog lamps, are usually used as reference objects, the reference objects are compared with mark images to solve the visibility, the typical method is to estimate the visibility through comparison with sample library images, and the like, and then the road is pre-warned based on the visibility after the visibility is calculated to reduce accidents, but the method has severe requirements on the fog lamp light source serving as the reference objects, and particularly when the fog lamp light source is weakened due to ageing of the fog lamps, the road visibility warning errors are easy to cause.
Disclosure of Invention
The application provides a video-based visibility monitoring system and method, which are used for solving the problem that the conventional visibility monitoring method is easy to cause road visibility alarm errors when a fog lamp light source is weakened due to fog lamp aging.
In a first aspect, the present application provides a method for monitoring visibility based on video, which may be performed by a network device or may be performed by a chip configured in the network device, which is not limited in this aspect of the present application.
Specifically, the method comprises the following steps:
Starting highway visibility monitoring, and obtaining highway monitoring video;
carrying out single object framing on the highway monitoring video to obtain a single vehicle image frame set;
performing visible color difference comparison according to the single-vehicle image frame set to obtain visible color comparison enthalpy corresponding to each single-vehicle image frame respectively, and further determining a visible color comparison enthalpy sequence according to a time sequence;
extracting object speeds according to the single vehicle image frame set to obtain an object speed scatter diagram, extracting the light rising speed of a vehicle object according to the single vehicle image frame set to obtain an object light rising speed scatter diagram, and performing vehicle monitoring confidence detection by the object speed scatter diagram and the object light rising speed scatter diagram to obtain a vehicle monitoring confidence energy level;
When the vehicle monitoring confidence energy level is higher than an energy level threshold, determining a visible color transfer matrix according to the color contrast enthalpy sequence, observing the visibility by the visible color transfer matrix and the vehicle monitoring confidence energy level to obtain the visibility of a vehicle object, and carrying out visibility grading alarm according to the visibility of the vehicle object.
With reference to the first aspect, in some implementation manners of the first aspect, performing single-object framing on the highway monitoring video to obtain a single-vehicle image frame set specifically includes:
framing the highway monitoring video to obtain a highway monitoring video frame set;
performing edge detection on each highway monitoring video frame in the highway monitoring video frame set, and determining the number of connected areas corresponding to each highway monitoring video frame respectively;
And according to the number of the connected areas corresponding to each road monitoring video frame, taking the road monitoring video frames which are adjacent together in time sequence as a single vehicle image frame set.
With reference to the first aspect, in some implementation manners of the first aspect, performing visible color difference contrast according to the set of single-vehicle image frames, to obtain visible color contrast enthalpies corresponding to each single-vehicle image frame respectively specifically includes:
acquiring a first single vehicle image frame in the single vehicle image frame set;
performing edge detection on the single vehicle image frame to obtain an object communication area;
determining the visible color contrast enthalpy corresponding to the first single-vehicle image frame according to the visible color difference degree of the object communication area and the visible color difference degree of the single-object image frame;
And determining visible color contrast enthalpy corresponding to the rest single vehicle image frames in the single vehicle image frame set in the same manner as the first single vehicle image frame.
With reference to the first aspect, in some implementation manners of the first aspect, extracting an object speed according to the single vehicle image frame set, and obtaining an object speed scatter diagram specifically includes:
acquiring the object mass center position of each single vehicle image frame in the single vehicle image frame set;
and determining the object speed corresponding to each single vehicle image frame according to the object centroid position of each single vehicle image frame, and further drawing an object speed scatter diagram according to the time sequence.
With reference to the first aspect, in some implementation manners of the first aspect, extracting a light rise rate of a vehicle object according to the single vehicle image frame set, and obtaining an object light rise rate scatter diagram specifically includes:
Determining an object gray average value of each single vehicle image frame in the single vehicle image frame set;
And determining the corresponding light rise rate of each single vehicle image frame according to the object gray level average value of each single vehicle image frame, and drawing an object light rise rate scatter diagram according to the time sequence.
With reference to the first aspect, in some implementations of the first aspect, the detecting a vehicle monitoring confidence level by using the object speed scatter diagram and the object light rise rate scatter diagram specifically includes:
performing correlation test according to each object speed value in the object speed scatter diagram and each object light rise speed value in the object light rise speed scatter diagram to obtain light rise correlation;
And mapping the light rising correlation degree into a corresponding vehicle monitoring confidence energy level.
With reference to the first aspect, in some implementations of the first aspect, the capturing is performed by a high-definition camera, and the highway monitoring video is obtained.
In a second aspect, the present application provides a video-based visibility monitoring system, including a visibility monitoring unit, the visibility monitoring unit including:
the video acquisition module is used for starting the monitoring of the visibility of the highway and acquiring a highway monitoring video;
The visibility monitoring module is used for carrying out single-object framing on the highway monitoring video to obtain a single-vehicle image frame set;
The visibility monitoring module is further used for comparing visible color differences according to the single-vehicle image frame set to obtain visible color comparison enthalpy corresponding to each single-vehicle image frame respectively, and further determining a visible color comparison enthalpy sequence according to time sequence;
The visibility monitoring module is further used for extracting object speeds according to the single vehicle image frame set to obtain an object speed scatter diagram, extracting the light rising speed of a vehicle object according to the single vehicle image frame set to obtain an object light rising speed scatter diagram, and performing vehicle monitoring confidence detection by the object speed scatter diagram and the object light rising speed scatter diagram to obtain a vehicle monitoring confidence energy level;
and the alarm module is used for determining a visible color transfer matrix according to the color contrast enthalpy sequence when the vehicle monitoring confidence energy level is higher than an energy level threshold, observing the visibility by the visible color transfer matrix and the vehicle monitoring confidence energy level to obtain the visibility of the vehicle object, and carrying out visibility grading alarm according to the visibility of the vehicle object.
In a third aspect, the present application provides a computer terminal device comprising a memory storing code and a processor configured to obtain the code and to perform a video-based visibility monitoring method as described above.
In a fourth aspect, the present application provides a computer readable storage medium storing at least one computer program loaded and executed by a processor to perform the operations performed by a video-based visibility monitoring method as described above.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
According to the video-based visibility monitoring system and method, a highway monitoring video is firstly obtained, single object framing is carried out on the highway monitoring video, a single vehicle image frame set is obtained, visible color difference comparison is carried out according to the single vehicle image frame set, visible color contrast enthalpy corresponding to each single vehicle image frame is obtained, further, a visible color contrast enthalpy sequence is determined according to time sequence, object speed is extracted according to the single vehicle image frame set, an object speed scatter diagram is obtained, light elevation rate is extracted according to the single vehicle image frame set, an object light elevation rate scatter diagram is obtained, vehicle monitoring confidence detection is carried out according to the object speed scatter diagram and the object light elevation rate scatter diagram, a vehicle monitoring confidence level is obtained, when the vehicle monitoring confidence level is higher than a preset threshold value, a visible color transfer matrix is determined according to the color contrast enthalpy sequence, vehicle object visibility is determined according to the visible color transfer matrix and the vehicle monitoring confidence level, a visibility grading alarm is carried out according to the vehicle object visibility, the visible color contrast reflection of the object in the video reflects the object speed of the single vehicle, and further, the vehicle image is attenuated according to the vehicle object visibility error, and the vehicle image visibility error caused by the vehicle image degradation problem is avoided, and the vehicle image degradation is prevented.
Drawings
FIG. 1 is an exemplary flow chart of a video-based visibility monitoring method in accordance with some embodiments of the present application;
FIG. 2 is an exemplary flow chart for single object framing of the highway monitoring video in some embodiments of the present application;
FIG. 3 is a schematic diagram of exemplary hardware and/or software of a visibility-monitoring unit, shown in accordance with some embodiments of the present application;
fig. 4 is a schematic structural diagram of a computer terminal device implementing a video-based visibility monitoring method according to some embodiments of the present application.
Detailed Description
The application carries out single object framing on a highway monitoring video to obtain a single vehicle image frame set, carries out visible color difference comparison according to the single vehicle image frame set to obtain visible color contrast enthalpy corresponding to each single vehicle image frame, further determines a visible color contrast enthalpy sequence according to time sequence, extracts object speed according to the single vehicle image frame set to obtain an object speed scatter diagram, extracts light rise rate according to the single vehicle image frame set to obtain an object light rise rate scatter diagram, carries out vehicle monitoring confidence detection according to the object speed scatter diagram and the object light rise rate scatter diagram to obtain a vehicle monitoring confidence level, and when the vehicle monitoring confidence level is higher than a preset threshold value, determining a visible color transfer matrix according to the color contrast enthalpy sequence, determining the visibility of a vehicle object according to the visible color transfer matrix and the vehicle monitoring confidence level, performing visibility grading alarm according to the visibility of the vehicle object, wherein the visible color contrast enthalpy in the application reflects the color visibility degree of a vehicle moving object in a video, and then confirm the visibility of the vehicle object according to the change of the image color visibility of the vehicle moving object, report to the police according to the visibility of the vehicle object, avoid the ageing fog lamp to cause the fog lamp light source to weaken, cause the problem that the warning of the visibility of the road is wrong easily, have improved the accuracy that the visibility of the road monitors.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments. Referring to fig. 1, which is an exemplary flowchart of a video-based visibility monitoring method 100 according to some embodiments of the present application, the video-based visibility monitoring method 100 mainly includes the steps of:
in step S101, road visibility monitoring is started, and a road monitoring video is acquired.
It should be noted that, the highway monitoring video is a video for monitoring highway conditions, which is shot by a camera or other monitoring devices, and plays an important role in traffic management, road maintenance, safety monitoring, and the like.
Optionally, in some embodiments, the highway monitoring video may be obtained through a high-definition camera, and in some other embodiments, the highway monitoring video may also be obtained through other devices or apparatuses capable of implementing video capturing, which is not limited herein.
In step S102, single-object framing is performed on the highway monitoring video, so as to obtain a single-vehicle image frame set.
It should be noted that, the single vehicle image frame set is a video image frame set containing only one vehicle moving object, so that the road visibility on the highway can be monitored by monitoring the change of the image frame visibility of the vehicle moving object in the video.
Optionally, in some embodiments, referring to fig. 2, the fig. is an exemplary flowchart of single-object framing the highway monitoring video according to some embodiments of the present application, where the single-object framing the highway monitoring video to obtain the single-vehicle image frame set specifically includes:
in step S1021, framing the highway monitoring video to obtain a highway monitoring video frame set, where framing is a process of decomposing a continuous video stream into separate image frames, and when the process is specifically implemented, the highway monitoring video may be framed by using a corresponding decoder;
In step S1022, edge detection is performed on each highway monitoring video frame in the highway monitoring video frame set, and the number of connected areas corresponding to each highway monitoring video frame is determined;
in step S1023, the number of connected regions is set to be one road monitoring video frame adjacent in time sequence as a single vehicle image frame set according to the number of connected regions corresponding to each road monitoring video frame.
It should be noted that, performing edge detection on a highway monitoring video frame refers to finding a place with significant gray level change in the video frame, generally representing an object boundary or an important feature in an image, in some embodiments, performing edge detection on the highway monitoring video frame by using a Sobel operator commonly used in edge detection on an image, where the Sobel operator is a commonly used edge detection operator, and uses a first derivative of a pixel gray level value in the image, and performing Sobel filtering on the image in horizontal and vertical directions to obtain a gradient image of the image in the two directions, and then determining whether the gradient image is an edge according to the size of the gradient image, and further communicating the edge regions to obtain one or more connected regions, counting the number of the connected regions, taking the road monitoring video frame with the number of connected regions being 1 and the uninterrupted road monitoring video frame adjacent to each other in time sequence as a single vehicle image frame set, and taking the single vehicle image frame set with the largest number of video frame images as the single vehicle image frame set finally used for visibility monitoring.
Optionally, in some embodiments, when the brightness of the video scene is too low, the edge detection may be further performed on the highway monitoring video frame according to a preset gray threshold by setting a gray threshold.
In step S103, the visible color difference contrast is performed according to the set of single vehicle image frames, so as to obtain the visible color contrast enthalpy corresponding to each single vehicle image frame, and further determine the visible color contrast enthalpy sequence according to the time sequence.
It should be noted that, the visible color contrast enthalpy is a quantized value of the color visibility of the vehicle object in the single vehicle image frame, and is determined according to the uniformity of the brightness value and the brightness value of the object connected region in the image frame, so that the size of the object capability in the human eye detection image frame can be reflected, specifically, when a stronger brightness difference exists in the object connected region, and the greater the brightness difference between the object connected region and the surrounding environment, the motion condition of the vehicle object is more easily captured by the human eye.
Preferably, in some embodiments, the obtaining visible color contrast enthalpy corresponding to each single vehicle image frame includes:
acquiring a first single vehicle image frame in the single vehicle image frame set;
performing edge detection on the single vehicle image frame to obtain an object communication area;
determining the visible color contrast enthalpy corresponding to the first single-vehicle image frame according to the visible color difference degree of the object communication area and the visible color difference degree of the single-object image frame;
And determining visible color contrast enthalpy corresponding to the rest single vehicle image frames in the single vehicle image frame set in the same manner as the first single vehicle image frame.
The object communicating region may detect edge feature points of the vehicle object in the image through a Sobel edge detection operator, and further communicate the edge feature points, and use the communicating region as the object communicating region.
Optionally, in some embodiments, the visible color difference of the object connected region=the edge region luminance standard deviation of the object connected region-the center region luminance standard deviation, the visible color difference of the single object image frame=the edge region luminance of the object connected region-the luminance of the single object image frame, the visible color contrast enthalpy=the visible color difference of the single object image frame×the visible color difference of the object connected region.
In step S104, the object speed is extracted according to the single vehicle image frame set, an object speed scatter diagram is obtained, the light rise rate of the vehicle object is extracted according to the single vehicle image frame set, an object light rise rate scatter diagram is obtained, and vehicle monitoring confidence detection is performed by the object speed scatter diagram and the object light rise rate scatter diagram, so as to obtain a vehicle monitoring confidence level.
The vehicle monitoring confidence level reflects the reliability of the visibility monitoring according to the vehicle object in the video, the vehicle monitoring confidence level is determined according to the correlation between the vehicle speed in the video image frame and the brightness change speed of the vehicle light source, and then the monitoring process of the visibility is adjusted according to the vehicle monitoring confidence level, so that the influence of the interference of other light sources such as lightning in thunderstorm weather on the road visibility monitoring is avoided.
It should be noted that, the object speed is a running speed of a vehicle object, and in some embodiments, extracting the object speed according to the single vehicle image frame set, to obtain an object speed scatter diagram specifically includes:
acquiring the object mass center position of each single vehicle image frame in the single vehicle image frame set;
and determining the object speed corresponding to each single vehicle image frame according to the object centroid position of each single vehicle image frame, and further drawing an object speed scatter diagram according to the time sequence.
In some embodiments, the object centroid position may be obtained by obtaining an object connected region, taking an average value of coordinates of all pixels in the connected region as an object centroid position, further determining object speeds corresponding to different image frames according to changes of the object centroid position in an image, further taking the object speeds corresponding to different image frames as data points, and drawing an object speed scatter diagram, where in specific implementation, the object speed= (difference between object centroid positions of adjacent single-vehicle image frames x video scale)/video frame time interval.
In some embodiments, the object velocity may also be determined by the relative displacement rate between the object centroid position and the video marker point, without limitation.
It should be noted that, in the process that the light rising rate is changed along with the image frames of the video, the gray level change rate of the vehicle object at the corresponding time of different image frames reflects the light brightness rising condition of the vehicle object at the corresponding time of different image frames of the video, and in some embodiments, extracting the light rising rate of the vehicle object according to the single vehicle image frame set to obtain the object light rising rate scatter diagram specifically includes:
Determining an object gray average value of each single vehicle image frame in the single vehicle image frame set;
According to the object gray average value of each single vehicle image frame, determining the corresponding light rise rate of each single vehicle image frame, and further drawing an object light rise rate scatter diagram according to time sequence, wherein the object gray average value is the gray average value in a connected area when the method is specifically implemented, and the light rise rate of a vehicle object = difference between the object gray average values of adjacent single vehicle image frames/video frame time interval.
In some embodiments, the vehicle monitoring confidence detection is performed by the object speed scatter diagram and the object light rise rate scatter diagram, and the obtaining the vehicle monitoring confidence level specifically includes:
performing correlation test according to each object speed value in the object speed scatter diagram and each object light rise speed value in the object light rise speed scatter diagram to obtain light rise correlation;
And mapping the light rising correlation degree into a corresponding vehicle monitoring confidence energy level.
When the vehicle travels from a distance to the camera, the overall brightness ratio condition in the image frame will generate a certain regularity with the traveling speed, the interference degree of the random light source on the video can be detected by detecting the correlation between the actual speed change of the vehicle object and the gray level change of the vehicle object, and the interference degree of the random light source on the video can be used as the confidence level of the video, so that the interference in the process of monitoring the visibility caused by the interference of the external random light source in the video is avoided, for example, when the random condition such as multiple lightning or the state of a vehicle lamp is changed by a vehicle driver occurs, the confidence level of the vehicle monitoring is too low to be used for the visibility decision, and in particular implementation, the pearson correlation coefficient between each object speed value in the object speed scatter diagram and each object light rise speed value in the object light rise rate scatter diagram can be used as the light rise correlation, and the light rise correlation degree can be mapped into corresponding vehicle monitoring confidence levels through a preset mapping table, for example, the vehicle monitoring confidence levels can be classified into 1,2,3, 4, 5 light rise levels and the higher confidence levels correspond to the vehicle monitoring confidence levels.
In step S105, when the vehicle monitoring confidence energy level is higher than an energy level threshold, a visible color transfer matrix is determined according to the color contrast enthalpy sequence, visibility observation is performed by the visible color transfer matrix and the vehicle monitoring confidence energy level, vehicle object visibility is obtained, and visibility grading alarm is performed according to the vehicle object visibility.
Optionally, in some embodiments, the energy level threshold is preset to a fixed value according to a security coefficient set during visibility monitoring, which is not described herein.
Optionally, in some embodiments, the single vehicle image frame is redetermined for visibility monitoring when the vehicle monitoring confidence energy level is below an energy level threshold.
It should be noted that, the visibility of the vehicle object is determined according to the color change rate of the vehicle object in the road monitoring video, specifically, when the visibility in the foggy weather is lower, the video color change degree of the vehicle object generated along with the approach of the camera is lower, so that the visibility of the vehicle in the foggy weather can be determined according to the color change degree of the moving object in each image frame of the video, and the monitoring of the whole visibility of the road in the foggy weather is further realized, thereby avoiding the problem that the alarm error of the road visibility is easily caused when the fog light source is weakened due to the aging of the fog lamp, and improving the accuracy of the road visibility monitoring.
It should be noted that, the visible color transfer matrix includes color feature change information of the vehicle object in the video monitoring process, which is used for identifying the visibility of the vehicle object, and preferably, in some embodiments, determining the visible color transfer matrix according to the color contrast enthalpy sequence specifically includes:
acquiring object communication areas respectively corresponding to each single vehicle image frame in the single vehicle image frame set;
taking the corresponding color characteristic values of the single object communication area as a matrix array to obtain an initial visible color transfer matrix;
And correcting the initial visible color transfer matrix according to each visible color contrast enthalpy in the visible color contrast enthalpy sequence to obtain a visible color transfer matrix.
In some embodiments, the color feature values in the initial visible color transfer matrix include: in some embodiments, each visible color contrast enthalpy in the visible color contrast enthalpy sequence may be used as a corresponding correction coefficient to correct a column of matrix elements in the initial visible color transfer matrix, for example, the visible color contrast enthalpy of each single-object image frame is multiplied by each corresponding color feature value of the object communication region in each single-object image frame, so as to correct the initial visible color transfer matrix, and obtain the visible color transfer matrix.
The visible color contrast enthalpy can reflect the actual observation obvious degree of human eyes, so that the visible color transfer matrix is corrected by adopting the visible color contrast enthalpy, the result of observation can be captured and more closely attached to human eyes, the accuracy of the vehicle object in the visibility determination process is improved, and the accident rate caused by the excessively low road visibility in the mass fog weather is reduced.
It should be noted that, the vehicle object visibility is a visibility degree of a vehicle object in the video monitoring time period, and is dimensionless, preferably, in some embodiments, the obtaining the vehicle object visibility by performing visibility observation by using the visible color transfer matrix and the vehicle monitoring confidence level specifically includes:
Performing inter-row differential processing on the visible color transfer matrix to obtain a visible color differential matrix;
Determining a color change sequence according to the visible color differential matrix;
Selecting color change values from the color change sequences according to the vehicle monitoring confidence energy level to form a visibility decision sequence;
According to the visibility decision sequence, the visibility of the vehicle object is determined, and in specific implementation, the visibility of the vehicle object can be obtained by multiplying the sequence average value of the visibility decision sequence by a preset proportionality coefficient, so that when the color change speed of the vehicle is low, the visibility alarm is started, and the traffic accident rate caused by the over-low road visibility is reduced.
In some embodiments, performing an inter-row differential processing on the visible color transfer matrix to obtain a visible color differential matrix specifically includes: the difference value between adjacent matrix columns is adopted to replace the original matrix columns, the visible color difference matrix is obtained, and the determining of the color change sequence according to the visible color difference matrix specifically comprises: and respectively taking the average value of the matrix columns as color change values, and sequencing from small to large according to the sizes of the color change values to obtain the color change sequence.
Optionally, in some embodiments, color change values with different proportions in the color change sequence are selected to form the visibility decision sequence according to the vehicle monitoring confidence level, specifically, when the vehicle monitoring confidence level is lower, the color change values in the visibility decision sequence are more, and the specific selected proportion values are not limited herein, so that when the vehicle monitoring confidence level is lower, more color change values can be obtained for visibility decision, and the reliability and accuracy of visibility monitoring are improved.
Optionally, in some embodiments, performing the visibility classification alarm according to the visibility of the vehicle object specifically includes: according to a preset alarm mapping table, the visibility of the vehicle object is mapped into a corresponding alarm grade, and the alarm grade is sent to an alarm center for alarm, so that the stability of monitoring the visibility of the highway is improved, and the accident rate of the highway in the group fog weather is reduced.
Additionally, in another aspect of the present application, in some embodiments, the present application provides a video-based visibility monitoring system including a visibility monitoring unit, referring to fig. 3, which is a schematic diagram of exemplary hardware and/or software of the visibility monitoring unit according to some embodiments of the present application, the visibility monitoring unit 200 including: the video acquisition module 201, the visibility monitoring module 202 and the alarm module 203 are respectively described as follows:
The video acquisition module 201, in some embodiments of the present application, the video acquisition module 201 is mainly used for starting highway visibility monitoring to obtain highway monitoring video;
The visibility monitoring module 202, in some specific embodiments of the present application, the visibility monitoring module 202 is mainly configured to perform single-object framing on the highway monitoring video to obtain a single-vehicle image frame set;
In some embodiments, the visibility monitoring module 202 is further configured to perform visible color difference comparison according to the set of single-vehicle image frames, obtain visible color contrast enthalpy corresponding to each single-vehicle image frame, and further determine a visible color contrast enthalpy sequence according to a time sequence;
In some embodiments, the visibility monitoring module 202 is further configured to extract an object speed according to the single vehicle image frame set, obtain an object speed scatter diagram, extract a light rise rate of a vehicle object according to the single vehicle image frame set, obtain an object light rise rate scatter diagram, and perform vehicle monitoring confidence detection according to the object speed scatter diagram and the object light rise rate scatter diagram, so as to obtain a vehicle monitoring confidence level;
The alarm module 203, in some specific embodiments of the present application, the alarm module 203 is mainly configured to determine a visible color transfer matrix according to the color contrast enthalpy sequence when the vehicle monitoring confidence energy level is higher than an energy level threshold, observe the visibility by the visible color transfer matrix and the vehicle monitoring confidence energy level, obtain the visibility of the vehicle object, and perform a visibility grading alarm according to the visibility of the vehicle object.
While the foregoing details have been provided with examples of a video-based visibility monitoring system and method according to embodiments of the present application, it will be understood that, in order to implement the foregoing functions, the corresponding apparatus includes corresponding hardware structures and/or software modules that perform the respective functions.
Those skilled in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein are capable of being carried out in the form of hardware or a combination of hardware and computer software, and that certain functions of the application are performed in either hardware or computer software, depending on the specific application and design constraints of the solution, so that a person skilled in the art can use different methods to achieve the described functions for each specific application, but such implementation should not be considered to be beyond the scope of the present application.
In addition, the application also provides a computer terminal device, which comprises a memory and a processor, wherein the memory stores codes, and the processor is configured to acquire the codes and execute the video-based visibility monitoring method.
In some embodiments, reference is made to fig. 4, which is a schematic structural diagram of a computer terminal device to which a video-based visibility monitoring method is applied, according to some embodiments of the present application. A video-based visibility monitoring method in the above embodiment may be implemented by a computer terminal device shown in fig. 4, where the computer terminal device 300 includes at least one communication bus 301, a communication interface 302, a processor 303, and a memory 304.
The processor 303 may be a general purpose central processing unit (central processing unit, CPU), application-specific integrated circuit (ASIC), or one or more of the same for controlling the execution of a video-based visibility monitoring method in accordance with the present application.
Communication bus 301 may include a pathway to transfer information between the aforementioned components.
Memory 304 may be, but is not limited to, read-only Memory (ROM) or other type of static storage device that can store static information and instructions, random access Memory (random access Memory, RAM) or other type of dynamic storage device that can store information and instructions, but may also be electrically erasable programmable read-only Memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-only Memory, EEPROM), compact disc read-only Memory (compact disc read-only Memory) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory 304 may be stand alone and be coupled to the processor 303 via the communication bus 301. Memory 304 may also be integrated with processor 303.
The memory 304 is used for storing program codes for executing the scheme of the present application, and the processor 303 controls the execution. The processor 303 is arranged to execute program code stored in the memory 304. One or more software modules may be included in the program code. The determination of the color contrast enthalpy seen in the above embodiments may be implemented by one or more software modules in the processor 303 and program code in the memory 304.
Communication interface 302, using any transceiver-like device for communicating with other devices or communication networks, such as ethernet, radio access network (radio access network, RAN), wireless local area network (wireless local areanetworks, WLAN), etc.
Optionally, the computer terminal device 300 may further include a power supply 305 for providing power to various devices or circuits in the real-time computer terminal device.
In a specific implementation, as an embodiment, the computer terminal device may include a plurality of processors, where each of the processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The computer terminal device may be a general purpose computer terminal device or a special purpose computer terminal device. In a specific implementation, the computer terminal device may be a desktop, a portable computer, a network server, a personal computer (PDA), a mobile phone, a tablet computer, a wireless terminal device, a communication device, or an embedded device. The embodiment of the application is not limited to the type of the computer terminal equipment.
In addition, in other aspects of the present application, there is provided a computer readable storage medium storing at least one computer program loaded and executed by a processor to implement the operations performed by the video-based visibility monitoring method.
In summary, in the video-based visibility monitoring system and method disclosed by the embodiment of the application, a highway monitoring video is firstly obtained, single object framing is carried out on the highway monitoring video, a single vehicle image frame set is obtained, visible color difference comparison is carried out according to the single vehicle image frame set, the visible color contrast enthalpy corresponding to each single vehicle image frame is obtained, further, the visible color contrast enthalpy sequence is determined according to time sequence, the object speed is extracted according to the single vehicle image frame set, an object speed scatter diagram is obtained, the light elevation rate is extracted according to the single vehicle image frame set, the object light elevation rate scatter diagram is obtained, the vehicle monitoring confidence detection is carried out according to the object speed scatter diagram and the object light elevation rate scatter diagram, the vehicle monitoring confidence level is obtained, when the vehicle monitoring confidence level is higher than a preset threshold value, the visible color transfer matrix is determined according to the color contrast enthalpy sequence, the vehicle object visibility is determined according to the visible color transfer matrix and the vehicle monitoring confidence level, the visibility is graded alarm is carried out according to the vehicle object visibility, the visible color contrast enthalpy of the vehicle images reflects the color moving in the video, the image is further, the problem that the vehicle visibility is reduced according to the vehicle object visibility of the vehicle movement is caused by the vehicle image, and the vehicle light is prevented from being degraded, and the problem that the vehicle visibility is easy to be caused by the vehicle image is reduced is detected.
The foregoing is merely exemplary embodiments of the present application, and detailed technical solutions or features that are well known in the art have not been described in detail herein. It should be noted that, for those skilled in the art, several variations and modifications can be made without departing from the technical solution of the present application, and these should also be regarded as the protection scope of the present application, which does not affect the effect of the implementation of the present application and the practical applicability of the patent.
The scope of the application as claimed should be determined by the appended claims, and the description of the embodiments and the like in the specification should be construed as illustrative only, since various changes and modifications can be made therein by one skilled in the art without departing from the spirit and scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
Claims (8)
1. A video-based visibility monitoring method, comprising:
Starting highway visibility monitoring, and obtaining highway monitoring video;
carrying out single object framing on the highway monitoring video to obtain a single vehicle image frame set;
performing visible color difference comparison according to the single-vehicle image frame set to obtain visible color comparison enthalpy corresponding to each single-vehicle image frame respectively, and further determining a visible color comparison enthalpy sequence according to a time sequence;
extracting object speeds according to the single vehicle image frame set to obtain an object speed scatter diagram, extracting the light rising speed of a vehicle object according to the single vehicle image frame set to obtain an object light rising speed scatter diagram, and performing vehicle monitoring confidence detection by the object speed scatter diagram and the object light rising speed scatter diagram to obtain a vehicle monitoring confidence energy level;
when the vehicle monitoring confidence energy level is higher than an energy level threshold, determining a visible color transfer matrix according to the visible color comparison enthalpy sequence, observing the visibility by the visible color transfer matrix and the vehicle monitoring confidence energy level to obtain the visibility of a vehicle object, and carrying out visibility grading alarm according to the visibility of the vehicle object;
the method for obtaining the visible color contrast enthalpy corresponding to each single vehicle image frame respectively specifically comprises the following steps of:
acquiring a first single vehicle image frame in the single vehicle image frame set;
performing edge detection on the single vehicle image frame to obtain an object communication area;
According to the visible color difference degree of the object communication area and the visible color difference degree of the single vehicle image frame, determining the visible color contrast enthalpy corresponding to the first single vehicle image frame;
Determining visible color contrast enthalpy corresponding to the rest single vehicle image frames in the single vehicle image frame set in the same manner as the first single vehicle image frame;
extracting the light rising rate of the vehicle object according to the single vehicle image frame set, and obtaining the object light rising rate scatter diagram specifically comprises the following steps:
Determining an object gray average value of each single vehicle image frame in the single vehicle image frame set;
according to the object gray level average value of each single vehicle image frame, determining the corresponding light rise rate of each single vehicle image frame, and further drawing an object light rise rate scatter diagram according to the time sequence;
wherein the light rise rate of the vehicle object = difference in object gray scale mean of adjacent single vehicle image frames/video frame time interval.
2. The method of claim 1, wherein the single object framing of the highway monitoring video to obtain a single vehicle image frame set specifically comprises:
framing the highway monitoring video to obtain a highway monitoring video frame set;
performing edge detection on each highway monitoring video frame in the highway monitoring video frame set, and determining the number of connected areas corresponding to each highway monitoring video frame respectively;
And according to the number of the connected areas corresponding to each road monitoring video frame, taking the road monitoring video frames which are adjacent together in time sequence as a single vehicle image frame set.
3. The method of claim 1, wherein extracting object velocities from the set of single vehicle image frames, the obtaining an object velocity scatter plot comprising:
acquiring the object mass center position of each single vehicle image frame in the single vehicle image frame set;
and determining the object speed corresponding to each single vehicle image frame according to the object centroid position of each single vehicle image frame, and further drawing an object speed scatter diagram according to the time sequence.
4. The method of claim 1, wherein performing vehicle monitoring confidence detection from the object velocity scatter plot and the object light rise rate scatter plot, the deriving a vehicle monitoring confidence level comprising:
Performing correlation test according to each object speed value in the object speed scatter diagram and each object light rise speed value in the object light rise speed scatter diagram to obtain light rise correlation;
And mapping the light rising correlation degree into a corresponding vehicle monitoring confidence energy level.
5. The method of claim 1, wherein the video acquisition is performed by a high-definition camera to obtain the highway monitoring video.
6. A video-based visibility monitoring system for performing visibility monitoring using the method of any one of claims 1 to 5, the video-based visibility monitoring system comprising a visibility monitoring unit, wherein the visibility monitoring unit comprises:
the video acquisition module is used for starting the monitoring of the visibility of the highway and acquiring a highway monitoring video;
The visibility monitoring module is used for carrying out single-object framing on the highway monitoring video to obtain a single-vehicle image frame set;
The visibility monitoring module is further used for comparing visible color differences according to the single-vehicle image frame set to obtain visible color comparison enthalpy corresponding to each single-vehicle image frame respectively, and further determining a visible color comparison enthalpy sequence according to time sequence;
The visibility monitoring module is further used for extracting object speeds according to the single vehicle image frame set to obtain an object speed scatter diagram, extracting the light rising speed of a vehicle object according to the single vehicle image frame set to obtain an object light rising speed scatter diagram, and performing vehicle monitoring confidence detection by the object speed scatter diagram and the object light rising speed scatter diagram to obtain a vehicle monitoring confidence energy level;
and the alarm module is used for determining a visible color transfer matrix according to the visible color comparison enthalpy sequence when the vehicle monitoring confidence energy level is higher than an energy level threshold, observing the visibility by the visible color transfer matrix and the vehicle monitoring confidence energy level to obtain the visibility of the vehicle object, and carrying out visibility grading alarm according to the visibility of the vehicle object.
7. A computer terminal device comprising a memory storing code and a processor configured to obtain the code and to perform a video-based visibility monitoring method according to any one of claims 1-5.
8. A computer readable storage medium storing at least one computer program, wherein the computer program is loaded and executed by a processor to implement the operations performed by a video-based visibility monitoring method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410385349.5A CN118298629B (en) | 2024-04-01 | Video-based visibility monitoring system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410385349.5A CN118298629B (en) | 2024-04-01 | Video-based visibility monitoring system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118298629A CN118298629A (en) | 2024-07-05 |
CN118298629B true CN118298629B (en) | 2024-11-15 |
Family
ID=
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108735000A (en) * | 2018-06-22 | 2018-11-02 | 南京慧尔视智能科技有限公司 | A kind of energy prevention group mist leads to the method and system of traffic accidents |
CN116798222A (en) * | 2023-03-28 | 2023-09-22 | 哈尔滨工业大学 | Method, system and device for detecting visibility of expressway and intelligently inducing visibility of expressway |
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108735000A (en) * | 2018-06-22 | 2018-11-02 | 南京慧尔视智能科技有限公司 | A kind of energy prevention group mist leads to the method and system of traffic accidents |
CN116798222A (en) * | 2023-03-28 | 2023-09-22 | 哈尔滨工业大学 | Method, system and device for detecting visibility of expressway and intelligently inducing visibility of expressway |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112396116B (en) | Thunder and lightning detection method and device, computer equipment and readable medium | |
KR20210078530A (en) | Lane property detection method, device, electronic device and readable storage medium | |
KR20210080459A (en) | Lane detection method, apparatus, electronic device and readable storage medium | |
JP4653207B2 (en) | Smoke detector | |
US20160148383A1 (en) | Estimating rainfall precipitation amounts by applying computer vision in cameras | |
US9600739B2 (en) | Architecture for real-time extraction of extended maximally stable extremal regions (X-MSERs) | |
CN111967345A (en) | Method for judging shielding state of camera in real time | |
CN113970734B (en) | Method, device and equipment for removing snowfall noise points of road side multi-line laser radar | |
CN109886864B (en) | Privacy mask processing method and device | |
CN113505643B (en) | Method and related device for detecting violation target | |
CN112863187B (en) | Detection method of perception model, electronic equipment, road side equipment and cloud control platform | |
CN107169969B (en) | Highway dangerous rock collapse deposit size measurement and alarm system based on FPGA | |
CN108230288B (en) | Method and device for determining fog state | |
CN113808135B (en) | Image brightness abnormality detection method, electronic device, and storage medium | |
CN118298629B (en) | Video-based visibility monitoring system and method | |
CN105469604A (en) | An in-tunnel vehicle detection method based on monitored images | |
CN112261402A (en) | Image detection method and system and camera shielding monitoring method and system | |
CN118298629A (en) | Video-based visibility monitoring system and method | |
CN116228756B (en) | Method and system for detecting bad points of camera in automatic driving | |
CN108847035A (en) | Vehicle flowrate appraisal procedure and device | |
CN113807209A (en) | Parking space detection method and device, electronic equipment and storage medium | |
CN117022264B (en) | Obstacle detection method and device based on radar fusion | |
CN116681955B (en) | Method and computing device for identifying traffic guardrail anomalies | |
CN117372967B (en) | Remote monitoring method, device, equipment and medium based on intelligent street lamp of Internet of things | |
CN117252903B (en) | Motion area extraction method and system based on image processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |