CN111383223A - Charging pile part line sequence color error-proofing detection method - Google Patents
Charging pile part line sequence color error-proofing detection method Download PDFInfo
- Publication number
- CN111383223A CN111383223A CN202010196160.3A CN202010196160A CN111383223A CN 111383223 A CN111383223 A CN 111383223A CN 202010196160 A CN202010196160 A CN 202010196160A CN 111383223 A CN111383223 A CN 111383223A
- Authority
- CN
- China
- Prior art keywords
- charging pile
- color
- pile part
- frame
- line sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 88
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 33
- 238000012360 testing method Methods 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims abstract description 14
- 238000010586 diagram Methods 0.000 claims description 15
- 238000012549 training Methods 0.000 claims description 14
- 230000004913 activation Effects 0.000 claims description 13
- 238000005070 sampling Methods 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 11
- 238000012795 verification Methods 0.000 claims description 10
- 238000013507 mapping Methods 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000007621 cluster analysis Methods 0.000 claims description 3
- 238000005286 illumination Methods 0.000 claims description 3
- 238000003064 k means clustering Methods 0.000 claims description 3
- 230000002708 enhancing effect Effects 0.000 claims 1
- 230000005764 inhibitory process Effects 0.000 claims 1
- 238000004519 manufacturing process Methods 0.000 abstract description 8
- 239000000523 sample Substances 0.000 description 19
- 230000006870 function Effects 0.000 description 13
- 239000003086 colorant Substances 0.000 description 6
- 238000011176 pooling Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000001629 suppression Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000012468 concentrated sample Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000013072 incoming material Substances 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0008—Industrial image inspection checking presence/absence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Probability & Statistics with Applications (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a charging pile part line sequence color error-proofing detection method, which comprises the following steps of: step 1, collecting line sequence and color video samples of charging pile parts; step 2, acquiring single-frame images of the video samples of the line sequence and the color of the charging pile parts acquired in the step 1 to form a basic data set; step 3, generating a trained convolutional neural network detection model of the line sequence and the color of the charging pile parts; and 4, obtaining an improved convolutional neural network detection model of the line sequence and the color of the charging pile part, and inputting the test set into the improved convolutional neural network detection model of the line sequence and the color of the charging pile part for testing to obtain a detection result. The method can effectively improve the precision and speed of detection and classification of the charging pile parts, and meet the beat requirement of a production system.
Description
Technical Field
The invention belongs to the technical field of part detection in intelligent assembly, relates to a detection method of charging pile part parts, and particularly relates to a line sequence color error-proofing detection method of the charging pile part parts.
Background
At present, with the continuous development of the manufacturing industry in China, the intelligent assembly system is also gradually applied. In addition, the new energy industry is greatly promoted in China, new energy technology products are continuously developed, and the application of the charging pile is wide. The detection and the identification of the line order and the color of the charging pile parts are important contents in the intelligent charging pile assembly system, and the system relates to the research fields of computer vision, deep learning, image identification and the like. At present, repeated labor is carried out by manpower in a traditional assembly system, errors are easily caused due to the fact that operation of people has the characteristics of being fatigue, limited in eye resolution and the like, an intelligent system is not generally applied, and an application level method for detecting specific targets for researching wrong installation prevention of parts of charging piles is few. Therefore, in order to avoid wasting manpower and time, unstable factors such as light in the adaptive factory environment, the intelligent charging pile production mistake-proofing detection of ensuring precision and speed is imperative.
For the detection of wrong installation prevention of charging pile part line sequences and colors, no specific detection method is directly applied, the problems of low accuracy, low efficiency and the like exist in the existing target detection method for the detection of the line sequences and the colors of the charging pile parts, the detection accuracy and the model complexity cannot be well considered, and the target detection effect under the dynamic real-time environment is not ideal.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides a charging pile part line sequence color error-proofing detection method, and can solve the technical problems of low detection efficiency and high error detection rate in a dynamic real-time environment in the charging pile part line sequence and color detection method.
The invention solves the practical problem by adopting the following technical scheme:
a charging pile part line sequence color error-proofing detection method comprises the following steps:
and 4, verifying the trained convolutional neural network detection model of the line sequence and the color of the charging pile part in the step 3 by using a sample verification set, optimizing the network to obtain an improved convolutional neural network detection model of the line sequence and the color of the charging pile part, and inputting the test set into the improved convolutional neural network detection model of the line sequence and the color of the charging pile part to test to obtain a detection result.
Further, the specific steps of step 2 include:
(1) acquiring single-frame images of the video samples of the line sequence and the color of the charging pile parts collected in the step 1 by adopting an image processing algorithm based on an open source computer vision library, and selecting the single-frame images of the charging pile parts with different illumination changes and different positions to form a basic data set;
(2) carrying out data enhancement processing on the single-frame sample image of the charging pile part in the basic data set, and then renaming the single-frame sample image according to the serial number to form a charging pile part data set;
(3) carrying out information annotation on each single-frame image of the charging pile parts in the charging pile part data set, wherein the annotated information comprises four pieces of position information of x, y, w and h and category information of a target frame, and dividing all single-frame images of the charging pile parts in the basic data set into a sample training set, a verification set and a test set according to a proportion as sample images;
moreover, the specific method of the data enhancement processing in the step 2 and the step (2) is as follows:
and adjusting the brightness and the contrast of the single-frame sample image of the charging pile part in the basic data set.
Further, the specific steps of step 3 include:
(1) taking a single-frame sample image of a charging pile component as an input, and dividing the single-frame sample image into grids of S × S by a network;
(2) using a dimension clustering fixed prior frame to select a boundary frame, predicting four position information x, y, w, h and confidence coefficient for each boundary frame by using a neural network, wherein the confidence coefficient is expressed by the following formula:
wherein, Pr(object) indicates the likelihood of an included object, and has a value of 1 or 0;the intersection and comparison coefficient of the real value of the marking frame and the value of the prediction coordinate is represented, and when the value is 1, the marking frame and the prediction frame are superposed; if the current prediction is not the best, when above a threshold, choose to ignore the prediction;
the confidence score is calculated using the following formula:
wherein, Pr (class)i) Indicating the probability that a trellis belongs to a certain class, Pr (class)i| object) represents the C category probabilities of the mesh prediction containing the target; identifying various targets by adopting a non-maximum suppression algorithm according to the confidence scores of the C category probabilities;
(3) extracting the features of two bounding boxes with different sizes, extracting feature maps from the first two layers, up-sampling the feature maps, and connecting the feature maps with different resolutions so as to find up-sampling features and fine-grained features in the early feature mapping; and processing the feature mapping combination through convolutional layer operation;
(4) clustering the charging pile part data sets again by using a k-means clustering algorithm to obtain parameters of six groups of pre-selection frames, and then extracting characteristics by adopting a residual error network to generate a trained convolutional neural network detection model of the line sequence and color of the charging pile part;
and moreover, the activation function of the convolutional neural network detection model of the line order and color of the charging pile part in the step 3 is a Mish activation function, and the function formula is as follows:
f(x)=x*tanh(ln(1+ex))。
moreover, the convolutional neural network detection model of the line order and color of the charging pile part in the step 3 adopts a multi-scale idea and applies a priori frame anchors idea, and the convolutional neural network detection model comprises the following steps:
①, performing a series of convolution and up-sampling operations on the result of the last layer of the largest pooled down-sampling by 32 times and fusing the result of the last layer with the result of the largest pooled down-sampling by 16 times to obtain the feature of a first scale 26 × 26, and similarly, obtaining the feature of a second scale 52 × 52, wherein each scale feature comprises three groups of preselected frames, the feature diagram corresponding to the first scale is the largest, and the feature diagram corresponding to the second scale is the smallest, namely the larger the feature diagram is, the smaller the receptive field is, the smaller the sensitive to the small target is, so the small anchor box is selected, and the larger anchor box is selected for the large target;
② the average IOU value obtained by the prior frame obtained by k-means cluster analysis is higher, so that the model is easier to train and learn.
The invention has the advantages and beneficial effects that:
1. the invention provides a charging pile part line sequence color mistake-proofing detection method, which aims to solve the problem of slow production of a charging pile in the current new energy technology popularization. Especially, the problems of low detection efficiency and poor effect in a dynamic real-time environment existing in the industrial detection method for the line sequence and the color of the charging pile parts are solved, labor participation in the production process is reduced, errors of manual detection fatigue are avoided, the precision and the speed of detection and classification of the charging pile parts are effectively improved, and the beat requirement of a production system is met.
2. According to the charging pile part line sequence color mistake-proofing detection method, a convolutional neural network architecture is adopted, a Mish activation function is adopted to improve the network detection accuracy, the multi-scale thought is adopted to realize automatic detection of line sequence and color mistake-proofing of charging pile parts, the method is suitable for an intelligent assembly system, and operations such as detection and classification in the charging pile production process can be quickly and efficiently realized in a dynamic environment.
3. The data set can be updated and perfected according to needs, the line sequence and the color type of the charging pile are included, and more types of charging pile parts and color error-proofing detection are effectively guaranteed.
Drawings
Fig. 1 is a schematic flow chart of a charging pile part line sequence color error-proofing detection method according to the present invention;
FIG. 2 is a schematic diagram of an image using the Mish activation function of the present invention;
fig. 3 is a schematic flow chart of a charging pile part line sequence color error-proofing detection method according to an embodiment of the invention;
FIG. 4 is a schematic diagram of a receptacle base frame detection implemented according to an embodiment of the invention;
FIG. 5 is a schematic diagram of the detection of lines implemented according to an embodiment of the invention.
Detailed Description
The embodiments of the invention will be described in further detail below with reference to the accompanying drawings:
a charging pile part line sequence color error-proofing detection method is shown in figure 1 and comprises the following steps:
in this embodiment, a high-precision camera is used to collect video samples of line sequences and colors of the charging pile components, and the collected objects can be expanded into video samples of specific targets in other shapes in the charging pile components as required.
the specific steps of the step 2 comprise:
(1) acquiring single-frame images of the video samples of the line sequence and the color of the charging pile parts acquired in the step 1 by adopting an image processing algorithm based on an OpenCV (open source computer vision library), and selecting the single-frame images of the charging pile parts with different illumination changes and different positions to form a basic data set;
(2) carrying out data enhancement processing on the single-frame sample image of the charging pile part in the basic data set, and then renaming the single-frame sample image according to the serial number to form a charging pile part data set;
the data enhancement processing of the step 2 and the step (2) comprises the following specific steps:
and adjusting the brightness and the contrast of the single-frame sample image of the charging pile part with the concentrated sample. The method mainly adjusts the brightness degree of the picture and simulates the complex and changeable environment of a factory.
(3) Carrying out information annotation on each single-frame image of the charging pile parts in the charging pile part data set, wherein the annotated information comprises four pieces of position information of x, y, w and h and category information of a target frame, and dividing all single-frame images of the charging pile parts in the basic data set into a sample training set, a verification set and a test set according to a proportion as sample images;
in this embodiment, the single-frame images of the parts in the charging pile are not limited to a certain category of parts.
the specific steps of the step 3 comprise:
(1) taking a single-frame sample image of a charging pile component as an input, and dividing the single-frame sample image into grids of S × S by a network;
(2) bounding boxes are selected by using dimension clustering (dimension clusters) fixed prior boxes (anchor boxes), and the neural network predicts four position information (x, y, w, h) and confidence (confidence) for each bounding box, wherein the confidence is expressed by the following formula:
wherein, Pr(object) indicates the likelihood of an included object, and has a value of 1 or 0;the intersection and comparison coefficient of the real value of the marking frame and the value of the prediction coordinate is represented, and when the value is 1, the marking frame and the prediction frame are superposed; if the current prediction is not the best, when a certain threshold (0.5 is adopted) is reached, the prediction is selected to be ignored;
the confidence score (confidence score) is the probability of the class of the grid to which the bounding box belongs multiplied by the confidence of the bounding box, and represents the probability that the object in the bounding box belongs to a certain class, as shown in the following formula:
wherein, Pr (class)i) Indicating the probability that a trellis belongs to a certain class, Pr (class)iI object) represents the C class probabilities of the mesh prediction containing the target. According to C class probabilitiesThe confidence score identifies various targets by adopting a non-maximum suppression algorithm (NMS);
(3) extracting the features of the bounding box with different sizes (16 times and 8 times of maximum pooling downsampling), extracting feature maps from the first two layers, upsampling the feature maps, connecting the feature maps with different resolutions, and finding the upsampled features and fine-grained features in the early feature mapping. The feature mapping combination is processed through the operation of the convolutional layer, and the nonlinear learning capability of the network is improved by adopting a Mish activation function behind the convolutional layer.
(4) Clustering the charging pile part data set again by using a k-means clustering algorithm to obtain parameters of six groups of preselected boxes, and then extracting characteristics by adopting a Residual Network (Residual Network) to generate a trained convolutional neural Network detection model of the line order and the color of the charging pile part;
in this embodiment, the activation function of the convolutional neural network detection model for the line order and color of the charging pile components in step 3 is a Mish activation function, and a function formula is as follows.
f(x)=x*tanh(ln(1+ex))
Fig. 2 is a schematic diagram of an image using the Mish activation function according to the present invention. It can be seen that the positive value of the Mish function can reach infinity to avoid saturation caused by capping, and a slight change of the negative value can have better gradient flow. Such a smooth activation function can express deeper information and better propagate the information, thereby improving accuracy.
In this embodiment, the convolutional neural network detection model for the line order and color of the charging pile components in step 3 adopts a multi-scale thought, and applies anchor boxes, and includes the following steps:
1) the method comprises the steps that a series of convolution (conditional) and up-sampling operations are carried out on the result of the largest pooling down-sampling 32 times (the last layer) and the result of the largest pooling down-sampling 16 times (the last layer) are fused to obtain the feature of a first scale (26 × 26), and the feature of a second scale (52 × 52) is obtained by a similar method, wherein each scale feature comprises three groups of preselection frames, the feature diagram corresponding to the first scale is the largest, the feature diagram corresponding to the second scale is the smallest, namely the feature diagram is larger, the receptive field is smaller, and the small target is sensitive to a small target, so that the small anchor box is selected, and on the contrary, the large anchor box is selected for a large target, and due to the fact that the types of the detected targets are fewer, two multi-scale charging pile features are;
2) the average IOU value obtained by the prior frame obtained by adopting k-means cluster analysis is higher, so that the model is easier to train and learn.
In this embodiment, in step 3, the training parameters of the network are as follows: batch size 32, learningsite 0.00001, Iteration 5000, and Match Threshold 0.5.
the specific method of the step 4 comprises the following steps: and (3) continuously iterating and training parameters of the optimization model by using the sample verification set according to the trained line sequence and color detection model of the charging pile parts in the step (3), when the parameters tend to be stable, testing by using the test set, finally obtaining the detection model of the line sequence and color of the charging pile parts based on the convolutional neural network, and realizing the error-proof detection of the line sequence and color of the parts in the charging pile industrial production process.
As shown in fig. 3, for the line sequence color error-proofing detection of the charging pile part in the embodiment of the present invention, after the detection model is obtained, the detection process includes the following steps:
a high precision industrial camera device is used to provide real time images.
A single frame image is extracted by an OpenCV-based image processing algorithm.
And identifying the image in real time by adopting a detection model. The wire end socket comprises a frame and a socket of a wire end socket base at a detection position, and wires with different colors. As shown in fig. 4, a schematic diagram of detecting a frame of a jack base according to an embodiment of the present invention is shown, where the frame of the jack base and three plug wire holes 1#, 2# and 3# in the base are detected.
And outputting the line sequence of the part to judge whether the line sequence is correct or not. As shown in fig. 5, a schematic diagram of line detection is implemented for detecting lines of different colors according to an embodiment of the present invention. And judging whether the line sequence is correct according to the detected positions of the lines with different colors and the position relation of the 1#, 2# and 3# wire inserting holes. To the industrial robot assembly in the correct case and to trigger an alarm device in the incorrect case.
The convolutional neural network model used by the invention is based on the darknet-53, and comprises a convolutional layer, an activation layer and a maximum pooling layer, and the darknet-53 greatly expands the classification capability in a multi-class data set and has higher detection speed.
The invention discloses a convolutional neural network-based charging pile part line sequence color error-proofing detection method, which solves the problems of slow manual identification of charging pile part parts, high error detection rate and the like in intelligent assembly. According to the technical scheme, the method for detecting the line sequence color error proofing of the charging pile part provided by the invention realizes real-time detection and identification of the line sequence color error proofing of the charging pile part by using the multi-scale thought and the Mish activation function, can realize stable detection of the line head jack base and different color lines, and comprises incoming material detection of the charging pile part as a whole. Meanwhile, the method provides possibility for the line sequence and color error-proofing detection of parts in intelligent assembly in charging pile production, reduces the error detection rate and unnecessary investment of manpower and material resources, is suitable for dynamic environment and small target detection, and effectively improves the detection precision and speed.
It should be emphasized that the examples described herein are illustrative and not restrictive, and thus the present invention includes, but is not limited to, those examples described in this detailed description, as well as other embodiments that can be derived from the teachings of the present invention by those skilled in the art and that are within the scope of the present invention.
Claims (6)
1. The utility model provides a charging pile part line order colour mistake proofing detection method which characterized in that: the method comprises the following steps:
step 1, collecting line sequence and color video samples of charging pile parts;
step 2, acquiring single-frame images of the video samples of the line sequence and the color of the charging pile parts acquired in the step 1 to form a basic data set; performing data enhancement on the image of the basic data set to form a charging pile part data set; marking the position and the category information of the charging pile part parts in the charging pile part data set, and dividing the marked charging pile part data set into a training set, a verification set and a test set;
step 3, training a convolutional neural network on the sample training set to generate a trained convolutional neural network detection model of the line sequence and color of the charging pile parts;
and 4, verifying the trained convolutional neural network detection model of the line sequence and the color of the charging pile part in the step 3 by using a sample verification set, optimizing the network to obtain an improved convolutional neural network detection model of the line sequence and the color of the charging pile part, and inputting the test set into the improved convolutional neural network detection model of the line sequence and the color of the charging pile part to test to obtain a detection result.
2. The charging pile part line order color error-proofing detection method according to claim 1, characterized in that: the specific steps of the step 2 comprise:
(1) acquiring single-frame images of the video samples of the line sequence and the color of the charging pile parts collected in the step 1 by adopting an image processing algorithm based on an open source computer vision library, and selecting the single-frame images of the charging pile parts with different illumination changes and different positions to form a basic data set;
(2) carrying out data enhancement processing on the single-frame sample image of the charging pile part in the basic data set, and then renaming the single-frame sample image according to the serial number to form a charging pile part data set;
(3) and carrying out information annotation on each single-frame image of the charging pile parts in the charging pile part data set, wherein the annotated information comprises the x, y, w and h position information and the category information of a target frame, and dividing all single-frame images of the charging pile parts in the basic data set into a sample training set, a verification set and a test set according to the proportion as sample images.
3. The charging pile part line order color error-proofing detection method according to claim 2, characterized in that: the specific method for enhancing the data in the step (2) in the step 2 comprises the following steps:
and adjusting the brightness and the contrast of the single-frame sample image of the charging pile part in the basic data set.
4. The charging pile part line order color error-proofing detection method according to claim 1 or 2, characterized in that: the specific steps of the step 3 comprise:
(1) taking a single-frame sample image of a charging pile component as an input, and dividing the single-frame sample image into grids of S × S by a network;
(2) using a dimension clustering fixed prior frame to select a boundary frame, predicting four position information x, y, w, h and confidence coefficient for each boundary frame by using a neural network, wherein the confidence coefficient is expressed by the following formula:
wherein, Pr(object) indicates the likelihood of an included object, and has a value of 1 or 0;the intersection and comparison coefficient of the real value of the marking frame and the value of the prediction coordinate is represented, and when the value is 1, the marking frame and the prediction frame are superposed; if the current prediction is not the best, when above a threshold, choose to ignore the prediction;
the confidence score is calculated using the following formula:
wherein, Pr (class)i) Indicating the probability that a trellis belongs to a certain class, Pr (class)i| object) represents the C category probabilities of the mesh prediction containing the target; adopting non-polar confidence score according to C class probabilityIdentifying various targets by a large-value inhibition algorithm;
(3) extracting the features of two bounding boxes with different sizes, extracting feature maps from the first two layers, up-sampling the feature maps, and connecting the feature maps with different resolutions so as to find up-sampling features and fine-grained features in the early feature mapping; and processing the feature mapping combination through convolutional layer operation;
(4) and re-clustering the charging pile part data sets by using a k-means clustering algorithm to obtain parameters of six groups of pre-selection frames, and then extracting characteristics by adopting a residual error network to generate a trained convolutional neural network detection model of the line sequence and the color of the charging pile part.
5. The charging pile part line order color error-proofing detection method according to claim 1 or 2, characterized in that: the activation function of the convolutional neural network detection model for the line order and color of the charging pile parts in the step 3 is a Mish activation function, and the function formula is as follows:
f(x)=x*tanh(ln(1+ex))。
6. the charging pile part line order color error-proofing detection method according to claim 1 or 2, characterized in that: the convolutional neural network detection model of the line order and color of the charging pile part in the step 3 adopts a multi-scale idea and applies a priori frame anchors idea, and the convolutional neural network detection model comprises the following steps:
①, performing a series of convolution and up-sampling operations on the result of the last layer of the largest pooled down-sampling by 32 times and fusing the result of the last layer with the result of the largest pooled down-sampling by 16 times to obtain the feature of a first scale 26 × 26, and similarly, obtaining the feature of a second scale 52 × 52, wherein each scale feature comprises three groups of preselected frames, the feature diagram corresponding to the first scale is the largest, and the feature diagram corresponding to the second scale is the smallest, namely the larger the feature diagram is, the smaller the receptive field is, the smaller the sensitive to the small target is, so the small anchor box is selected, and the larger anchor box is selected for the large target;
② the average IOU value obtained by the prior frame obtained by k-means cluster analysis is higher, so that the model is easier to train and learn.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010196160.3A CN111383223A (en) | 2020-03-19 | 2020-03-19 | Charging pile part line sequence color error-proofing detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010196160.3A CN111383223A (en) | 2020-03-19 | 2020-03-19 | Charging pile part line sequence color error-proofing detection method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111383223A true CN111383223A (en) | 2020-07-07 |
Family
ID=71221649
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010196160.3A Pending CN111383223A (en) | 2020-03-19 | 2020-03-19 | Charging pile part line sequence color error-proofing detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111383223A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107895362A (en) * | 2017-10-30 | 2018-04-10 | 华中师范大学 | A kind of machine vision method of miniature binding post quality testing |
CN108961235A (en) * | 2018-06-29 | 2018-12-07 | 山东大学 | A kind of disordered insulator recognition methods based on YOLOv3 network and particle filter algorithm |
CN109345519A (en) * | 2018-09-21 | 2019-02-15 | 江苏拙术智能制造有限公司 | Wiring harness connector based on deep learning YOLO algorithm processes model detection method |
CN109638959A (en) * | 2018-12-06 | 2019-04-16 | 杭州意能电力技术有限公司 | Power equipment remote signaling function adjustment method and system based on AR and deep learning |
CN110796107A (en) * | 2019-11-04 | 2020-02-14 | 南京北旨智能科技有限公司 | Power inspection image defect identification method and system and power inspection unmanned aerial vehicle |
-
2020
- 2020-03-19 CN CN202010196160.3A patent/CN111383223A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107895362A (en) * | 2017-10-30 | 2018-04-10 | 华中师范大学 | A kind of machine vision method of miniature binding post quality testing |
CN108961235A (en) * | 2018-06-29 | 2018-12-07 | 山东大学 | A kind of disordered insulator recognition methods based on YOLOv3 network and particle filter algorithm |
CN109345519A (en) * | 2018-09-21 | 2019-02-15 | 江苏拙术智能制造有限公司 | Wiring harness connector based on deep learning YOLO algorithm processes model detection method |
CN109638959A (en) * | 2018-12-06 | 2019-04-16 | 杭州意能电力技术有限公司 | Power equipment remote signaling function adjustment method and system based on AR and deep learning |
CN110796107A (en) * | 2019-11-04 | 2020-02-14 | 南京北旨智能科技有限公司 | Power inspection image defect identification method and system and power inspection unmanned aerial vehicle |
Non-Patent Citations (4)
Title |
---|
DIGANTA MISRA: "Mish:A Self Regularized Non-Monotonic Neural Activation Function", 《ARXIV》 * |
苏东等: "基于深度学习的OBD端口占用状态自动识别算法", 《北京邮电大学学报》 * |
董洪义: "《深度学习之PyTorch物体检测实战》", 31 January 2020 * |
蔡成涛: "《海洋浮标目标探测技术》", 30 November 2019 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111429418A (en) | Industrial part detection method based on YO L O v3 neural network | |
CN110059694B (en) | Intelligent identification method for character data in complex scene of power industry | |
CN108960245B (en) | Tire mold character detection and recognition method, device, equipment and storage medium | |
CN108961235B (en) | Defective insulator identification method based on YOLOv3 network and particle filter algorithm | |
CN110992317A (en) | PCB defect detection method based on semantic segmentation | |
CN106875373B (en) | Mobile phone screen MURA defect detection method based on convolutional neural network pruning algorithm | |
CN107123111B (en) | Deep residual error network construction method for mobile phone screen defect detection | |
CN109816725A (en) | A kind of monocular camera object pose estimation method and device based on deep learning | |
CN108648233A (en) | A kind of target identification based on deep learning and crawl localization method | |
CN111950453A (en) | Optional-shape text recognition method based on selective attention mechanism | |
CN109902761B (en) | Fishing situation prediction method based on marine environment factor fusion and deep learning | |
CN111612051B (en) | Weak supervision target detection method based on graph convolution neural network | |
CN111008576B (en) | Pedestrian detection and model training method, device and readable storage medium | |
CN111553949A (en) | Positioning and grabbing method for irregular workpiece based on single-frame RGB-D image deep learning | |
CN108802041B (en) | Method for rapidly changing small sample set of screen detection | |
CN113516656B (en) | Defect image data processing simulation method based on ACGAN and Cameralink cameras | |
CN107247952B (en) | Deep supervision-based visual saliency detection method for cyclic convolution neural network | |
CN108875819B (en) | Object and component joint detection method based on long-term and short-term memory network | |
CN112750129A (en) | Image semantic segmentation model based on feature enhancement position attention mechanism | |
CN112149535A (en) | Lane line detection method and device combining SegNet and U-Net | |
CN111967313A (en) | Unmanned aerial vehicle image annotation method assisted by deep learning target detection algorithm | |
CN111539931A (en) | Appearance abnormity detection method based on convolutional neural network and boundary limit optimization | |
CN114627553A (en) | Method for detecting classroom scene student behaviors based on convolutional neural network | |
CN113361496A (en) | City built-up area statistical method based on U-Net | |
CN111383223A (en) | Charging pile part line sequence color error-proofing detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200707 |
|
RJ01 | Rejection of invention patent application after publication |