CN113159183B - Tiny pest image identification method based on local dense area density feature detection - Google Patents
Tiny pest image identification method based on local dense area density feature detection Download PDFInfo
- Publication number
- CN113159183B CN113159183B CN202110440782.0A CN202110440782A CN113159183B CN 113159183 B CN113159183 B CN 113159183B CN 202110440782 A CN202110440782 A CN 202110440782A CN 113159183 B CN113159183 B CN 113159183B
- Authority
- CN
- China
- Prior art keywords
- pest
- dense
- network
- region
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a tiny pest image identification method based on local dense region density feature detection, which overcomes the defect of low tiny pest identification rate compared with the prior art. The invention comprises the following steps: acquiring a training image; constructing a pest dense area detection network; training a pest dense area detection network; standardizing pest dense areas; constructing and training a local area pest target detection network group; constructing and training a global pest target detection network; fusing pest detection results; acquiring an image of the pest to be detected; and obtaining a pest image detection result. According to the invention, the density characteristic information of the tiny pest gathering area is utilized to accurately divide the dense area and carry out individual pest target detection, so that the problems of detection omission, low detection precision and the like of global pest target detection in the area are solved, and the overall detection precision of tiny pest image detection is improved.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a tiny pest image identification method based on local dense region density feature detection.
Background
Most of traditional agricultural pest forecasting methods are based on manual field investigation to identify and estimate quantity, the identification accuracy is influenced by professional knowledge of investigators, and quantity estimation is influenced by subjective judgment of the investigators, so that the forecasting results have large difference. In recent years, pest identification and detection algorithms based on machine vision and image processing technologies are applied to agricultural pest identification and detection work in a large quantity, so that the labor cost of field investigation is greatly reduced, and the accuracy of identification and counting is improved.
In practical application, although the existing target detection algorithm has a good performance in detecting pests with large size and high identification degree, the existing target detection algorithm has the problems of large detection omission, poor detection precision and the like in the case of some pests with small size and high concentration density, such as wheat aphids and the like. This is because the global target detection algorithm for the entire image has a low detection resolution, and it is difficult to resolve a minute target. If the detection resolution of the global target detection algorithm is directly improved, the calculation burden of the algorithm is greatly increased, a large amount of calculation resources are occupied, and the actual application requirements cannot be met.
Therefore, how to improve the detection of the tiny pests while ensuring the operation efficiency becomes a technical problem which needs to be solved urgently by the tiny pest detection task.
Disclosure of Invention
The invention aims to solve the defect of low micro-pest identification rate in the prior art, and provides a micro-pest image identification method based on local dense area density feature detection to solve the problem.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a tiny pest image identification method based on local dense region density feature detection comprises the following steps:
11) acquisition of training images: acquiring a pest image data set with an artificial mark;
12) constructing a pest dense area detection network: constructing a pest dense area detection network, wherein the pest dense area detection network comprises an overall feature extraction network and a dense area suggestion network;
13) training of a pest dense area detection network: training a pest dense area detection network by using a pest image data set;
14) standardization of pest dense areas: performing standardized operation on a local pest region output by the pest dense region detection network; the standardized operation of the pest dense region comprises pest dense region merging operation and pest dense region segmentation operation, pest local regions output by a pest dense region detection network are input, and standardized image local regions which are grouped according to density scores and are similar in size are output;
15) constructing and training a local area pest target detection network group: constructing and training a local area pest target detection network group; inputting image local areas which are obtained by standardizing pest dense areas and grouped according to density scores, and outputting pest identification and positioning results in the image local areas which are grouped according to the density scores;
16) constructing and training a global pest target detection network;
17) and (3) fusing pest detection results: fusing the pest identification and positioning results output by the global pest target detection network and the local region pest target detection network group to obtain a global pest identification and positioning result;
18) obtaining an image of the pest to be detected: acquiring a tiny pest image to be detected;
19) and obtaining a pest image detection result.
The construction of the pest dense area detection network comprises the following steps:
21) setting an overall feature extraction network: the overall feature extraction network comprises a backbone network and a feature fusion network; the backbone network consists of a plurality of layers of convolutional neural network layers, a pooling layer and an activation function layer which are superposed and is used for extracting basic features in the picture and outputting a plurality of layers of feature maps; the feature fusion network fuses the feature maps of all layers by laterally connecting the multi-layer feature maps output by the backbone network, and outputs an overall feature map considering different levels of semantic features; wherein the backbone network is a ResNet50 network, and the feature fusion network is a FPN feature pyramid network;
22) setting a dense area proposal network: setting the input of the dense area suggestion network as an overall characteristic graph output by an overall characteristic extraction network, and outputting the overall characteristic graph as a density score of a selected area which takes each anchor point as the center;
the dense area proposal network firstly uses a convolution layer with convolution kernel size of 3 multiplied by 3 and provided with 512 channels, then uses a linear rectification function ReLu as a convolution layer activation function, and then uses a convolution layer with convolution kernel size of 1 multiplied by 1, and the channel number S multiplied by R of the convolution layer is determined by the product of the area shape number S and the area magnification number R.
The training of the pest dense area detection network comprises the following steps:
31) inputting a pest image data set with artificial labels into an overall feature extraction network, extracting an image basic feature map through a backbone network, and outputting an overall feature map after multilayer semantics are mutually fused through a feature fusion network by the basic feature map;
32) the global feature map is input into the dense area proposal network,
setting an anchor point A sliding on the characteristic diagram, setting the single sliding step length as k, taking the anchor point as the center, sharing S-shaped region selection frames, wherein each selection frame has R amplification ratios, and when the anchor point slides to the ith position, the number of the artificial marks contained in the current S-shaped selection frame under the R amplification ratio is the target numberThe area of the current region marquee isThe target density score in the current marquee is represented using the following formula:
wherein O is a deviation compensation coefficient to ensure that the target density score is a positive number, taking O as 10 in the application, and setting the maximum value d of the target density score max 4, minimum value d min =1;
Setting a target density score for a current marqueeFor the real density score, the score output by the network through the convolutional layer according to the global feature map is setPredicting a score for the target density of the current marquee; representing current image generation using the following formula for dense area detection network back propagation trainingLoss function:
wherein I is the number of anchor point positions in the image,the loss function for each marquee is calculated from the smooth L1 norm SmoothL 1:
finally, the pest dense area detection network obtained through training outputs a series of candidate areas for each imageCorresponding prediction density scoreThe candidate regions with high density scores are dense regions.
The pest dense area standardization comprises the following steps:
41) merging the candidate regions with similar density scores and highly overlapped regions, inputting a dense region set and a corresponding density score output by a dense region detection network, and outputting the merged dense region set and the corresponding density score; the dense area merging step is as follows:
411) the density scores are divided into 5 groups according to the score height, and the scores are respectively as follows:
score 1-2 is no dense, score 2-2.5 is sparse, score 2.5-3 is moderately dense, score 3-3.5 is generally dense, score 3.5-4 is extremely dense;
inputting the agricultural tiny pest image with the artificial mark into the trained pest dense area detection network to obtain the local pest areaWherein each dense region is represented using top-left and bottom-right coordinates
412) Setting an overlap degree calculation formula to calculate the overlap degree OL (a, b) of the two dense areas a and b, wherein the calculation formula is as follows:
and setting a synthesis threshold N t Judging whether the overlapping degree of the two dense areas a and b needs to be synthesized; if OL (a, b) is greater than threshold N t Then merging the dense areas a and b;
413) setting merge operationsInput of itIn order to collect the regions to be merged,outputting { b ', d' } as the new region to be merged and the corresponding density score,of the smallestAndand maximumAndas the upper left and lower right coordinates of the synthesized region b'; the density score corresponding to the synthesized region isMinimum value of (2), is recorded as
414) From dense areasTaking out the corresponding density scoreMaximum area b k And go throughAll other areas b i Performing overlap calculation OL (b) k ,b i ) If the degree of overlap is greater than the composition threshold N t And corresponding to the density score d k And d i Belonging to the same density group, then b i And d i Fetch-put to merge candidate setPerforming the following steps; if the candidate set is not empty after the traversal is finished, the currently selected area b is selected k And d k Also put into the candidate set for merging operationAnd put the output b ', d' backAndotherwise, will b k ,d k Put into output setPerforming the following steps; the above operations are circulated untilAll the areas in the table are taken out;
415) collectionThe middle is the output dense region after merging and the corresponding dense score;
42) performing segmentation operation on the oversized regions in the merged dense regions, inputting the merged dense region set and the corresponding density scores output for the merging operation, and outputting the merged dense region set and the corresponding density scores; the cutting operation steps are as follows:
421) setting a slicing threshold L t Judging whether the current dense area a needs to be segmented or not;
422) fromTaking out area b i If the region does not need to be segmented, it is put into an output set with the corresponding density scorePerforming the following steps; otherwise, halving operation is carried out, and L is reserved at an halving boundary t A/4 overlapping region, keeping the density score of the region after segmentation and the original density score unchanged, and putting the region into an output setPerforming the following steps; the above operations are circulated untilAll the areas are taken out;
423) collection ofThe dense region after the output segmentation and the corresponding density score.
The construction and training of the local area pest target detection network group comprises the following steps:
51) constructing a plurality of groups of parallel local area pest target detection networks to form a local area pest target detection network group; the input of each group of local area pest target detection network is a local area which is obtained by standardizing a pest dense area and is grouped in a corresponding density mode, and the output is an identification and positioning result of the pest target in the grouped local area; the construction of each group of local area pest target detection network comprises the following steps:
511) setting a pest feature extraction network; the pest feature extraction network is used for extracting an input pest feature map of a local region, inputting a dense region obtained by standardizing a pest region, and outputting the pest feature map based on the dense region;
512) setting a pest identification and positioning network; the pest identification and positioning network is used for automatically learning pest characteristic diagrams and identifying and positioning pests, inputting the obtained pest characteristic diagrams of the dense area, and outputting pest type identification and positioning results of the dense area;
52) training a plurality of groups of parallel local area pest target detection networks by using the obtained standardized pest dense areas; the training comprises the following steps:
521) selecting three groups of sparse, medium dense, general and extremely dense according to the density score of the input region, and grouping the normalized dense regions according to the corresponding dense scores;
522) and inputting each grouped local dense region into the corresponding grouped local region pest target detection network for training to obtain the trained local region pest target detection network.
The construction and training of the global pest target detection network comprises the following steps:
61) constructing a global pest target detection network comprising an overall feature extraction network and a pest target identification and positioning network;
62) setting an overall characteristic extraction network for extracting a characteristic graph in the whole input picture, wherein the input picture is a tiny pest picture, and the output picture is an overall characteristic graph obtained based on the whole pest picture;
63) setting a pest target identification and positioning network for automatically learning an integral characteristic diagram and detecting pest targets, inputting the integral characteristic diagram, and outputting a pest identification result and a positioning result;
64) and training the global pest target detection network.
The pest image detection result obtaining method comprises the following steps:
71) inputting an agricultural tiny pest image to be detected into a trained pest dense area detection network to obtain a pest local area;
72) standardizing pest dense areas in pest local areas to generate standardized pest dense areas;
73) the pest dense areas are grouped according to the corresponding density scores, and corresponding target detection networks in the trained local area pest target detection network groups are respectively input to generate local area pest target detection results of the corresponding density groups;
74) inputting agricultural tiny pest images to be detected into the trained global pest target detection network to obtain a global pest target detection result;
75) and (3) fusion of pest target detection results: and fusing the global pest target detection result and the local region pest target detection result to obtain a final target detection result.
Advantageous effects
Compared with the prior art, the method for identifying the tiny pest image based on the density characteristic detection of the local dense area accurately divides the dense area and performs the independent pest target detection by using the density characteristic information of the tiny pest gathering area, solves the problems of detection omission, low detection precision and the like of the global pest target detection in the area, and improves the overall detection precision of the tiny pest image detection.
Drawings
FIG. 1 is a sequence diagram of the method of the present invention;
fig. 2-5 are graphs showing the image recognition result of the pest according to the method of the present invention.
Detailed Description
So that the manner in which the above recited features of the present invention can be understood and readily understood, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings, wherein:
as shown in fig. 1, the method for identifying a tiny pest image based on the density feature detection of a local dense region according to the present invention includes the following steps:
firstly, acquiring a training image: a pest image dataset with artificial markers is acquired.
Secondly, constructing a pest dense area detection network: and constructing a pest dense area detection network, wherein the pest dense area detection network comprises an overall feature extraction network and a dense area suggestion network. The overall characteristic extraction network is used for extracting a characteristic diagram of pests in the whole image, the network inputs an agricultural tiny pest image and outputs an overall characteristic diagram extracted based on the pest image; and the dense region suggestion network predicts the pest dense region and the density degree according to the overall depth characteristic map, the network inputs the overall characteristic map, and the network outputs the dense region and the density score corresponding to each region.
The pest dense region detection network locates pest dense distribution regions in the pest pictures and outputs the dense regions to a subsequent local pest target identification and location network for individual detection. In the process, the resolution ratio of the tiny pest targets in the local area is increased, the difficulty in identifying and positioning the tiny pests by a pest target identification and positioning network is reduced, and the identification, positioning and detection performance of the tiny pest targets by the identification and positioning network is finally improved. The difficulty lies in the accurate resolution of dense region targets in the overall feature map and the correct prediction of region density scores. When training is insufficient, the network has the problems that dense region selection is not accurate, and the region density score is not in accordance with the reality seriously. The method comprises the following specific steps:
(1) setting an overall feature extraction network: the overall feature extraction network comprises a backbone network and a feature fusion network; the backbone network consists of a plurality of superposed convolutional neural network layers, a pooling layer and an activation function layer and is used for extracting basic features in the picture and outputting a plurality of layers of feature maps; the feature fusion network fuses the feature maps of all layers by laterally connecting the multi-layer feature maps output by the backbone network, and outputs an overall feature map considering different levels of semantic features; the backbone network is a ResNet50 network, and the feature fusion network is an FPN feature pyramid network.
(2) Setting a dense area suggestion network: setting the input of the dense area suggestion network as an overall characteristic graph output by an overall characteristic extraction network, and outputting the overall characteristic graph as a density score of a selected area which takes each anchor point as the center;
the dense area proposal network firstly uses a convolution layer with convolution kernel size of 3 multiplied by 3 and provided with 512 channels, then uses a linear rectification function ReLu as a convolution layer activation function, and then uses a convolution layer with convolution kernel size of 1 multiplied by 1, and the number S multiplied by R of the area shape number S and the area magnification ratio number is used for determining the channel number S multiplied by R of the convolution layer.
Thirdly, training a pest dense area detection network: and training the pest dense region detection network by using the pest image data set. The dense region suggestion network in the pest dense region detection network is used as a basis for network training according to the target density score in the marquee. In other prior art, the initial target detection result of the whole image is obtained mainly by initial detection, and then the dense area in the image is selected by methods such as clustering or thermodynamic diagram according to the result. Compared with other prior art, the method has the advantages that the judgment on the density degree of the area is more direct and accurate, the density score is considered to select the target quantity and the area size in the area, and the calculation burden is less. The technical difficulty is that the target density score contains complex information, and a large amount of density region information is needed to be used as a training sample in order to obtain an accurate density score prediction result. The method comprises the following specific steps:
(1) inputting a pest image data set with artificial labels into an overall feature extraction network, extracting an image basic feature map through a backbone network, and outputting an overall feature map after multi-layer semantics are mutually fused through a feature fusion network by the basic feature map;
(2) the global feature map is input into the dense area proposal network,
setting an anchor point A sliding on the characteristic diagram, setting the single sliding step length as k, taking the anchor point as the center, sharing S-shaped region selection frames, wherein each selection frame has R amplification ratios, and when the anchor point slides to the ith position, the number of the artificial marks contained in the current S-shaped selection frame under the R amplification ratio is the target numberThe area of the current region marquee isThe target density score in the current marquee is represented using the following formula:
wherein O is a deviation compensation coefficient to ensure that the target density score is a positive number, and O is 10 in application, and the maximum value d of the target density score is set max 4, minimum value d min =1;
Setting a target density score for a current marqueeFor the real density score, the score output by the network through the convolutional layer according to the global feature map is setPredicting a score for the target density of the current marquee; the loss function generated by the current image for the dense area detection network back propagation training is represented using the following formula:
wherein I is the number of anchor point positions in the image,the loss function for each marquee is calculated from the smooth L1 norm SmoothL 1:
finally, the pest dense area detection network obtained through training outputs a series of candidate areas for each imageCorresponding prediction density scoreThe candidate region with a high density score is the dense region.
Fourthly, standardizing pest dense areas: performing standardized operation on a local pest region output by the pest dense region detection network; the standardized operation of the pest dense region comprises pest dense region merging operation and pest dense region segmentation operation, wherein pest local regions output by a pest dense region detection network are input, and standardized image local regions which are grouped according to density scores and are similar in size are output.
The pest dense areas are standardized, and the highly overlapped areas are combined, so that the subsequent target detection calculation burden is reduced; meanwhile, the overlarge regions are grouped according to the density and segmented, the standardized regions with similar density and size are finally obtained, and the problems of difficult training and insufficient precision of a subsequent detection network caused by uneven density and large size span are solved;
the design difficulty of the method lies in that the predicted density score of the dense area network is effectively combined, reasonable score grouping basis is used, and a merging threshold value and a segmentation threshold value with proper size are set, so that the optimal effect can be achieved only by confirming the parameters through a large number of experiments; the method comprises the following specific steps:
(1) merging the candidate regions with similar density scores and highly overlapped regions, inputting a dense region set and a corresponding density score output by a dense region detection network, and outputting the merged dense region set and the corresponding density score; the dense area merging step is as follows:
A1) the density scores were divided into 5 groups according to the score, and the scores were as follows:
score 1-2 is no dense, score 2-2.5 is sparse, score 2.5-3 is moderately dense, score 3-3.5 is generally dense, score 3.5-4 is extremely dense;
inputting the agricultural tiny pest image with the artificial mark into the trained pest dense area detection network to obtain the local pest areaWherein each dense region is represented using top-left and bottom-right coordinates
A2) Setting an overlap degree calculation formula to calculate the overlap degree OL (a, b) of the two dense areas a and b, wherein the calculation formula is as follows:
and setting a synthesis threshold N t Judging whether the overlapping degree of the two dense areas a and b needs to be synthesized; if OL (a, b) is greater than threshold N t Then merging the dense areas a and b;
A3) setting merge operationsInput of itIn order to collect the regions to be merged,outputting { b ', d' } as the new region to be merged and the corresponding density score,of the smallestAndand maximumAndas the upper left and lower right coordinates of the synthesized region b'; the density score corresponding to the composite region isMinimum value of (1), is noted
A4) From dense areasTaking out the corresponding density scoreMaximum area b k And go throughAll other areas b i Performing overlap calculation OL (b) k ,b i ) If the degree of overlap is greater than the composition threshold N t And corresponding to the density score d k And d i Belonging to the same density group, then b i And d i Fetch and put into the merge candidate setPerforming the following steps; if the candidate set is not empty after the traversal is finished, the currently selected area b is selected k And d k Also put into the candidate set for merging operationAnd put the output b ', d' backAndotherwise, will b k ,d k Put into output setPerforming the following steps; the above operations are circulated untilAll the areas are taken out;
(2) performing segmentation operation on the oversize regions in the merged dense regions, inputting the merged dense region set and the corresponding density score output for the merging operation, and outputting the merged dense region set and the corresponding density score; the cutting operation steps are as follows:
B1) setting a slicing threshold L t Judging whether the current dense area a needs to be segmented or not;
B2) fromTaking outRegion b i If the region does not need to be segmented, it is put into an output set with the corresponding density scorePerforming the following steps; otherwise, halving operation is carried out, and L is reserved at an halving boundary t A/4 overlapping region, keeping the density score of the region after segmentation unchanged from the original density score, and putting the region into an output setPerforming the following steps; the above operations are circulated untilAll the areas in the table are taken out;
Fifthly, constructing and training a local area pest target detection network group: constructing and training a local area pest target detection network group; the method comprises the steps of inputting image local areas grouped according to density scores obtained through pest dense area standardization, and outputting pest identification and positioning results in the image local areas grouped according to the density scores.
In the prior art, a group of local area detection networks are generally used for carrying out target identification and positioning operation in a local area, so that the group of detection networks needs to detect various density characteristics, and the detection precision is not high; the local area pest target detection network group groups local areas with various densities according to the target densities thereof, so that each group of detection networks carries out identification and positioning operation aiming at similar density areas, and the problem of detection precision reduction caused by density span is reduced; the technical difficulty lies in that the optimal grouping number and density grouping basis are set manually in effective combination with standardized operation, and the optimal parameters need to be summarized through a large number of experiments.
The construction and training of the local area pest target detection network group comprises the following steps:
(1) constructing a plurality of groups of parallel local area pest target detection networks to form a local area pest target detection network group; the input of each group of local area pest target detection network is a local area which is obtained by standardizing a pest dense area and is grouped by corresponding density, and the output is an identification and positioning result of the pest target in the grouped local area; the construction of each group of local area pest target detection network comprises the following steps:
C1) setting a pest feature extraction network; the pest feature extraction network is used for extracting an input pest feature map of a local region, inputting a dense region obtained by standardizing a pest region, and outputting the pest feature map based on the dense region;
C2) setting a pest identification and positioning network; the pest identification and positioning network is used for automatically learning pest characteristic diagrams and identifying and positioning pests, inputting the obtained pest characteristic diagrams of the dense area, and outputting pest type identification and positioning results of the dense area;
(2) training a plurality of groups of parallel local area pest target detection networks by using the obtained standardized pest dense areas; the training comprises the following steps:
D1) selecting three groups of sparse, medium dense, general and extremely dense according to the density score of the input region, and grouping the normalized dense regions according to the corresponding dense scores;
D2) and inputting each grouped local dense region into the corresponding grouped local region pest target detection network for training to obtain the trained local region pest target detection network.
And sixthly, constructing and training a global pest target detection network. The global pest target detection network is used for detecting sparse pest targets in the whole picture, is used as supplement of a detection result of a local dense area, finally obtains a complete recognition and positioning result of all pests in the picture, and is constructed and trained according to the prior art.
Firstly, constructing a global pest target detection network, including an overall feature extraction network and a pest target identification and positioning network;
secondly, setting an overall characteristic extraction network for extracting a characteristic diagram in the whole input picture, inputting the characteristic diagram into a tiny pest picture, and outputting the characteristic diagram into an overall characteristic diagram obtained based on the whole pest picture;
thirdly, setting a pest target identification and positioning network for automatically learning the overall characteristic diagram and detecting pest targets, inputting the overall characteristic diagram, and outputting a pest identification result and a positioning result;
and finally, training the global pest target detection network.
Seventhly, fusing pest detection results: and fusing the pest identification and positioning results output by the global pest target detection network and the local region pest target detection network group to obtain a global pest identification and positioning result.
Eighthly, acquiring an image of the pest to be detected: and acquiring a micro pest image to be detected.
And ninthly, obtaining a pest image detection result.
(1) Inputting an agricultural tiny pest image to be detected into a trained pest dense area detection network to obtain a pest local area;
(2) standardizing pest dense areas in pest local areas to generate standardized pest dense areas;
(3) dividing pest dense areas into groups according to corresponding density scores, and respectively inputting corresponding target detection networks in the trained local area pest target detection network groups to generate local area pest target detection results of the corresponding density groups;
(4) inputting an agricultural tiny pest image to be detected into the trained global pest target detection network to obtain a global pest target detection result;
(5) and (3) fusion of pest target detection results: and fusing the global pest target detection result and the local region pest target detection result to obtain a final target detection result.
As shown in fig. 2 to 5, it can be seen from the technical method of the present invention that the identification effect of the tiny pest picture is shown, the method of the present invention can give consideration to both the detection of the tiny pests in the dense region and the detection of the sparse pests, and compared with the prior art, the method has the advantages of missing set detection and high precision of the detection of the dense region.
TABLE 1 comparison table of the detection results on the micro-pest data set
Method | AP | AP50 | AP75 |
FCOS | 22.0 | 61.9 | 8.7 |
RetinaNet | 17.5 | 51.3 | 6.5 |
FasterRCNN | 23.6 | 63.2 | 10.8 |
DMNet | 24.5 | 64.6 | 12.0 |
The method of the invention | 30.5 | 71.8 | 16.3 |
As shown in Table 1, the detection precision of the method of the present invention and other prior art methods on the micro-pest data set is superior to that of the prior art methods, as shown in Table 1, by using the detection precision evaluation methods AP, AP50 and AP75 which are well known in the industry.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined by the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (5)
1. A tiny pest image recognition method based on local dense region density feature detection is characterized by comprising the following steps:
11) acquisition of training images: acquiring a pest image data set with an artificial mark;
12) constructing a pest dense area detection network: constructing a pest dense area detection network, wherein the pest dense area detection network comprises an overall feature extraction network and a dense area suggestion network;
the construction of the pest dense area detection network comprises the following steps:
121) setting an overall feature extraction network: the overall feature extraction network comprises a backbone network and a feature fusion network; the backbone network consists of a plurality of layers of convolutional neural network layers, a pooling layer and an activation function layer which are superposed and is used for extracting basic features in the picture and outputting a plurality of layers of feature maps; the feature fusion network fuses the feature maps of all layers by laterally connecting the multi-layer feature maps output by the backbone network, and outputs an overall feature map considering different levels of semantic features; wherein the backbone network is a ResNet50 network, and the feature fusion network is an FPN feature pyramid network;
122) setting a dense area suggestion network: setting the input of the dense area suggestion network as an overall characteristic graph output by an overall characteristic extraction network, and outputting the overall characteristic graph as a density score of a selected area which takes each anchor point as the center;
the dense area proposal network firstly uses a convolution layer with convolution kernel size of 3 multiplied by 3 and provided with 512 channels, then uses a linear rectification function ReLu as a convolution layer activation function, and then uses a convolution layer with convolution kernel size of 1 multiplied by 1, and the channel number S multiplied by R of the convolution layer is determined by the product of the area shape number S and the area amplification ratio number R;
13) training of a pest dense area detection network: training a pest dense area detection network by using a pest image data set;
14) pest dense area standardization: performing standardized operation on a local pest region output by the pest dense region detection network; the standardized operation of the pest dense region comprises pest dense region merging operation and pest dense region segmentation operation, pest local regions output by a pest dense region detection network are input, and standardized image local regions which are grouped according to density scores and are similar in size are output;
15) constructing and training a local area pest target detection network group: constructing and training a local area pest target detection network group; inputting image local areas which are obtained by standardizing pest dense areas and grouped according to density scores, and outputting pest identification and positioning results in the image local areas which are grouped according to the density scores;
16) constructing and training a global pest target detection network;
the construction and training of the global pest target detection network comprises the following steps:
161) constructing a global pest target detection network comprising an overall feature extraction network and a pest target identification and positioning network;
162) setting an overall feature extraction network for extracting a feature map in the whole input picture, inputting the picture as a tiny pest picture, and outputting the picture as an overall feature map obtained based on the whole pest picture;
163) setting a pest target identification and positioning network for automatically learning an integral characteristic diagram and detecting pest targets, inputting the integral characteristic diagram, and outputting a pest identification result and a positioning result;
164) training a global pest target detection network;
17) and (3) fusing pest detection results: fusing the pest identification and positioning results output by the global pest target detection network and the local region pest target detection network group to obtain a global pest identification and positioning result;
18) obtaining an image of the pest to be detected: acquiring a tiny pest image to be detected;
19) and obtaining a pest image detection result.
2. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the training of the pest dense region detection network comprises the following steps:
21) inputting a pest image data set with artificial labels into an overall feature extraction network, extracting an image basic feature map through a backbone network, and outputting an overall feature map after multilayer semantics are mutually fused through a feature fusion network by the basic feature map;
22) the global feature map is input into the dense area proposal network,
setting an anchor point A sliding on the characteristic diagram, setting the single sliding step length as k, taking the anchor point as the center, sharing S-shaped region selection frames, wherein each selection frame has R amplification ratios, and when the anchor point slides to the ith position, the number of the artificial marks contained in the current S-shaped selection frame under the R amplification ratio is the target numberThe area of the current region marquee isThe target density score in the current marquee is represented using the following formula:
wherein O is a deviation compensation coefficient to ensure that the target density score is a positive number, and O is 10 in application, and the maximum value d of the target density score is set max 4, minimum value d min =1;
Setting a target density score for a current marqueeFor the real density score, the score output by the network through the convolutional layer according to the global feature map is setPredicting a score for the target density of the current marquee; the loss function generated by the current image for the dense area detection network back propagation training is expressed using the following formula:
wherein I is the number of anchor point positions in the image,the loss function for each vote box is calculated from the smooth L1 norm SmoothL 1:
3. The method for identifying tiny pest images based on local dense region density feature detection according to claim 1, wherein the pest dense region standardization comprises the following steps:
31) merging the candidate regions with similar density scores and highly overlapped regions, inputting a dense region set and a corresponding density score output by a dense region detection network, and outputting the merged dense region set and the corresponding density score; the dense region merging step is as follows:
311) the density scores were divided into 5 groups according to the score, and the scores were as follows:
score 1-2 is no dense, score 2-2.5 is sparse, score 2.5-3 is moderately dense, score 3-3.5 is generally dense, score 3.5-4 is extremely dense;
inputting the agricultural tiny pest image with the artificial mark into the trained pest dense area detection network to obtain the local pest areaWherein each dense region is represented using upper-left and lower-right coordinates
312) Setting an overlap degree calculation formula to calculate the overlap degree OL (a, b) of the two dense areas a and b, wherein the calculation formula is as follows:
and setting a synthesis threshold N t Judging whether the overlapping degree of the two dense areas a and b needs to be synthesized; if OL (a, b) is greater than threshold N t Then merging the dense areas a and b;
313) setting merge operationsInput of itIn order to collect the regions to be merged,outputting { b ', d' } as the new region to be merged and the corresponding density score for the region to be merged corresponding to the density score set,of the smallestAndand maximumAndas the upper left and lower right coordinates of the synthesized region b'; the density score corresponding to the composite region isMinimum of (1)Value, is recorded as
314) From dense areasTaking out the corresponding density scoreMaximum area b k And go throughAll other areas b i Performing overlap calculation OL (b) k ,b i ) If the degree of overlap is greater than the composition threshold N t And corresponding to the density score d k And d i Belonging to the same density group, then b i And d i Fetch-put to merge candidate setPerforming the following steps; if the candidate set is not empty after the traversal is finished, the currently selected area b is selected k And d k Also put into the candidate set for merging operationAnd put the output b ', d' backAndotherwise, will b k ,d k Put into output setThe preparation method comprises the following steps of (1) performing; the above operations are circulated untilAll the areas in the table are taken out;
315) collectionThe middle is the output dense region after merging and the corresponding dense score;
32) performing segmentation operation on the oversize regions in the merged dense regions, inputting the merged dense region set and the corresponding density score output for the merging operation, and outputting the merged dense region set and the corresponding density score; the cutting operation steps are as follows:
321) setting a slicing threshold L t Judging whether the current dense area a needs to be segmented or not;
322) fromTaking out area b i If the region does not need to be segmented, put it into the output set with the corresponding density scorePerforming the following steps; otherwise, performing halving operation and reserving L at halving boundary t A/4 overlapping region, keeping the density score of the region after segmentation and the original density score unchanged, and putting the region into an output setPerforming the following steps; the above operations are circulated untilAll the areas in the table are taken out;
4. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the construction and training of the local region pest target detection network group comprises the following steps:
41) constructing a plurality of groups of parallel local area pest target detection networks to form a local area pest target detection network group; the input of each group of local area pest target detection network is a local area which is obtained by standardizing a pest dense area and is grouped in a corresponding density mode, and the output is an identification and positioning result of the pest target in the grouped local area; the construction of each group of local area pest target detection network comprises the following steps:
411) setting a pest feature extraction network; the pest feature extraction network is used for extracting an input pest feature map of a local region, inputting a dense region obtained by standardizing a pest region, and outputting the pest feature map based on the dense region;
412) setting a pest identification and positioning network; the pest identification and positioning network is used for automatically learning pest characteristic diagrams and identifying and positioning pests, inputting the obtained pest characteristic diagrams of the dense area, and outputting pest type identification and positioning results of the dense area;
42) training a plurality of groups of parallel local area pest target detection networks by using the obtained standardized pest dense areas; the training comprises the following steps:
421) selecting four groups of sparse, medium dense, general dense and extreme dense according to the density score of the input region, and grouping the normalized dense regions according to the corresponding dense scores;
422) and inputting each grouped local dense region into the corresponding grouped local region pest target detection network for training to obtain the trained local region pest target detection network.
5. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the obtaining of the pest image detection result comprises the following steps:
51) inputting an agricultural tiny pest image to be detected into a trained pest dense area detection network to obtain a pest local area;
52) standardizing pest dense areas in pest local areas to generate standardized pest dense areas;
53) dividing pest dense areas into groups according to corresponding density scores, and respectively inputting corresponding target detection networks in the trained local area pest target detection network groups to generate local area pest target detection results of the corresponding density groups;
54) inputting an agricultural tiny pest image to be detected into the trained global pest target detection network to obtain a global pest target detection result;
55) and (3) fusion of pest target detection results: and fusing the global pest target detection result and the local region pest target detection result to obtain a final target detection result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110440782.0A CN113159183B (en) | 2021-04-23 | 2021-04-23 | Tiny pest image identification method based on local dense area density feature detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110440782.0A CN113159183B (en) | 2021-04-23 | 2021-04-23 | Tiny pest image identification method based on local dense area density feature detection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113159183A CN113159183A (en) | 2021-07-23 |
CN113159183B true CN113159183B (en) | 2022-08-30 |
Family
ID=76870091
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110440782.0A Active CN113159183B (en) | 2021-04-23 | 2021-04-23 | Tiny pest image identification method based on local dense area density feature detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113159183B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111178121A (en) * | 2018-12-25 | 2020-05-19 | 中国科学院合肥物质科学研究院 | Pest image positioning and identifying method based on spatial feature and depth feature enhancement technology |
WO2020102988A1 (en) * | 2018-11-20 | 2020-05-28 | 西安电子科技大学 | Feature fusion and dense connection based infrared plane target detection method |
JP2020091543A (en) * | 2018-12-03 | 2020-06-11 | キヤノン株式会社 | Learning device, processing device, neural network, learning method, and program |
CN111460315A (en) * | 2020-03-10 | 2020-07-28 | 平安科技(深圳)有限公司 | Social portrait construction method, device and equipment and storage medium |
CN111476238A (en) * | 2020-04-29 | 2020-07-31 | 中国科学院合肥物质科学研究院 | Pest image detection method based on regional scale perception technology |
CN111476317A (en) * | 2020-04-29 | 2020-07-31 | 中国科学院合肥物质科学研究院 | Plant protection image non-dense pest detection method based on reinforcement learning technology |
CN112488244A (en) * | 2020-12-22 | 2021-03-12 | 中国科学院合肥物质科学研究院 | Method for automatically counting densely distributed small target pests in point labeling mode by utilizing thermodynamic diagram |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110096933B (en) * | 2018-01-30 | 2023-07-18 | 华为技术有限公司 | Target detection method, device and system |
-
2021
- 2021-04-23 CN CN202110440782.0A patent/CN113159183B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020102988A1 (en) * | 2018-11-20 | 2020-05-28 | 西安电子科技大学 | Feature fusion and dense connection based infrared plane target detection method |
JP2020091543A (en) * | 2018-12-03 | 2020-06-11 | キヤノン株式会社 | Learning device, processing device, neural network, learning method, and program |
CN111178121A (en) * | 2018-12-25 | 2020-05-19 | 中国科学院合肥物质科学研究院 | Pest image positioning and identifying method based on spatial feature and depth feature enhancement technology |
CN111460315A (en) * | 2020-03-10 | 2020-07-28 | 平安科技(深圳)有限公司 | Social portrait construction method, device and equipment and storage medium |
CN111476238A (en) * | 2020-04-29 | 2020-07-31 | 中国科学院合肥物质科学研究院 | Pest image detection method based on regional scale perception technology |
CN111476317A (en) * | 2020-04-29 | 2020-07-31 | 中国科学院合肥物质科学研究院 | Plant protection image non-dense pest detection method based on reinforcement learning technology |
CN112488244A (en) * | 2020-12-22 | 2021-03-12 | 中国科学院合肥物质科学研究院 | Method for automatically counting densely distributed small target pests in point labeling mode by utilizing thermodynamic diagram |
Non-Patent Citations (6)
Title |
---|
A coarse-to-fine network for aphid recognition and detection in the field;LI R 等;《BIOSYSTEMS ENGINEERING》;20191130;全文 * |
Deep Learning based Automatic Approach using Hybrid Global and Local Activated Features towards Large-scale Multi-class Pest Monitoring;Liu, Liu 等;《2019 IEEE 17th International Conference on Industrial Informatics (INDIN)》;20191231;全文 * |
Density map guided object detection in aerial images;C. Li 等;《Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (2020)》;20201231;全文 * |
基于深度学习的农作物害虫检测方法研究与应用;刘浏;《知网博士电子期刊》;20210115;全文 * |
基于语义信息跨层特征融合的细粒度鸟类识别;李国瑞等;《计算机应用与软件》;20200412(第04期);全文 * |
多尺度非局部注意力网络的小目标检测算法;梁延禹等;《计算机科学与探索》;20191225(第10期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113159183A (en) | 2021-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110599448B (en) | Migratory learning lung lesion tissue detection system based on MaskScoring R-CNN network | |
CN111027547B (en) | Automatic detection method for multi-scale polymorphic target in two-dimensional image | |
CN108830326B (en) | Automatic segmentation method and device for MRI (magnetic resonance imaging) image | |
CN111695482A (en) | Pipeline defect identification method | |
CN110335277A (en) | Image processing method, device, computer readable storage medium and computer equipment | |
CN108288271A (en) | Image detecting system and method based on three-dimensional residual error network | |
CN109685067A (en) | A kind of image, semantic dividing method based on region and depth residual error network | |
JP2017004480A (en) | Conspicuity information acquisition device and conspicuity information acquisition method | |
CN106203430A (en) | A kind of significance object detecting method based on foreground focused degree and background priori | |
CN112036231B (en) | Vehicle-mounted video-based lane line and pavement indication mark detection and identification method | |
CN108629286B (en) | Remote sensing airport target detection method based on subjective perception significance model | |
CN112084869A (en) | Compact quadrilateral representation-based building target detection method | |
CN113569724B (en) | Road extraction method and system based on attention mechanism and dilation convolution | |
CN106897681A (en) | A kind of remote sensing images comparative analysis method and system | |
CN104657980A (en) | Improved multi-channel image partitioning algorithm based on Meanshift | |
CN104835146A (en) | Salient object segmenting method in stereo image based on depth information and image cutting | |
JP2016099835A (en) | Image processor, image processing method, and program | |
CN115423806B (en) | Breast mass detection method based on multi-scale cross-path feature fusion | |
CN113096184A (en) | Diatom positioning and identifying method under complex background | |
CN112287906A (en) | Template matching tracking method and system based on depth feature fusion | |
CN113361530A (en) | Image semantic accurate segmentation and optimization method using interaction means | |
CN110428437A (en) | The GGO dividing method of SLIC and secondary Density Clustering based on edge sensitive | |
CN110688512A (en) | Pedestrian image search algorithm based on PTGAN region gap and depth neural network | |
CN113159183B (en) | Tiny pest image identification method based on local dense area density feature detection | |
CN112927215A (en) | Automatic analysis method for digestive tract biopsy pathological section |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |