CN116797824A - Automatic egg activity detection method and system based on vision system - Google Patents
Automatic egg activity detection method and system based on vision system Download PDFInfo
- Publication number
- CN116797824A CN116797824A CN202310665385.2A CN202310665385A CN116797824A CN 116797824 A CN116797824 A CN 116797824A CN 202310665385 A CN202310665385 A CN 202310665385A CN 116797824 A CN116797824 A CN 116797824A
- Authority
- CN
- China
- Prior art keywords
- image
- egg
- texture
- textures
- activity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 43
- 230000000694 effects Effects 0.000 title claims abstract description 43
- 235000013601 eggs Nutrition 0.000 claims abstract description 136
- 238000000034 method Methods 0.000 claims abstract description 27
- 238000007781 pre-processing Methods 0.000 claims abstract description 16
- 238000001914 filtration Methods 0.000 claims description 19
- 238000012216 screening Methods 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims description 9
- 230000005540 biological transmission Effects 0.000 claims description 7
- 238000003860 storage Methods 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 6
- 238000007619 statistical method Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 238000009499 grossing Methods 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 239000008280 blood Substances 0.000 abstract description 10
- 210000004369 blood Anatomy 0.000 abstract description 10
- 238000005516 engineering process Methods 0.000 abstract description 5
- 238000004519 manufacturing process Methods 0.000 abstract description 5
- 241000287828 Gallus gallus Species 0.000 description 9
- 238000002360 preparation method Methods 0.000 description 5
- 229960005486 vaccine Drugs 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 241000700605 Viruses Species 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 208000019914 Mental Fatigue Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 229940028617 conventional vaccine Drugs 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 235000015097 nutrients Nutrition 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
- G06V10/765—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Biochemistry (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Analytical Chemistry (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Immunology (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The application discloses an automatic egg activity detection method and system based on a vision system, wherein the method comprises the following steps: s1, acquiring an egg image; s2, preprocessing the egg image to obtain a preprocessed image; s3, extracting the internal characteristic information of the egg of the preprocessed image to obtain an input image; s4, searching textures in the input image through a line segment recognition algorithm, and finally obtaining a binarization image layer containing texture position information; s5, counting the number of textures in the binarization image layer, and judging the activity quality of the eggs according to the number of textures. According to the method, the blood silk textures in the egg image are identified through the image identification technology, and the active quality of the detected egg is judged through counting the blood silk texture quantity, so that the problems of automatic identification and detection of the active quality of the egg are solved, the labor production of mechanical repetition is relieved, and the detection accuracy and the production efficiency are improved.
Description
Technical Field
The application relates to the field of egg activity detection, in particular to an automatic egg activity detection method and system based on a visual system.
Background
Eggs are an indispensable important food source for daily life of people, and the activity of the eggs is more directly related to the important problem of food safety; eggs are also a common vaccine preparation material, and in conventional vaccine preparation methods, fertilized live eggs are often used as a production substrate for virus cultivation. The activity of eggs plays a vital role in the preparation of vaccines. However, the quality of eggs is affected by a plurality of factors, such as differences of egg types, and the content of nutrient substances in the eggs has a certain influence on the activity of the eggs; furthermore, the age of the egg is directly related to the activity of the egg; storage conditions can affect changes in the cell state within an egg and thus affect the activity of the egg. Therefore, in practical application, a plurality of complex processes from the cultivation of eggs to the preparation of virus vaccines are needed, the activity of the eggs can be detected in advance and classified, and the method has very important significance for improving the vaccine preparation efficiency or the hatching efficiency of the eggs.
Most of the current egg activity detection is based on manual detection, a inspector irradiates the interior of an egg with strong light to detect blood silk texture in the egg, the manual egg irradiation is influenced by the subjective influence, the detection efficiency is low, the work repeatability is high, and the inspector is easy to produce physical and mental fatigue when repeatedly and mechanically operate for a long time, so that the detection work is in missed detection and false detection work errors. Therefore, the developed automatic egg activity detection method based on the vision system can relieve the labor production of mechanical repetition and improve the detection accuracy and the production efficiency.
And the Chinese application patent with the application number of 202111436545.3: the intelligent identification and detection method based on individual hen egg quality adopts the following technical scheme: obtaining photos of chicken cages and eggs in the chicken cages, wherein chicken cage numbers are arranged on the chicken cages; determining the position of a chicken coop number in the photo; separating out the region where the chicken coop number is located; dividing and identifying chicken coop characters in the region; and identifying eggs in the chicken coops by utilizing edge detection, so that the color and quality of the eggs under the corresponding chicken coops can be obtained. The technical scheme is that whether the surface of the egg to be tested is cracked or damaged is identified to judge the quality of the egg. The technical scheme provided by the application is that the blood silk texture in the egg image is identified by an image identification technology, and the activity quality of the detected egg is judged by counting the number of the blood silk textures.
Disclosure of Invention
Aiming at the problems of low manual detection efficiency, low accuracy and high work repeatability in the prior art, the application provides an automatic egg activity detection method and system based on a vision system, and the technical scheme adopted by the application is as follows:
the application provides an automatic egg activity detection method based on a visual system, which comprises the following steps of:
s1, acquiring an egg image;
s2, preprocessing the egg image to obtain a preprocessed image;
s3, extracting the internal characteristic information of the egg of the preprocessed image to obtain an input image;
s4, searching textures in the input image through a line segment recognition algorithm, and finally obtaining a binarization image layer containing texture position information;
s5, counting the number of textures in the binarization image layer, and judging the activity quality of the eggs according to the number of textures.
Compared with the prior art, the application provides the automatic egg activity detection method based on the visual system, which is used for identifying the blood silk textures in the egg images through the image identification technology and judging the activity quality of the detected eggs through counting the blood silk texture numbers, so that the problems of low manual detection efficiency, low accuracy and high work repeatability in the prior art are solved.
As a preferable mode, in the step S2, the preprocessing specifically includes the following steps:
s21, extracting the ROI area of the egg image through a preset pre-selection frame, and carrying out normalization processing on the egg image to obtain a first egg image;
s22, smoothing the first egg image by adopting a median filtering algorithm to obtain a second egg image;
s23, filtering the interference pixels of the background part by using a kirchhoff circle detection algorithm on the second egg image.
As a preferable scheme, the specific process of step S23 is as follows: firstly, detecting the edge of the second egg image by using a kirchhoff circle detection algorithm through a gradient method, then, considering the local gradient of each non-0 point of the image, calculating the image gradient according to a mask 2 multiplied by 2, and calculating by using pixels around an ith pixel point: the pixel horizontal line angle is calculated as: />The gradient amplitude is: />And then according to the analytical equation of the circle: (x-a) 2 +(y-b) 2 =r 2 Determining a circumference line; determining a radius according to the support degree of the non-0 pixels on the edges of all candidate centers; and finding out the maximum connected domain of the kirchhoff circle and combining with the image, thereby filtering light interference around the egg due to the lighting reason.
In the step S3, the characteristic information inside the egg is extracted specifically by gray threshold segmentation and area screening of the region of interest.
As a preferable solution, in the step S4, the method specifically further includes the following steps:
s41, searching textures in the input image through a line segment recognition algorithm to obtain an image containing suspected textures;
s42, screening suspected textures in the image by using a double-threshold method, and splicing adjacent textures with higher similarity into the same texture by comparing the similarity of the sharpening degree and the tilting degree of the screened suspected textures;
and S43, if the texture spliced in the step S42 is larger than a set threshold value, outputting the texture as the identified texture, and finally obtaining a binarized image layer containing texture position information.
As a preferable solution, in the step S41, the line segment recognition algorithm specifically includes the following steps:
inputting the input image in the step S3 as an algorithm, performing Gaussian filtering on pixels in the input image by a single egg, performing region growing operation on the image subjected to Gaussian filtering, merging pixels with similar directions by using a region growing algorithm to obtain seed pixels, wherein the seed pixels are used as new region pixels, and the angles of the regions are updated every time a new region pixel is added, so that a directional line segment region can be obtained; calculating pixels of the line segment area part by using rectangular approximation so as to obtain a regular linear area; and taking the linear area as the suspected texture.
As a preferable solution, in the step S42, the implementation procedure of the dual threshold method is as follows:
setting a width threshold and a contrast threshold, filtering out partial fuzzy linear regions according to the contrast threshold, and screening clear linear regions meeting the width condition through the width threshold.
As a preferred embodiment, the method further comprises a step S0 performed before the step S1; the step S0 includes:
s01, numbering and camera calibration are carried out on the image acquisition module;
s02, marking the batch of eggs and different eggs in the same batch;
s03, constructing a data transmission link.
The application also provides an egg activity automatic detection system based on a vision system, which comprises an image acquisition unit, an image preprocessing unit, an image feature extraction unit, a texture recognition unit and a texture number statistical analysis unit which are connected in sequence;
the image acquisition unit is used for acquiring an egg image;
the image preprocessing unit is used for preprocessing the egg image to obtain a preprocessed image;
the image feature extraction unit is used for extracting the internal feature information of the egg of the preprocessed image to obtain an input image;
the texture recognition unit is used for retrieving textures in the input image through a line segment recognition algorithm to finally obtain a binarization image layer containing texture position information;
the texture number statistical analysis unit is used for counting the texture number in the binarization image layer and judging the activity quality of the egg according to the texture number.
The second aspect of the application also provides a computer device comprising a storage medium, a processor and a computer program stored in the storage medium and executable by the processor, which computer program, when being executed by the processor, implements the steps of a vision system-based egg activity automatic detection method as described above.
The beneficial effects of the application are as follows:
the application collects the egg image in real time through the image collecting unit, recognizes the blood silk texture in the egg image through the image recognition technology, and judges the activity quality of the detected egg through counting the blood silk texture number.
Drawings
FIG. 1 is a flow chart diagram of an automatic egg activity detection method based on a vision system provided by an embodiment of the application;
fig. 2 is a flowchart of kirchhoff circle detection provided in an embodiment of the present application;
FIG. 3 is a flowchart of a method for identifying suspected textures according to an embodiment of the present application;
FIG. 4 is a flow chart of a texture calibration method according to an embodiment of the present application;
FIG. 5 is a schematic diagram showing the results of detecting the activity quality of eggs according to an embodiment of the present application;
fig. 6 is a block diagram of unit connection of an automatic egg activity detection system based on a vision system according to an embodiment of the present application.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
it should be understood that the described embodiments are merely some, but not all embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the application, are intended to be within the scope of the embodiments of the present application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of embodiments of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application as detailed in the accompanying claims. In the description of the present application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The application is further illustrated in the following figures and examples.
The application is further illustrated in the following figures and examples.
Example 1
Referring to fig. 1 to 5, an automatic egg activity detection method based on a vision system includes the following steps:
s1, acquiring an egg image;
in a specific embodiment, step S0 is further included, which is performed before step S1; the step S0 includes:
s01, numbering and camera calibration are carried out on the image acquisition module;
specifically, calibrating the camera, printing a checkerboard calibration plate, calibrating the experiment to obtain nine groups, wherein the camera shooting distance is between 0.2m and 0.8m, each group shoots about 20 photos at different positions and angles, black and white characteristic points in the image are detected, internal parameters and external parameters of the camera under ideal distortion-free conditions are solved through the characteristic points, and the actual radial distortion is obtained by applying least square:and tangential distortion: />Wherein r is 2 =x 2 +y 2 ,k 1 ,k 2 ,k 3 For camera radial distortion parameters, p 1 ,p 2 Is a camera tangential distortion parameter. According to the actual detection environment, the basic parameters of the camera are set to be 20.16fps for acquisition frame rate, 30 ms for exposure time, 0.40 for gamma correction, 1726 for balance ratio and 10 for image gain.
S02, marking the batch of eggs and different eggs in the same batch;
specifically, batches of eggs are labeled a, a=1, 2,3 … …, z, z being a positive integer, and different eggs of the same batch are labeled egg, egg=1, 2, … …, x, x being a positive integer. Aiming at the quality of eggs in the same batch collected by a plurality of groups of cameras, two rows of eggs are photographed by each group of cameras, pixel noise caused by uneven light is avoided as much as possible while the definition of the texture of the blood threads of the eggs is guaranteed, the texture characteristics of the two sides of the same egg need to be detected as the basis for judgment, and the problems of missed judgment, misjudgment or misjudgment are avoided.
S03, constructing a data transmission link;
specifically, the image acquisition module inputs images and current parameter information of the camera through multiple threads, and is connected with the PC end through the switch to perform data transmission. The image synchronous transmission of a plurality of cameras requires the packet switching technology of a switch, the switch is physically connected with each networking node through an optical fiber network cable, a routing protocol is used by the switch, a data transmission path is determined according to an IP address and a subnet mask, and the data transmission efficiency of the multi-camera is improved.
Specifically, in the step S1, two different angles are adopted for image acquisition for the same egg. Eggs collected by different cameras are numbered according to a rule, and pictures collected by the same egg in different cameras are given the same number.
S2, preprocessing the egg image to obtain a preprocessed image;
in a specific embodiment, in the step S2, the preprocessing specifically includes the following steps:
s21, extracting the ROI area of the egg image through a preset pre-selection frame, and carrying out normalization processing on the egg image to obtain a first egg image;
s22, smoothing the first egg image by adopting a median filtering algorithm to obtain a second egg image;
s23, filtering the interference pixels of the background part by using a kirchhoff circle detection algorithm on the second egg image.
In a specific embodiment, the specific process of step S23 is: firstly, detecting the edge of the second egg image by using a kirchhoff circle detection algorithm through a gradient method, then, considering the local gradient of each non-0 point of the image, calculating the image gradient according to a mask 2 multiplied by 2, and calculating by using pixels around an ith pixel point: the pixel horizontal line angle is calculated as: />The gradient amplitude is: />And then according to the analytical equation of the circle: (x-a) 2 +(y-b) 2 =r 2 Determining a circumference line; determining a radius according to the support degree of the non-0 pixels on the edges of all candidate centers; and finding out the maximum connected domain of the kirchhoff circle and combining with the image, thereby filtering light interference around the egg due to the lighting reason.
S3, extracting the internal characteristic information of the egg of the preprocessed image to obtain an input image;
in a specific embodiment, in the step S3, specifically, the characteristic information inside the egg is extracted through gray threshold segmentation and area screening of the region of interest.
S4, searching textures in the input image through a line segment recognition algorithm, and finally obtaining a binarization image layer containing texture position information;
in a specific embodiment, in the step S4, the method specifically further includes the following steps:
s41, searching textures in the input image through a line segment recognition algorithm to obtain an image containing suspected textures;
s42, screening suspected textures in the image by using a double-threshold method, and splicing adjacent textures with higher similarity into the same texture by comparing the similarity of the sharpening degree and the tilting degree of the screened suspected textures;
and S43, if the texture spliced in the step S42 is larger than a set threshold value, outputting the texture as the identified texture, and finally obtaining a binarized image layer containing texture position information.
In step S41, the line segment recognition algorithm is specifically implemented as follows:
inputting the input image in the step S3 as an algorithm, performing Gaussian filtering on pixels in the input image by a single egg, performing region growing operation on the image subjected to Gaussian filtering, and obtaining seed pixels by using the region growing algorithm and pixels with gradient amplitude merging directions similar to those in the step S23, wherein the seed pixels are used as new region pixels, and the angles of the regions are updated every time a new region pixel is added, so that a directional line segment region can be obtained; calculating pixels of the line segment area part by using rectangular approximation so as to obtain a regular linear area; and taking the linear area as the suspected texture.
It should be noted that the straight line region is an approximately rectangular region composed of a plurality of pixels, and the region can be regarded as a straight line at a conventional image size.
Specifically, in the step S41, when the pixels of the line segment region portion are calculated using rectangular approximation, the angle of the rectangle is set as the angle of the eigenvector, and this eigenvector is related to the minimum eigenvalue of the M matrix:wherein-> G i Is a pixelGradient value of i, i representing each pixel point in the region, (c) x ,c y ) Is the center of the rectangle.
In the step S42, the specific implementation process of the dual-threshold method is as follows:
setting a width threshold and a contrast threshold, filtering out partial fuzzy linear regions according to the contrast threshold, and screening clear linear regions meeting width conditions through the width threshold to serve as textures.
Specifically, the width threshold and the contrast threshold are set to divide the gradient amplitude into two parts, which are larger than the high threshold P 1 Is considered to be a strong texture, less than the low threshold P 2 Is considered non-texture (noise), and between the two thresholds is considered weak texture P 2 <P≤P 1 . And detecting and judging whether pixel information connected with the strong texture exists in the weak texture, classifying the pixel information into the same type of texture information if the pixel information is connected with the strong texture, filtering the pixel information if the pixel information is not connected with the strong texture, and finally marking the suspected texture.
S5, counting the number of textures in the binarization image layer, and judging the activity quality of the eggs according to the number of textures.
Example 2
Referring to fig. 6, an automatic egg activity detection system based on a vision system includes an image acquisition unit 1, an image preprocessing unit 2, an image feature extraction unit 3, a texture recognition unit 4 and a texture number statistical analysis unit 5, which are sequentially connected;
the image acquisition unit 1 is used for acquiring an egg image;
the image preprocessing unit 2 is used for preprocessing the egg image to obtain a preprocessed image;
the image feature extraction unit 3 is used for extracting the internal feature information of the egg of the preprocessed image to obtain an input image;
the texture recognition unit 4 is used for retrieving textures in the input image through a line segment recognition algorithm to finally obtain a binarization image layer containing texture position information;
the texture number statistical analysis unit 5 is used for counting the texture number in the binarization image layer, and judging the activity quality of the egg according to the texture number.
Example 3
A computer device comprising a storage medium, a processor and a computer program stored in the storage medium and executable by the processor, which when executed by the processor performs the steps of a vision system-based egg activity automatic detection method of embodiment 1.
It is to be understood that the above examples of the present application are provided by way of illustration only and not by way of limitation of the embodiments of the present application. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are desired to be protected by the following claims.
Claims (10)
1. An automatic egg activity detection method based on a vision system is characterized by comprising the following steps of:
s1, acquiring an egg image;
s2, preprocessing the egg image to obtain a preprocessed image;
s3, extracting the internal characteristic information of the egg of the preprocessed image to obtain an input image;
s4, searching textures in the input image through a line segment recognition algorithm, and finally obtaining a binarization image layer containing texture position information;
s5, counting the number of textures in the binarization image layer, and judging the activity quality of the eggs according to the number of textures.
2. An automatic detection method of egg activity based on a visual system according to claim 1, wherein in said step S2, said preprocessing comprises in particular the steps of:
s21, extracting the ROI area of the egg image through a preset pre-selection frame, and carrying out normalization processing on the egg image to obtain a first egg image;
s22, smoothing the first egg image by adopting a median filtering algorithm to obtain a second egg image;
s23, filtering the interference pixels of the background part by using a kirchhoff circle detection algorithm on the second egg image.
3. The method for automatically detecting the activity of eggs based on the visual system according to claim 2, wherein the specific process of the step S23 is as follows: firstly, detecting the edge of the second egg image by using a kirchhoff circle detection algorithm through a gradient method, then, considering the local gradient of each non-0 point of the image, calculating the image gradient according to a mask 2 multiplied by 2, and calculating by using pixels around an ith pixel point:the pixel horizontal line angle is calculated as: />The gradient amplitude is: />And then according to the analytical equation of the circle: (x-a) 2 +(y-b) 2 =r 2 Determining a circumference line; determining a radius according to the support degree of the non-0 pixels on the edges of all candidate centers; and finding out the maximum connected domain of the kirchhoff circle and combining with the image, thereby filtering light interference around the egg due to the lighting reason.
4. The method according to claim 1, wherein in the step S3, the characteristic information of the interior of the egg is extracted by dividing the gray threshold and screening the area of the region of interest.
5. An automatic detection method of egg activity based on a visual system according to claim 1, characterized in that in said step S4, the following steps are specifically further included:
s41, searching textures in the input image through a line segment recognition algorithm to obtain an image containing suspected textures;
s42, screening suspected textures in the image by using a double-threshold method, and splicing adjacent textures with higher similarity into the same texture by comparing the similarity of the sharpening degree and the tilting degree of the screened suspected textures;
and S43, if the texture spliced in the step S42 is larger than a set threshold value, outputting the texture as the identified texture, and finally obtaining a binarized image layer containing texture position information.
6. The automatic egg activity detection method based on a vision system according to claim 5, wherein in the step S41, the line segment recognition algorithm is specifically implemented as follows:
inputting the input image in the step S3 as an algorithm, performing Gaussian filtering on pixels in the input image by a single egg, performing region growing operation on the image subjected to Gaussian filtering, merging pixels with similar directions by using a region growing algorithm to obtain seed pixels, wherein the seed pixels are used as new region pixels, and the angles of the regions are updated every time a new region pixel is added, so that a directional line segment region can be obtained; calculating pixels of the line segment area part by using rectangular approximation so as to obtain a regular linear area; and taking the linear area as the suspected texture.
7. The method for automatically detecting egg activity based on a visual system according to claim 5, wherein in the step S42, the dual-threshold method is implemented as follows:
setting a width threshold and a contrast threshold, filtering out partial fuzzy linear regions according to the contrast threshold, and screening clear linear regions meeting the width condition through the width threshold.
8. An automatic detection method of egg activity based on a visual system according to claim 1, further comprising a step S0 performed before step S1; the step S0 includes:
s01, numbering and camera calibration are carried out on the image acquisition module;
s02, marking the batch of eggs and different eggs in the same batch;
s03, constructing a data transmission link.
9. An automatic egg activity detection system based on a visual system is characterized by comprising an image acquisition unit (1), an image preprocessing unit (2), an image feature extraction unit (3), a texture recognition unit (4) and a texture number statistical analysis unit (5) which are connected in sequence;
the image acquisition unit (1) is used for acquiring an egg image;
the image preprocessing unit (2) is used for preprocessing the egg image to obtain a preprocessed image;
the image feature extraction unit (3) is used for extracting the internal feature information of the egg of the preprocessed image to obtain an input image;
the texture recognition unit (4) is used for retrieving textures in the input image through a line segment recognition algorithm to finally obtain a binarization image layer containing texture position information;
the texture number statistical analysis unit (5) is used for counting the texture number in the binarization image layer, and judging the activity quality of the egg according to the texture number.
10. A computer device, characterized by: comprising a storage medium, a processor and a computer program stored in the storage medium and executable by the processor, which computer program, when being executed by the processor, carries out the steps of a vision system-based automatic detection method of egg activity as claimed in any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310665385.2A CN116797824A (en) | 2023-06-06 | 2023-06-06 | Automatic egg activity detection method and system based on vision system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310665385.2A CN116797824A (en) | 2023-06-06 | 2023-06-06 | Automatic egg activity detection method and system based on vision system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116797824A true CN116797824A (en) | 2023-09-22 |
Family
ID=88033790
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310665385.2A Pending CN116797824A (en) | 2023-06-06 | 2023-06-06 | Automatic egg activity detection method and system based on vision system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116797824A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118468919A (en) * | 2024-07-11 | 2024-08-09 | 青岛海兴智能装备有限公司 | Egg counter based on intelligent vision and counting method |
-
2023
- 2023-06-06 CN CN202310665385.2A patent/CN116797824A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118468919A (en) * | 2024-07-11 | 2024-08-09 | 青岛海兴智能装备有限公司 | Egg counter based on intelligent vision and counting method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11842556B2 (en) | Image analysis method, apparatus, program, and learned deep learning algorithm | |
CN108764257B (en) | Multi-view pointer instrument identification method | |
US9489562B2 (en) | Image processing method and apparatus | |
JP6517788B2 (en) | System and method for adaptive histopathology image decomposition | |
US11226280B2 (en) | Automated slide assessments and tracking in digital microscopy | |
CN106780537B (en) | A kind of paper cocooning frame silk cocoon screening plant and method based on image procossing | |
CN112215790A (en) | KI67 index analysis method based on deep learning | |
CN113962976B (en) | Quality evaluation method for pathological slide digital image | |
CN116228764B (en) | Neonate disease screening blood sheet acquisition quality detection method and system | |
JP2011103786A (en) | Device for harvesting bacterial colony and method therefor | |
US12051253B2 (en) | Method and apparatus for training a neural network classifier to classify an image depicting one or more objects of a biological sample | |
CN116797824A (en) | Automatic egg activity detection method and system based on vision system | |
CN115861721B (en) | Livestock and poultry breeding spraying equipment state identification method based on image data | |
CN117333489A (en) | Film damage detection device and detection system | |
CN113724235B (en) | Semi-automatic Ki67/ER/PR negative and positive cell counting system and method under condition of changing environment under mirror | |
CN115131346A (en) | Fermentation tank processing procedure detection method and system based on artificial intelligence | |
CN118115496B (en) | Wafer defect detection method and device | |
CN114399764A (en) | Pathological section scanning method and system | |
CN109409402A (en) | A kind of image contamination detection method and system based on dark channel prior histogram | |
CN116563298B (en) | Cross line center sub-pixel detection method based on Gaussian fitting | |
CN111797706A (en) | Image-based parasite egg shape recognition system and method | |
CN114550069B (en) | Piglet nipple counting method based on deep learning | |
Blasco et al. | Automatic sex detection of individuals of Ceratitis capitata by means of computer vision in a biofactory | |
CN112651305A (en) | Method for identifying microorganism species | |
CN117496274B (en) | Classification counting method, system and storage medium based on liquid drop images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |