CN117173490A - Marine product detection classification method and system based on separated and extracted image data - Google Patents
Marine product detection classification method and system based on separated and extracted image data Download PDFInfo
- Publication number
- CN117173490A CN117173490A CN202311293437.4A CN202311293437A CN117173490A CN 117173490 A CN117173490 A CN 117173490A CN 202311293437 A CN202311293437 A CN 202311293437A CN 117173490 A CN117173490 A CN 117173490A
- Authority
- CN
- China
- Prior art keywords
- boundary
- target
- derivative
- real
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 84
- 238000001514 detection method Methods 0.000 title claims abstract description 74
- 235000013372 meat Nutrition 0.000 claims abstract description 179
- 238000012216 screening Methods 0.000 claims abstract description 139
- 238000010224 classification analysis Methods 0.000 claims abstract description 7
- 230000006870 function Effects 0.000 claims description 211
- 238000012545 processing Methods 0.000 claims description 89
- 238000004458 analytical method Methods 0.000 claims description 68
- 230000008569 process Effects 0.000 claims description 26
- 238000000605 extraction Methods 0.000 claims description 15
- 230000035755 proliferation Effects 0.000 claims description 6
- 235000014102 seafood Nutrition 0.000 description 27
- 230000008859 change Effects 0.000 description 16
- 206010020718 hyperplasia Diseases 0.000 description 7
- 238000004140 cleaning Methods 0.000 description 6
- 238000007405 data analysis Methods 0.000 description 6
- 238000003672 processing method Methods 0.000 description 5
- 241000237509 Patinopecten sp. Species 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000013075 data extraction Methods 0.000 description 4
- 239000002932 luster Substances 0.000 description 4
- 235000020637 scallop Nutrition 0.000 description 4
- 235000015170 shellfish Nutrition 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 241000251468 Actinopterygii Species 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010835 comparative analysis Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 239000002689 soil Substances 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention provides a marine product detection classification method and a marine product detection classification system based on separated and extracted image data, and relates to the technical field of marine product detection classification. The method comprises the steps of obtaining classified image data of a target marine product, respectively extracting target boundary information and meat quality glossiness information, and establishing a target boundary comparison database and a target meat quality comparison database; acquiring a plurality of real-time image data of the target marine products, carrying out boundary information comparison by combining a target boundary comparison database, and classifying and screening the target marine products according to comparison result data to form different screening classes; and determining target screening classes according to different screening classes, acquiring real-time meat quality glossiness information of the target screening classes, and performing meat quality classification analysis by combining a target meat quality comparison database to form meat quality classification result data. The method can realize high-efficiency automatic screening of the meat quality of the marine products.
Description
Technical Field
The invention relates to the technical field of marine product detection and classification, in particular to a marine product detection and classification method and a marine product detection and classification system based on separated and extracted image data.
Background
Seafood products are numerous and require preliminary sorting and subsequent fine sorting of different types of seafood products to accommodate different market demands. As the seafood market opens, there is an increasing demand for seafood. Currently, classification screening of seafood, especially meat quality, is basically performed manually, which is time-consuming and labor-consuming.
Of course, there are also some screening and sorting methods which are deeply combined with automation, for example, classification and sorting are combined with weight data acquisition, but the screening and sorting methods are single and only auxiliary screening and sorting methods, and the classification and sorting of the meat quality of marine products are still under the current state of manual operation. For most marine products, the shape of the marine products has a certain fixed shape, especially fish, shellfish and the like, so that the same meat classification and identification can be realized, and the realization of efficient and automatic meat classification of the marine products is a content worthy of intensive research.
Therefore, the marine product detection and classification device and method based on the separated and extracted image data are designed, so that the efficient and automatic screening of the meat quality of the marine products can be realized, and the marine product detection and classification device and method are the problems to be solved urgently at present.
Disclosure of Invention
The object of the present invention is to provide a marine product detection classification method based on separately extracted image data by performing historical classification image data acquisition of the meat quality of a target marine product to be subjected to classification detection screening, and extracting boundary features based on these classification image data to create a boundary feature database for comparison and creating a meat quality comparison database for comparison based on the light gray-scale features exhibited by the meat quality in the image data under specific light. Meanwhile, real-time image data of meat quality is obtained, comparison analysis of boundary characteristics is carried out, automatic classification detection on whether damage or hyperplasia exists in the meat quality of the target marine product is achieved, and marine products with normal meat quality are compared in meat quality state according to detection results, namely, gray data are compared and analyzed by utilizing a meat quality comparison database, and classification processing analysis of different meat quality is further completed. The whole screening process of detection classification realizes automatic classification detection screening of the meat quality of the target marine products on the basis of using an image processing method, and has the advantages of high screening accuracy and high efficiency, and greatly improves the processing efficiency of the marine products.
The invention also aims to provide a marine product detection and classification system based on the separated and extracted image data, which can fully acquire the image data of the target marine product through the image data acquisition unit and provide reasonable and sufficient basic data for the subsequent analysis of detection and classification. The analysis processing unit is used for realizing reasonable and efficient analysis of the meat quality state based on the image data and accurate analysis of the meat quality, forming accurate marine product meat quality detection classification screening results, and further providing real-time and accurate analysis result data for effectively and accurately completing the whole detection classification screening process of marine product meat quality.
In a first aspect, the present invention provides a marine product detection and classification method based on separated and extracted image data, comprising obtaining classified image data of a target marine product, extracting target boundary information and meat quality glossiness information respectively, and establishing a target boundary comparison database and a target meat quality comparison database; acquiring a plurality of real-time image data of the target marine products, carrying out boundary information comparison by combining a target boundary comparison database, and classifying and screening the target marine products according to comparison result data to form different screening classes; and determining target screening classes according to different screening classes, acquiring real-time meat quality glossiness information of the target screening classes, and performing meat quality classification analysis by combining a target meat quality comparison database to form meat quality classification result data.
In the present invention, the method establishes a boundary feature database for comparison by performing acquisition of historical classified image data of target seafood to be classified, detected and screened, and extracting boundary features based on these classified image data, and establishes a meat quality comparison database for comparison based on light gray-scale features exhibited by meat quality in the image data under specific light. Meanwhile, real-time image data of meat quality is obtained, comparison analysis of boundary characteristics is carried out, automatic classification detection on whether damage or hyperplasia exists in the meat quality of the target marine product is achieved, and marine products with normal meat quality are compared in meat quality state according to detection results, namely, gray data are compared and analyzed by utilizing a meat quality comparison database, and classification processing analysis of different meat quality is further completed. The whole screening process of detection classification realizes automatic classification detection screening of the meat quality of the target marine products on the basis of using an image processing method, and has the advantages of high screening accuracy and high efficiency, and greatly improves the processing efficiency of the marine products.
As a possible implementation manner, obtaining classified image data of the target marine product, extracting target boundary information and meat quality glossiness information respectively, and establishing a target boundary comparison database and a target meat quality comparison database, including: acquiring classified image data of the target marine products, and dividing the classified image data in the image acquisition directions to form classified image data sets in different directions; extracting and analyzing the classified image data sets in different directions based on the target boundary information to form a target boundary comparison database; and carrying out extraction analysis based on the meat quality glossiness information on the different direction classification image data sets to form a target meat quality comparison database.
In the present invention, the image data acquired in different directions are different, and thus the boundary characteristic information and the fleshy luster information formed are also different. In order to make the database have a targeted comparison effect, when the database is built, the basic data needs to be classified according to different acquisition directions, so that the image data acquired in the same direction has the generalization and compatibility of the data when the feature extraction is performed. For the acquired direction, consideration can be made at the beginning of establishing big data of image data, the acquired direction can be determined according to the characteristics of the shape, the luster and the like of the meat of different types of marine products, for example, for shellfish marine products, the direction facing a single shell surface on a scallop and the direction facing two shell seams can be considered because the meat of the scallop is similar to the shell, so that the boundary characteristic data extracted at the later stage is favorable for containing the shape characteristic information of the marine products as much as possible. Of course, for each image data extraction, it may be determined that a stable state of a seafood is, for example, stably placed on a fixed station of a certain working procedure, and a certain position of a stable transmission process is obtained, so that the established boundary coordinate system is basically corresponding to the corresponding feature boundary function, which is beneficial to improving accuracy of the scaling data in the subsequent comparative analysis.
As a possible implementation manner, performing extraction analysis based on target boundary information on different direction classification image data sets to form a target boundary comparison database, including: classifying the image data set for each direction, extracting boundary information in each image to form a target boundary function setWherein the set of object boundary functions +.>The object boundary function in (2) comprises a boundary coordinate system taking the center of the image as the origin of coordinates; for the set of object boundary functions->Performing overlap comparison based on boundary coordinate system, and determining the object boundary function with the highest overlap ratio as the designated boundary function +.>The method comprises the steps of carrying out a first treatment on the surface of the Setting an angle step alpha to specify a boundary function +.>The corresponding boundary coordinate system is a point-integrated reference coordinate system; taking the origin of coordinates as a reference, taking the angle step alpha as an angle interval acquired by a designated reference point, sequentially forming rays from the origin of coordinates, and combining the rays with a designated boundary function +.>The point of intersection is determined as a designated reference point; obtaining the function derivative corresponding to each appointed reference point to form appointed derivative set +.>=[/>,/>,…,/>]K represents the number of the acquired designated reference point; for objective boundary function->Divide by specified boundary function- >Other object boundary functions than the specified boundary function +.>Forming a process boundary function; the processing boundary function is processed and assigned with the boundary function under the corresponding boundary coordinate system>The same point centralization parameter of each processing reference point is determined, and the corresponding processing derivative of each processing reference point is determined to form a corresponding processing derivative set +.>Wherein n represents the number of other target boundary functions except the specified boundary function in the target boundary functions; one-to-one correspondence of each designated reference point to the process derivative in each process boundary function based on the position on the boundary coordinate system and in a designated derivative setBased on each specified derivative of (a) each set of treatment derivatives>Corresponding processing derivative in (a) is a range adjustment quantity, and a derivative adjustment range of each appointed reference point is determined; and establishing a target boundary comparison database by combining derivative adjustment ranges of all designated reference points corresponding to the classified image data sets in different directions.
In the invention, the determination of the appointed boundary function is determined through the coincidence degree, so that the obtained characteristics contained in the appointed boundary function can cover most characteristics of the meat quality of the marine products, and the reasonable reservation and maintenance of the characteristic information can be facilitated when the zooming processing is carried out based on the appointed boundary function. It should be noted that, the range adjustment amount obtained here is substantially the derivative of the boundary function at the reference point, and based on the property of the derivative, the range adjustment amount can determine whether the curve change of the boundary function is in a reasonable range defined by big data, so as to determine an abnormal boundary function, that is, the condition that the quality of the seafood may have hyperplasia to cause the boundary to be raised or damaged to cause the boundary to be recessed. The scaling of the objective boundary function is considered to take into account the difference in individual body types of the meat quality of the seafood of the same type, and if the derivative is directly extracted, the range adjustment amount can be enlarged or reduced, so that the follow-up real-time judgment is inaccurate in reference to the comparison library data and the detection classification is inaccurate. In addition, the establishment of the appointed derivative set can consider the reference point extraction of the fixed step length, and the angle step length can be determined according to the boundary characteristics of the meat quality of marine products of different types, so that on one hand, the accuracy of data analysis can be adaptively improved, and on the other hand, the rationality and the moderate degree of data extraction can be realized while the accuracy of data analysis is ensured. When speaking, as for the boundary function, the shape characteristics of the meat quality of different seafood are considered, the boundary function may be a piecewise function, so that the angle step size can also be adjusted in a targeted manner under the same boundary function for different piecewise curves so as to improve the accuracy of the reference point data.
As a possible implementation, forObjective boundary functionDivide by specified boundary function->Other object boundary functions than the specified boundary function +.>Form a process boundary function, comprising: determining the average specified distance +.>The method comprises the steps of carrying out a first treatment on the surface of the Dividing the designated boundary function by each selection>An outer single target boundary function; will specify boundary function +.>The corresponding boundary coordinate system coincides with the coordinate system corresponding to the target boundary function, and the distance difference function of the appointed boundary function and the target boundary function is determined when the coordinate system coincides +.>Wherein->,/>Representing the distance of the coordinate point on the specified boundary function from the origin of the coordinate system,/>Representing the distance between the coordinate point on the selected single target boundary function and the origin of the coordinate system; the average distance difference under the distance difference function is taken as the effective scaling distance difference +.>The method comprises the steps of carrying out a first treatment on the surface of the From an average specified distanceLeave->And effective zoom distance difference->Determining the scaling of the objective boundary function>Wherein, the method comprises the steps of, wherein,,/>representing the effective zoom distance difference +.>A sign extraction function for extracting positive and negative signs; and scaling the target boundary function according to the corresponding scaling rate to form a processing boundary function.
In the invention, the distance difference is used as basic data for determining the scaling rate, so that the scaling accuracy can be realized. Of course, the individual shape difference of the meat quality of the marine products of the same type is considered, so that the distance difference is not constant, reasonable scaling rate data can be obtained by taking the average distance as a reference quantity, the data averaging requirement of big data processing is met, and enough characteristic data can be reserved. Here, the boundary function is preferably a function in an angular coordinate system, but may be a function in a planar coordinate system. The function under the angle coordinate system can ensure that the acquired distance difference data is the actual distance difference between the corresponding reference points, and the function under the plane coordinate system can perform equivalent processing because the acquired distance difference is the angle difference relative to the origin instead of the actual distance difference, after all, the distance difference between the distance difference and the actual distance difference is smaller, boundary characteristic information is fully reserved, and large change of the scaling ratio cannot be caused. Of course, this fine influence can also be reduced by adjusting the angle step to improve accuracy.
As a possible implementation manner, to specify a derivative setBased on each specified derivative of (a) each set of treatment derivatives>The corresponding processing derivative in (a) is a range adjustment amount, and determining a derivative adjustment range of each specified reference point comprises: acquiring each set of processing derivatives->Treatment derivative corresponding to position->Forming an initial reference point processing derivative set; setting a derivative deviation threshold beta, and carrying out the following screening judgment on each processing derivative in the initial reference point processing derivative set: if it isThe processing derivative is reserved, and otherwise, the processing derivative is removed; acquiring reserved processing derivatives to form an effective reference point processing derivative set; acquiring the minimum processing derivative in the effective reference point processing derivative set and the maximum processing derivative in the effective reference point processing derivative set to form an initial adjustment range; according to the initial adjustment range and the corresponding designated derivative, the following judgment is carried out to form a derivative adjustment range: if the designated derivative belongs to the corresponding initial adjustment range, determining the initial adjustment range as a derivative adjustment range; if the specified derivative does not belong to the corresponding initial adjustment range, the specified derivative is used as a boundary value to expand the initial adjustment range, so that a derivative adjustment range is formed.
In the invention, the determination of the derivative adjustment range is determined by taking the minimum value and the maximum value in the derivatives formed by the boundary functions as range boundaries, but the boundary characteristic information contained in the boundary characteristic function with the maximum contact ratio is considered to be the largest, so that the determination of the inclusion relation of the specified derivative is needed, the condition that the boundary function determines a deviation range can be further reduced, and the accuracy and the rationality of the obtained derivative adjustment range are improved.
As a possible implementation manner, performing extraction analysis based on flesh glossiness information on different direction classification image data sets to form a target flesh comparison database, including: classifying an image data set for each direction, and extracting gray data of each image in a corresponding target boundary, wherein the gray data corresponding to each image contains position information of each gray value under a corresponding boundary coordinate system; setting the diameter stepFor designating boundary function->The gray information in the inner part is sequentially according to the diameter step length by taking the origin of the corresponding boundary coordinate system as the circle center>Acquiring gray information in each newly acquired range, and determining a specified average gray value of the gray information in each range to form a gray information set of the specified range Wherein i is a specified boundary function +.>The number of regions acquired in>A designated average gray value corresponding to the region with the number i is represented; for other target boundary functions, taking the origin of the corresponding boundary coordinate system as the center of a circle, and sequentially taking the scaling step length +.>Acquiring gray information in each newly acquired range, and scaling the average gray value of the gray information in each range to form a gray information set of a scaling range +.>Wherein->Represents the scaled average gray value corresponding to the region numbered i determined within the target boundary function numbered n,the method comprises the steps of carrying out a first treatment on the surface of the Gray information set in the specified range->On the basis of the specified average gray value of each region of (a) in each zoom range gray information set +.>The area gray scale range determination is performed in the following manner for the adjustment amount of the scaled average gray scale value of each area: if the appointed average gray value belongs to the range limited by all the scaled average gray values in the areas with the same numbers, determining the range limited by all the scaled average gray values as the gray adjustment range of the corresponding area; if the appointed average gray value does not belong to the range limited by all the scaled average gray values in the area with the same number, the appointed average gray value is used as a boundary value to expand the range limited by all the scaled average gray values, so as to form a gray adjustment range; and collecting gray scale adjustment ranges corresponding to all areas to form a target meat quality comparison database.
In the invention, the illumination formed by each image data has the same regularity under the specific light setting, especially in the aspect of glossiness, and in order to improve the analysis efficiency and simplify the analysis data, the invention simplifies the glossiness information by carrying out gray processing on the image, but fully retains the characteristic information of glossiness change, thus the glossiness condition of the seafood meat quality can be determined by carrying out numerical analysis on the gray data, and further a comparison database capable of distinguishing the seafood meat quality is formed. Here, considering that the coordinate system and scaling are utilized when the comparison database of the boundary function is established, when the meat quality comparison database is established, on one hand, considering that the meat quality of different sizes also causes the difference of glossiness at different positions, and therefore, the range of each acquired region needs to be adjusted based on the scaling rate in an adaptive manner, so that the acquired gray data has more reasonable contrast and accuracy, on the other hand, considering that the meat quality of different sizes adjusts and compresses the gray data through scaling to further ensure the rationality and the comparability of the data, and therefore, taking the origin of the coordinate system referenced by scaling as the starting point position of the divided region can ensure that the scaling does not affect the division of the region, and of course, the light needs to be set again to ensure that the light corresponds to the position of the origin of the coordinate as much as possible, so that the change of glossiness at least on each region can not be too strong by dividing the region by circular shape, and the rationality and the accuracy of the data can be effectively ensured.
As a possible implementation manner, a plurality of real-time image data of the target marine products are obtained, boundary information comparison is performed in combination with a target boundary comparison database, and classification screening is performed on the target marine products according to comparison result data to form different screening classes, including: acquiring a plurality of real-time image data of the target marine product in a first motion direction, extracting real-time boundary functions, and forming a plurality of first-direction real-time boundary functions, wherein each first-direction real-time boundary function comprises a first real-time coordinate system taking the center of a real-time image as a coordinate origin; determining a derivative adjustment range which is the same as the image acquisition direction of the first-direction real-time boundary function in a target boundary comparison database; for each first-direction real-time boundary function, acquiring the real-time derivative of the real-time reference point which is the same as the reference point corresponding to the derivative adjustment range; the following classification screening is carried out on the real-time derivatives corresponding to the derivative adjustment range on the real-time boundary functions in the first directions: if the real-time derivatives on all the first-direction real-time boundary functions belong to the corresponding derivative adjustment ranges, marking the analyzed target marine products as normal screening classes in the first direction; if any real-time derivative on the first-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is smaller than a small boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a first-direction damage screening class; if any real-time derivative on the first-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is larger than a large boundary value of the corresponding derivative adjustment range, marking the analyzed target marine products as first-direction proliferation screening types; performing a plurality of real-time image data in a second motion direction on the target marine products determined to be normal screening classes in the first direction, and extracting real-time boundary functions to form a plurality of second-direction real-time boundary functions, wherein each second-direction real-time boundary function comprises a second real-time coordinate system taking the center of a real-time image as a coordinate origin; determining a derivative adjustment range which is the same as the image acquisition direction of the second-direction real-time boundary function in a target boundary comparison database; for each second-direction real-time boundary function, acquiring the real-time derivative of the real-time reference point which is the same as the reference point corresponding to the derivative adjustment range; and carrying out the following classification screening on the real-time derivatives corresponding to the derivative adjustment range on the plurality of second-direction real-time boundary functions: if the real-time derivatives on all the real-time boundary functions in the second direction belong to the corresponding derivative adjustment ranges, marking the analyzed target marine products as normal screening classes in the second direction; if any real-time derivative on the second-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is smaller than a small boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a second-direction damage screening class; if any real-time derivative on the second direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is larger than a large boundary value of the corresponding derivative adjustment range, marking the analyzed target marine products as second direction proliferation screening types.
In the invention, after the comparison database is established, the direction of the real-time image data is required to be determined after the real-time image data is acquired, and then the corresponding derivative adjustment range is extracted from the comparison database, so that the invalidity of data comparison is avoided. It will be appreciated that the interval defined by the derivative adjustment range is essentially a range of reasonable variation of the rate of change of the target seafood boundary, and that for any real-time derivative not within this range, it can be determined that a non-negligible variation of the boundary feature has occurred at its corresponding location. Of course, the situation that such judgment is often considered may be that soil is stuck or damaged is sunk, and the same change situation generally occurs at a plurality of adjacent reference points, so that further accurate judgment can be performed by performing continuous judgment of the adjacent reference points to avoid deviation of data judgment, and it is worth noting that the judgment is also based on the selection result of the angle step, if the angle step is selected relatively large, the change position may be crossed, so that the invention also takes the abnormality of the change occurrence of a single reference point as the standard of analysis judgment. The breakage screening can be subjected to other subsequent processes, and the cleaning screening can be subjected to the same-process cleaning process again, or can be manually judged. The invention provides the analysis and judgment of the image data in two directions to carry out detection and classification screening, of course, the quantity of the directions required to be detected can be set according to the shape characteristics of different marine product meat quality, and then the image data can be acquired in different directions to carry out screening and judgment. Essentially, the analysis screening judgment in each direction is the treatment of whether the cleaning is not clean or broken. It can be understood that the data analysis and judgment combined with the boundary features also provides a data basis for the subsequent classification and screening according to the sizes of the marine products, and realizes the efficient automatic classification and screening from original products to finished products.
As a possible implementation manner, determining a target screening class according to different screening classes, acquiring real-time meat quality glossiness information of the target screening class, and performing meat quality classification analysis in combination with a target meat quality comparison database to form meat quality classification result data, including: determining the normal screening class in the second direction as a target screening class; acquiring gray information of the target screening class in boundary ranges in different motion directions, and based on a gray information set of a designated rangeDetermining a real-time average gray value of a corresponding region range; combining real-time average gray value and target meat pairThe comparison database is used for judging meat quality classification according to the following modes: acquiring a gray scale adjustment range corresponding to the motion direction from a target meat quality comparison database; if the real-time average gray value of the corresponding region belongs to the corresponding gray adjustment range, marking the corresponding target marine products as high-quality screening types; and if the real-time average gray value of the corresponding region does not belong to the corresponding gray adjustment range, marking the corresponding target marine product as a general quality screening class.
In the invention, the real-time average gray value is obtained by carrying out the same mode as the gray information set of the appointed range on the gray data of the image data obtained in real time, and of course, the step length of the area range is regulated by scaling according to the form of the scaling rate to realize reasonable division, so that the obtained real-time average gray value can be compared and analyzed in a one-to-one correspondence with the gray regulation range of the database. Further, since the evaluation of the meat quality is considered to be integral, it is possible to determine the quality of the seafood meat as high quality only if the gradation values of all the regions are within the gradation adjustment range.
In a second aspect, the present invention provides a marine product detection and classification system based on separated and extracted image data, which is applied to the marine product detection and classification method based on separated and extracted image data in the first aspect, and includes an image data acquisition unit for acquiring image data of a target marine product in different directions; the analysis processing unit is used for acquiring the image data of the image data acquisition unit, establishing a target boundary comparison database and a target meat quality comparison database, carrying out comparison analysis on the real-time image data to form different screening classes, and carrying out comparison analysis on the image data after the screening class analysis based on the target meat quality comparison database to form meat quality classification result data.
In the invention, the system fully acquires the image data of the target marine products through the image data acquisition unit, and provides reasonable and sufficient basic data for the subsequent analysis of detection classification. The analysis processing unit is used for realizing reasonable and efficient analysis of the meat quality state based on the image data and accurate analysis of the meat quality, forming accurate marine product meat quality detection classification screening results, and further providing real-time and accurate analysis result data for effectively and accurately completing the whole detection classification screening process of marine product meat quality.
The marine product detection classification system and method based on the separated and extracted image data provided by the invention have the beneficial effects that:
the method establishes a boundary feature database for comparison by acquiring historical classified image data of target marine products to be classified, detected and screened, and extracting boundary features based on the classified image data, and establishes a meat quality comparison database for comparison based on light gray scale features exhibited by meat quality in the image data under specific light. Meanwhile, real-time image data of meat quality is obtained, comparison analysis of boundary characteristics is carried out, automatic classification detection on whether damage or hyperplasia exists in the meat quality of the target marine product is achieved, and marine products with normal meat quality are compared in meat quality state according to detection results, namely, gray data are compared and analyzed by utilizing a meat quality comparison database, and classification processing analysis of different meat quality is further completed. The whole screening process of detection classification realizes automatic classification detection screening of the meat quality of the target marine products on the basis of using an image processing method, and has the advantages of high screening accuracy and high efficiency, and greatly improves the processing efficiency of the marine products.
The system fully acquires the image data of the target marine products through the image data acquisition unit, and provides reasonable and sufficient basic data for the subsequent analysis of detection classification. The analysis processing unit is used for realizing reasonable and efficient analysis of the meat quality state based on the image data and accurate analysis of the meat quality, forming accurate marine product meat quality detection classification screening results, and further providing real-time and accurate analysis result data for effectively and accurately completing the whole detection classification screening process of marine product meat quality.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments of the present invention will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and should not be considered as limiting the scope, and other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a step diagram of a marine product detection and classification method based on separated and extracted image data according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described below with reference to the accompanying drawings in the embodiments of the present invention.
Seafood products are numerous and require preliminary sorting and subsequent fine sorting of different types of seafood products to accommodate different market demands. As the seafood market opens, there is an increasing demand for seafood. Currently, classification screening of seafood, especially meat quality, is basically performed manually, which is time-consuming and labor-consuming.
Of course, there are also some screening and sorting methods which are deeply combined with automation, for example, classification and sorting are combined with weight data acquisition, but the screening and sorting methods are single and only auxiliary screening and sorting methods, and the classification and sorting of the meat quality of marine products are still under the current state of manual operation. For most marine products, the shape of the marine products has a certain fixed shape, especially fish, shellfish and the like, so that the same meat classification and identification can be realized, and the realization of efficient and automatic meat classification of the marine products is a content worthy of intensive research.
Referring to fig. 1, an embodiment of the present invention provides a marine product detection classification method based on separately extracted image data by performing historical classification image data acquisition of a target marine product to be subjected to classification detection screening, and extracting boundary features based on these classification image data to create a boundary feature database for comparison and a meat quality comparison database for comparison based on light gray-scale features exhibited by meat quality in the image data under specific light. Meanwhile, real-time image data of meat quality is obtained, comparison analysis of boundary characteristics is carried out, automatic classification detection on whether damage or hyperplasia exists in the meat quality of the target marine product is achieved, and marine products with normal meat quality are compared in meat quality state according to detection results, namely, gray data are compared and analyzed by utilizing a meat quality comparison database, and classification processing analysis of different meat quality is further completed. The whole screening process of detection classification realizes automatic classification detection screening of the meat quality of the target marine products on the basis of using an image processing method, and has the advantages of high screening accuracy and high efficiency, and greatly improves the processing efficiency of the marine products.
The marine product detection and classification method based on the separated and extracted image data specifically comprises the following steps:
s1: and obtaining classified image data of the target marine products, respectively extracting target boundary information and meat quality glossiness information, and establishing a target boundary comparison database and a target meat quality comparison database.
The method for acquiring the classified image data of the target marine products, respectively extracting target boundary information and meat quality glossiness information, and establishing a target boundary comparison database and a target meat quality comparison database comprises the following steps: acquiring classified image data of the target marine products, and dividing the classified image data in the image acquisition directions to form classified image data sets in different directions; extracting and analyzing the classified image data sets in different directions based on the target boundary information to form a target boundary comparison database; and carrying out extraction analysis based on the meat quality glossiness information on the different direction classification image data sets to form a target meat quality comparison database.
The image data acquired in different directions are different, and thus the boundary feature information and the meat luster information formed are also different. In order to make the database have a targeted comparison effect, when the database is built, the basic data needs to be classified according to different acquisition directions, so that the image data acquired in the same direction has the generalization and compatibility of the data when the feature extraction is performed. For the acquired direction, consideration can be made at the beginning of establishing big data of image data, the acquired direction can be determined according to the characteristics of the shape, the luster and the like of the meat of different types of marine products, for example, for shellfish marine products, the direction facing a single shell surface on a scallop and the direction facing two shell seams can be considered because the meat of the scallop is similar to the shell, so that the boundary characteristic data extracted at the later stage is favorable for containing the shape characteristic information of the marine products as much as possible. Of course, for each image data extraction, it may be determined that a stable state of a seafood is, for example, stably placed on a fixed station of a certain working procedure, and a certain position of a stable transmission process is obtained, so that the established boundary coordinate system is basically corresponding to the corresponding feature boundary function, which is beneficial to improving accuracy of the scaling data in the subsequent comparative analysis.
The method for extracting and analyzing the image data sets classified in different directions based on the target boundary information to form a target boundary comparison database comprises the following steps: classifying the image data set for each direction, extracting boundary information in each image to form a target boundary function setWherein the set of object boundary functions +.>The object boundary function in (2) comprises a boundary coordinate system taking the center of the image as the origin of coordinates; for the set of object boundary functions->Performing overlap comparison based on boundary coordinate system, and determining the object boundary function with the highest overlap ratio as the designated boundary function +.>The method comprises the steps of carrying out a first treatment on the surface of the Setting an angle step alpha to specify a boundary function +.>The corresponding boundary coordinate system is a point-integrated reference coordinate system; sequentially forming rays from the origin of coordinates with the origin of coordinates as a reference and the angle step alpha as an angular interval acquired by designating a reference point, and combining the rays with the origin of coordinatesSpecifying boundary function +.>The point of intersection is determined as a designated reference point; obtaining the function derivative corresponding to each appointed reference point to form appointed derivative set +.>=[,/>,…,/>]K represents the number of the acquired designated reference point; for objective boundary function->Divide by specified boundary function->Other object boundary functions than the specified boundary function +. >Forming a process boundary function; the processing boundary function is processed and assigned with the boundary function under the corresponding boundary coordinate system>The same point centralization parameter of each processing reference point is determined, and the corresponding processing derivative of each processing reference point is determined to form a corresponding processing derivative set +.>Wherein n represents the number of other target boundary functions except the specified boundary function in the target boundary functions; one-to-one correspondence of each designated reference point to the processing derivative in each processing boundary function based on the position on the boundary coordinate system, and +_ in the designated derivative set>Based on each specified derivative of (a) each set of treatment derivatives>Corresponding processing derivative in (a) is a range adjustment quantity, and a derivative adjustment range of each appointed reference point is determined; and establishing a target boundary comparison database by combining derivative adjustment ranges of all designated reference points corresponding to the classified image data sets in different directions.
The determination of the specified boundary function is determined through the coincidence degree, so that the obtained characteristics contained in the specified boundary function can cover most characteristics of the meat quality of the marine products, and the characteristic information can be reasonably reserved and maintained in the subsequent scaling treatment based on the specified boundary function. It should be noted that, the range adjustment amount obtained here is substantially the derivative of the boundary function at the reference point, and based on the property of the derivative, the range adjustment amount can determine whether the curve change of the boundary function is in a reasonable range defined by big data, so as to determine an abnormal boundary function, that is, the condition that the quality of the seafood may have hyperplasia to cause the boundary to be raised or damaged to cause the boundary to be recessed. The scaling of the objective boundary function is considered to take into account the difference in individual body types of the meat quality of the seafood of the same type, and if the derivative is directly extracted, the range adjustment amount can be enlarged or reduced, so that the follow-up real-time judgment is inaccurate in reference to the comparison library data and the detection classification is inaccurate. In addition, the establishment of the appointed derivative set can consider the reference point extraction of the fixed step length, and the angle step length can be determined according to the boundary characteristics of the meat quality of marine products of different types, so that on one hand, the accuracy of data analysis can be adaptively improved, and on the other hand, the rationality and the moderate degree of data extraction can be realized while the accuracy of data analysis is ensured. When speaking, as for the boundary function, the shape characteristics of the meat quality of different seafood are considered, the boundary function may be a piecewise function, so that the angle step size can also be adjusted in a targeted manner under the same boundary function for different piecewise curves so as to improve the accuracy of the reference point data.
For objective boundary functionDivide by specified boundary function->Other object boundary functions than the specified boundary function +.>Form a process boundary function, comprising: determining the average specified distance +.>The method comprises the steps of carrying out a first treatment on the surface of the Dividing the designated boundary function by each selection>An outer single target boundary function; will specify boundary function +.>The corresponding boundary coordinate system coincides with the coordinate system corresponding to the target boundary function, and the distance difference function of the appointed boundary function and the target boundary function is determined when the coordinate system coincides +.>Wherein->,/>Representing the distance of the coordinate point on the specified boundary function from the origin of the coordinate system,/>Representing the distance between the coordinate point on the selected single target boundary function and the origin of the coordinate system; the average distance difference under the distance difference function is taken as the effective scaling distance difference +.>The method comprises the steps of carrying out a first treatment on the surface of the According to the average specified distance->And effective zoom distance difference->Determining the scaling of the objective boundary function>Wherein->,Representing the effective zoom distance difference +.>A sign extraction function for extracting positive and negative signs; and scaling the target boundary function according to the corresponding scaling rate to form a processing boundary function.
The distance difference is used as basic data for determining the scaling rate, so that the scaling accuracy can be realized. Of course, the individual shape difference of the meat quality of the marine products of the same type is considered, so that the distance difference is not constant, reasonable scaling rate data can be obtained by taking the average distance as a reference quantity, the data averaging requirement of big data processing is met, and enough characteristic data can be reserved. Here, the boundary function is preferably a function in an angular coordinate system, but may be a function in a planar coordinate system. The function under the angle coordinate system can ensure that the acquired distance difference data is the actual distance difference between the corresponding reference points, and the function under the plane coordinate system can perform equivalent processing because the acquired distance difference is the angle difference relative to the origin instead of the actual distance difference, after all, the distance difference between the distance difference and the actual distance difference is smaller, boundary characteristic information is fully reserved, and large change of the scaling ratio cannot be caused. Of course, this fine influence can also be reduced by adjusting the angle step to improve accuracy.
To specify derivative setsBased on each specified derivative of (a) each set of treatment derivatives>The corresponding processing derivative in (a) is a range adjustment amount, and determining a derivative adjustment range of each specified reference point comprises: acquiring each set of processing derivatives->Treatment derivative corresponding to position->Forming an initial reference point processing derivative set; setting a derivative deviation threshold beta, and carrying out the following screening judgment on each processing derivative in the initial reference point processing derivative set: if->The processing derivative is reserved, and otherwise, the processing derivative is removed; acquiring reserved processing derivatives to form an effective reference point processing derivative set; acquiring the minimum processing derivative in the effective reference point processing derivative set and the maximum processing derivative in the effective reference point processing derivative set to form an initial adjustment range; according to the initial adjustment range and the corresponding designated derivative, the following judgment is carried out to form a derivative adjustment range: if the designated derivative belongs to the corresponding initial adjustment range, determining the initial adjustment range as a derivative adjustment range; if the specified derivative does not belong to the corresponding initial adjustment range, the specified derivative is used as a boundary value to expand the initial adjustment range, so that a derivative adjustment range is formed.
The determination of the derivative adjustment range is determined by taking the minimum value and the maximum value in the derivatives formed by the boundary functions as range boundaries, but the boundary characteristic information contained in the boundary characteristic function with the maximum overlap ratio is considered to be the specified boundary function, so that the determination of the inclusion relation of the specified derivatives is needed, the situation that the boundary function determines a deviation range can be further reduced, and the accuracy and the rationality of the obtained derivative adjustment range are improved.
Performing extraction analysis based on meat quality glossiness information on different direction classification image data sets to form a target meat quality comparison database, including: classifying an image data set for each direction, and extracting gray data of each image in a corresponding target boundary, wherein the gray data corresponding to each image contains position information of each gray value under a corresponding boundary coordinate system; setting the diameter stepFor designating boundary function->The gray information in the inner part is sequentially according to the diameter step length by taking the origin of the corresponding boundary coordinate system as the circle center>Acquiring gray information in each newly acquired range, and determining the designated average gray value of the gray information in each range to form a gray information set of designated range +. >Wherein i is a specified boundary function +.>The number of regions acquired in>A designated average gray value corresponding to the region with the number i is represented; for other target boundary functions, taking the origin of the corresponding boundary coordinate system as the center of a circle, and sequentially taking the scaling step length +.>Acquiring gray information in each newly acquired range, and scaling the average gray value of the gray information in each range to form a gray information set of a scaling range +.>Wherein->Represents the scaled average gray value corresponding to the region of number i determined within the target boundary function of number n,/->The method comprises the steps of carrying out a first treatment on the surface of the Gray information set in specified rangeOn the basis of the specified average gray value of each region of (a) in each zoom range gray information set +.>The area gray scale range determination is performed in the following manner for the adjustment amount of the scaled average gray scale value of each area: if the appointed average gray value belongs to the range limited by all the scaled average gray values in the areas with the same numbers, determining the range limited by all the scaled average gray values as the gray adjustment range of the corresponding area; if the appointed average gray value does not belong to the range limited by all the scaled average gray values in the area with the same number, the appointed average gray value is used as a boundary value to expand the range limited by all the scaled average gray values, so as to form a gray adjustment range; and collecting gray scale adjustment ranges corresponding to all areas to form a target meat quality comparison database.
The illumination formed by each image data has the same regularity under the specific light setting, especially in the aspect of glossiness, and in order to improve the analysis efficiency and simplify the analysis data, the invention simplifies the glossiness information by carrying out gray processing on the images, but fully retains the characteristic information of glossiness change, thus the glossiness condition of the seafood meat quality can be determined by carrying out numerical analysis on the gray data, and further a comparison database capable of distinguishing the seafood meat quality is formed. Here, considering that the coordinate system and scaling are utilized when the comparison database of the boundary function is established, when the meat quality comparison database is established, on one hand, considering that the meat quality of different sizes also causes the difference of glossiness at different positions, and therefore, the range of each acquired region needs to be adjusted based on the scaling rate in an adaptive manner, so that the acquired gray data has more reasonable contrast and accuracy, on the other hand, considering that the meat quality of different sizes adjusts and compresses the gray data through scaling to further ensure the rationality and the comparability of the data, and therefore, taking the origin of the coordinate system referenced by scaling as the starting point position of the divided region can ensure that the scaling does not affect the division of the region, and of course, the light needs to be set again to ensure that the light corresponds to the position of the origin of the coordinate as much as possible, so that the change of glossiness at least on each region can not be too strong by dividing the region by circular shape, and the rationality and the accuracy of the data can be effectively ensured.
S2: and acquiring a plurality of real-time image data of the target marine products, carrying out boundary information comparison by combining the target boundary comparison database, and classifying and screening the target marine products according to the comparison result data to form different screening classes.
Acquiring a plurality of real-time image data of the target marine products, carrying out boundary information comparison by combining the target boundary comparison database, classifying and screening the target marine products according to comparison result data to form different screening classes, and comprising the following steps: acquiring a plurality of real-time image data of the target marine product in a first motion direction, extracting real-time boundary functions, and forming a plurality of first-direction real-time boundary functions, wherein each first-direction real-time boundary function comprises a first real-time coordinate system taking the center of a real-time image as a coordinate origin; determining a derivative adjustment range which is the same as the image acquisition direction of the first-direction real-time boundary function in a target boundary comparison database; for each first-direction real-time boundary function, acquiring the real-time derivative of the real-time reference point which is the same as the reference point corresponding to the derivative adjustment range; the following classification screening is carried out on the real-time derivatives corresponding to the derivative adjustment range on the real-time boundary functions in the first directions: if the real-time derivatives on all the first-direction real-time boundary functions belong to the corresponding derivative adjustment ranges, marking the analyzed target marine products as normal screening classes in the first direction; if any real-time derivative on the first-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is smaller than a small boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a first-direction damage screening class; if any real-time derivative on the first-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is larger than a large boundary value of the corresponding derivative adjustment range, marking the analyzed target marine products as first-direction proliferation screening classes. Performing a plurality of real-time image data in a second motion direction on the target marine products determined to be normal screening classes in the first direction, and extracting real-time boundary functions to form a plurality of second-direction real-time boundary functions, wherein each second-direction real-time boundary function comprises a second real-time coordinate system taking the center of a real-time image as a coordinate origin; determining a derivative adjustment range which is the same as the image acquisition direction of the second-direction real-time boundary function in a target boundary comparison database; for each second-direction real-time boundary function, acquiring the real-time derivative of the real-time reference point which is the same as the reference point corresponding to the derivative adjustment range; and carrying out the following classification screening on the real-time derivatives corresponding to the derivative adjustment range on the plurality of second-direction real-time boundary functions: if the real-time derivatives on all the real-time boundary functions in the second direction belong to the corresponding derivative adjustment ranges, marking the analyzed target marine products as normal screening classes in the second direction; if any real-time derivative on the second-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is smaller than a small boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a second-direction damage screening class; if any real-time derivative on the second direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is larger than a large boundary value of the corresponding derivative adjustment range, marking the analyzed target marine products as second direction proliferation screening types.
After the comparison database is established, the direction of the real-time image data is required to be determined after the real-time image data is acquired, and then the corresponding derivative adjustment range is extracted from the comparison database, so that the invalidity of data comparison is avoided. It will be appreciated that the interval defined by the derivative adjustment range is essentially a range of reasonable variation of the rate of change of the target seafood boundary, and that for any real-time derivative not within this range, it can be determined that a non-negligible variation of the boundary feature has occurred at its corresponding location. Of course, the situation that such judgment is often considered may be that soil is stuck or damaged is sunk, and the same change situation generally occurs at a plurality of adjacent reference points, so that further accurate judgment can be performed by performing continuous judgment of the adjacent reference points to avoid deviation of data judgment, and it is worth noting that the judgment is also based on the selection result of the angle step, if the angle step is selected relatively large, the change position may be crossed, so that the invention also takes the abnormality of the change occurrence of a single reference point as the standard of analysis judgment. The breakage screening can be subjected to other subsequent processes, and the cleaning screening can be subjected to the same-process cleaning process again, or can be manually judged. The invention provides the analysis and judgment of the image data in two directions to carry out detection and classification screening, of course, the number of directions required to be detected can be set for different marine products according to the shape characteristics of the marine products, and then the image data can be acquired in different directions to carry out screening and judgment. Essentially, the analysis screening judgment in each direction is the treatment of whether the cleaning is not clean or broken. It can be understood that the data analysis and judgment combined with the boundary features also provides a data basis for the subsequent classification and screening according to the sizes of the marine products, and realizes the efficient automatic classification and screening from original products to finished products.
S3: and determining target screening classes according to different screening classes, acquiring real-time meat quality glossiness information of the target screening classes, and performing meat quality classification analysis by combining a target meat quality comparison database to form meat quality classification result data.
Determining a target screening class according to different screening classes, acquiring real-time meat quality glossiness information of the target screening class, and performing meat quality classification analysis by combining a target meat quality comparison database to form meat quality classification result data, wherein the method comprises the following steps: will be in the second directionThe normal screening class is determined as a target screening class; acquiring gray information of the target screening class in boundary ranges in different motion directions, and based on a gray information set of a designated rangeDetermining a real-time average gray value of a corresponding region range; combining the real-time average gray value with a target meat quality comparison database, and judging meat quality classification according to the following modes: acquiring a gray scale adjustment range corresponding to the motion direction from a target meat quality comparison database; if the real-time average gray value of the corresponding region belongs to the corresponding gray adjustment range, marking the corresponding target marine products as high-quality screening types; and if the real-time average gray value of the corresponding region does not belong to the corresponding gray adjustment range, marking the corresponding target marine product as a general quality screening class.
The real-time average gray value is obtained by carrying out the same mode as the gray information set of the appointed range on the gray data of the image data obtained in real time, and of course, the step length of the area range is regulated by scaling according to the scaling rate mode to realize reasonable division under the condition that scaling exists, so that the obtained real-time average gray value can be subjected to one-to-one corresponding comparison analysis with the gray regulation range of the database. Further, since the evaluation of the meat quality is considered to be integral, it is possible to determine the quality of the seafood meat as high quality only if the gradation values of all the regions are within the gradation adjustment range.
The invention also provides a marine product detection and classification system based on the separated and extracted image data, which adopts the marine product detection and classification method based on the separated and extracted image data, and comprises an image data acquisition unit for acquiring image data of a target marine product in different directions; the analysis processing unit is used for acquiring the image data of the image data acquisition unit, establishing a target boundary comparison database and a target meat quality comparison database, carrying out comparison analysis on the real-time image data to form different screening classes, and carrying out comparison analysis on the image data after the screening class analysis based on the target meat quality comparison database to form meat quality classification result data.
The system fully acquires the image data of the target marine products through the image data acquisition unit, and provides reasonable and sufficient basic data for the subsequent analysis of detection classification. The analysis processing unit is used for realizing reasonable and efficient analysis of the meat quality state based on the image data and accurate analysis of the meat quality, forming accurate marine product meat quality detection classification screening results, and further providing real-time and accurate analysis result data for effectively and accurately completing the whole detection classification screening process of marine product meat quality.
In summary, the marine product detection and classification device and method based on the separated and extracted image data provided by the embodiment of the invention have the beneficial effects that:
the method establishes a boundary feature database for comparison by acquiring historical classified image data of target marine products to be classified, detected and screened, and extracting boundary features based on the classified image data, and establishes a meat quality comparison database for comparison based on light gray scale features exhibited by meat quality in the image data under specific light. Meanwhile, real-time image data of meat quality is obtained, comparison analysis of boundary characteristics is carried out, automatic classification detection on whether damage or hyperplasia exists in the meat quality of the target marine product is achieved, and marine products with normal meat quality are compared in meat quality state according to detection results, namely, gray data are compared and analyzed by utilizing a meat quality comparison database, and classification processing analysis of different meat quality is further completed. The whole screening process of detection classification realizes automatic classification detection screening of the meat quality of the target marine products on the basis of using an image processing method, and has the advantages of high screening accuracy and high efficiency, and greatly improves the processing efficiency of the marine products.
The system fully acquires the image data of the target marine products through the image data acquisition unit, and provides reasonable and sufficient basic data for the subsequent analysis of detection classification. The analysis processing unit is used for realizing reasonable and efficient analysis of the meat quality state based on the image data and accurate analysis of the meat quality, forming accurate marine product meat quality detection classification screening results, and further providing real-time and accurate analysis result data for effectively and accurately completing the whole detection classification screening process of marine product meat quality.
In the present invention, "at least one" means one or more, and "a plurality" means two or more. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present invention, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
Claims (10)
1. The marine product detection and classification method based on the separated and extracted image data is characterized by comprising the following steps:
acquiring classified image data of a target marine product, respectively extracting target boundary information and meat quality glossiness information, and establishing a target boundary comparison database and a target meat quality comparison database;
acquiring a plurality of real-time image data of the target marine products, carrying out boundary information comparison by combining the target boundary comparison database, and classifying and screening the target marine products according to comparison result data to form different screening classes;
And determining target screening classes according to different screening classes, acquiring real-time meat quality glossiness information of the target screening classes, and performing meat quality classification analysis by combining the target meat quality comparison database to form meat quality classification result data.
2. The marine product detection and classification method based on the separated and extracted image data according to claim 1, wherein the obtaining of the classified image data of the target marine product, the extracting of the target boundary information and the meat quality glossiness information, respectively, creates a target boundary comparison database and a target meat quality comparison database, comprises:
acquiring classified image data of the target marine products, and dividing the classified image data in the image acquisition direction to form classified image data sets in different directions;
extracting and analyzing different direction classification image data sets based on target boundary information to form a target boundary comparison database;
and carrying out extraction analysis based on meat quality glossiness information on different direction classification image data sets to form the target meat quality comparison database.
3. The marine product detection classification method based on the separated and extracted image data according to claim 2, wherein the performing the extraction analysis based on the target boundary information on the different direction classification image data sets to form the target boundary comparison database includes:
Classifying the image data sets for each direction, extracting boundary information in each image to form an objective boundary function setWherein the set of objective boundary functions +.>The object boundary function in (2) comprises a boundary coordinate system taking the center of the image as the origin of coordinates;
for the target boundary function setPerforming overlap comparison based on the boundary coordinate system, and determining the target boundary function with the highest overlap ratio as a specified boundary function +.>;
Setting an angle step alpha to the specified boundary functionThe corresponding boundary coordinate system is a point-integrated reference coordinate system;
sequentially forming rays from the origin of coordinates with the origin of coordinates as a reference and the angle step alpha as an angle interval acquired by the designated reference point, and combining the rays with the designated boundary functionDetermining the intersecting point as the appointed reference point;
obtaining the function derivative corresponding to each appointed reference point to form an appointed derivative set=[/>,/>,…,/>]K represents the number of the acquired designated reference point;
for the target boundary functionDivide the specified boundary function->The object boundary function is based on the specified boundary function +. >Forming a process boundary function;
performing the processing boundary function and the appointed boundary function under the corresponding boundary coordinate systemThe same point centralization parameter of each processing reference point is determined, and the corresponding processing derivative of each processing reference point is determined to form a corresponding processing derivative set +.>Wherein n represents the division of the specification in the objective boundary functionNumbering of other target boundary functions outside the boundary function;
one-to-one correspondence of each of the specified reference points to the processing derivatives in each of the processing boundary functions based on the positions on the boundary coordinate system and in the specified derivative setBased on each of said specified derivatives of (1) each of said set of treated derivatives +.>The corresponding processing derivative is a range adjustment quantity, and a derivative adjustment range of each appointed reference point is determined;
and establishing the target boundary comparison database by combining the derivative adjustment ranges of all the designated reference points corresponding to the classified image data sets in different directions.
4. A marine product detection and classification method based on separated and extracted image data as claimed in claim 3, wherein said target boundary function is applied to said marine product Divide the specified boundary function->The object boundary function is based on the specified boundary function +.>Form a process boundary function, comprising:
determining an average specified distance of the specified boundary function relative to the origin of the boundary coordinate system;
Dividing the appointed boundary function by each selectionAn outer single said target boundary function;
the specified boundary functionThe corresponding boundary coordinate system coincides with the coordinate system corresponding to the target boundary function, and the distance difference function of the appointed boundary function and the target boundary function when the coordinate system coincides is determined>Wherein->,/>Representing the distance of the coordinate point on the specified boundary function relative to the origin of the coordinate system,/for>Representing the distance between the coordinate point on the selected single target boundary function and the origin of the coordinate system;
taking the average distance difference under the distance difference function as the effective scaling distance difference;
Specifying a distance according to the averageAnd said effective zoom distance difference +.>Determining the scaling of the object boundary function +.>Wherein->,/>Representing +.>A sign extraction function for extracting positive and negative signs;
and scaling the target boundary function according to the corresponding scaling rate to form the processing boundary function.
5. The marine product detection classification method based on the separated and extracted image data according to claim 4, wherein said classifying is performed with said specified derivative setBased on each of said specified derivatives of (1) each of said set of treated derivatives +.>The corresponding processing derivative is a range adjustment amount, and determining the derivative adjustment range of each designated reference point comprises the following steps:
obtaining each of the set of processing derivativesSaid processing derivative corresponding to the middle position +.>Forming an initial reference point processing derivative set;
setting a derivative deviation threshold beta, and carrying out the following screening judgment on each processing derivative in the initial reference point processing derivative set:
if it isThe process is then derivative intoRow reservation, otherwise, removing;
acquiring the reserved processing derivative to form an effective reference point processing derivative set;
acquiring the minimum processing derivative in the effective reference point processing derivative set and the maximum processing derivative in the effective reference point processing derivative set to form an initial adjustment range;
and according to the initial adjustment range and the corresponding specified derivative, performing the following judgment to form the derivative adjustment range:
if the specified derivative belongs to the corresponding initial adjustment range, determining the initial adjustment range as the derivative adjustment range;
And if the specified derivative does not belong to the corresponding initial adjustment range, expanding the initial adjustment range by taking the specified derivative as a boundary value to form the derivative adjustment range.
6. The marine product detection classification method based on separated and extracted image data according to claim 5, wherein the performing an extraction analysis based on meat quality glossiness information on different of the direction classification image data sets to form the target meat quality comparison database includes:
classifying an image data set for each direction, and extracting gray data of each image in the corresponding target boundary, wherein the gray data corresponding to each image comprises position information of each gray value under the corresponding boundary coordinate system;
setting the diameter stepFor the specified boundary function +.>The gray information in the frame is sequentially according to the diameter step length by taking the origin of the corresponding boundary coordinate system as the circle center>Acquiring gray information in each newly acquired range, and determining a specified average gray value of the gray information in each range to form a gray information set of the specified rangeWherein i is the specified boundary function +. >The number of regions acquired in>Representing the designated average gray value corresponding to the region with the number i;
sequentially taking the origin of the corresponding boundary coordinate system as the circle center for other target boundary functions according to the zooming step lengthAcquiring gray information in each newly acquired range, and scaling the average gray value of the gray information in each range to form a gray information set of a scaling range +.>Wherein->Representing the scaled average gray value corresponding to the region of number i determined within the objective boundary function of number n, +.>;
Gray information set in the specified rangeOn the basis of said specified average gray value for each region of said (a) in each of said zoom range gray information sets +.>The scaled average gray value of each region of (a) is used as an adjustment amount to perform the following region gray range determination:
if the appointed average gray value belongs to the range limited by all the scaled average gray values in the areas with the same numbers, determining the range limited by all the scaled average gray values as the gray adjustment range of the corresponding area;
if the appointed average gray value does not belong to the range limited by all the scaled average gray values in the area with the same number, expanding the range limited by all the scaled average gray values by taking the appointed average gray value as a boundary value to form the gray adjustment range;
And collecting the gray scale adjustment ranges corresponding to all the areas to form the target meat quality comparison database.
7. The marine product detection and classification method based on separated and extracted image data according to claim 6, wherein the acquiring of the plurality of real-time image data of the target marine product, the comparing of the boundary information in combination with the target boundary comparison database, and the classifying and screening of the target marine product according to the comparison result data, forms different screening classes, comprises:
acquiring a plurality of real-time image data of the target marine product in a first motion direction, extracting real-time boundary functions, and forming a plurality of first-direction real-time boundary functions, wherein each first-direction real-time boundary function comprises a first real-time coordinate system taking the center of a real-time image as a coordinate origin;
determining the derivative adjustment range which is the same as the image acquisition direction of the first-direction real-time boundary function in the target boundary comparison database;
for each first-direction real-time boundary function, acquiring a real-time derivative of a real-time reference point which is the same as the reference point corresponding to the derivative adjustment range;
And carrying out the following classification screening on the real-time derivatives corresponding to the derivative adjustment range on the plurality of the first-direction real-time boundary functions:
if all the real-time derivatives on the first-direction real-time boundary function belong to the corresponding derivative adjustment range, marking the analyzed target marine products as a first-direction normal screening class;
if any real-time derivative on the first-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is smaller than a small boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a first-direction damage screening class;
and if any real-time derivative on the first-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is larger than a large boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a first-direction proliferation screening class.
8. The marine product detection and classification method based on separated and extracted image data according to claim 7, wherein the acquiring of the plurality of real-time image data of the target marine product, the comparing of the boundary information in combination with the target boundary comparison database, and the classifying and screening of the target marine product according to the comparison result data, forms different screening classes, comprises:
Performing a plurality of real-time image data in a second motion direction on the target marine products determined to be the normal screening class in the first direction, extracting real-time boundary functions, and forming a plurality of second-direction real-time boundary functions, wherein each second-direction real-time boundary function comprises a second real-time coordinate system taking the center of a real-time image as a coordinate origin;
determining the derivative adjustment range which is the same as the image acquisition direction of the second-direction real-time boundary function in the target boundary comparison database;
acquiring the real-time derivative of the real-time reference point which is the same as the reference point corresponding to the derivative adjustment range for each second direction real-time boundary function;
and carrying out the following classification screening on the real-time derivatives corresponding to the derivative adjustment range on the plurality of second-direction real-time boundary functions:
if all the real-time derivatives on the second-direction real-time boundary function belong to the corresponding derivative adjustment range, marking the analyzed target marine products as second-direction normal screening classes;
if any real-time derivative on the second-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is smaller than a small boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a second-direction damage screening class;
And if any real-time derivative on the second-direction real-time boundary function does not belong to the corresponding derivative adjustment range and the real-time derivative is larger than a large boundary value of the corresponding derivative adjustment range, marking the analyzed target marine product as a second-direction proliferation screening class.
9. The marine product detection and classification method based on separated and extracted image data according to claim 8, wherein the determining a target screening class according to different screening classes, and acquiring real-time meat quality glossiness information of the target screening class, and performing meat quality classification analysis in combination with the target meat quality comparison database, to form the meat quality classification result data, comprises:
determining the second direction normal screening class as the target screening class;
acquiring gray information of the target screening class in boundary ranges in different motion directions, and based on the gray information set of the designated rangeDetermining a real-time average gray value of a corresponding region range;
combining the real-time average gray value and the target meat quality comparison database, and judging meat quality classification according to the following modes:
acquiring the gray scale adjustment range of the corresponding motion direction from the target meat quality comparison database;
If the real-time average gray value of the corresponding region belongs to the corresponding gray adjustment range, marking the corresponding target marine products as high-quality screening types;
and if the real-time average gray value of the corresponding region does not belong to the corresponding gray adjustment range, marking the corresponding target marine product as a general quality screening class.
10. Marine product detection and classification system based on the separated and extracted image data, and marine product detection and classification method based on the separated and extracted image data according to any one of claims 1 to 9, characterized by comprising:
the image data acquisition unit is used for acquiring image data of the target marine products in different directions;
and the analysis processing unit is used for acquiring the image data of the image data acquisition unit, establishing a target boundary comparison database and a target meat quality comparison database, carrying out comparison analysis on the real-time image data to form different screening classes, and carrying out comparison analysis on the image data after the screening class analysis based on the target meat quality comparison database to form meat quality classification result data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311293437.4A CN117173490B (en) | 2023-10-09 | 2023-10-09 | Marine product detection classification method and system based on separated and extracted image data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311293437.4A CN117173490B (en) | 2023-10-09 | 2023-10-09 | Marine product detection classification method and system based on separated and extracted image data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117173490A true CN117173490A (en) | 2023-12-05 |
CN117173490B CN117173490B (en) | 2024-07-23 |
Family
ID=88937437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311293437.4A Active CN117173490B (en) | 2023-10-09 | 2023-10-09 | Marine product detection classification method and system based on separated and extracted image data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117173490B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118521530A (en) * | 2024-04-19 | 2024-08-20 | 苏州科技大学 | Terahertz safety detection method and system based on image processing |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111259926A (en) * | 2020-01-08 | 2020-06-09 | 珠海格力电器股份有限公司 | Meat freshness detection method and device, computing equipment and storage medium |
CN113158824A (en) * | 2021-03-30 | 2021-07-23 | 自然资源部第三海洋研究所 | Underwater video fish identification method, system and storage medium |
JP2022021262A (en) * | 2020-07-21 | 2022-02-02 | Assest株式会社 | Program and system for fish quality discrimination |
CN114863198A (en) * | 2022-03-02 | 2022-08-05 | 湖北工业大学 | Crayfish quality grading method based on neural network |
-
2023
- 2023-10-09 CN CN202311293437.4A patent/CN117173490B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111259926A (en) * | 2020-01-08 | 2020-06-09 | 珠海格力电器股份有限公司 | Meat freshness detection method and device, computing equipment and storage medium |
JP2022021262A (en) * | 2020-07-21 | 2022-02-02 | Assest株式会社 | Program and system for fish quality discrimination |
CN113158824A (en) * | 2021-03-30 | 2021-07-23 | 自然资源部第三海洋研究所 | Underwater video fish identification method, system and storage medium |
CN114863198A (en) * | 2022-03-02 | 2022-08-05 | 湖北工业大学 | Crayfish quality grading method based on neural network |
Non-Patent Citations (1)
Title |
---|
CHRISTELL FAITH D. LUMOGDANG ET AL.: "Supervised Machine Learning Approach for Pork Meat Freshness Identification", ICBRA \'19, pages 1 - 6 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118521530A (en) * | 2024-04-19 | 2024-08-20 | 苏州科技大学 | Terahertz safety detection method and system based on image processing |
Also Published As
Publication number | Publication date |
---|---|
CN117173490B (en) | 2024-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Reddy et al. | Analysis of classification algorithms for plant leaf disease detection | |
CN106778902B (en) | Dairy cow individual identification method based on deep convolutional neural network | |
JP4921858B2 (en) | Image processing apparatus and image processing program | |
CN107437068B (en) | Pig individual identification method based on Gabor direction histogram and pig body hair mode | |
CN109190567A (en) | Abnormal cervical cells automatic testing method based on depth convolutional neural networks | |
Buayai et al. | End-to-end automatic berry counting for table grape thinning | |
CN110555875A (en) | Pupil radius detection method and device, computer equipment and storage medium | |
CN113313692B (en) | Automatic banana young plant identification and counting method based on aerial visible light image | |
CN109871900A (en) | The recognition positioning method of apple under a kind of complex background based on image procossing | |
CN117173490B (en) | Marine product detection classification method and system based on separated and extracted image data | |
CN108596176B (en) | Method and device for identifying diatom types of extracted diatom areas | |
CN114627116B (en) | Fabric defect identification method and system based on artificial intelligence | |
CN111178405A (en) | Similar object identification method fusing multiple neural networks | |
CN118115497B (en) | Quartz sand crushing and grinding detection method and device | |
CN109948577B (en) | Cloth identification method and device and storage medium | |
GB2399629A (en) | Automatic thresholding algorithm method and apparatus | |
Dannemiller et al. | A new method for the segmentation of algae images using retinex and support vector machine | |
CN108491888B (en) | Environmental monitoring hyperspectral data spectrum section selection method based on morphological analysis | |
CN116563276A (en) | Chemical fiber filament online defect detection method and detection system | |
CN110633720A (en) | Corn disease identification method | |
CN115705748A (en) | Facial feature recognition system | |
CN118095971B (en) | AD calcium milk beverage processing technology assessment method, system and medium | |
Nandy et al. | Supervised learning framework for screening nuclei in tissue sections | |
CN110619273A (en) | Efficient iris recognition method and device | |
CN116267226B (en) | Mulberry picking method and device based on intelligent machine vision recognition of maturity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |