CN118097305A - Method and system for detecting quality of semiconductor light-emitting element - Google Patents
Method and system for detecting quality of semiconductor light-emitting element Download PDFInfo
- Publication number
- CN118097305A CN118097305A CN202410451398.4A CN202410451398A CN118097305A CN 118097305 A CN118097305 A CN 118097305A CN 202410451398 A CN202410451398 A CN 202410451398A CN 118097305 A CN118097305 A CN 118097305A
- Authority
- CN
- China
- Prior art keywords
- emitting diode
- image
- light emitting
- region
- led
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 239000004065 semiconductor Substances 0.000 title claims abstract description 17
- 238000012545 processing Methods 0.000 claims abstract description 23
- 238000003709 image segmentation Methods 0.000 claims abstract description 11
- 238000009826 distribution Methods 0.000 claims description 37
- 230000008569 process Effects 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 15
- 238000007781 pre-processing Methods 0.000 claims description 11
- 238000010998 test method Methods 0.000 claims 1
- 238000001514 detection method Methods 0.000 description 19
- 238000004458 analytical method Methods 0.000 description 18
- 238000004364 calculation method Methods 0.000 description 12
- 238000004519 manufacturing process Methods 0.000 description 12
- 230000010339 dilation Effects 0.000 description 8
- 230000011218 segmentation Effects 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 6
- 239000000047 product Substances 0.000 description 6
- 239000011324 bead Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 5
- 230000000877 morphologic effect Effects 0.000 description 4
- 230000007797 corrosion Effects 0.000 description 3
- 238000005260 corrosion Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000003708 edge detection Methods 0.000 description 3
- 238000005530 etching Methods 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 238000003908 quality control method Methods 0.000 description 3
- 239000002994 raw material Substances 0.000 description 3
- 102000003712 Complement factor B Human genes 0.000 description 2
- 108090000056 Complement factor B Proteins 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000000354 decomposition reaction Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000001303 quality assessment method Methods 0.000 description 2
- 230000006798 recombination Effects 0.000 description 2
- 238000005215 recombination Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000011049 filling Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012467 final product Substances 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 230000000246 remedial effect Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 230000008961 swelling Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a method and a system for detecting the quality of a semiconductor light-emitting element, and relates to the technical field of data processing, wherein the method comprises the following steps: calculating a dynamic correlation factor according to the identified light emitting diode region and the background region, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the light emitting diode and the background; image segmentation is carried out on the preprocessed image according to the calculated dynamic correlation factor so as to extract the light-emitting diode region and obtain an independent element image; traversing the independent element images, and extracting key characteristics of the light emitting diode; according to a preset quality standard and a dynamic correlation factor, the key features are respectively compared with the quality standard to obtain a quality judgment result; and classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result. The invention can accurately identify the LED area.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a method and a system for detecting the quality of a semiconductor light-emitting element.
Background
With the rapid development of semiconductor technology, light Emitting Diodes (LEDs) have been widely used in various fields such as illumination, display, signal indication, etc., as an important element thereof. Quality detection of LEDs is also becoming increasingly important due to their wide application scenarios. The traditional quality detection methods mainly depend on manual visual inspection or use simple photoelectric detection equipment, and are not only low in efficiency, but also easily affected by human factors, so that the accuracy and consistency of detection results cannot be ensured.
In order to solve the above-described problems, in recent years, an automated detection method based on image processing and machine vision techniques has been attracting attention. According to the method, the image of the LED is acquired, and the image is analyzed and processed by using an image processing algorithm, so that the automatic detection of the quality of the LED is realized. However, when the existing automatic detection method processes the LED images with complex background and changeable forms, the problems of low recognition precision, poor anti-interference capability and the like often exist.
Disclosure of Invention
The invention aims to provide a method and a system for detecting the quality of a semiconductor light-emitting element, which can quantify the characteristic difference between a light-emitting diode and a background by calculating a dynamic correlation factor so as to more accurately identify the light-emitting diode area.
In order to solve the technical problems, the technical scheme of the invention is as follows:
in a first aspect, a method for detecting quality of a semiconductor light emitting element, the method comprising:
Acquiring an image of the light emitting diode, and preprocessing the image to obtain a preprocessed image;
in the preprocessing image, utilizing color and brightness information to primarily identify a light emitting diode region and a background region;
calculating a dynamic correlation factor according to the identified light emitting diode region and the background region, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the light emitting diode and the background;
image segmentation is carried out on the preprocessed image according to the calculated dynamic correlation factor so as to extract the light-emitting diode region and obtain an independent element image;
Traversing the independent element images, and extracting key characteristics of the light emitting diode;
According to a preset quality standard and a dynamic correlation factor, the key features are respectively compared with the quality standard to obtain a quality judgment result;
And classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result.
Further, calculating a dynamic correlation factor according to the identified LED area and the background area, including:
respectively extracting color features and brightness features corresponding to the light-emitting diode region and the background region according to the identified light-emitting diode region and the background region;
Calculating the characteristic difference between the color characteristic and the brightness characteristic of the light-emitting diode region and the background region;
And calculating a specific numerical value of the dynamic correlation factor according to the characteristic difference.
Further, according to the feature difference, a dynamic correlation factor is calculated, including:
By passing through Calculating a dynamic correlation factor D, wherein/(AndRepresenting the areas of the led region and the background region, respectively,Calculating the overlapping degree of the light-emitting diode region and the background region; /(I),,Weight factors of color feature difference, brightness feature difference and shape feature difference, respectively,AndColor histograms respectively representing a light emitting diode region and a background region, n being the number of bins of the histogram,AndGray level histograms respectively representing a light emitting diode region and a background region, m being the number of bins of the histogram,Representing the number of pixels of the led region in the ith bin of the color histogram,The number of pixels representing the ith bin of the background region in the color histogram,Representing the number of pixels of the light emitting diode region in the ith bin of the gray histogram,The number of pixels representing the ith bin of the background region in the gray histogram,Representing intersection operations for computing elements common to both sets,Representing the overlapping area of the light emitting diode region and the background region,Representing a union operation for merging all elements of two sets while removing duplicate parts,Representing the combined total area of the led region and the background region, i represents the bin index in the histogram.
Further, according to the calculated dynamic correlation factor, image segmentation is performed on the preprocessed image to extract the led area to obtain an independent component image, including:
Identifying areas of the light emitting diode and the background in the image according to the dynamic correlation factor D;
Dividing the light-emitting diode region from the preprocessed image according to the light-emitting diode and the background region in the image;
Converting the light emitting diode region into a binary image;
traversing each pixel of the binary image, and identifying pixel groups connected together, wherein the pixel groups represent independent objects in the binary image, the independent objects are independent light-emitting diode elements, and in the identification process, all pixels belonging to the same pixel group are distributed with the same label;
For each identified pixel group, a pixel group bounding box is calculated by acquiring the minimum and maximum lateral and longitudinal coordinates of all pixels within the pixel group to acquire an independent component image.
Further, key features include size, shape, brightness distribution, and color uniformity.
Further, traversing the independent component images to extract key features of the light emitting diode, including:
Calculating the size characteristics of the light emitting diode elements and calculating the shape factors according to the coordinates of the pixel group boundary boxes;
calculating a gray level histogram of the LED element image to obtain brightness distribution characteristics;
The uniformity of the gray level histogram of the LED element image is analyzed to extract the color uniformity characteristics.
Further, calculating the shape factor includes:
acquiring the total number of pixels in the outline of the light-emitting diode element and the pixel length of the outline of the light-emitting diode element according to the size characteristics of the light-emitting diode element, and calculating the circularity according to the total number of pixels and the pixel length;
Acquiring an external rectangle with the smallest area by rotating the outline of the light emitting diode element, and calculating the eccentricity according to the external rectangle with the smallest area;
the inherent moment is determined by calculating the spatial distribution of the light emitting diode elements.
In a second aspect, a system for detecting quality of a semiconductor light emitting element includes:
The acquisition module is used for acquiring the image of the light emitting diode and preprocessing the image to obtain a preprocessed image; in the preprocessing image, utilizing color and brightness information to primarily identify a light emitting diode region and a background region; calculating a dynamic correlation factor according to the identified light emitting diode region and the background region, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the light emitting diode and the background;
The processing module is used for carrying out image segmentation on the preprocessed image according to the calculated dynamic correlation factor so as to extract the light-emitting diode region and obtain an independent element image; traversing the independent element images, and extracting key characteristics of the light emitting diode; according to a preset quality standard and a dynamic correlation factor, the key features are respectively compared with the quality standard to obtain a quality judgment result; and classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result.
In a third aspect, a computing device includes:
one or more processors;
And a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the above-described methods.
In a fourth aspect, a computer readable storage medium stores a program that when executed by a processor implements the above method.
The scheme of the invention at least comprises the following beneficial effects:
The invention can quantify the characteristic difference between the LED and the background by calculating the dynamic correlation factor, thereby more accurately identifying the LED area. When the LED image with the complex background is processed, the color and brightness information is utilized to preliminarily identify the LED area and the background area, so that the influence of background noise on a detection result is effectively reduced, and meanwhile, the adaptability to the LED image with the changeable forms is enhanced through the calculation of the dynamic correlation factor.
According to the invention, the automatic processing and analysis of the LED images are realized, manual intervention is not needed, the detection efficiency is greatly improved, in addition, the simultaneous detection of a plurality of LED elements is realized by traversing independent element images and extracting key features, and the detection speed is further improved.
Drawings
Fig. 1 is a flow chart of a method for detecting quality of a semiconductor light emitting device according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a system for detecting quality of a semiconductor light emitting device according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described more closely below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As shown in fig. 1, an embodiment of the present invention provides a method for detecting quality of a semiconductor light emitting element, the method including:
step 11, acquiring an image of the light emitting diode, and preprocessing the image to obtain a preprocessed image;
Step 12, in the preprocessed image, primarily identifying the LED area and the background area by utilizing color and brightness information;
Step 13, calculating a dynamic correlation factor according to the identified LED area and the background area, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the LED and the background;
step 14, according to the calculated dynamic correlation factor, image segmentation is carried out on the preprocessed image so as to extract the light-emitting diode region and obtain an independent element image;
Step 15, traversing the independent element images, and extracting key characteristics of the light emitting diode;
step 16, comparing the key features with the quality standards respectively according to the preset quality standards and dynamic association factors to obtain quality judgment results;
And step 17, classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result.
In the embodiment of the invention, the characteristic difference between the light emitting diode and the background can be quantified by calculating the dynamic correlation factor, so that the light emitting diode area can be identified more accurately. Compared with the traditional fixed threshold segmentation method, the method can adaptively adjust the segmentation threshold, and effectively avoid the problem of false segmentation caused by the change of image brightness, contrast and the like. When the LED image with the complex background is processed, the color and brightness information is utilized to preliminarily identify the LED area and the background area, so that the influence of background noise on a detection result is effectively reduced. Meanwhile, the adaptability of the algorithm to the LED images with various forms is further enhanced through the calculation of the dynamic correlation factors. The invention realizes the automatic processing and analysis of the LED image, does not need manual intervention, and greatly improves the detection efficiency. In addition, by traversing the independent element images and extracting key features, the simultaneous detection of a plurality of LED elements is realized, and the detection speed is further improved. The invention not only can detect whether the LED element has quality problems, but also can quantitatively evaluate key characteristics according to the preset quality standard and dynamic association factors.
In another preferred embodiment of the present invention, the step 11 may include:
in step 111, a high resolution camera is used to capture an image of a Light Emitting Diode (LED) under controlled lighting conditions, ensuring that the focal length and exposure settings of the camera are able to capture the details of the LED and its surroundings in a clear way. The step of converting the color image into a gray scale image simplifies the image data, reduces the computational complexity of subsequent processing, and retains the luminance information. This step helps to improve the image quality by removing random noise from the image by a gaussian filter, especially in images taken under low light conditions, where the specific calculation formula when removing noise is:
;
Wherein, Representing the pixel value at position (x, y) after gaussian filtering, I (x+i, y+j) representing the pixel value at position (x+i, y+j) of the original image,The standard deviation of the gaussian function is represented, i and j represent the positional offset of the currently processed pixel, and k represents the radius of the gaussian kernel. The contrast of the image is enhanced through histogram equalization, and the distinguishing degree of the light emitting diode and the background is improved. The sharpness of the led edge is enhanced by an edge detection algorithm, such as Canny edge detection. For example, assume that a batch of LEDs is passing on an automated LED quality inspection line. An image of each passing LED is automatically captured using a camera mounted above the pipeline. For each captured image, it is first converted into a gray scale, and then a gaussian filter is used to remove noise that may be caused by minute dust or vibration in the production environment. Then, the contrast of the image is improved by histogram equalization processing, ensuring that the degree of differentiation between the LED and the background is maximized. Finally, edge detection is performed by using a Canny algorithm to accurately capture the contour of the LED.
In another preferred embodiment of the present invention, in the above step 12, since the light emitting diode may emit light of a specific color under a specific condition, this can be used to distinguish the LED from the background, which requires analyzing the color distribution of the image, particularly in RGB or HSV color space; light emitting diodes are typically brighter than the surrounding background in an operating state, and can help identify LED areas by analyzing the brightness distribution of the image. For example, assume an image with a red LED that emits light in front of a relatively dim background. The goal is to identify and distinguish between LED areas and backgrounds. The image is converted from the RGB color space to the HSV color space. The HSV color space is more intuitive for the description of colors, where H (Hue) represents the color type, S (Saturation) represents the Saturation of the color, and V (Value) represents the brightness of the color. For red LEDs, a threshold range of the H component may be set, specifically for the red range. For example, if the H value of red is approximately between 0 and 10 degrees and 340 and 360 degrees, the threshold may be set accordingly. The image is converted to a gray scale and its histogram is calculated to identify areas of higher brightness, a brightness threshold is determined for distinguishing the LED from the background, and since the LED is brighter than the background, a threshold for the brighter part of the histogram can be selected. By combining the analysis results of the color and the brightness, the LED areas in the image can be preliminarily marked. In particular, a mask image may be created in which pixels meeting color and brightness threshold conditions are labeled as foreground (LED area) with the remainder being background. In this case, the red LED and the dark background can be effectively distinguished by converting the image into HSV color space and setting appropriate color and brightness thresholds.
In a preferred embodiment of the present invention, the step 13 may include:
Step 131, respectively extracting color features and brightness features corresponding to the light-emitting diode region and the background region according to the identified light-emitting diode region and the background region;
step 132, calculating the characteristic difference between the color characteristic and the brightness characteristic of the light emitting diode region and the background region;
And step 133, calculating a specific numerical value of the dynamic correlation factor according to the characteristic difference.
In the embodiment of the present invention, step 131, in which the binary mask image created in step 12 is used, wherein the light emitting diode area is marked with 1 (or 255, representing white) and the background area is marked with 0 (black); for the led areas, the average color values (H and S values in RGB color space or HSV color space) of the pixels within these areas are calculated, and for the background areas, the same calculation is performed.
In the HSV color space, the V component is used to evaluate brightness. The average luminance value (V value) of the light emitting diode region and the background region is calculated. For example, while an image containing yellow LEDs is being processed, the background is relatively dark, a binary mask image has been obtained by color and brightness threshold analysis, clearly distinguishing the LEDs from the background, assuming that the color characteristics of the yellow LEDs have an average H value of 30 degrees (typical hue value for yellow) and an average S value of 80% (representing higher saturation) in the HSV color space, the average H value of the background may be widely distributed, the average S value is lower because the color saturation of the dark background is typically not high; the average brightness value (V value) of the leds is assumed to be 85%, which means that these areas are relatively bright, and the average brightness value of the background is low, assumed to be 20%, reflecting the characteristics of a dark background. Therefore, the invention can accurately extract the color and brightness characteristics of the light emitting diode and the background from the image.
Step 132, calculating the difference between the LED area and the background area based on the color features extracted in step 131, such as the average value of H (hue) and S (saturation) in the HSV color space. The color difference can be quantified by a simple mathematical operation, such as a difference value. Similarly, the average of the V (luminance) components in the HSV color space is used to calculate the difference in luminance of the LED region from the background region. For example, there is an image with green LEDs. In step 131, the following features are obtained:
Color characteristics of the LED area: haverage=120 (green hue), saverage=80% (higher saturation); luminance (V) average = 90% (lighter).
Color characterization of background area: haverage=60 (possibly yellow background), saverage=30% (lower saturation); luminance (V) average = 20% (darker).
Color feature difference calculation, hue (H) difference: 120-60 = 60;
saturation (S) difference, 80% -30% = 50%;
calculating brightness characteristic difference, namely calculating brightness (V) difference: 90% -20% = 70%;
The image segmentation algorithm is adjusted and optimized by calculating specific differences of the LED area and the background area in color and brightness, so that the LED area and the background can be distinguished more accurately, and the quantitative analysis of the differences of the color and the brightness is helpful for identifying and eliminating errors caused by image quality problems (such as uneven illumination) and improving the detection accuracy.
In a preferred embodiment of the present invention, the step 133 may include:
Step 1331 by Calculating a dynamic correlation factor D, wherein/(AndRepresenting the areas of the led region and the background region respectively,Calculating the overlapping degree of the light-emitting diode region and the background region; /(I),,Weight factors of color feature difference, brightness feature difference and shape feature difference, respectively,AndColor histograms respectively representing a light emitting diode region and a background region, n being the number of bins of the histogram,AndGray level histograms respectively representing a light emitting diode region and a background region, m being the number of bins of the histogram,Representing the number of pixels of the led region in the ith bin of the color histogram,The number of pixels representing the ith bin of the background region in the color histogram,Representing the number of pixels of the light emitting diode region in the ith bin of the gray histogram,The generation represents the number of pixels of the background region in the ith bin of the gray histogram,Representing intersection operations for computing elements common to both sets,Representing the overlapping area of the light emitting diode region and the background region,Representing a union operation for merging all elements of two sets while removing duplicate parts,Representing the combined total area of the led region and the background region, i represents the bin index in the histogram.
In the embodiment of the present invention, for example, an image containing a green LED is being processed, and the background color is relatively uniform, but the color is different from that of the LED, in the color histogram, the green LED area is mainly concentrated in a green bin, which is assumed to be a certain bin i of the histogram, where H L (i) is significantly higher than the background value H B (i) of the same bin. For 10 bins of the color histogram (i.e., n=10), the measured number of pixels of the led region and the background region in the green bin differ significantly. Gray histogram difference, assuming that the gray histogram has 256 bins (i.e., m=256), the average luminance of the led region is higher than the background, resulting in that the G L (i) value of the led region is higher than the G B (i) value of the background in the bin where the luminance is higher.
By analysis, it was found that the light emitting diode region a L had little overlap with the background region a B, i.e., a L∩AB was close to 0, indicating that the LED was well isolated. Given these data, the weights of the color feature difference, the luminance feature difference, and the shape feature difference are assumed to be w c=0.5、wl =0.3 and w e =0.2, respectively. And then, calculating to obtain a specific numerical value of the dynamic correlation factor D according to a specific calculation formula of the dynamic correlation factor.
By comprehensively considering the difference of color and brightness, the dynamic correlation factor D provides a quantized measurement for each pixel, helps to more accurately divide the LED area and the background, and reduces the false division. The difference in characteristics between the LEDs and the background in different image scenes may vary greatly. The calculation method of the dynamic correlation factor D can be adaptively adjusted according to actual characteristic differences, and the applicability and the robustness of the algorithm under various different conditions are improved. For the production and quality control of the LEDs, the deviation between the LEDs and the expected characteristics can be rapidly estimated based on the dynamic correlation factor D, so that the production parameters can be adjusted in time or unqualified products can be identified.
The feature difference between the led region and the background region can be reflected more precisely by the dynamic correlation factor calculated in step 133. Compared with the traditional fixed threshold method, the dynamic threshold setting method based on the actual characteristic difference can more accurately segment the image, so that a more accurate LED area is extracted. The calculation of the dynamic correlation factor takes into account the actual differences in color and brightness of the light emitting diode and the background, and thus has strong adaptability. The method can automatically adjust the segmentation threshold value under different illumination conditions or in the face of the LEDs with various colors and brightness, and can keep higher segmentation precision.
In a preferred embodiment of the present invention, the step 14 may include:
step 141, identifying the areas of the light emitting diode and the background in the image according to the dynamic correlation factor D;
Step 142, dividing the LED area from the preprocessed image according to the LED and background areas in the image;
step 143, converting the led region into a binary image;
Step 144, traversing each pixel of the binary image, and identifying pixel groups connected together, wherein the pixel groups represent independent objects in the binary image, the independent objects are independent light emitting diode elements, and in the identification process, all pixels belonging to the same pixel group are allocated with the same label;
Step 145, for each identified pixel group, a pixel group bounding box is calculated by acquiring the minimum and maximum lateral and longitudinal coordinates of all pixels within the pixel group to acquire an independent component image.
In the embodiment of the present invention, step 141 utilizes the dynamic correlation factor to more accurately identify the boundary between the led and the background, so as to avoid misidentifying the background area as the led area, or vice versa. The state correlation factor provides an adaptive threshold that allows for efficient identification of the led area in images of different lighting conditions or different contrast. Step 142, based on the recognition result of step 141, can accurately divide the light emitting diode region from the complex background, provide the accurate target region for subsequent quality detection, through dividing, can remove the background information irrelevant to the quality detection of the light emitting diode, thus reduce the interference of noise to subsequent detection steps. The conversion of the led area into a binary image (typically a black and white image), step 143, which is typically faster than a color or multi-gray scale image, can greatly simplify the subsequent data processing and analysis steps, and thus helps to increase the computational efficiency of the overall inspection process. By identifying the pixel groups connected together, the individual light emitting diode elements in the image can be accurately identified, the position of each light emitting diode element in the image can be accurately positioned by calculating the bounding box of the pixel groups, and after each individual light emitting diode element is identified and positioned, classification, storage or further processing and analysis can be conveniently performed.
In another preferred embodiment of the present invention, step 141 described above involves using the dynamic correlation factor D to identify Light Emitting Diodes (LEDs) and background areas in the image. The dynamic correlation factor is a numerical value calculated based on the color feature difference, the brightness feature difference and the shape feature difference of the image, and reflects the possibility that each pixel point or region belongs to an LED. Specifically, the method specifically comprises the following steps:
And calculating a D value for each pixel or region in the image according to the color histogram difference, the gray level histogram difference and the region overlapping degree obtained in the previous step.
Setting a threshold value according to the distribution of the dynamic association factor D, wherein the threshold value is used for distinguishing an LED area from a background area, and the pixel or the area is considered to be the LED area when the D value of the pixel or the area is higher than the threshold value; below the threshold, it is considered background; the LED areas and background areas in the image are marked by comparing the D value of each pixel or area to a threshold. For example, assume that a photograph is processed that contains a plurality of blue LEDs, with a relatively dark background and a single color. In the previous step, color histograms, gray level histograms of the LED region and the background region, and the degree of overlap of these regions are calculated. For example, the color histogram difference and the luminance histogram difference of the LED region in the blue channel were found to be significantly higher than those of the background region. A threshold value, such as 0.5, is set by applying the dynamic correlation factor D calculated by the color and brightness feature differences to all pixels in the image (assuming that the D value has undergone normalization processing, ranging from 0 to 1). This threshold is derived based on an observation of the overall image dynamic correlation factor distribution. Using this threshold, each pixel in the image is traversed, with pixels with D values greater than 0.5 marked as LED areas and the rest marked as background areas. As a result, all blue LEDs were successfully identified accurately from a darker background.
In another preferred embodiment of the present invention, step 142 described above involves precisely segmenting the Light Emitting Diode (LED) regions from the pre-processed image based on the identified LED and background regions. This step is performed after successful identification of the LEDs and background areas, with the aim of extracting the LED areas from the image for further analysis and processing. The following is specific content for performing this step, specifically including:
The recognition result obtained in step 141, i.e., the binarized mask image calculated based on the dynamic correlation factor D, is used, wherein the LED area and the background area have been discriminated.
LED areas are extracted from the original or pre-processed image using the binarized mask image as a guide. Specific operations typically include applying a masking operation that retains only pixels marked as LED areas in the mask, while ignoring background areas.
To ensure that the boundaries of the LED area are clear, further image processing, such as morphological operations (dilation, erosion) to optimize the edges of the LED area, eliminate noise that may occur during segmentation, wherein the processing of the image by dilation operation specifically comprises:
Selecting a structural element of an appropriate size, such as a 3x 3 or 5 x 5 square, center-aligns the structural element onto each pixel of the binarized image, and sets the pixel at the center of the structural element to white if the structural element overlaps any one pixel of the image (i.e., if at least one pixel of the area covered by the structural element is white). Thus, by the expansion operation, the edges of the LED area are widened and small voids are filled, making the LED area more complete and easily identifiable.
The specific steps of processing the image through corrosion operation include:
Using the same structuring element as the dilation, center-aligning the structuring element onto each pixel of the binarized image, if the structuring element is located completely within the foreground region of the image (i.e., all pixels covered by the structuring element are white), then the pixels in the center of the structuring element are kept white; otherwise, it is set to black. Therefore, the edge portion of the excessive expansion possibly introduced in the expansion step and the small noise point in the original binarized image are removed by the etching operation, so that the boundary of the LED area is clearer.
For example, assume a night scene photo is being processed that includes a red LED billboard. The LED lights of the billboard appear particularly bright at night, while the surrounding environment is relatively dark. The contrast between the LED billboard area in the image and the background is obvious due to illumination and color, but small cracks appear in the image due to vibration and uneven illumination during photographing, and small noise points caused by other light sources also exist in the background. Based on the analysis of the dynamic correlation factor D, a binary mask image is generated in which the LED billboard area is clearly marked white and the background area is black, as a result of step 141. The morphological dilation operation is used for connecting broken LED lamp areas in the image, and the continuity of the LED advertising board is enhanced, and the morphological dilation operation specifically comprises the following steps:
A 5 x 5 square shaped structural element is selected for the dilation operation, the structural element is aligned to each pixel of the binarized image, for the area covered by the structural element, if at least one pixel is white, the pixel in the center of the structural element is also set to white, after dilation, the LED billboard area becomes more continuous, and the previously existing small breaks and holes are effectively filled.
The morphological corrosion operation is used for removing unnecessary edge expansion and small noise points possibly introduced in the expansion operation, and refining the boundary of the LED advertising board, and specifically comprises the following steps:
Performing an etching operation using the same 5 x 5 square structuring element as the dilation operation, aligning the structuring element onto each pixel of the binarized image, keeping the pixels in the center of the structuring element white only when the structuring element is fully within the foreground region (i.e., all covered pixels are white); otherwise, it is set to black. The etching operation removes the edge portion and small noise point that are excessively expanded due to expansion, so that the boundary of the LED signboard is more clear and accurate. After the swelling and corrosion operation, the LED billboard area is clearly and accurately identified in the image, both visually and from an analytical processing perspective, with significant improvements.
In another preferred embodiment of the present invention, step 143 described above involves converting the segmented Light Emitting Diode (LED) regions into a binary image in order to further simplify the image content, making subsequent image processing, feature extraction and analysis more efficient and accurate. In this step, the LED areas in the image will be marked as white (or a value of 1/255) while the background areas will be marked as black (or a value of 0). Such a binarization process not only reduces the complexity of the data, but also helps to clearly distinguish between the target object and the background. The refinement procedure to perform this step is as follows:
Using the segmentation mask image generated in step 142, the mask image has distinguished the LED region from the background region. In this mask image, the LED area is white (or has a value of 255) and the background is black (or has a value of 0); the segmentation mask is applied to the original or preprocessed image. Specifically, only the pixels corresponding to white (LED area) in the mask are retained, and the pixels of the background area are set to black. In the resulting image, all pixels belonging to the LED region are set to the maximum luminance value (white, value 255), while pixels other than the LED region (background) are set to the minimum luminance value (black, value 0). For example, the number of the cells to be processed,
Assume that an image showing a plurality of green LED lamps is processed. The LED lamps are arranged in a straight line, the background is a dark table surface, after color and brightness analysis, a binary mask image has been successfully generated, the LED lamp area is clearly marked, and this mask image is directly used because it has precisely segmented the LED area, a new image is created by applying the mask, wherein the LED lamp area remains white, the background is completely converted to black, and a clear binary image is generated. Binary images simplify the complexity of the image, making further analysis and processing of the LED lamp (e.g., computing area, perimeter, detected shape, etc.) more straightforward and efficient.
In another preferred embodiment of the present invention, the purpose of step 144 is to identify individual Light Emitting Diode (LED) elements in the binarized image. This is accomplished by detecting groups of pixels in the binary image that are connected together, each group representing a separate LED element. During the identification process, each individual pixel group is assigned a unique label, thereby distinguishing between different LED elements. The following is a process for performing this step, and specifically includes:
Starting from the upper left corner of the binary image, each pixel is scanned line by line, the value of each pixel is checked, where white represents the LED area (value 1 or 255) and black represents the background (value 0). When a white pixel is encountered, if there are other white pixels in any of the neighbors, the current pixel and the white pixels in those neighbors are considered part of the same pixel group.
A data structure such as a recursion or queue is used to track and expand the entire pixel group. This means that starting from an initial white pixel, all white pixels connected by 4-way are gradually brought into the same pixel group, and for each newly found white pixel, its 4 neighborhood pixels are repeatedly examined until the entire pixel group is completely identified and expanded.
Each identified group of pixels is assigned a unique label of the same size as the original binary image by creating a new label image in which the value of each pixel is initialized to 0 (representing the background). Each time a new pixel group is identified while traversing the original binary image, all pixels in the pixel group are marked with a new label value (e.g., 1,2,3, …) in the label image. After the whole image is scanned and processed, each white pixel in the binary image is allocated to a pixel group with unique labels, and each unique label represents an independent LED element.
For example, suppose a binary image is processed that contains several groups of closely arranged LED lamps. These LED lamps, because of the different lighting conditions, have tiny breaks in the image in some areas. After the above steps are applied, each group of LED lamps connected together by 4 connections is identified as an independent pixel group and assigned a unique label. In this way, even with a small gap inside the LED light group, the entire group of LED lights is correctly identified and marked as a single LED element. The invention improves the recognition precision of the LED element: by precisely identifying and marking each individual LED element. By assigning unique tags, each LED element in the image can be individually tracked and analyzed, increasing the amount of information in the process.
In another preferred embodiment of the present invention, the step 145 may include:
Step 1451, for each identified pixel group (i.e., each individual LED element), initializes four variables: minX, maxX, minY, maxY, which are used to store the minimum lateral coordinate, the maximum lateral coordinate, the minimum longitudinal coordinate and the maximum longitudinal coordinate of the pixel group, respectively; traversing all pixels belonging to the same pixel group (i.e. assigned the same label), comparing the X coordinate and the Y coordinate of each pixel, and updating minX, maxX, minY, maxY values as required; from the updated minX, maxX, minY, maxY values, a bounding box for each LED element is determined, which can be expressed as: coordinates of the upper left corner (minX, minY) and coordinates of the lower right corner (maxX, maxY); using the bounding box coordinates of each pixel group, an individual LED element image can be cropped from the original or preprocessed image.
For example, assume that an image containing a plurality of green LED beads is being processed. After step 144, each LED bead in the image is successfully identified as an independent group of pixels and assigned a unique label. For a particular LED bead (i.e., a group of pixels), it begins traversing all its pixels. Suppose that during traversal, the X-coordinate range of this pixel group is found to be from 50 to 100 and the y-coordinate range is found to be from 200 to 250. Thus, the bounding box of this LED bead can be determined as the upper left corner (50, 200) and the lower right corner (100, 250). Then, an individual image of the LED beads can be cropped from the entire image based on this bounding box for further analysis. Determining the bounding box makes it possible to accurately measure the position and size of each LED element by accurately calculating the bounding box of each LED element. With the clearly defined bounding box, the characteristics of each LED element, such as shape, color, brightness and the like, can be extracted more easily, and the independent LED element images can be automatically identified and cut out, so that the processing efficiency is improved.
In a preferred embodiment of the present invention, the key features include size, shape, brightness distribution, and color uniformity; the step 15 may include:
Step 151, calculating the size characteristics of the LED element and calculating the shape factor according to the coordinates of the pixel group boundary box; step 151, may include: step 1511, obtaining the total number of pixels in the outline of the light emitting diode element and the pixel length of the outline of the light emitting diode element according to the size characteristics of the light emitting diode element, and calculating the circularity according to the total number of pixels and the pixel length; step 1512, obtaining an external rectangle with the smallest area by rotating the outline of the light emitting diode element, and calculating the eccentricity according to the external rectangle with the smallest area; step 1513, determining the inherent moment by calculating the spatial distribution of the light emitting diode elements.
Step 152, obtaining brightness distribution characteristics by calculating a gray level histogram of the light emitting diode element image;
Step 153, extracting color uniformity features by analyzing the consistency of gray level histograms of the LED element images.
In the embodiment of the present invention, in step 151, the dimension characteristics of the element are calculated by the coordinates of the bounding box, so that the dimension of the led element can be accurately measured, which is important for quality control and consistency of product specifications. Calculating the form factor helps to understand the shape characteristics of the led element, such as circularity, rectangularity, etc., which may reflect process stability and product quality during production. In step 152, the gray histogram provides a visual representation of the brightness distribution of the led element, which is helpful for evaluating the brightness performance and uniformity of the element, and by analyzing the brightness distribution characteristics, potential faults or defects, such as uneven brightness, dark spots, bright spots, etc., can be detected, so as to perform quality control and product screening in time. In step 153, the uniformity of the gray level histogram is analyzed to evaluate the uniformity of the color of the led element, i.e. whether the color and brightness of each area on the surface of the element are uniform, and the quality of the led element can be graded according to the analysis result of the uniformity of the color, so as to provide suitable product selection for different application scenarios.
In another preferred embodiment of the present invention, in step 151, the circularity is a quantization index describing the degree to which the shape of the object is nearly circular, which can be used to evaluate the regularity of the shape of the LED element, by first obtaining the total number of pixels (i.e., area) within the outline of the LED element and the pixel length (i.e., perimeter) of the outline by the bounding box coordinates, and then using the formulaCalculate circularityWherein Area is the total number of pixels in the contour, perimer is the pixel length of the contour, a circularity value close to 1 indicates a shape close to perfect circle, and a value far from 1 indicates a shape deviating from circle; the eccentricity measures the flatness of the shape of the LED element, is another important index for analyzing the geometric characteristics of the LED element, and the minimum circumscribed rectangle is found by rotating the outline of the LED element, and the eccentricity is calculated by the aspect ratio of the minimum circumscribed rectangle, so that the information of the flatness of the shape of the LED element is provided; the inherent moment is a set of statistics describing the shape characteristics of the object, including information such as the position, size, direction, etc. of the object, the calculation of the inherent moment involves analysis of the spatial distribution of the LED element image, and by calculating the first moment and the second moment (describing the direction and expansion of the shape of the object), comprehensive information about the spatial distribution of the LED element can be obtained.
For example, assume that there is a rectangular LED element in the binarized image, which element is located in the central region of the image, the specific parameters are as follows: the pixel coordinates of the LED elements range from 50 to 100 (width 50 pixels) on the x-axis and from 200 to 250 (height 50 pixels) on the y-axis. Thus, the LED element occupies a 50×50 pixel area for 2500 pixels in total.
The first moment (centroid) is calculated and first, the centroid of the LED element, i.e. its geometric center, is calculated, the X-coordinate of the centroid being equal to the sum of the X-coordinates of all pixels divided by the total number of pixels. For a rectangle, the X-coordinate of the centroid is simply equal to the average of the rectangular edge X-coordinates, i.e., (50+100)/2=75.
Similarly, the Y coordinate of the centroid is equal to the sum of the Y coordinates of all pixels divided by the total number of pixels, calculated as (200+250)/2=225, and therefore, the centroid of the LED element is located at the (75, 225) position of the image.
The second moment (shape description) is calculated, and next, the shape properties of the LED element are described using the second moment, focusing mainly on the direction and expansion thereof.
Computation of M 20 and M 02, since this is a regular rectangle, can simplify computation of M 20 involving variance of the x-coordinate, whereas for a centrally symmetric rectangle it is known to be evenly distributed around the centroid. Thus, for a rectangle of width 50, the calculation of M 20 can be reduced to the square of the width of the rectangle divided by 12 (this is the variance formula for a uniformly distributed rectangle), i.e., (50 2)/12, and similarly, M 02 is equal to the square of the height divided by 12. For a perfect rectangle, the eccentricity is determined by its aspect ratio, in this case 1 (because it is square); since the LED element is regular rectangular, the inherent moment describes mainly its principal axis direction and expansion, which in this case is parallel to the x-axis and the y-axis, without specific tilt, by the above calculation a second moment description of the position of the centroid of the LED element, its spatial distribution, and the shape characteristics of the element (such as eccentricity and inherent moment) is obtained.
In another preferred embodiment of the present invention, if the original LED element image is colored, it is first required to be converted to a gray scale image in step 152. Traversing the converted gray level image, and counting the occurrence frequency of each possible gray level value (0-255). This may be achieved by an array of 256 elements, the index of the array corresponding to the grey level, and the value at each index corresponding to the number of pixels of that grey level. By examining the histogram, the darkest and brightest regions of the image, i.e., the range of non-zero value portions of the histogram, can be determined. The average of all pixel luminance values is calculated, providing an overall perception of image luminance. By analyzing the shape of the histogram, the uniformity of the brightness of the LED element can be evaluated. A uniformly distributed histogram indicates an average luminance distribution, while a histogram with sharp peaks or significant skew may indicate luminance non-uniformity.
For example, assuming that there is one rectangular-shaped LED element image, to analyze the luminance distribution characteristics thereof, first, the LED element image is converted into a grayscale image. Assuming that the converted image mainly shows LEDs of medium brightness, the gray levels are mainly concentrated in the middle range, e.g. 120-180; traversing the gray image, calculating and filling the gray histogram array. The result is assumed to show that the brightness of most pixels is concentrated around 140, but that some number of pixels are lower in brightness and concentrated around 100. As can be seen from the histogram, the brightness of the LED elements is mainly concentrated in a narrow range, but there are also some darker areas. The average luminance may be near 140, indicating that the LED element is generally bright, but incomplete uniformity of luminance may indicate that there are some shadows or areas of lower luminance on the element.
In another preferred embodiment of the present invention, in step 153, it may include:
observing the smoothness of the gray level histogram, a uniform luminance distribution means that the distribution of the number of pixels for each gray level in the histogram is relatively uniform without much fluctuation. This can be quantified by calculating the standard deviation of the histogram, a lower standard deviation generally indicating a more uniform luminance distribution.
It is checked whether there are significant peaks in the histogram, which may indicate that there are too many pixels of a particular brightness level in the image, indicating brightness non-uniformities. The width of the histogram, i.e. the luminance range between the darkest and brightest pixels, is analyzed. A narrower histogram width may mean that the brightness of the image varies less and the brightness is more uniform. For example, assume that there is an image of LED elements that shows uniform blue light over most of the area, but one corner of the image is slightly darker due to the shadow effect. After calculation of the gray level histogram of the image, it was found that the histogram was mainly concentrated at medium brightness levels, but at the lower brightness end there was a small peak, corresponding to the pixels of the dark corner of the image. By calculating the standard deviation of the histogram, the uniformity of the luminance distribution can be quantified. The histogram shows significant non-uniformity due to the presence of the vignetting, assuming that the standard deviation is relatively low.
In another preferred embodiment of the present invention, in step 16, it may include:
First, quality criteria are set for each key feature of the LED element (such as size, form factor, brightness distribution, and color uniformity), including: dimensional tolerance, minimum circularity threshold, standard deviation range of luminance distribution.
For each LED element, the extracted key features are compared with corresponding quality criteria. For example, it is checked whether the size of the element is within an allowable tolerance, whether the shape factor satisfies a minimum circularity requirement, whether uniformity of the luminance distribution reaches a predetermined standard, or the like. Based on the comparison of the key features to the quality criteria, a quality judgment is made for each LED element. For example, assume a batch of blue LED elements is being evaluated, with the key features of one particular element being as follows:
Size: within a predetermined dimensional tolerance.
Degree of circularity: below the minimum threshold, indicating that the shape may be irregular.
Brightness distribution: the standard deviation exceeding the predetermined range indicates that the luminance distribution is uneven.
Color uniformity: by gray histogram consistency analysis, it is determined that the color distribution is relatively uniform.
In combination with dynamic correlation factors, it is possible to make appropriate adjustments to the evaluation criteria of the brightness distribution, for example if the element is an image acquired under low light conditions.
Based on the above analysis, if the quality criteria require that all critical features must meet preset criteria, the LED element may be judged as unacceptable because its circularity and brightness distribution are not as satisfactory. However, if certain features allow for some degree of deviation, or remedial action is present, the final quality assessment may be more flexible. In this way, a detailed quality assessment can be made for each LED element, ensuring that only elements meeting all quality criteria are considered acceptable.
In another preferred embodiment of the present invention, in step 17, it may include:
and determining the main quality problem type of each unqualified element according to the comparison result of the key characteristics and the quality standard, wherein the problem type comprises size inconsistency, irregular shape, uneven brightness, inconsistent color distribution and the like.
For each LED element with quality problems, it is classified into the corresponding quality problem type category. This classification may be based on one or more key features that do not meet the criteria. For example, if an element is out of tolerance in size and its brightness distribution is also non-uniform, it may be classified as both "size non-conforming" and "brightness non-uniform".
A record is created for each quality problem type, summarizing all LED element information belonging to that type. The record should include identification information of the component, specific features that do not meet the quality criteria, and suggested follow-up measures. For example, it is assumed that, in a mass-produced LED element, it is found by the evaluation of step 16 that 10% of the elements are not uniform in size, 5% of the elements are irregular in shape, and 8% of the elements are judged to be defective due to the uneven brightness. These failed components are classified according to their primary quality problems. For example, all out-of-tolerance components are classified as "out-of-size", shape-problematic components are classified as "irregularly shaped", and brightness-problematic components are classified as "non-uniformly brightness". A detailed record is created for each quality problem, including the component number, the detected problem, and recommended follow-up measures (e.g., reworking, scrapping).
In another preferred embodiment of the present invention, after the step 17, the method may further include:
step 18, for each quality problem type, calculating the corresponding frequency; for each quality problem, a weighted score is assigned according to the severity and frequency of occurrence of the impact, wherein the weighted score is calculated by the formula:
;
Wherein S i represents the comprehensive severity score of the question type I, w j represents the weight of the jth influencing factor, and I ij represents the value of the question type I on the jth evaluation index; Representing a nonlinear function, wherein e is a base of natural logarithms, T j represents a threshold value for a j-th evaluation index, the contribution of which to the total score increases exponentially when I ij exceeds T j, D j represents a regulation parameter for controlling the nonlinear function growth rate of the j-th evaluation index, M is the number of influencing factors, wherein the evaluation index includes a dimensional deviation, color uniformity, brightness uniformity, and the like; by/> Calculating a relation between each question type i and each possible production factor k, wherein r ik represents a correlation coefficient between quality question type i and production factor k, the value of this coefficient ranging from-1 to 1, wherein 1 represents a complete positive correlation, -1 represents a complete negative correlation, and 0 represents no linear correlation, X i represents an observation sequence of a specific quality question type i,Represents the average of all X i observations, Y k represents the sequence of observations of a particular production factor k that may be related to quality problems,Mean of all Y k observations is shown.
For example, LED components on a production line find three main quality problem types: size deviation (type 1), color uniformity problem (type 2), brightness non-uniformity (type 3), possible production factors include: machine calibration (factor a), ambient temperature (factor B), and raw material lot (factor C).
Calculating the problem frequency, the dimensional deviation occurred in 10% of the elements, the color uniformity problem was 15%, the brightness non-uniformity was 20%, and for each problem, a weighted score was assigned according to the severity of the effect and the frequency of occurrence.
Assume that the severity score for the dimensional deviation is calculated as: wherein the occurrence frequency of the dimensional deviation is 0.1, the threshold T 1 is 0.05, and the adjustment parameter D 1 is 0.01; similarly, color uniformity problems and brightness non-uniformity are also assigned corresponding scores. /(I)
If usedCalculating the correlation coefficient between the dimensional deviation problem (type 1) and the machine calibration (factor a) to be 0.8 shows a strong positive correlation, and the same analysis finds that if the correlation coefficient between the color consistency problem (type 2) and the ambient temperature (factor B) is-0.6, a medium negative correlation is shown, and the correlation coefficient between the luminance unevenness (type 3) and the raw material lot (factor C) is 0.75, a strong positive correlation is shown.
The invention can be used for specifically taking solving measures by identifying production factors which are obviously related to quality problems. For example, by enhancing the machine calibration process to reduce dimensional deviations, adjusting the production environment temperature to improve color consistency, and optimizing the raw material selection process to improve brightness uniformity, not only can the reject rate be significantly reduced, but also the overall production efficiency and quality of the final product can be improved by solving these fundamental problems.
As shown in fig. 2, an embodiment of the present invention further provides a system 20 for detecting quality of a semiconductor light emitting element, including:
An acquisition module 21, configured to acquire an image of the light emitting diode, and perform preprocessing on the image to obtain a preprocessed image; in the preprocessing image, utilizing color and brightness information to primarily identify a light emitting diode region and a background region; calculating a dynamic correlation factor according to the identified light emitting diode region and the background region, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the light emitting diode and the background;
The processing module 22 is configured to perform image segmentation on the preprocessed image according to the calculated dynamic correlation factor, so as to extract the light emitting diode region, so as to obtain an independent element image; traversing the independent element images, and extracting key characteristics of the light emitting diode; according to a preset quality standard and a dynamic correlation factor, the key features are respectively compared with the quality standard to obtain a quality judgment result; and classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result.
Optionally, calculating the dynamic association factor according to the identified led area and the background area includes:
respectively extracting color features and brightness features corresponding to the light-emitting diode region and the background region according to the identified light-emitting diode region and the background region;
Calculating the characteristic difference between the color characteristic and the brightness characteristic of the light-emitting diode region and the background region;
And calculating a specific numerical value of the dynamic correlation factor according to the characteristic difference.
Optionally, calculating the dynamic correlation factor according to the feature difference includes:
By passing through Calculating dynamic correlation factorWhereinAndRepresenting the areas of the led region and the background region, respectively,Calculating the overlapping degree of the light-emitting diode region and the background region; /(I),,Weight factors of color feature difference, brightness feature difference and shape feature difference, respectively,AndColor histograms respectively representing a light emitting diode region and a background region, n being the number of bins of the histogram,AndGray level histograms respectively representing a light emitting diode region and a background region, m being the number of bins of the histogram,Representing the number of pixels of the led region in the ith bin of the color histogram,The number of pixels representing the ith bin of the background region in the color histogram,Representing the number of pixels of the light emitting diode region in the ith bin of the gray histogram,The number of pixels representing the ith bin of the background region in the gray histogram,Representing intersection operations for computing elements common to both sets,Representing the overlapping area of the light emitting diode region and the background region,Representing a union operation for merging all elements of two sets while removing duplicate parts,Representing the combined total area of the led region and the background region, i represents the bin index in the histogram.
Optionally, image segmentation is performed on the preprocessed image according to the calculated dynamic correlation factor to extract the led area to obtain an independent component image, including:
Identifying areas of the light emitting diode and the background in the image according to the dynamic correlation factor D;
Dividing the light-emitting diode region from the preprocessed image according to the light-emitting diode and the background region in the image;
Converting the light emitting diode region into a binary image;
traversing each pixel of the binary image, and identifying pixel groups connected together, wherein the pixel groups represent independent objects in the binary image, the independent objects are independent light-emitting diode elements, and in the identification process, all pixels belonging to the same pixel group are distributed with the same label;
For each identified pixel group, a pixel group bounding box is calculated by acquiring the minimum and maximum lateral and longitudinal coordinates of all pixels within the pixel group to acquire an independent component image.
Optionally, the key features include size, shape, brightness distribution, and color uniformity.
Optionally, traversing the independent component images to extract key features of the light emitting diode includes:
Calculating the size characteristics of the light emitting diode elements and calculating the shape factors according to the coordinates of the pixel group boundary boxes;
calculating a gray level histogram of the LED element image to obtain brightness distribution characteristics;
The uniformity of the gray level histogram of the LED element image is analyzed to extract the color uniformity characteristics.
Optionally, calculating the shape factor includes:
acquiring the total number of pixels in the outline of the light-emitting diode element and the pixel length of the outline of the light-emitting diode element according to the size characteristics of the light-emitting diode element, and calculating the circularity according to the total number of pixels and the pixel length;
Acquiring an external rectangle with the smallest area by rotating the outline of the light emitting diode element, and calculating the eccentricity according to the external rectangle with the smallest area;
the inherent moment is determined by calculating the spatial distribution of the light emitting diode elements.
It should be noted that the apparatus is an apparatus corresponding to the above method, and all implementation manners in the above method embodiment are applicable to this embodiment, so that the same technical effects can be achieved.
Embodiments of the present invention also provide a computing device comprising: a processor, a memory storing a computer program which, when executed by the processor, performs the method as described above. All the implementation manners in the method embodiment are applicable to the embodiment, and the same technical effect can be achieved.
Embodiments of the present invention also provide a computer-readable storage medium storing instructions that, when executed on a computer, cause the computer to perform a method as described above. All the implementation manners in the method embodiment are applicable to the embodiment, and the same technical effect can be achieved.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
Furthermore, it should be noted that in the apparatus and method of the present invention, it is apparent that the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present invention. Also, the steps of performing the series of processes described above may naturally be performed in chronological order in the order of description, but are not necessarily performed in chronological order, and some steps may be performed in parallel or independently of each other. It will be appreciated by those of ordinary skill in the art that all or any of the steps or components of the methods and apparatus of the present invention may be implemented in hardware, firmware, software, or a combination thereof in any computing device (including processors, storage media, etc.) or network of computing devices, as would be apparent to one of ordinary skill in the art after reading this description of the invention.
The object of the invention can thus also be achieved by running a program or a set of programs on any computing device. The computing device may be a well-known general purpose device. The object of the invention can thus also be achieved by merely providing a program product containing program code for implementing said method or apparatus. That is, such a program product also constitutes the present invention, and a storage medium storing such a program product also constitutes the present invention. It is apparent that the storage medium may be any known storage medium or any storage medium developed in the future. It should also be noted that in the apparatus and method of the present invention, it is apparent that the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present invention. The steps of executing the series of processes may naturally be executed in chronological order in the order described, but are not necessarily executed in chronological order. Some steps may be performed in parallel or independently of each other.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that various modifications and adaptations can be made without departing from the principles of the present invention, and such modifications and adaptations are intended to be comprehended within the scope of the present invention.
Claims (10)
1. A method for detecting quality of a semiconductor light emitting element, the method comprising:
Acquiring an image of the light emitting diode, and preprocessing the image to obtain a preprocessed image;
in the preprocessing image, utilizing color and brightness information to primarily identify a light emitting diode region and a background region;
calculating a dynamic correlation factor according to the identified light emitting diode region and the background region, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the light emitting diode and the background;
image segmentation is carried out on the preprocessed image according to the calculated dynamic correlation factor so as to extract the light-emitting diode region and obtain an independent element image;
Traversing the independent element images, and extracting key characteristics of the light emitting diode;
According to a preset quality standard and a dynamic correlation factor, the key features are respectively compared with the quality standard to obtain a quality judgment result;
And classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result.
2. The method of claim 1, wherein calculating a dynamic correlation factor based on the identified led region and the background region, comprises:
respectively extracting color features and brightness features corresponding to the light-emitting diode region and the background region according to the identified light-emitting diode region and the background region;
Calculating the characteristic difference between the color characteristic and the brightness characteristic of the light-emitting diode region and the background region;
And calculating a specific numerical value of the dynamic correlation factor according to the characteristic difference.
3. The method for detecting the quality of a semiconductor light emitting element according to claim 2, wherein calculating the dynamic correlation factor based on the characteristic difference comprises:
By passing through Calculating dynamic correlation factorWhereinAndRepresenting the areas of the led region and the background region respectively,Calculating the overlapping degree of the light-emitting diode region and the background region; /(I),,Weight factors of color feature difference, brightness feature difference and shape feature difference, respectively,AndColor histograms respectively representing a light emitting diode region and a background region, n being the number of bins of the histogram,AndGray level histograms respectively representing a light emitting diode region and a background region, m being the number of bins of the histogram,Representing the number of pixels of the led region in the ith bin of the color histogram,The number of pixels representing the ith bin of the background region in the color histogram,Representing the number of pixels of the light emitting diode region in the ith bin of the gray histogram,The number of pixels representing the ith bin of the background region in the gray histogram,Representing intersection operations for computing elements common to both sets,Representing the overlapping area of the light emitting diode region and the background region,Representing a union operation for merging all elements of two sets while removing duplicate parts,Representing the combined total area of the led region and the background region, i represents the bin index in the histogram.
4. A method of testing the quality of a semiconductor light emitting device according to claim 3, wherein image segmentation of the preprocessed image to extract light emitting diode regions based on the calculated dynamic correlation factor, comprises:
Identifying areas of the light emitting diode and the background in the image according to the dynamic correlation factor D;
Dividing the light-emitting diode region from the preprocessed image according to the light-emitting diode and the background region in the image;
Converting the light emitting diode region into a binary image;
traversing each pixel of the binary image, and identifying pixel groups connected together, wherein the pixel groups represent independent objects in the binary image, the independent objects are independent light-emitting diode elements, and in the identification process, all pixels belonging to the same pixel group are distributed with the same label;
For each identified pixel group, a pixel group bounding box is calculated by acquiring the minimum and maximum lateral and longitudinal coordinates of all pixels within the pixel group to acquire an independent component image.
5. The method of claim 4, wherein the key features include size, shape, brightness distribution, and color uniformity.
6. The method of claim 5, wherein traversing the individual component images to extract key features of the light emitting diode comprises:
Calculating the size characteristics of the light emitting diode elements and calculating the shape factors according to the coordinates of the pixel group boundary boxes;
calculating a gray level histogram of the LED element image to obtain brightness distribution characteristics;
The uniformity of the gray level histogram of the LED element image is analyzed to extract the color uniformity characteristics.
7. The method for detecting the quality of a semiconductor light emitting element according to claim 6, wherein calculating the shape factor comprises:
acquiring the total number of pixels in the outline of the light-emitting diode element and the pixel length of the outline of the light-emitting diode element according to the size characteristics of the light-emitting diode element, and calculating the circularity according to the total number of pixels and the pixel length;
Acquiring an external rectangle with the smallest area by rotating the outline of the light emitting diode element, and calculating the eccentricity according to the external rectangle with the smallest area;
the inherent moment is determined by calculating the spatial distribution of the light emitting diode elements.
8. A system for detecting quality of a semiconductor light emitting element, comprising:
The acquisition module is used for acquiring the image of the light emitting diode and preprocessing the image to obtain a preprocessed image; in the preprocessing image, utilizing color and brightness information to primarily identify a light emitting diode region and a background region; calculating a dynamic correlation factor according to the identified light emitting diode region and the background region, wherein the dynamic correlation factor is used for quantifying the characteristic difference between the light emitting diode and the background;
The processing module is used for carrying out image segmentation on the preprocessed image according to the calculated dynamic correlation factor so as to extract the light-emitting diode region and obtain an independent element image; traversing the independent element images, and extracting key characteristics of the light emitting diode; according to a preset quality standard and a dynamic correlation factor, the key features are respectively compared with the quality standard to obtain a quality judgment result; and classifying the LEDs with quality problems according to different quality problem types according to the quality judgment result.
9. A computing device, comprising:
one or more processors;
Storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a program which, when executed by a processor, implements the method according to any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410451398.4A CN118097305B (en) | 2024-04-16 | 2024-04-16 | Method and system for detecting quality of semiconductor light-emitting element |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410451398.4A CN118097305B (en) | 2024-04-16 | 2024-04-16 | Method and system for detecting quality of semiconductor light-emitting element |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118097305A true CN118097305A (en) | 2024-05-28 |
CN118097305B CN118097305B (en) | 2024-06-28 |
Family
ID=91154957
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410451398.4A Active CN118097305B (en) | 2024-04-16 | 2024-04-16 | Method and system for detecting quality of semiconductor light-emitting element |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118097305B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118365697A (en) * | 2024-06-19 | 2024-07-19 | 深圳明锐理想科技股份有限公司 | Wire diameter detection method, electronic equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010055414A1 (en) * | 2000-04-14 | 2001-12-27 | Ico Thieme | System and method for digitally editing a composite image, e.g. a card with the face of a user inserted therein and for surveillance purposes |
CN110490847A (en) * | 2019-07-31 | 2019-11-22 | 浙江大学山东工业技术研究院 | The LED chip quality determining method of view-based access control model |
CN114399502A (en) * | 2022-03-24 | 2022-04-26 | 视睿(杭州)信息科技有限公司 | Appearance defect detection method and system suitable for LED chip and storage medium |
CN115205194A (en) * | 2022-04-20 | 2022-10-18 | 浙江托普云农科技股份有限公司 | Method, system and device for detecting coverage rate of sticky trap based on image processing |
CN116448768A (en) * | 2023-04-25 | 2023-07-18 | 南京玛诺泰克自动化设备有限公司 | Embedded online machine vision detection method |
WO2023134792A2 (en) * | 2022-12-15 | 2023-07-20 | 苏州迈创信息技术有限公司 | Led lamp wick defect detection method |
CN116758045A (en) * | 2023-07-05 | 2023-09-15 | 日照鲁光电子科技有限公司 | Surface defect detection method and system for semiconductor light-emitting diode |
-
2024
- 2024-04-16 CN CN202410451398.4A patent/CN118097305B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010055414A1 (en) * | 2000-04-14 | 2001-12-27 | Ico Thieme | System and method for digitally editing a composite image, e.g. a card with the face of a user inserted therein and for surveillance purposes |
CN110490847A (en) * | 2019-07-31 | 2019-11-22 | 浙江大学山东工业技术研究院 | The LED chip quality determining method of view-based access control model |
CN114399502A (en) * | 2022-03-24 | 2022-04-26 | 视睿(杭州)信息科技有限公司 | Appearance defect detection method and system suitable for LED chip and storage medium |
CN115205194A (en) * | 2022-04-20 | 2022-10-18 | 浙江托普云农科技股份有限公司 | Method, system and device for detecting coverage rate of sticky trap based on image processing |
WO2023134792A2 (en) * | 2022-12-15 | 2023-07-20 | 苏州迈创信息技术有限公司 | Led lamp wick defect detection method |
CN116448768A (en) * | 2023-04-25 | 2023-07-18 | 南京玛诺泰克自动化设备有限公司 | Embedded online machine vision detection method |
CN116758045A (en) * | 2023-07-05 | 2023-09-15 | 日照鲁光电子科技有限公司 | Surface defect detection method and system for semiconductor light-emitting diode |
Non-Patent Citations (2)
Title |
---|
胡正东;陈晓竹;丁宁;: "区域边缘直方图的目标搜索算法", 中国计量学院学报, no. 02, 15 June 2016 (2016-06-15), pages 91 - 96 * |
郑喜凤;宋新丽;刘贵华;张鑫;郝娅如;: "一种基于区域最大值的LED显示屏亮度特征数据提取方法", 液晶与显示, no. 04, 15 August 2008 (2008-08-15), pages 59 - 63 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118365697A (en) * | 2024-06-19 | 2024-07-19 | 深圳明锐理想科技股份有限公司 | Wire diameter detection method, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN118097305B (en) | 2024-06-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110349126B (en) | Convolutional neural network-based marked steel plate surface defect detection method | |
CN115082683B (en) | Injection molding defect detection method based on image processing | |
US10565479B1 (en) | Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring | |
CN110389127B (en) | System and method for identifying metal ceramic parts and detecting surface defects | |
CN115351598A (en) | Numerical control machine tool bearing detection method | |
CN115439494B (en) | Spray image processing method for quality inspection of sprayer | |
CN117152161B (en) | Shaving board quality detection method and system based on image recognition | |
CN118097305B (en) | Method and system for detecting quality of semiconductor light-emitting element | |
CN111179362B (en) | Test paper color uniformity detection method based on dynamic illumination correction algorithm | |
CN116152242B (en) | Visual detection system of natural leather defect for basketball | |
CN116703909B (en) | Intelligent detection method for production quality of power adapter | |
CN118279304B (en) | Abnormal recognition method, device and medium for special-shaped metal piece based on image processing | |
CN108460344A (en) | Dynamic area intelligent identifying system in screen and intelligent identification Method | |
CN114581376A (en) | Automatic sorting method and system for textile silkworm cocoons based on image recognition | |
CN117314826A (en) | Performance detection method of display screen | |
CN107038690A (en) | A kind of motion shadow removal method based on multi-feature fusion | |
CN107545565B (en) | Solar screen plate detection method | |
CN118429242A (en) | Image analysis method and system based on deep learning | |
CN114155179A (en) | Light source defect detection method, device, equipment and storage medium | |
CN116258703A (en) | Defect detection method, defect detection device, electronic equipment and computer readable storage medium | |
CN113033635B (en) | Method and device for detecting invisible graphics context of coin | |
JP4115378B2 (en) | Defect detection method | |
CN111832565A (en) | Decision tree-based nixie tube identification method | |
CN112763506A (en) | Flaw detection method and device with AOI and AI functions | |
CN118501177B (en) | Appearance defect detection method and system for formed foil |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |