[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112560538B - Method for quickly positioning damaged QR (quick response) code according to image redundant information - Google Patents

Method for quickly positioning damaged QR (quick response) code according to image redundant information Download PDF

Info

Publication number
CN112560538B
CN112560538B CN202110213441.XA CN202110213441A CN112560538B CN 112560538 B CN112560538 B CN 112560538B CN 202110213441 A CN202110213441 A CN 202110213441A CN 112560538 B CN112560538 B CN 112560538B
Authority
CN
China
Prior art keywords
edge
code
pixel
point
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110213441.XA
Other languages
Chinese (zh)
Other versions
CN112560538A (en
Inventor
吴俊斌
顾善中
田晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seuic Technologies Co Ltd
Original Assignee
Jiangsu Seuic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Seuic Technology Co ltd filed Critical Jiangsu Seuic Technology Co ltd
Priority to CN202110213441.XA priority Critical patent/CN112560538B/en
Publication of CN112560538A publication Critical patent/CN112560538A/en
Application granted granted Critical
Publication of CN112560538B publication Critical patent/CN112560538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/02Affine transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10008Still image; Photographic image from scanner, fax or copier
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for quickly positioning a damaged QR code according to image redundant information, which comprises the following steps: carrying out gray level processing on the picture containing the QR code; evaluating the definition of the gray level picture; filtering and edge detection are carried out on the gray level picture; determining three positioning patterns of the QR code according to the width proportion of the stripes; carrying out self-adaptive binarization on the QR code area by combining the size of the positioning pattern; finding four edges of the QR code through affine transformation, and finally carrying out positioning correction by using homography transformation. The method for rapidly positioning the damaged QR code fully utilizes redundant information in the image, reduces the image traversal times as much as possible, and improves the algorithm speed; the self-adaptive threshold edge detection and the self-adaptive threshold binarization are adopted, so that the edge detection and binarization effects are improved; and the unclear image is filtered, so that the transitional waste of computing resources can be avoided.

Description

Method for quickly positioning damaged QR (quick response) code according to image redundant information
Technical Field
The invention relates to a method for quickly positioning a damaged QR code, in particular to a method for quickly positioning a damaged QR code according to image redundant information.
Background
The QR Code (Quick Response Code) is a two-dimensional Code, and has the advantages of high information storage density, strong fault-tolerant capability and the like, and can realize Quick positioning through an obvious positioning pattern, so that the QR Code is widely applied.
With the development of the informatization degree, all industries aim at reducing labor cost and improving the informatization level, and the simple QR code label with low cost and high reading speed becomes the first choice of all industries. By pasting the QR code label and identifying on a factory production line, the whole production and assembly process of each part can be automatically recorded, and the rapid product tracing and good product analysis can be realized; the full-automatic parcel sorting can be realized through the QR code in the logistics industry, and the labor participation cost is reduced; in the retail industry, the product can replace bar codes to identify commodities; in the mobile payment industry, code scanning payment of QR codes is widely applied. Visible QR codes have become an important portal for quickly obtaining item information or human information.
As the QR code enters various industries, the code scanning capability of the code scanning equipment is further improved. Because the two-dimensional code is very easily influenced by various factors in the reading process, such as the pollution of the QR code, low contrast, uneven illumination, motion blur of equipment, defocus blur, various noises and the like can influence the positioning effect of the code scanning algorithm, and the improvement of the positioning effect usually brings about the reduction of the speed, which is a contradictory choice for the industrial field which is poor in the lighting environment and hopes to accelerate the production. In view of this, how to improve the positioning effect of the damaged QR code while ensuring the positioning speed is an urgent problem to be solved.
The traditional image processing algorithm is very dependent on the positioning pattern of the QR code, and meanwhile, the traditional image processing algorithm can only be optimized aiming at a specific scene, and the problem that the positioning cannot be correctly carried out is caused when a complex environment is encountered. The possible environments are as follows: (1) the QR code acquired in the environment with insufficient illumination has the defects of dark color and low contrast; (2) a QR code on a non-plane or with severe bending deformation cannot be effectively positioned; (3) the dot matrix QR codes printed on the metal material generally have the same foreground and background colors; (4) QR codes collected in a mobile environment are easy to generate motion blur; (5) when the positioning pattern is seriously stained, effective positioning cannot be realized; (6) the two-dimensional code after being out of focus generates out-of-focus blur. Traditional image processing algorithms are extremely difficult to exhaust all scenes, thus leading to situations where the algorithms cannot be located when applied to other scenes. The existing deep learning algorithm can effectively locate the QR code area after a large amount of training, and although the existing deep learning algorithm does not rely on the locating pattern of the QR code, the locating effect is influenced by the size of the model. Larger models have better localization capabilities, but the algorithm speed is slower because of the large number of convolution and floating point operations that need to be performed. Deep learning relies on hardware devices such as GPUs with parallel computing, which further slows down the positioning speed for code scanning devices with hardware lag, and even fails to effectively port algorithms.
Disclosure of Invention
The purpose of the invention is as follows: the method for rapidly positioning the damaged QR code according to the image redundant information is provided, the edge detection and binaryzation can be performed by fully using the image redundant information and continuously adjusting the threshold value, the anomaly filtering of the neighborhood is performed on the basis, and finally the rapid positioning of the QR code is realized.
The technical scheme is as follows: the invention discloses a method for quickly positioning a damaged QR code according to image redundant information, which comprises the following steps:
step 1, carrying out gray processing on a picture containing a QR code to obtain a gray picture;
step 2, evaluating the definition of the gray level picture through random sampling;
step 3, filtering and edge detection are carried out on the gray level picture through a self-adaptive threshold value;
step 4, determining the width of the stripe through the detected edge, and determining three positioning patterns of the QR code according to the width proportion of the stripe;
step 5, determining a QR code area through the three positioning patterns, and combining the module width of the positioning patternssCarrying out edge detection on the QR code area again, and carrying out binarization on the edge;
and 6, calculating an affine change matrix through the three positioning patterns, carrying out affine transformation on the gray level image, searching out four edge lines of the QR code according to a binarization result, intersecting the edge lines with four corner points of the QR code to obtain a homography transformation matrix, and finally carrying out positioning correction through homography transformation.
Further, in step 2, the specific steps of evaluating the definition of the gray-scale picture by random sampling are as follows:
step 2.1, Generation Using ISAAC pseudo-random number Generation Algorithm
Figure 389729DEST_PATH_IMAGE001
A random number satisfying a normal distribution, whereinWAndHrespectively a width pixel and a height pixel of the gray scale image;
step 2.2, determining the coordinate of a pixel point by two adjacent random numbers
Figure 13609DEST_PATH_IMAGE002
Can determine
Figure 955020DEST_PATH_IMAGE003
Pixel points, and taking the coordinates of the pixel points as the coordinates of each sampling point;
step 2.3, sampling is carried out on the gray picture according to the coordinates of each sampling point, and the average value of the pixel values of each sampling point and nine adjacent sampling points is taken as the pixel value of the sampling point;
step 2.4, counting the pixel values of all sampling points, and solving the mean value, the variance and the histogram distribution of the pixel values of all the sampling points;
step 2.5, comparing the variance with a definition threshold, if the variance is greater than or equal to the definition threshold, indicating that the definition meets the requirement, directly entering step 3, if the variance is less than the definition threshold, indicating that the definition is low, returning a histogram distribution result, and entering step 2.6;
step 2.6, calculating a foreground pixel value and a background pixel value according to the histogram distribution, adjusting the brightness of a flash lamp of the code scanning device according to the foreground pixel value and the background pixel value, starting the code scanning device to re-acquire a picture containing the QR code, and returning to the step 1; if the code scanning device cannot be started, directly entering step 3.
Further, in step 2.6, the specific steps of calculating the foreground pixel value and the background pixel value according to the histogram distribution are as follows:
step 2.6.1, calculating the number gradient of the current pixel value:
Figure 764408DEST_PATH_IMAGE004
in the formula (I), the compound is shown in the specification,
Figure 158480DEST_PATH_IMAGE005
representing a pixel value of
Figure 851629DEST_PATH_IMAGE006
The gradient of the amount of (c) is,
Figure 116388DEST_PATH_IMAGE007
representing a pixel value of
Figure 604002DEST_PATH_IMAGE008
The number of the sampling points of (a),
Figure 219791DEST_PATH_IMAGE009
representing a pixel value of
Figure 949587DEST_PATH_IMAGE010
And number of sampling points of
Figure 68853DEST_PATH_IMAGE011
And
Figure 992947DEST_PATH_IMAGE012
the relationship of (1);
step 2.6.2, carrying out extreme point judgment: if it is
Figure 96032DEST_PATH_IMAGE013
And
Figure 865405DEST_PATH_IMAGE005
is of opposite sign or
Figure 370335DEST_PATH_IMAGE005
When the value is 0, the pixel value
Figure 963865DEST_PATH_IMAGE006
Determining a suspected extreme point of a histogram distribution, and counting pixel values
Figure 819826DEST_PATH_IMAGE006
And the total number of sampling points of two adjacent pixel values is as follows:
Figure 392890DEST_PATH_IMAGE014
in the formula (I), the compound is shown in the specification,
Figure 17906DEST_PATH_IMAGE015
is a pixel value
Figure 283802DEST_PATH_IMAGE006
The total number of surrounding sample points,
Figure 361480DEST_PATH_IMAGE016
as the current pixel value
Figure 738235DEST_PATH_IMAGE006
The number of sampling points;
step 2.6.3, comparing the total number of the surrounding sample points of all the suspected extreme points
Figure 716293DEST_PATH_IMAGE015
Figure 153090DEST_PATH_IMAGE015
The pixel value corresponding to the maximum suspected extremum point is used as the background pixel value,
Figure 718064DEST_PATH_IMAGE015
and taking the pixel value corresponding to the second largest suspected extremum point as the foreground pixel value.
Further, in step 2.6, when the flash brightness of the code scanning device is adjusted according to the foreground pixel value and the background pixel value, if both the foreground pixel value and the background pixel value are less than 127, the flash brightness of the code scanning device is adjusted to be 10% higher than the previous brightness, and if both the foreground pixel value and the background pixel value are greater than or equal to 127, the flash brightness of the code scanning device is adjusted to be 10% lower than the previous brightness.
Further, in step 3, the specific steps of filtering the grayscale image and detecting the edge by the adaptive threshold are as follows:
step 3.1, the gray level picture is scanned in a reciprocating mode line by line or line by line, the rectangular distribution of the scanning line or the scanning line is solved through the pixel values of all pixel points in the current scanning line or the scanning line, and then the rectangular distribution is calculatedFrom the foreground pixel value of the current scan line or scan column
Figure 632930DEST_PATH_IMAGE017
And background pixel value
Figure 232539DEST_PATH_IMAGE018
Thus, an initial edge threshold is constructed:
Figure 840237DEST_PATH_IMAGE019
in the formula, ABS is a function of absolute value,
Figure 892507DEST_PATH_IMAGE020
is the minimum edge threshold value for the edge of the image,
Figure 345485DEST_PATH_IMAGE021
is the current edge threshold;
step 3.2, filtering each pixel point in the scanning line or the scanning line one by using exponential moving average, wherein the filtering meets the formula:
Figure 303995DEST_PATH_IMAGE022
in the formula (I), the compound is shown in the specification,
Figure 82595DEST_PATH_IMAGE023
is the filtered value of the current pixel point,
Figure 356582DEST_PATH_IMAGE024
is the filtered value of the previous pixel point,
Figure 878830DEST_PATH_IMAGE006
is the pixel value of the current pixel point,
Figure 187451DEST_PATH_IMAGE025
the range is a set constant value and is between 0 and 1;
step 3.3, judging whether the current pixel point is an edge or not through the laplace operator, and if the current pixel point is the edge, judging according to the judgment
Figure 605794DEST_PATH_IMAGE026
The rule of (2) screens weak edges, otherwise, the coordinate of the current pixel point is taken as an edge value
Figure 865612DEST_PATH_IMAGE027
Edge set stored to current scan line or scan column
Figure 191551DEST_PATH_IMAGE028
And updating the current edge threshold
Figure 354679DEST_PATH_IMAGE029
Step 3.4, narrow the current edge threshold
Figure 209503DEST_PATH_IMAGE030
And returning to the step 3.2 to scan the next pixel point until all the pixel points of the scanning line or the scanning column are scanned completely, thereby obtaining the edge set of the current scanning line or the scanning column:
Figure 723661DEST_PATH_IMAGE031
in the formula (I), the compound is shown in the specification,
Figure 322132DEST_PATH_IMAGE028
for the set of edges of the current scan row or scan column,
Figure 605346DEST_PATH_IMAGE032
is as followscIn a scanning line or a scanning columnnAn edge value;
and 3.5, repeating the steps 3.1-3.4 until the edge set of each row or each column of the whole gray level image is obtained.
Further, in step 4, the specific steps of determining three positioning patterns of the QR code according to the width ratio of the stripes are as follows:
step 4.1, unifying the edge values obtained by the reciprocating scanning into the edge values in the same direction
Figure 395185DEST_PATH_IMAGE033
cIs the number of the current row or column, ifcIs odd, then
Figure 599902DEST_PATH_IMAGE034
If, ifcIs an even number, then
Figure 736485DEST_PATH_IMAGE035
LIs the length of the current row or column, if the scan is horizontal, thenLEqual to the width of the grayscale imageWIf the scanning is vertical scanning, thenLEqual to the gray scale image heightH
And 4.2, forming a stripe by each adjacent edge, and calculating the width of each stripe:
Figure 608626DEST_PATH_IMAGE036
in the formula (I), the compound is shown in the specification,
Figure 805252DEST_PATH_IMAGE037
is as followsiThe width of each stripe;
step 4.3, searching the stripe proportion according with the width of the adjacent five stripes, and recording the stripe proportion in the matching stripe set of the current stripe:
Figure 261379DEST_PATH_IMAGE038
in the formula, matching stripes
Figure 732812DEST_PATH_IMAGE039
The six edge values of each matching stripe are included as the edge set of the current scanning line or columnE cur A subset of (a);
step 4.4, matching the fringe sets through the previous row or columnP last Each matching stripe of
Figure 725038DEST_PATH_IMAGE040
To filter the edge set of the current row or columnE cur Matching the fringe set equally by current row or columnP cur Each matching stripe of
Figure 358145DEST_PATH_IMAGE041
To filter the edge set of the next row or columnE last Searching out matching stripes which are missed due to image noise, and recording the matching stripes in a matching stripe set of a corresponding row or column;
step 4.5, clustering the searched stripes, namely when the line spacing and the column spacing of the adjacent edges are less than or equal to a threshold value t,
Figure 803033DEST_PATH_IMAGE042
then the stripes are considered to be the same stripes, so that wider stripes are formed;
step 4.6, correspondingly crossing each horizontal stripe and each longitudinal stripe obtained after clustering, finally determining a positioning pattern, and filtering out the non-crossed stripes;
step 4.7, calculating the center point of the positioning patternoEdge point setV={v 0,v 1,v 2,…,v nAnd module width
Figure 812577DEST_PATH_IMAGE043
In the formula (I), wherein,d(o, v i) Edge points representing transverse stripesv iAnd a central pointoTransverse distance or edge points of longitudinal stripesv iAnd a central pointoIf the center pointoHas the coordinates of (x o ,y o ) Edge pointsv iHas the coordinates of (x i ,y i ) The lateral distance is ABS: (x o ,- x i ) The longitudinal distance is ABS: (y o ,-y i )。
Further, in step 4.4, the specific steps of filtering are as follows:
step 4.4.1, byP cur Each of which matches the stripe
Figure 659311DEST_PATH_IMAGE044
Set of edges for the next row or column
Figure 728898DEST_PATH_IMAGE045
Carry out a search, order
Figure 177195DEST_PATH_IMAGE046
And
Figure 459272DEST_PATH_IMAGE047
step 4.4.2, threshold judgment: if it satisfies
Figure 426091DEST_PATH_IMAGE048
Wherein
Figure 401000DEST_PATH_IMAGE049
Then, then
Figure 86059DEST_PATH_IMAGE050
And
Figure 906248DEST_PATH_IMAGE051
has a corresponding relationship and records
Figure 226108DEST_PATH_IMAGE050
Step 4.4.3, if the threshold judgment of the step 4.4.2 is met, updating
Figure 371919DEST_PATH_IMAGE052
Otherwise, update
Figure 13116DEST_PATH_IMAGE053
And returning to the step 4.4.2 until
Figure 168154DEST_PATH_IMAGE054
Or
Figure 843985DEST_PATH_IMAGE055
All the elements are traversed;
step 4.4.4, if
Figure 160697DEST_PATH_IMAGE054
Each edge value in (1)
Figure 554770DEST_PATH_IMAGE051
Are all at
Figure 746454DEST_PATH_IMAGE055
Find the corresponding edge value
Figure 542372DEST_PATH_IMAGE050
Then, it means that a matching stripe missed due to image noise is found from the edge set of the previous line or column and is recorded in the matching stripe set of the previous line or column
Figure 295564DEST_PATH_IMAGE056
In (3), go back to step 4.4.1, search using the next matching stripe until
Figure 911353DEST_PATH_IMAGE057
All the elements are traversed.
Further, the specific steps of step 5 are:
step 5.1, pairing the positioning patterns according to the pairing rule: the module width s is smaller than the threshold t, and the color is the same and not collinear;
step 5.2, determining a QR code area according to the three matched positioning patterns;
step 5.3, carrying out edge detection on the QR code area to identify a black and white space, thereby forming preliminary binarization processing;
step 5.4, calculating a binarization threshold value of each pixel point in a QR code region in a neighborhood by using an averaging method, wherein the neighborhood size is a square region with the side length being the module width s, and then carrying out binarization processing on the QR code region according to the binarization threshold value;
step 5.5, comparing the binarization in the step 5.3 with the binarization in the step 5.4, if the pixel values are all binarized into 0, updating to 0, and if the pixel values are all binarized into 255, updating to 255, otherwise, the following three conditions exist:
in case 1, if the pixel point on the left side of the current pixel point and the pixel point on the upper side are both binarized into 0, the current pixel point is binarized into 0;
in case 2, if the pixel point on the left side of the current pixel point and the pixel point on the upper side are both binarized to be 255, the current pixel point is binarized to be 255;
in case 3, if the binarization results of the left pixel point and the upper pixel point of the current pixel point are different, the average value of the two pixel points is used as the pixel threshold, if the pixel value of the current pixel point is greater than the pixel threshold, the binarization is 255, otherwise the binarization is 0.
Further, the specific steps of step 6 are:
step 6.1, setting the coordinates of the central points of the three positioning patterns of the standard QR code as
Figure 408194DEST_PATH_IMAGE058
Figure 527460DEST_PATH_IMAGE059
And
Figure 451553DEST_PATH_IMAGE060
the coordinates of the center points of the three positioning patterns of the current QR code are
Figure 53174DEST_PATH_IMAGE061
Figure 822546DEST_PATH_IMAGE062
And
Figure 327477DEST_PATH_IMAGE063
then, the three positioning patterns of the current QR code satisfy the following affine transformation:
Figure 891314DEST_PATH_IMAGE064
in the formula (I), the compound is shown in the specification,
Figure 216116DEST_PATH_IMAGE065
in order to be an affine transformation coefficient,
Figure 293574DEST_PATH_IMAGE066
as a coordinate
Figure 918590DEST_PATH_IMAGE002
By affine transformation of the coordinates projected into the affine domain
Figure 450066DEST_PATH_IMAGE067
And
Figure 262164DEST_PATH_IMAGE068
substituting the above formula to obtain affine transformation coefficient;
step 6.2, mapping each edge point coordinate of the edge point set V into an affine domain through the affine transformation, determining the position of the edge point in the standard QR code through the coordinate comparison with the standard QR code, finding the edge point belonging to the edge line on the standard QR code, and performing least square straight line fitting to form an upper edge line;
step 6.3, obtaining a lower edge line, a left edge line and a right edge line according to the same principle of the step 6.2;
6.4, sampling edge black points of the QR code in the binary image in the step 5 along the upper edge line, the lower edge line, the left edge line and the right edge line, and performing least square straight line fitting again according to newly added sampling points, so that the straight line fitting is more accurate;
step 6.5, the upper edge line, the lower edge line, the left edge line and the right edge line are intersected at four corner points of the QR code
Figure 638919DEST_PATH_IMAGE069
Is provided with
Figure 118442DEST_PATH_IMAGE069
The correspondence in the homography transform domain is
Figure 555239DEST_PATH_IMAGE070
And satisfies the following homography transformation:
Figure 120213DEST_PATH_IMAGE071
in the formula (I), the compound is shown in the specification,
Figure 799194DEST_PATH_IMAGE072
in order to be a homography transformation coefficient,
Figure 398802DEST_PATH_IMAGE066
as a coordinate
Figure 6501DEST_PATH_IMAGE002
Transforming the coordinates projected into the homography domain by homography
Figure 793191DEST_PATH_IMAGE069
The homography transformation matrix can be obtained by the four points and the corresponding relation, and the positioning correction is completed on the QR code.
Compared with the prior art, the invention has the beneficial effects that: by fully utilizing redundant information in the image, the image traversal times are reduced as much as possible, and the algorithm speed is improved; the self-adaptive threshold edge detection and the self-adaptive threshold binarization are adopted, so that the edge detection and binarization effects are improved; when the image is transmitted, simple image analysis is firstly carried out, and the unclear image is filtered, so that excessive waste of computing resources can be avoided.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a schematic diagram of a horizontal positioning pattern stripe of a QR code according to the present invention;
FIG. 3 is a schematic diagram illustrating center and edge point set identification of a QR code positioning pattern according to the present invention;
FIG. 4 is a schematic diagram of the QR code affine transformation of the present invention;
FIG. 5 is a schematic diagram of the edge line fitting of the QR code of the present invention;
FIG. 6 is a schematic diagram of QR code edge point sampling according to the present invention;
FIG. 7 is a schematic diagram of the homography transformation of the QR code of the present invention;
fig. 8 is a schematic diagram illustrating positioning and correction of a QR code according to the present invention.
Detailed Description
The technical solution of the present invention is described in detail below with reference to the accompanying drawings, but the scope of the present invention is not limited to the embodiments.
As shown in fig. 1, the method for quickly locating a damaged QR code according to image redundant information disclosed by the present invention includes the following steps:
step 1, carrying out gray processing on a picture containing a QR code to obtain a gray picture;
step 2, evaluating the definition of the gray level picture through random sampling;
step 3, filtering and edge detection are carried out on the gray level picture through a self-adaptive threshold value;
step 4, determining the width of the stripe through the detected edge, and determining three positioning patterns of the QR code according to the width proportion of the stripe;
step 5, determining a QR code area through the three positioning patterns, and combining the module width of the positioning patternssCarrying out edge detection on the QR code region again, and carrying out binarization on the edge;
and 6, calculating an affine change matrix through the three positioning patterns, carrying out affine transformation on the gray level image, as shown in fig. 4, searching four edge lines of the QR code according to a binarization result, wherein the edge lines are crossed with four corner points of the QR code to obtain a homography transformation matrix, and finally carrying out positioning correction through homography transformation.
By fully utilizing redundant information in the image, the image traversal times are reduced as much as possible, and the algorithm speed is improved; the self-adaptive threshold edge detection and the self-adaptive threshold binarization are adopted, so that the edge detection and binarization effects are improved; when the image is transmitted, simple image analysis is firstly carried out, and the unclear image is filtered, so that excessive waste of computing resources can be avoided.
Further, in step 2, the specific steps of evaluating the definition of the gray-scale picture by random sampling are as follows:
step 2.1, Generation Using ISAAC pseudo-random number Generation Algorithm
Figure 777328DEST_PATH_IMAGE001
Random numbers satisfying normal distribution, wherein W and H are width pixels and height pixels of the gray level image respectively; when an image is transmitted, the image is analyzed through random sampling, and the sampling mode is used for sampling in a normal distribution mode, so that the sampling points in the middle area are more, mainly because when most code scanning equipment collects QR code data, the probability of the QR code at the edge of the image is usually smaller, and therefore, the gray picture is sampled according to the random number in normal distribution, and the sampling meets the requirements that the middle sampling points of the gray picture are more and the edge sampling points are less;
step 2.2, determining the coordinate of a pixel point by two adjacent random numbers
Figure 497022DEST_PATH_IMAGE002
Can determine
Figure 275622DEST_PATH_IMAGE003
Pixel points, and taking the coordinates of the pixel points as the coordinates of each sampling point;
step 2.3, sampling is carried out on the gray level picture according to the coordinates of each sampling point, and the average value of the pixel values of each sampling point and nine adjacent sampling points is taken as the pixel value of the sampling point, so that the influence of abnormal points can be effectively avoided;
step 2.4, counting the pixel values of all sampling points, and solving the mean value, the variance and the histogram distribution of the pixel values of all the sampling points;
step 2.5, comparing the variance with a definition threshold, if the variance is greater than or equal to the definition threshold, indicating that the definition meets the requirement, directly entering step 3, if the variance is less than the definition threshold, indicating that the definition is low, returning a histogram distribution result, and entering step 2.6; the definition analysis is carried out on part of the sampled pixel points, if the definition is not enough, the pixel points can be fed back to the code scanning equipment for collecting the next frame, so that a clearer image is collected for analysis, and the recognition rate is improved;
step 2.6, calculating a foreground pixel value and a background pixel value according to the histogram distribution, adjusting the brightness of a flash lamp of the code scanning device according to the foreground pixel value and the background pixel value, starting the code scanning device to re-acquire a picture containing the QR code, and returning to the step 1; if the code scanning device cannot be started, directly entering step 3.
Further, in step 2.6, the specific steps of calculating the foreground pixel value and the background pixel value according to the histogram distribution are as follows:
step 2.6.1, calculating the number gradient of the current pixel value:
Figure 549609DEST_PATH_IMAGE073
in the formula (I), the compound is shown in the specification,
Figure 570392DEST_PATH_IMAGE005
representing a pixel value of
Figure 879014DEST_PATH_IMAGE006
The gradient of the amount of (c) is,
Figure 828515DEST_PATH_IMAGE007
representing a pixel value of
Figure 855377DEST_PATH_IMAGE008
The number of the sampling points of (a),
Figure 915737DEST_PATH_IMAGE009
representing a pixel value of
Figure 344444DEST_PATH_IMAGE010
OfNumber of dots and exist
Figure 464847DEST_PATH_IMAGE011
And
Figure 979005DEST_PATH_IMAGE012
the relationship of (1);
step 2.6.2, carrying out extreme point judgment: if it is
Figure 843056DEST_PATH_IMAGE074
And
Figure 624805DEST_PATH_IMAGE075
is of opposite sign or
Figure 916109DEST_PATH_IMAGE075
When the value is 0, the pixel value
Figure 651984DEST_PATH_IMAGE006
Determining a suspected extreme point of a histogram distribution, and counting pixel values
Figure 319726DEST_PATH_IMAGE006
And the total number of sampling points of two adjacent pixel values is as follows:
Figure 457446DEST_PATH_IMAGE014
in the formula (I), the compound is shown in the specification,
Figure 919651DEST_PATH_IMAGE015
is a pixel value
Figure 142822DEST_PATH_IMAGE006
The total number of surrounding sample points,
Figure 614255DEST_PATH_IMAGE016
as the current pixel value
Figure 99157DEST_PATH_IMAGE006
OfThe number of points;
step 2.6.3, comparing the total number of the surrounding sample points of all the suspected extreme points
Figure 732264DEST_PATH_IMAGE015
Figure 708310DEST_PATH_IMAGE015
The pixel value corresponding to the maximum suspected extremum point is used as the background pixel value,
Figure 717854DEST_PATH_IMAGE015
and taking the pixel value corresponding to the second largest suspected extremum point as the foreground pixel value.
Further, in step 2.6, when the flash brightness of the code scanning device is adjusted according to the foreground pixel value and the background pixel value, if both the foreground pixel value and the background pixel value are less than 127, the flash brightness of the code scanning device is adjusted to be 10% higher than the previous brightness, and if both the foreground pixel value and the background pixel value are greater than or equal to 127, the flash brightness of the code scanning device is adjusted to be 10% lower than the previous brightness.
Further, in step 3, the specific steps of filtering the grayscale image and detecting the edge by the adaptive threshold are as follows:
step 3.1, the gray picture is scanned in a reciprocating mode line by line or line by line, the rectangular distribution of the scanning line or the scanning line is solved through the pixel values of all pixel points in the current scanning line or the scanning line, and the foreground pixel value of the current scanning line or the scanning line is calculated through the rectangular distribution
Figure 564588DEST_PATH_IMAGE017
And background pixel value
Figure 634175DEST_PATH_IMAGE018
Thus, an initial edge threshold is constructed:
Figure 566359DEST_PATH_IMAGE019
in the formulaThe ABS is a function of the absolute value,
Figure 379594DEST_PATH_IMAGE020
is the minimum edge threshold value for the edge of the image,
Figure 579369DEST_PATH_IMAGE021
is the current edge threshold;
step 3.2, filtering each pixel point in the scanning line or the scanning line one by using exponential moving average, wherein the filtering meets the formula:
Figure 819857DEST_PATH_IMAGE076
in the formula (I), the compound is shown in the specification,
Figure 239338DEST_PATH_IMAGE023
is the filtered value of the current pixel point,
Figure 325105DEST_PATH_IMAGE024
is the filtered value of the previous pixel point,
Figure 146431DEST_PATH_IMAGE006
is the pixel value of the current pixel point,
Figure 557820DEST_PATH_IMAGE025
the range is between 0 and 1 for the set fixed value, and the exponential moving average filter has the advantages of simple algorithm and high filtering speed;
step 3.3, judging whether the current pixel point is an edge or not through the laplace operator, and if the current pixel point is the edge, judging according to the judgment
Figure 464597DEST_PATH_IMAGE026
The rule of (2) screens weak edges, otherwise, the coordinate of the current pixel point is taken as an edge value
Figure 619634DEST_PATH_IMAGE027
Edge set stored to current scan line or scan column
Figure 794002DEST_PATH_IMAGE028
And updating the current edge threshold
Figure 376293DEST_PATH_IMAGE029
(ii) a Because the laplace operator is easily influenced by the abnormal value, the weak edge can be filtered through the threshold value, and in order to avoid filtering the effective edge, the threshold value can be adjusted through the histogram distribution of the surrounding images, so that the judgment of the abnormal edge on the positioning pattern is effectively reduced, and the identification rate of the positioning pattern stripe can be greatly improved;
step 3.4, narrow the current edge threshold
Figure 770365DEST_PATH_IMAGE030
And returning to the step 3.2 to scan the next pixel point until all the pixel points of the scanning line or the scanning column are scanned completely, thereby obtaining the edge set of the current scanning line or the scanning column:
Figure 729094DEST_PATH_IMAGE031
in the formula (I), the compound is shown in the specification,
Figure 259432DEST_PATH_IMAGE028
for the set of edges of the current scan row or scan column,
Figure 12624DEST_PATH_IMAGE032
is as followscIn a scanning line or a scanning columnnAn edge value;
and 3.5, repeating the steps 3.1-3.4 until the edge set of each row or each column of the whole gray level image is obtained.
A large amount of redundant information generally exists in an image, and if the upper and lower lines of pixel points are similar, when the upper line is abnormal, the upper line can be filtered through the lower line, so that excessive iteration and judgment on the same line are avoided, and the algorithm speed is effectively improved.
Further, in step 4, the specific steps of determining three positioning patterns of the QR code according to the width ratio of the stripes are as follows:
step 4.1, unifying the edge values obtained by the reciprocating scanning into the edge values in the same direction
Figure 893993DEST_PATH_IMAGE033
cIs the number of the current row or column, ifcIs odd, then
Figure 390833DEST_PATH_IMAGE034
If, ifcIs an even number, then
Figure 539792DEST_PATH_IMAGE035
LIs the length of the current row or column, if the scan is horizontal, thenLEqual to the width of the grayscale imageWIf the scanning is vertical scanning, thenLEqual to the gray scale image heightH
And 4.2, forming a stripe by each adjacent edge, and calculating the width of each stripe:
Figure 463886DEST_PATH_IMAGE036
in the formula (I), the compound is shown in the specification,
Figure 566971DEST_PATH_IMAGE037
is as followsiThe width of each stripe;
step 4.3, searching the stripe proportion according with the width of the adjacent five stripes, and recording the stripe proportion in the matching stripe set of the current stripe:
Figure 601923DEST_PATH_IMAGE077
in the formula, matching stripes
Figure 106854DEST_PATH_IMAGE039
The six edge values of each matching stripe are included as the edge set of the current scanning line or columnE cur A subset of (a);
step 4.4, matching the fringe sets through the previous row or columnP last Each matching stripe of
Figure 201849DEST_PATH_IMAGE040
To filter the edge set of the current row or columnE cur Matching the fringe set equally by current row or columnP cur Each matching stripe of
Figure 792230DEST_PATH_IMAGE041
To filter the edge set of the next row or columnE last Searching out matching stripes which are omitted due to image noise, and recording the matching stripes in a matching stripe set of corresponding rows or columns, wherein the stripes of the QR code transverse positioning pattern are shown in figure 2;
step 4.5, clustering the searched stripes, namely when the line spacing and the column spacing of the adjacent edges are less than or equal to a threshold value t,
Figure 630873DEST_PATH_IMAGE042
then the stripes are considered to be the same stripes, so that wider stripes are formed;
step 4.6, correspondingly crossing each horizontal stripe and each longitudinal stripe obtained after clustering, finally determining a positioning pattern, and filtering out the non-crossed stripes;
step 4.7, calculating the center point of the positioning patternoEdge point setV={v 0,v 1,v 2,…,v nAnd module width
Figure 760284DEST_PATH_IMAGE043
In the formula (I), wherein,d(o, v i) Edge points representing transverse stripesv iAnd a central pointoTransverse distance or edge points of longitudinal stripesv iAnd a central pointoIf the center pointoHas the coordinates of (x o ,y o ) Edge pointsv iHas the coordinates of (x i ,y i ) The lateral distance is ABS: (x o ,- x i ) The longitudinal distance is ABS: (y o ,-y i )。
Further, in step 4.4, the specific steps of filtering are as follows:
step 4.4.1, byP cur Each of which matches the stripe
Figure 26181DEST_PATH_IMAGE044
Set of edges for the next row or column
Figure 103858DEST_PATH_IMAGE045
Carry out a search, order
Figure 480613DEST_PATH_IMAGE046
And
Figure 225715DEST_PATH_IMAGE047
step 4.4.2, threshold judgment: if it satisfies
Figure 662512DEST_PATH_IMAGE048
Wherein
Figure 227486DEST_PATH_IMAGE049
Then, then
Figure 407932DEST_PATH_IMAGE050
And
Figure 240496DEST_PATH_IMAGE051
has a corresponding relationship and records
Figure 848195DEST_PATH_IMAGE050
Step 4.4.3, if the threshold judgment of the step 4.4.2 is met, updating
Figure 900465DEST_PATH_IMAGE052
Otherwise, update
Figure 884601DEST_PATH_IMAGE053
And returning to the step 4.4.2 until
Figure 73137DEST_PATH_IMAGE054
Or
Figure 851737DEST_PATH_IMAGE055
All the elements are traversed;
step 4.4.4, if
Figure 656882DEST_PATH_IMAGE054
Each edge value in (1)
Figure 179130DEST_PATH_IMAGE051
Are all at
Figure 487752DEST_PATH_IMAGE055
Find the corresponding edge value
Figure 670209DEST_PATH_IMAGE050
Then, it means that a matching stripe missed due to image noise is found from the edge set of the previous line or column and is recorded in the matching stripe set of the previous line or column
Figure 962650DEST_PATH_IMAGE056
In (3), go back to step 4.4.1, search using the next matching stripe until
Figure 23010DEST_PATH_IMAGE057
All the elements are traversed.
When the QR code positioning pattern is polluted or influenced by noise, the condition that the matching stripes are filtered due to pollution can be effectively avoided by comparing the edges of two lines or two columns in front and back, and the identification effect of the positioning pattern is improved.
Further, the specific steps of step 5 are:
step 5.1, pairing the positioning patterns according to the pairing rule: the module width s is smaller than the threshold t, and the color is the same and not collinear;
step 5.2, determining a QR code area according to the three matched positioning patterns;
step 5.3, carrying out edge detection on the QR code area to identify a black and white space, thereby forming preliminary binarization processing;
step 5.4, calculating a binarization threshold value of each pixel point in a QR code region in a neighborhood by using an averaging method, wherein the neighborhood size is a square region with the side length being the module width s, and then carrying out binarization processing on the QR code region according to the binarization threshold value;
step 5.5, comparing the binarization in the step 5.3 with the binarization in the step 5.4, if the pixel values are all binarized into 0, updating to 0, and if the pixel values are all binarized into 255, updating to 255, otherwise, the following three conditions exist:
in case 1, if the pixel point on the left side of the current pixel point and the pixel point on the upper side are both binarized into 0, the current pixel point is binarized into 0;
in case 2, if the pixel point on the left side of the current pixel point and the pixel point on the upper side are both binarized to be 255, the current pixel point is binarized to be 255;
in case 3, if the binarization results of the left pixel point and the upper pixel point of the current pixel point are different, the average value of the two pixel points is used as the pixel threshold, if the pixel value of the current pixel point is greater than the pixel threshold, the binarization is 255, otherwise the binarization is 0.
For binarization, firstly, a QR code area is locked through a found positioning pattern, so that the influence of a global threshold value on a local image is avoided, then, the edge of each module is searched through edge detection, the position of each black and white module is preliminarily positioned, then, the size of the module is used as the size of a pixel neighborhood of a self-adaptive threshold value, the threshold value is self-adaptively calculated by adopting an average value method to realize binarization, and the binarization effect can be effectively improved.
Further, the specific steps of step 6 are:
step 6.1, setting the coordinates of the central points of the three positioning patterns of the standard QR code as
Figure 186138DEST_PATH_IMAGE058
Figure 572120DEST_PATH_IMAGE059
And
Figure 820699DEST_PATH_IMAGE060
the coordinates of the center points of the three positioning patterns of the current QR code are
Figure 684750DEST_PATH_IMAGE061
Figure 967964DEST_PATH_IMAGE062
And
Figure 23382DEST_PATH_IMAGE063
then, the three positioning patterns of the current QR code satisfy the following affine transformation:
Figure 759257DEST_PATH_IMAGE078
in the formula (I), the compound is shown in the specification,
Figure 426999DEST_PATH_IMAGE065
in order to be an affine transformation coefficient,
Figure 564719DEST_PATH_IMAGE066
as a coordinate
Figure 26924DEST_PATH_IMAGE002
By affine transformation of the coordinates projected into the affine domain
Figure 250095DEST_PATH_IMAGE067
And
Figure 721528DEST_PATH_IMAGE068
substituting the above formula to obtain affine transformation coefficient;
step 6.2, mapping each edge point coordinate of the edge point set V into an affine domain through the affine transformation, determining the position of the edge point in the standard QR code through the coordinate comparison with the standard QR code, finding the edge point belonging to the edge line on the standard QR code, and performing least square straight line fitting to form an upper edge line, as shown in FIG. 5;
step 6.3, obtaining a lower edge line, a left edge line and a right edge line according to the same principle of the step 6.2;
6.4, sampling edge black points of the QR code in the binary image in the step 5 along the upper edge line, the lower edge line, the left edge line and the right edge line, as shown in FIG. 6, and performing least square straight line fitting again according to newly added sampling points, so that the straight line fitting is more accurate;
step 6.5, the upper edge line, the lower edge line, the left edge line and the right edge line are intersected at four corner points of the QR code
Figure 448175DEST_PATH_IMAGE069
Is provided with
Figure 612440DEST_PATH_IMAGE069
The correspondence in the homography transform domain is
Figure 839021DEST_PATH_IMAGE070
And satisfies the following homography transformation:
Figure 582986DEST_PATH_IMAGE071
in the formula (I), the compound is shown in the specification,
Figure 695298DEST_PATH_IMAGE072
in order to be a homography transformation coefficient,
Figure 499306DEST_PATH_IMAGE066
as a coordinate
Figure 697069DEST_PATH_IMAGE002
Transforming the coordinates projected into the homography domain by homography
Figure 244725DEST_PATH_IMAGE069
The homography transformation matrix can be obtained by the four points and the corresponding relation, as shown in fig. 7, the positioning correction is completed on the QR code, and the correction result is shown in fig. 8.
As noted above, while the present invention has been shown and described with reference to certain preferred embodiments, it is not to be construed as limited thereto. Various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. A method for rapidly positioning a damaged QR code according to image redundant information is characterized by comprising the following steps:
step 1, carrying out gray processing on a picture containing a QR code to obtain a gray picture;
step 2, evaluating the definition of the gray level picture through random sampling;
step 3, filtering and edge detection are carried out on the gray level picture through a self-adaptive threshold value;
step 4, determining the width of the stripe through the detected edge, and determining three positioning patterns of the QR code according to the width proportion of the stripe;
step 5, determining a QR code area through the three positioning patterns, and combining the module width of the positioning patternssCarrying out edge detection on the QR code area again, and carrying out binarization on the edge;
step 6, calculating an affine change matrix through the three positioning patterns, carrying out affine transformation on the gray level image, searching out four edge lines of the QR code according to a binarization result, crossing the edge lines with four corner points of the QR code, obtaining a homography transformation matrix, and finally carrying out positioning correction through homography transformation;
in step 2, the specific steps of evaluating the definition of the gray level picture through random sampling are as follows:
step 2.1, Generation Using ISAAC pseudo-random number Generation Algorithm
Figure 691342DEST_PATH_IMAGE001
A random number satisfying a normal distribution, whereinWAndHrespectively a width pixel and a height pixel of the gray scale image;
step 2.2, determining the coordinate of a pixel point by two adjacent random numbers
Figure 61012DEST_PATH_IMAGE002
Can determine
Figure 699804DEST_PATH_IMAGE003
Pixel points, and taking the coordinates of the pixel points as the coordinates of each sampling point;
step 2.3, sampling is carried out on the gray picture according to the coordinates of each sampling point, and the average value of the pixel values of each sampling point and nine adjacent sampling points is taken as the pixel value of the sampling point;
step 2.4, counting the pixel values of all sampling points, and solving the mean value, the variance and the histogram distribution of the pixel values of all the sampling points;
step 2.5, comparing the variance with a definition threshold, if the variance is greater than or equal to the definition threshold, indicating that the definition meets the requirement, directly entering step 3, if the variance is less than the definition threshold, indicating that the definition is low, returning a histogram distribution result, and entering step 2.6;
step 2.6, calculating a foreground pixel value and a background pixel value according to the histogram distribution, adjusting the brightness of a flash lamp of the code scanning device according to the foreground pixel value and the background pixel value, starting the code scanning device to re-acquire a picture containing the QR code, and returning to the step 1; if the code scanning equipment cannot be started, directly entering the step 3;
in step 3, the specific steps of filtering the gray level picture and detecting the edge through the adaptive threshold value are as follows:
step 3.1, the gray picture is scanned in a reciprocating mode line by line or line by line, the rectangular distribution of the scanning line or the scanning line is solved through the pixel values of all pixel points in the current scanning line or the scanning line, and the foreground pixel value of the current scanning line or the scanning line is calculated through the rectangular distribution
Figure 750805DEST_PATH_IMAGE004
And background pixel value
Figure 971702DEST_PATH_IMAGE005
Thus, an initial edge threshold is constructed:
Figure 190237DEST_PATH_IMAGE006
in the formula, ABS is a function of absolute value,
Figure 847483DEST_PATH_IMAGE007
is the minimum edge threshold value for the edge of the image,
Figure 702176DEST_PATH_IMAGE008
is the current edge threshold;
step 3.2, filtering each pixel point in the scanning line or the scanning line one by using exponential moving average, wherein the filtering meets the formula:
Figure 26847DEST_PATH_IMAGE009
in the formula (I), the compound is shown in the specification,
Figure 223473DEST_PATH_IMAGE010
is the filtered value of the current pixel point,
Figure 446644DEST_PATH_IMAGE011
is the filtered value of the previous pixel point,
Figure 652497DEST_PATH_IMAGE012
is the pixel value of the current pixel point,
Figure 628412DEST_PATH_IMAGE013
the range is a set constant value and is between 0 and 1;
step 3.3, byJudging whether the current pixel point is an edge or not by an aplace operator, and if so, judging according to the result
Figure 527098DEST_PATH_IMAGE014
The rule of (2) screens weak edges, otherwise, the coordinate of the current pixel point is taken as an edge value
Figure 690095DEST_PATH_IMAGE015
Edge set stored to current scan line or scan column
Figure 168481DEST_PATH_IMAGE016
And updating the current edge threshold
Figure 15214DEST_PATH_IMAGE017
Step 3.4, narrow the current edge threshold
Figure 68490DEST_PATH_IMAGE018
And returning to the step 3.2 to scan the next pixel point until all the pixel points of the scanning line or the scanning column are scanned completely, thereby obtaining the edge set of the current scanning line or the scanning column:
Figure 674DEST_PATH_IMAGE019
in the formula (I), the compound is shown in the specification,
Figure 548330DEST_PATH_IMAGE016
for the set of edges of the current scan row or scan column,
Figure 764416DEST_PATH_IMAGE020
is as followscIn a scanning line or a scanning columnnAn edge value;
step 3.5, repeating the steps 3.1-3.4 until the edge set of each row or each column of the whole gray level image is obtained;
in step 4, the specific steps of determining three positioning patterns of the QR code according to the width ratio of the stripes are as follows:
step 4.1, unifying the edge values obtained by the reciprocating scanning into the edge values in the same direction
Figure 4905DEST_PATH_IMAGE021
cIs the number of the current row or column, ifcIs odd, then
Figure 893226DEST_PATH_IMAGE022
If, ifcIs an even number, then
Figure 980260DEST_PATH_IMAGE023
LIs the length of the current row or column, if the scan is horizontal, thenLEqual to the width of the grayscale imageWIf the scanning is vertical scanning, thenLEqual to the gray scale image heightH
And 4.2, forming a stripe by each adjacent edge, and calculating the width of each stripe:
Figure 801586DEST_PATH_IMAGE024
in the formula (I), the compound is shown in the specification,
Figure 931085DEST_PATH_IMAGE025
is as followsiThe width of each stripe;
step 4.3, searching the stripe proportion according with the width of the adjacent five stripes, and recording the stripe proportion in the matching stripe set of the current stripe:
Figure 837861DEST_PATH_IMAGE026
in the formula, matching stripes
Figure 727319DEST_PATH_IMAGE027
Each piece of paper is includedSix edge values of the stripe are set as the edge of the current scanning line or columnE cur A subset of (a);
step 4.4, matching the fringe sets through the previous row or columnP last Each matching stripe of
Figure 386840DEST_PATH_IMAGE028
To filter the edge set of the current row or columnE cur Matching the fringe set equally by current row or columnP cur Each matching stripe of
Figure 969131DEST_PATH_IMAGE029
To filter the edge set of the next row or columnE last Searching out matching stripes which are missed due to image noise, and recording the matching stripes in a matching stripe set of a corresponding row or column;
step 4.5, clustering the searched stripes, namely when the line spacing and the column spacing of the adjacent edges are less than or equal to a threshold value t,
Figure 832045DEST_PATH_IMAGE030
then the stripes are considered to be the same stripes, so that wider stripes are formed;
step 4.6, correspondingly crossing each horizontal stripe and each longitudinal stripe obtained after clustering, finally determining a positioning pattern, and filtering out the non-crossed stripes;
step 4.7, calculating the center point of the positioning patternoEdge point setV={v 0,v 1,v 2,…,v nAnd module width
Figure 774462DEST_PATH_IMAGE031
In the formula (I), wherein,d(o, v i) Edge points representing transverse stripesv iAnd a central pointoTransverse distance or edge points of longitudinal stripesv iAnd a central pointoIf the center pointoHas the coordinates of (x o ,y o ) Edge pointsv iHas the coordinates of (x i ,y i ) The lateral distance is ABS: (x o ,- x i ) The longitudinal distance is ABS: (y o ,- y i );
The specific steps of the step 5 are as follows:
step 5.1, pairing the positioning patterns according to the pairing rule: the module width s is smaller than the threshold t, and the color is the same and not collinear;
step 5.2, determining a QR code area according to the three matched positioning patterns;
step 5.3, carrying out edge detection on the QR code area to identify a black and white space, thereby forming preliminary binarization processing;
step 5.4, calculating a binarization threshold value of each pixel point in a QR code region in a neighborhood by using an averaging method, wherein the neighborhood size is a square region with the side length being the module width s, and then carrying out binarization processing on the QR code region according to the binarization threshold value;
step 5.5, comparing the binarization in the step 5.3 with the binarization in the step 5.4, if the pixel values are all binarized into 0, updating to 0, and if the pixel values are all binarized into 255, updating to 255, otherwise, the following three conditions exist:
in case 1, if the pixel point on the left side of the current pixel point and the pixel point on the upper side are both binarized into 0, the current pixel point is binarized into 0;
in case 2, if the pixel point on the left side of the current pixel point and the pixel point on the upper side are both binarized to be 255, the current pixel point is binarized to be 255;
in case 3, if binarization results of a pixel point on the left side of the current pixel point and a pixel point on the upper side of the current pixel point are different, taking an average value of the two pixel points as a pixel threshold, if a pixel value of the current pixel point is greater than the pixel threshold, binarizing to 255, otherwise, binarizing to 0;
the concrete steps of the step 6 are as follows:
step 6.1, setting the coordinates of the central points of the three positioning patterns of the standard QR code as
Figure 39221DEST_PATH_IMAGE032
Figure 261255DEST_PATH_IMAGE033
And
Figure 595153DEST_PATH_IMAGE034
the coordinates of the center points of the three positioning patterns of the current QR code are
Figure 91994DEST_PATH_IMAGE035
Figure 991685DEST_PATH_IMAGE036
And
Figure 650200DEST_PATH_IMAGE037
then, the three positioning patterns of the current QR code satisfy the following affine transformation:
Figure 753285DEST_PATH_IMAGE038
in the formula (I), the compound is shown in the specification,
Figure 788237DEST_PATH_IMAGE039
in order to be an affine transformation coefficient,
Figure 808015DEST_PATH_IMAGE040
as a coordinate
Figure 371851DEST_PATH_IMAGE002
By affine transformation of the coordinates projected into the affine domain
Figure 227812DEST_PATH_IMAGE041
And
Figure 315722DEST_PATH_IMAGE042
obtaining affine transformation system by substituting the above formulaCounting;
step 6.2, mapping each edge point coordinate of the edge point set V into an affine domain through the affine transformation, determining the position of the edge point in the standard QR code through the coordinate comparison with the standard QR code, finding the edge point belonging to the edge line on the standard QR code, and performing least square straight line fitting to form an upper edge line;
step 6.3, obtaining a lower edge line, a left edge line and a right edge line according to the same principle of the step 6.2;
6.4, sampling edge black points of the QR code in the binary image in the step 5 along the upper edge line, the lower edge line, the left edge line and the right edge line, and performing least square straight line fitting again according to newly added sampling points, so that the straight line fitting is more accurate;
step 6.5, the upper edge line, the lower edge line, the left edge line and the right edge line are intersected at four corner points of the QR code
Figure 409580DEST_PATH_IMAGE043
Is provided with
Figure 675476DEST_PATH_IMAGE043
The correspondence in the homography transform domain is
Figure 753154DEST_PATH_IMAGE044
And satisfies the following homography transformation:
Figure 650615DEST_PATH_IMAGE045
in the formula (I), the compound is shown in the specification,
Figure 864558DEST_PATH_IMAGE046
in order to be a homography transformation coefficient,
Figure 301356DEST_PATH_IMAGE040
as a coordinate
Figure 115597DEST_PATH_IMAGE002
Transforming the coordinates projected into the homography domain by homography
Figure 296043DEST_PATH_IMAGE043
The homography transformation matrix can be obtained by the four points and the corresponding relation, and the positioning correction is completed on the QR code.
2. The method for rapidly positioning the damaged QR code according to the redundant information of the image as claimed in claim 1, wherein the step 2.6 of calculating the foreground pixel value and the background pixel value according to the histogram comprises the following specific steps:
step 2.6.1, calculating the number gradient of the current pixel value:
Figure 630072DEST_PATH_IMAGE047
in the formula (I), the compound is shown in the specification,
Figure 237771DEST_PATH_IMAGE048
representing a pixel value of
Figure 8150DEST_PATH_IMAGE012
The gradient of the amount of (c) is,
Figure 726707DEST_PATH_IMAGE049
representing a pixel value of
Figure 180822DEST_PATH_IMAGE050
The number of the sampling points of (a),
Figure 677531DEST_PATH_IMAGE051
representing a pixel value of
Figure 217097DEST_PATH_IMAGE052
And number of sampling points of
Figure 739345DEST_PATH_IMAGE053
And
Figure 297234DEST_PATH_IMAGE054
the relationship of (1);
step 2.6.2, carrying out extreme point judgment: if it is
Figure 246736DEST_PATH_IMAGE055
And
Figure 742439DEST_PATH_IMAGE048
is of opposite sign or
Figure 68378DEST_PATH_IMAGE048
When the value is 0, the pixel value
Figure 746353DEST_PATH_IMAGE012
Determining a suspected extreme point of a histogram distribution, and counting pixel values
Figure 866756DEST_PATH_IMAGE012
And the total number of sampling points of two adjacent pixel values is as follows:
Figure 380914DEST_PATH_IMAGE056
in the formula (I), the compound is shown in the specification,
Figure 963074DEST_PATH_IMAGE057
is a pixel value
Figure 980709DEST_PATH_IMAGE012
The total number of surrounding sample points,
Figure 537592DEST_PATH_IMAGE058
as the current pixel value
Figure 7887DEST_PATH_IMAGE012
The number of sampling points;
step 2.6.3, comparing the total number of the surrounding sample points of all the suspected extreme points
Figure 190476DEST_PATH_IMAGE057
Figure 62617DEST_PATH_IMAGE057
The pixel value corresponding to the maximum suspected extremum point is used as the background pixel value,
Figure 790402DEST_PATH_IMAGE057
and taking the pixel value corresponding to the second largest suspected extremum point as the foreground pixel value.
3. The method according to claim 1, wherein in step 2.6, when adjusting the flash brightness of the code scanning device according to the foreground pixel value and the background pixel value, if the foreground pixel value and the background pixel value are both less than 127, the flash brightness of the code scanning device is adjusted to be 10% higher than the previous brightness, and if the foreground pixel value and the background pixel value are both greater than or equal to 127, the flash brightness of the code scanning device is adjusted to be 10% lower than the previous brightness.
4. The method for rapidly locating a damaged QR code according to redundant image information of claim 1, wherein the filtering in step 4.4 comprises the following specific steps:
step 4.4.1, byP cur Each of which matches the stripe
Figure 13573DEST_PATH_IMAGE059
Set of edges for the next row or column
Figure 197255DEST_PATH_IMAGE060
Carry out a search, order
Figure 923902DEST_PATH_IMAGE061
And
Figure 822588DEST_PATH_IMAGE062
step 4.4.2, threshold judgment: if it satisfies
Figure 985585DEST_PATH_IMAGE063
Wherein
Figure 995130DEST_PATH_IMAGE064
Then, then
Figure 107442DEST_PATH_IMAGE065
And
Figure 895138DEST_PATH_IMAGE066
has a corresponding relationship and records
Figure 92902DEST_PATH_IMAGE065
Step 4.4.3, if the threshold judgment of the step 4.4.2 is met, updating
Figure 640558DEST_PATH_IMAGE067
Otherwise, update
Figure 607377DEST_PATH_IMAGE068
Returning to the step 4.4.2;
step 4.4.4, if
Figure 831553DEST_PATH_IMAGE069
Each edge value in (1)
Figure 516613DEST_PATH_IMAGE070
Are all at
Figure 867959DEST_PATH_IMAGE071
To find and withCorresponding edge value
Figure 689285DEST_PATH_IMAGE072
Then, it means that a matching stripe missed due to image noise is found from the edge set of the previous line or column and is recorded in the matching stripe set of the previous line or column
Figure 84363DEST_PATH_IMAGE073
In (3), go back to step 4.4.1, search using the next matching stripe until
Figure DEST_PATH_IMAGE074
All the elements are traversed.
CN202110213441.XA 2021-02-26 2021-02-26 Method for quickly positioning damaged QR (quick response) code according to image redundant information Active CN112560538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110213441.XA CN112560538B (en) 2021-02-26 2021-02-26 Method for quickly positioning damaged QR (quick response) code according to image redundant information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110213441.XA CN112560538B (en) 2021-02-26 2021-02-26 Method for quickly positioning damaged QR (quick response) code according to image redundant information

Publications (2)

Publication Number Publication Date
CN112560538A CN112560538A (en) 2021-03-26
CN112560538B true CN112560538B (en) 2021-05-11

Family

ID=75034711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110213441.XA Active CN112560538B (en) 2021-02-26 2021-02-26 Method for quickly positioning damaged QR (quick response) code according to image redundant information

Country Status (1)

Country Link
CN (1) CN112560538B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113177959B (en) * 2021-05-21 2022-05-03 广州普华灵动机器人技术有限公司 QR code real-time extraction method in rapid movement process
CN117057377B (en) * 2023-10-11 2024-01-12 青岛冠成软件有限公司 Code identification matching method
CN117291208B (en) * 2023-11-24 2024-01-23 四川数盾科技有限公司 Two-dimensional code extraction method and system
CN118171672B (en) * 2024-04-09 2024-09-17 荣耀终端有限公司 Bar code identification method, electronic device, computer storage medium, and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693409A (en) * 2012-05-18 2012-09-26 四川大学 Method for quickly identifying two-dimension code system type in images
CN107577979A (en) * 2017-07-26 2018-01-12 中科创达软件股份有限公司 DataMatrix type Quick Response Codes method for quickly identifying, device and electronic equipment
CN109409163A (en) * 2018-11-12 2019-03-01 凌云光技术集团有限责任公司 A kind of QR code method for rapidly positioning based on texture features
CN109993019A (en) * 2019-04-15 2019-07-09 苏州国芯科技股份有限公司 Two-dimensional code identification method, system and equipment and medium based on connected domain analysis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693409A (en) * 2012-05-18 2012-09-26 四川大学 Method for quickly identifying two-dimension code system type in images
CN107577979A (en) * 2017-07-26 2018-01-12 中科创达软件股份有限公司 DataMatrix type Quick Response Codes method for quickly identifying, device and electronic equipment
CN109409163A (en) * 2018-11-12 2019-03-01 凌云光技术集团有限责任公司 A kind of QR code method for rapidly positioning based on texture features
CN109993019A (en) * 2019-04-15 2019-07-09 苏州国芯科技股份有限公司 Two-dimensional code identification method, system and equipment and medium based on connected domain analysis

Also Published As

Publication number Publication date
CN112560538A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN112560538B (en) Method for quickly positioning damaged QR (quick response) code according to image redundant information
CN110866924B (en) Line structured light center line extraction method and storage medium
CN106960208B (en) Method and system for automatically segmenting and identifying instrument liquid crystal number
CN111179243A (en) Small-size chip crack detection method and system based on computer vision
CN113591967B (en) Image processing method, device, equipment and computer storage medium
KR101403876B1 (en) Method and Apparatus for Vehicle License Plate Recognition
US20130070095A1 (en) Fast obstacle detection
CN115170669B (en) Identification and positioning method and system based on edge feature point set registration and storage medium
CN105894521A (en) Sub-pixel edge detection method based on Gaussian fitting
CN110288612B (en) Nameplate positioning and correcting method and device
CN112233116A (en) Concave-convex mark visual detection method based on neighborhood decision and gray level co-occurrence matrix description
CN116704516B (en) Visual inspection method for water-soluble fertilizer package
CN110490924A (en) A kind of light field image feature point detecting method based on multiple dimensioned Harris
CN113538603A (en) Optical detection method and system based on array product and readable storage medium
CN110276759A (en) A kind of bad line defect diagnostic method of Mobile phone screen based on machine vision
CN115078365A (en) Soft package printing quality defect detection method
CN115546139A (en) Defect detection method and device based on machine vision and electronic equipment
CN109784328B (en) Method for positioning bar code, terminal and computer readable storage medium
CN114529555A (en) Image recognition-based efficient cigarette box in-and-out detection method
CN112085723B (en) Automatic detection method for spring jumping fault of truck bolster
CN111402185A (en) Image detection method and device
CN110348363B (en) Vehicle tracking method for eliminating similar vehicle interference based on multi-frame angle information fusion
CN110969612A (en) Two-dimensional code printing defect detection method
JP4492258B2 (en) Character and figure recognition and inspection methods
CN115546145A (en) Defect detection method and device based on machine vision and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: No.15 Xinghuo Road, Jiangbei new district, Nanjing, Jiangsu Province, 210031

Patentee after: Dongji Technology Co.,Ltd.

Address before: No.15 Xinghuo Road, Jiangbei new district, Nanjing, Jiangsu Province, 210031

Patentee before: JIANGSU SEUIC TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder