CN112800797A - Method and system for positioning DM code region - Google Patents

Method and system for positioning DM code region Download PDF

Info

Publication number
CN112800797A
CN112800797A CN202011612973.2A CN202011612973A CN112800797A CN 112800797 A CN112800797 A CN 112800797A CN 202011612973 A CN202011612973 A CN 202011612973A CN 112800797 A CN112800797 A CN 112800797A
Authority
CN
China
Prior art keywords
code
edge
point
dotted line
neighborhood
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011612973.2A
Other languages
Chinese (zh)
Other versions
CN112800797B (en
Inventor
梁宗林
戚涛
张见
赵严
姚毅
杨艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luster LightTech Co Ltd
Original Assignee
Luster LightTech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luster LightTech Co Ltd filed Critical Luster LightTech Co Ltd
Priority to CN202011612973.2A priority Critical patent/CN112800797B/en
Publication of CN112800797A publication Critical patent/CN112800797A/en
Application granted granted Critical
Publication of CN112800797B publication Critical patent/CN112800797B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • G06T3/02
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The application provides a method for positioning a DM code area, which comprises the following steps: acquiring a picture to be detected; positioning an ROI (region of interest) containing DM (digital data) codes in a picture to be detected through a YoloV3-Tiny target detection model; and performing cross traversal on the ROI containing the DM codes according to a preset track to obtain the effective boundary of the DM codes in the ROI. The application also provides a region positioning system of the DM code. The ROI area containing the DM codes in the picture to be detected is quickly positioned by the YoloV3-Tiny target detection model, the ROI area which is not subjected to accurate positioning processing is circularly traversed, and the DM codes in the ROI area are accurately positioned through a DM code positioning algorithm, so that a large amount of invalid detection operations are avoided, the accuracy of DM code positioning is improved, the ROI area time of a plurality of DM codes quickly positioned by the YoloV3-Tiny target detection model is consistent with the time of a single DM code, and the efficiency is higher.

Description

Method and system for positioning DM code region
Technical Field
The present application relates to the field of DM code positioning technologies, and in particular, to a method and a system for positioning a DM code area.
Background
The dm (data matrix) code is a pattern composed of two color blocks alternating between black and white. The DM code shown in fig. 11 is a 24 × 24 DM code, and is composed of a data area, a positioning pattern and a quiet area, where the data area is formed by regularly arranged square modules, the positioning pattern is a boundary of the data area, specifically, a module width, and two adjacent edges in the DM code are black solid lines. As shown in fig. 1, two adjacent edges are connected to form an "L" shaped edge, which is mainly used to define the physical size of the DM code, position the DM code, and correct distortion of the DM code; the other two adjacent sides of the DM code are composed of alternate black modules and white modules to form a dotted line side, and the dotted line side is mainly used for limiting the bar code specification of the DM code and the physical size of the unit module in the DM code. When a target DM code needs to be accurately found from a picture containing a plurality of DM codes or a plurality of different bar codes other than the DM code, the target DM code area needs to be accurately located to obtain information data included in the target DM code.
In the prior art, generally, the whole picture containing the DM code to be positioned is subjected to expansion corrosion to screen out a candidate region containing the DM code to be positioned, and then the candidate region is directly traversed point by point line by line, so that the consumed time is long, and when a whole picture contains a plurality of DM codes, the time consumed for positioning the DM code to be positioned is longer; moreover, since the whole picture may contain other barcodes besides the DM code, such as the QR code and the barcode, when the area of the DM code to be located is located, the other barcodes may cause serious interference to the location of the DM code, resulting in a low efficiency of locating the area of the DM code.
Disclosure of Invention
The application provides a method and a system for locating a DM code area, which are used for solving the problems that the time consumption for locating the DM code area is long and the locating efficiency is low in the prior art.
In one aspect, the present application provides a method for locating an area of a DM code, which specifically includes the following steps:
acquiring a picture to be detected, wherein the picture to be detected is a preprocessed picture containing a DM code;
positioning an ROI (region of interest) containing DM (demodulation) codes in the picture to be detected by using a YooloV 3-Tiny target detection model, wherein the YooloV 3-Tiny target detection model is obtained by training a standard YooloV 3-Tiny model and configuring parameters;
and performing cross traversal on the ROI containing the DM codes according to a preset track to obtain the effective boundary of the DM codes in the ROI.
In a preferred embodiment of the present application, the ROI area containing the DM code is traversed in a crossing manner according to a preset trajectory to obtain an effective boundary of the DM code in the ROI area, which includes the following specific processes:
acquiring track point coordinates on a preset track;
if the track point coordinates are effective track points, detecting a DM code boundary;
and if the DM code boundary is an effective boundary, outputting the effective boundary of the DM code.
In the preferred embodiment of the present application, if the trace point coordinates are valid trace points, then the DM code boundary is detected, and the specific process includes:
judging whether the effective track points are strong edge points or not;
if the effective track points are strong edge points, detecting an L-shaped edge of the DM code;
if the L-shaped edge is an effective L-shaped edge, detecting a dotted line edge of the DM code;
and if the dotted line edge is an effective dotted line edge, outputting an effective L-shaped edge and an effective dotted line edge of the DM code.
In the above technical solution, if traversing is performed completely according to the derivation direction of the dashed line edge or only in a single traversal direction, all black and white modules of the dashed line edge may not be completely traversed, and if all black and white modules of the dashed line edge can be accurately traversed, a reasonable traversal of the starting point and the traversal direction is required.
In the preferred embodiment of the present application, it is determined whether the effective trace point is a strong edge point, and the specific determination process is as follows:
calculating the gradient of the current effective track point according to the sobel operator;
if the gradient of the current effective track point meets the requirement of a first preset threshold value, calculating the gradient of a forward direction neighborhood point and the gradient of a reverse direction neighborhood point of the vertical gradient of the current effective track point;
if the gradient of the forward direction neighborhood point of the vertical gradient of the current effective track point and the gradient of the reverse direction neighborhood point both meet the requirement of a second preset threshold value, judging whether the current effective track point is a common neighborhood point of the forward direction neighborhood point of the vertical gradient and the reverse direction neighborhood point of the vertical gradient;
and if the current effective track point is a common neighborhood point of the vertical gradient forward direction neighborhood point and the vertical gradient reverse direction neighborhood point, judging that the current effective track point is a strong edge point.
In the preferred embodiment of the present application, if the effective trace point is a strong edge point, then detect the L-shaped edge of the DM code, and the specific detection process is as follows:
acquiring a first neighborhood point of the positive direction of the vertical gradient of the current effective track point and a second neighborhood point of the positive direction of the first neighborhood point in the vertical gradient;
when the gradient of the acquired neighborhood point is smaller than a third preset threshold value or the acquired neighborhood point is a traversed neighborhood point, no neighborhood point in the positive direction of the vertical gradient is acquired;
acquiring a first neighborhood point of the current effective track point in the opposite direction of the vertical gradient and a second neighborhood point of the first neighborhood point in the opposite direction of the vertical gradient;
when the gradient of the acquired neighborhood point is smaller than a fourth preset threshold value or the acquired neighborhood point is a traversed neighborhood point, no neighborhood point in the opposite direction of the vertical gradient is acquired;
and connecting the neighborhood points to obtain an L-shaped edge passing through the current effective track point.
In a preferred embodiment of the present application, connecting the neighborhood points to obtain an L-shaped edge passing through the current effective trace point specifically further includes:
obtaining a first right-angle side and a second right-angle side of the L-shaped side through Hough line detection;
calculating the included angle between the first right-angle edge and the second right-angle edge;
judging whether the included angle meets a preset included angle of the DM code;
if the included angle meets a preset included angle, calculating the vector product of the first right-angle edge and the second right-angle edge;
and determining the left side and the bottom side of the DM code according to the vector product, and outputting an L-shaped side detection result of the DM code.
In a preferred embodiment of the present application, if the L-shaped edge is an effective L-shaped edge, the detection process of the dashed edge of the DM code is as follows:
judging the straight line and the direction of the dotted line edge according to the effective L-shaped edge;
detecting an upper dotted line side and a right dotted line side of the dotted line sides according to the straight line and the direction of the dotted line sides;
carrying out Hough transform on the upper dotted line edge and the right dotted line edge respectively to obtain a first straight line and a second straight line;
obtaining an affine transformation matrix between the DM code and the unit square DM code according to the effective L-shaped edge, the first straight line and the second straight line;
violently enumerating the DM code specification, and sampling a DM code image by combining the affine transformation matrix;
obtaining the DM code specification which is matched most according to the average color difference value of the black and white modules in the DM code image obtained by sampling;
respectively taking the first straight line and the second straight line as traversal tracks, and counting the total number of black and white modules on the first straight line or the second straight line and the total number of black and white modules on the edge of the dotted line in the DM code specification which is most matched with the black and white modules;
and if the total number of the black and white modules on the first straight line or the second straight line is consistent with the total number of the black and white modules on the dotted line side in the DM code specification which is matched with the black and white modules on the dotted line side or the error is within a preset error range, judging that the detection of the dotted line side is finished.
In a preferred embodiment of the present application, the dotted line edge is obtained by dividing the dotted line edge into multiple segments of sub-edges and traversing straight lines of the multiple segments of sub-edges with different angles generated by Bresenham algorithm, where the dotted line edge includes the upper dotted line edge and the right dotted line edge.
On the other hand, the application also provides a region positioning system of the DM code by adopting the region positioning method of the DM code.
Compared with the prior art, the area positioning method and system of the DM code have the following beneficial effects:
(1) according to the method, the deep learning model is combined with the DM code positioning algorithm, false detection caused by irrelevant bar codes or scene interference in the picture to be detected can be avoided, a finite number of ROI areas containing the DM codes in the picture to be detected are obtained by utilizing the YooloV 3-Tiny target detection model for rapid positioning, all ROI areas which are not subjected to precise positioning processing are traversed circularly, the DM codes in the ROI areas are precisely positioned through the DM code positioning algorithm, a large number of invalid detection operations on areas without the DM codes are avoided, the accuracy and the efficiency of DM code positioning are effectively improved, besides the DM codes in the picture to be detected, the time for rapidly positioning the ROI areas of a plurality of DM codes by the YooloV 3-Tiny target detection model is basically consistent with the time only containing a single DM code when other bar codes are contained, and the overall positioning efficiency is ensured.
(2) The traversal adopted in the method is selective and sequential cross traversal, so that the actual traversal times are greatly reduced, and the positioning efficiency of the DM codes in the ROI area is effectively improved.
(3) The application changes the detection of the dotted line edge in the DM boundary detection from single-angle detection to multi-angle detection, so that the adaptability of a DM code positioning algorithm is stronger, and the detection rate of DM codes in the ROI area is further improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for locating areas of DM codes according to the present application;
fig. 2 is a schematic diagram of a to-be-detected picture containing a DM code in this embodiment 1;
FIG. 3 is a schematic diagram of the ROI containing DM code in this embodiment 1;
FIG. 4 is a schematic diagram of the effective boundary of the DM code in the ROI area in this embodiment 1;
FIG. 5a is a schematic diagram of the first cross-traversal trajectory of the ROI area in this embodiment 1;
FIG. 5b is a schematic diagram of the second cross-traversal trajectory of the ROI area in this embodiment 1;
FIG. 5c is a schematic diagram of a third traversal trajectory of the ROI area in this embodiment 1;
FIG. 5d is a schematic diagram of the mth traversal trajectory of the ROI area in this embodiment 1;
fig. 6 is a schematic diagram illustrating a principle of determining a strong edge point in this embodiment 1;
FIG. 7 is a schematic diagram illustrating the principle of determining the L-shaped edge in this embodiment 1;
FIG. 8 is a schematic diagram illustrating the detection principle of the L-shaped edge in the embodiment 1;
fig. 9 is a schematic diagram illustrating a principle of detecting a dashed line in embodiment 1;
FIG. 10 is a schematic diagram illustrating a derivation process of the optimal traversal direction of the dashed edge in this embodiment 1;
fig. 11 is a schematic diagram of a DM code with 24 × 24 specification.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the embodiments of the present application, it should be noted that the terms "upper", "lower", "bottom", "left", "right", and "forward direction", "reverse direction", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings or orientations or positional relationships that the products of the present invention are usually placed in when used, and are only used for convenience of description and simplicity of description, but do not indicate or imply that the devices or elements that are referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present application.
In the description of the embodiments of the present application, descriptions of "first", "second", "third", etc. are only for distinguishing similar concepts and for convenience of describing technical solutions and simplifying the description, but do not indicate or imply that the order must be specific and have any practical meaning, and therefore, the present application is not to be considered as limited.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
Example 1
As shown in fig. 1, the present application provides a method for locating a DM code area, which specifically includes the following steps:
s101, acquiring a picture to be detected, wherein the picture to be detected is a preprocessed picture containing DM codes;
s102, positioning an ROI (region of interest) containing DM codes in the picture to be detected through a YooloV 3-Tiny target detection model, wherein the YooloV 3-Tiny target detection model is obtained by training a standard YooloV 3-Tiny model and configuring parameters;
s103, performing cross traversal on the ROI containing the DM codes according to a preset track to obtain the effective boundary of the DM codes in the ROI.
Yolo is another framework proposed by Ross Girshick for the DL target detection speed problem following RCNN, fast-RCNN and fast-RCNN. The Yolov3-Tiny network is a lightweight target detection network based on a Yolov3 algorithm, the number of network layers is small, the parameter quantity is small, and real-time operation can be guaranteed on an embedded platform.
Due to the advantages of small calculation amount and high speed of the Yolov3-Tiny model, in the embodiment 1, a standard Yolov3-Tiny model is selected to detect the DM code, and then the Yolov3-Tiny model is trained and parameter fine-tuned on the basis to form a detection model for the DM code, namely, a Yolov3-Tiny target detection model.
In this example 1, the specific construction process of YoloV3-Tiny target detection model is as follows:
firstly, labeling a to-be-detected picture sample containing a DM code by using Labelimage software to obtain a training set test set of the DM code for subsequent operation.
Then, configuring a standard YooloV 3-Tiny model, namely modifying cfg/voc.data, data/voc.names and cfg/YoloV3-tiny.cfg files, wherein the step of modifying the cfg/YoloV3-tiny.cfg files specifically comprises the following steps: the network resolution is adjusted from 416 x 416 to 704 x 704 to increase the accuracy and increase the accuracy of detecting small objects.
After the file parameters are trained and modified, the standard YoloV3-Tiny model completes the adaptation of the DM codes, the standard YoloV3-Tiny model after the adaptation is completed, namely the YoloV3-Tiny target detection model, is trained through a code training set test set of the DM, and the evaluation result is that the recognition rate of the YoloV3-Tiny target detection model to the DM codes reaches 99 percent by utilizing the evaluation of the test set.
It should be particularly noted that, in this embodiment 1, the configuration of the parameter file of the standard yoolov 3-Tiny model is only one technical solution shown in the present application, and those skilled in the art can modify the parameter file to be configured according to the actual positioning requirement, which should not be construed as limiting the scope of the present application.
The following describes a specific technical solution of this embodiment 1 by using a picture to be detected containing a DM code.
As shown in fig. 2, for one preprocessed picture to be detected containing a DM code obtained in step S101, as shown in fig. 2, the picture to be detected only contains one DM code, and further includes one QR code and one 128 barcode, as shown in fig. 3, an ROI region containing the DM code in the picture to be detected containing the DM code is located for a yoolov 3-Tiny target detection model in step S102, a rectangular frame around the DM code in fig. 3 is the ROI region containing the DM code obtained after being located for the yoolov 3-Tiny target detection model, as shown in fig. 3, the located ROI region only contains the DM code, and the position of the DM code in the ROI region is substantially centered, and meanwhile, neither the two interfering barcodes of the QR code and the barcode 128 in the picture to be detected in fig. 2 are identified and output by the yoolov 3-Tiny target detection model; as shown in fig. 4, the ROI area containing the DM code in fig. 3 is traversed according to the preset track in step S103 to obtain the schematic diagram of the effective boundary of the DM code in the ROI area, the right side of fig. 4 is an enlarged view of the effective boundary of the obtained DM code, and the irregular frame next to the DM code is the effective boundary of the DM code.
In this embodiment 1, the specific process of step S103 is as follows:
acquiring track point coordinates on a preset track;
if the track point coordinates are effective track points, detecting a DM code boundary;
and if the DM code boundary is an effective boundary, outputting the effective boundary of the DM code.
It should be particularly noted that, in this embodiment 1, the specific cross-traversed track in step S103 is shown in fig. 5a, fig. 5b, fig. 5c, and fig. 5d, since the obtained ROI region after the positioning by the YoloV3-Tiny target detection model only contains one DM code, and the percentage of the DM code in the ROI region is also relatively large. Therefore, it is not necessary to traverse each pixel of the ROI, the region a excluding one fourth of the ROI is not traversed, the boundary of the ROI except a is bisected as shown in fig. 5a, and the pixel traversal is sequentially performed from the boundary of the ROI to the inside of the ROI according to the track sequence of 1, 2, 3, and 4 until the boundary of the DM code is found or the traversal track reaches the diagonal line between the boundary of the region a and the ROI; the second traversal trajectory takes the bisector shown in fig. 5b, the bisector repeated with the bisector of fig. 5a is excluded, and the rest bisectors are traversed according to the first traversal trajectory and the traversal rule according to the trajectory sequence of 1, 2, …, 7 and 8 shown in fig. 5 b; the third traversal trajectory takes the bisector shown in fig. 5c, the bisector repeated with the bisector of fig. 5a and the bisector of fig. 5b is excluded, and the rest bisectors are traversed according to the first traversal trajectory and the traversal rule according to the trajectory sequence of 1, 2, …, 15 and 16 shown in fig. 5 c; the m-th traversal trajectory is an N-th bisector as shown in fig. 5d, N is the m-th power of 2, and the trajectory numbers in the left-side dashed line box in fig. 5d are taken as examples, specifically, 1, 9, 17, …, and 1+ N, which illustrates that the traversal trajectory number rules after the third traversal trajectory are 1+8 × 0, 1+8 × 1, 1+8 × 2, …, and 1+8 × K, where K is N/8, and the numbering rules of other series are the same.
In this embodiment 1, in step S103, if the track point coordinates are valid track points, detecting a DM code boundary, where the specific process includes:
judging whether the effective track points are strong edge points or not;
if the effective track points are strong edge points, detecting an L-shaped edge of the DM code;
if the L-shaped edge is an effective L-shaped edge, detecting a dotted line edge of the DM code;
and if the dotted line edge is an effective dotted line edge, outputting an effective L-shaped edge and an effective dotted line edge of the DM code.
In the above technical solution, if traversing is performed completely according to the derivation direction of the dashed-line edge or only in a single traversal direction, all black-and-white modules of the dashed-line edge may not be completely traversed, and if all black-and-white modules of the dashed-line edge can be accurately traversed, it is necessary to reasonably traverse the starting point and the traversal direction, in this embodiment 1, the inference process of the optimal traversal direction of the dashed-line edge is as shown in fig. 10.
As shown in fig. 6, the specific determination process of the strong edge point in this embodiment 1 is as follows:
calculating the gradient of the current effective track point A according to the sobel operator;
if the gradient of the current effective track point A meets the requirement of a first preset threshold value, calculating the gradient of a forward direction neighborhood point B and the gradient of a reverse direction neighborhood point C of the vertical gradient of the current effective track point;
if the gradient of the forward vertical gradient neighborhood point B of the current effective track point a and the gradient of the backward vertical gradient neighborhood point C of the current effective track point a both meet the requirement of a second preset threshold, judging whether the current effective track point a is a common neighborhood point of B, C two points in the vertical gradient direction, if so, judging that the current track point a is a strong edge point, wherein the forward vertical gradient direction and the backward vertical gradient direction are as shown by an arrow of a left graph in fig. 6, and a right graph in fig. 6 is the position of the current track point a in the DM code.
As shown in fig. 7, the arrow direction of the graph on the left side of fig. 7 shows the positive vertical gradient direction and the negative vertical gradient direction of the current effective trace point a, and the graph on the right side of fig. 7 shows the position of the current trace point a in the DM code, and the edge L formed by an irregular connection line passing through the current trace point a, where the specific detection process of the L-shaped edge of the DM code in this embodiment 1 is as follows:
acquiring a first neighborhood point B of the positive vertical gradient direction of a current effective track point A and a second neighborhood point D of the positive vertical gradient direction of the first neighborhood point B;
when the gradient of the neighborhood points in the positive direction is smaller than a third preset threshold value or the acquired neighborhood points are traversed neighborhood points, no neighborhood points in the positive direction of the vertical gradient are acquired;
acquiring a first neighborhood point C of the current effective track point A in the opposite direction of the vertical gradient and a second neighborhood point E of the first neighborhood point C in the opposite direction of the vertical gradient;
when the gradient of the obtained reverse direction neighborhood points is smaller than a fourth preset threshold value or the obtained neighborhood points are traversed neighborhood points, no neighborhood points in the reverse direction of the vertical gradient are obtained;
and connecting the neighborhood points to obtain an L-shaped edge passing through the current effective track point A, as shown in the right graph of fig. 7.
In this embodiment 1, as shown in fig. 8, obtaining an L-shaped edge passing through the current effective track point, and further performing right-angle edge detection, includes:
carrying out Hough transform on the edge L by taking the current effective track point A as a reference to obtain an optimal straight line L1 close to the point A, traversing other points on the edge L from the positive direction and the negative direction by taking the point A as a starting point until the size of an included angle between a straight line formed by the point L and the point A and the L1 exceeds a tolerance range, and obtaining a sub-edge K with the two points B, C as end points; traversing other points on the edge L from the positive direction by taking the point B as a starting point until the traversal of the edge L is finished, and obtaining a sub-edge M; traversing other points on the edge L from the opposite direction by taking the point C as a starting point until the traversal of the edge L is finished, and obtaining a sub-edge N; carrying out Hough transformation on the sub-edge M to obtain an optimal straight line L2, and carrying out Hough transformation on the sub-edge N to obtain an optimal straight line L3; taking the L1 with the most edge points in L2 and L3 as two right-angle sides of the L-shaped side for output;
calculating the included angle of the first right-angle side L1 and the second right-angle side L2;
judging whether the included angle meets a preset included angle of the DM code;
if the included angle meets a preset included angle, calculating a vector product of the first right-angle side L1 and the second right-angle side L2;
determining the left side and the bottom side of the DM code according to the vector product, and outputting the L-shaped edge detection result of the DM code, in this embodiment 1, the two right-angled edges of the L-shaped edge are the first right-angled edge L1 and the second right-angled edge L2 shown in fig. 8, respectively, and L1 is the left side, and L2 is the bottom side.
As shown in fig. 9, the dotted line edge of the DM code in this embodiment 1 is obtained by dividing the dotted line edge into multiple segments of sub-edges in fig. 9, where the AB segment sub-edge is obtained by the L-shaped edge detection principle, the point a is on the L-shaped edge, the BC segment sub-edge is obtained by traversing rays with different angles generated by Bresenham algorithm, a ray with one angle in a certain distortion range can always pass through the next black module, the derivation direction angle coefficient of the dotted line edge is set to 1.0, and three correction coefficients 0.6,0.9, and 1.2 represent the degree of deviation of the actual traversal direction from the derivation direction, where a correction coefficient greater than 1.0 represents deviation to the outside, a larger deviation angle is larger, a correction coefficient smaller than 1.0 represents deviation to the inside, and a smaller deviation angle is larger. Because the upper dotted line side and the right dotted line side can be set with three correction coefficients, the combination of the coefficients of the upper dotted line side and the right dotted line side can obtain nine combinations of traversal directions.
The correction coefficients are expressed as (0.6 ), (0.6,0.9), (0.6,1.2), (0.9,0.6), (0.9 ), (0.9,1.2), (1.2,0.6), (1.2,0.9), (1.2 ) according to the mode of (upper dotted line edge coefficient and right dotted line edge coefficient), the algorithm module adopts a violent enumeration strategy to successively try combinations of all traversal directions, and the specific detection process is as follows:
judging the straight line and the direction of the dotted line edge according to the effective L-shaped edge;
detecting an upper dotted line side and a right dotted line side of the dotted line sides according to the straight line and the direction of the dotted line sides;
the specific process of detecting the upper dotted line edge is as follows: taking a point A on the edge of the L-shaped edge as a starting point, traversing the neighborhood points one by one towards the positive direction of the vertical gradient of the point A, starting to record the number of the neighborhood points in the opposite direction when the positive and negative directions of the vertical gradient of the continuous neighborhood points are opposite to the positive direction of the vertical gradient of the point A in the traversing process, and stopping the traversal of the edge point if the number of the neighborhood points in the opposite direction exceeds a set threshold value, thereby obtaining an edge end point B of the section; generating a ray with the point B as a starting point according to the traversal direction of the upper dotted line side in the current traversal direction combination, and traversing point by taking the ray as a traversal track until a strong edge point is found or the accumulated traversal length exceeds a length threshold; assuming that a strong edge point C can be found in the traversal trajectory in the traversal direction, a line segment BC can be obtained, and the line segment BC is used as a part of the dotted line edge, so that a complete sub-edge curve AC can be obtained; similarly, traversing is performed subsequently with point C as a starting point to obtain a sub-edge CD, and as shown in the left diagram of fig. 9, iteration is performed cyclically until the accumulated traversal length exceeds the length threshold, so that a complete upper dotted edge is obtained; the edge of the right dotted line is obtained by the same method; the resulting upper and right dashed edges are shown in the right hand graph of FIG. 9;
performing hough transform on the upper dotted line edge and the right dotted line edge respectively to obtain a first straight line L3 and a second straight line L4 as in the right diagram of fig. 9;
obtaining an affine transformation matrix between the DM code and the unit square DM code according to the effective L-shaped edges, namely a first right-angle edge L1 and a second right-angle edge L2 as well as the first straight line L3 and the second straight line L4;
violently enumerating the DM code specification, and sampling a DM code image by combining the affine transformation matrix;
obtaining the DM code specification which is the most matched according to the average color difference value of the black and white modules in the DM code image obtained by sampling, wherein the DM code specification which is the most matched is the DM code specification with the maximum average color difference value of the black and white modules;
respectively taking the first straight line L3 and the second straight line L4 as traversal tracks, and counting the total number of black-and-white modules on the first straight line L3 or the second straight line L4 and the total number of black-and-white modules on the side of the dotted line in the DM code specification which is matched with the first straight line L3 or the second straight line L4;
and if the total number of black and white modules on the first straight line L3 or the second straight line L4 is consistent with the total number of black and white modules on the dotted line side in the DM code specification which is matched most closely or the error is within a preset error range, determining that the detection of the dotted line side is finished.
If the total number of black-and-white modules on the first straight line L3 or the second straight line L4 is inconsistent with the total number of black-and-white modules on the dashed line side in the DM code specification that matches the best, and the error is not within the preset error range, continuing the next detection attempt of the dashed line side in the traversal direction combination until all the combination attempts are completed or the detection of the dashed line side is completed under a certain traversal combination.
It should be particularly noted that, in this embodiment 1, all the preset values or threshold values may be determined according to specific situations, which is not limited in this application, and the correction coefficient may also be set to different values according to actual detection, and the correction coefficient in this embodiment 1 is only a coefficient set for better illustrating the technical solution of this application, and is not to be construed as a limitation to the protection scope of this application.
Example 2
This embodiment 2 provides an area location system of DM codes using the area location method of DM codes in this embodiment 1.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (9)

1. A method for locating a DM code area is characterized by comprising the following steps:
acquiring a picture to be detected, wherein the picture to be detected is a preprocessed picture containing a DM code;
positioning an ROI (region of interest) containing DM (demodulation) codes in the picture to be detected by using a YooloV 3-Tiny target detection model, wherein the YooloV 3-Tiny target detection model is obtained by training a standard YooloV 3-Tiny model and configuring parameters;
and performing cross traversal on the ROI containing the DM codes according to a preset track to obtain the effective boundary of the DM codes in the ROI.
2. The method according to claim 1, wherein the ROI area containing the DM code is traversed in a crossing manner according to a preset trajectory to obtain an effective boundary of the DM code in the ROI area, and the specific process is as follows:
acquiring track point coordinates on a preset track;
if the track point coordinates are effective track points, detecting a DM code boundary;
and if the DM code boundary is an effective boundary, outputting the effective boundary of the DM code.
3. The method according to claim 2, wherein if the coordinates of the trace points are valid trace points, detecting the DM code boundary, and the specific process includes:
judging whether the effective track points are strong edge points or not;
if the effective track points are strong edge points, detecting an L-shaped edge of the DM code;
if the L-shaped edge is an effective L-shaped edge, detecting a dotted line edge of the DM code;
and if the dotted line edge is an effective dotted line edge, outputting an effective L-shaped edge and an effective dotted line edge of the DM code.
4. The method according to claim 3, wherein the method for locating the area of the DM code is to determine whether the effective track point is a strong edge point, and the specific determination process is as follows:
calculating the gradient of the current effective track point according to the sobel operator;
if the gradient of the current effective track point meets the requirement of a first preset threshold value, calculating the gradient of a forward direction neighborhood point and the gradient of a reverse direction neighborhood point of the vertical gradient of the current effective track point;
if the gradient of the forward direction neighborhood point of the vertical gradient of the current effective track point and the gradient of the reverse direction neighborhood point both meet the requirement of a second preset threshold value, judging whether the current effective track point is a common neighborhood point of the forward direction neighborhood point of the vertical gradient and the reverse direction neighborhood point of the vertical gradient;
and if the current effective track point is a common neighborhood point of the vertical gradient forward direction neighborhood point and the vertical gradient reverse direction neighborhood point, judging that the current effective track point is a strong edge point.
5. The method according to claim 3, wherein if the valid track point is a strong edge point, detecting an L-shaped edge of the DM code, specifically detecting as follows:
acquiring a first neighborhood point of the positive direction of the vertical gradient of the current effective track point and a second neighborhood point of the positive direction of the first neighborhood point in the vertical gradient;
when the gradient of the acquired neighborhood point is smaller than a third preset threshold value or the acquired neighborhood point is a traversed neighborhood point, no neighborhood point in the positive direction of the vertical gradient is acquired;
acquiring a first neighborhood point of the current effective track point in the opposite direction of the vertical gradient and a second neighborhood point of the first neighborhood point in the opposite direction of the vertical gradient;
when the gradient of the acquired neighborhood point is smaller than a fourth preset threshold value or the acquired neighborhood point is a traversed neighborhood point, no neighborhood point in the opposite direction of the vertical gradient is acquired;
and connecting the neighborhood points to obtain an L-shaped edge passing through the current effective track point.
6. The method according to claim 5, wherein the neighborhood points are connected to obtain an L-shaped edge passing through the current valid trace point, and the method further comprises:
obtaining a first right-angle side and a second right-angle side of the L-shaped side through Hough line detection;
calculating the included angle between the first right-angle edge and the second right-angle edge;
judging whether the included angle meets a preset included angle of the DM code;
if the included angle meets a preset included angle, calculating the vector product of the first right-angle edge and the second right-angle edge;
and determining the left side and the bottom side of the DM code according to the vector product, and outputting an L-shaped side detection result of the DM code.
7. The method according to claim 3, wherein if the L-shaped edge is an effective L-shaped edge, the method detects a dotted edge of the DM code, and the specific detection process is as follows:
judging the straight line and the direction of the dotted line edge according to the effective L-shaped edge;
detecting an upper dotted line side and a right dotted line side of the dotted line sides according to the straight line and the direction of the dotted line sides;
carrying out Hough transform on the upper dotted line edge and the right dotted line edge respectively to obtain a first straight line and a second straight line;
obtaining an affine transformation matrix between the DM code and the unit square DM code according to the effective L-shaped edge, the first straight line and the second straight line;
violently enumerating the DM code specification, and sampling a DM code image by combining the affine transformation matrix;
obtaining the DM code specification which is matched most according to the average color difference value of the black and white modules in the DM code image obtained by sampling;
respectively taking the first straight line and the second straight line as traversal tracks, and counting the total number of black and white modules on the first straight line or the second straight line and the total number of black and white modules on the edge of the dotted line in the DM code specification which is most matched with the black and white modules;
and if the total number of the black and white modules on the first straight line or the second straight line is consistent with the total number of the black and white modules on the dotted line side in the DM code specification which is matched with the black and white modules on the dotted line side or the error is within a preset error range, judging that the detection of the dotted line side is finished.
8. The method of claim 7, wherein the DM code area location method,
the dotted line edge is obtained by dividing the dotted line edge into a plurality of sections of sub-edges and traversing straight lines of the plurality of sections of sub-edges with different angles generated by Bresenham algorithm, wherein the dotted line edge comprises the upper dotted line edge and the right dotted line edge.
9. A system for area location of DM codes, characterized in that a method for area location of DM codes according to any one of claims 1 to 8 is used.
CN202011612973.2A 2020-12-30 2020-12-30 Region positioning method and system for DM code Active CN112800797B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011612973.2A CN112800797B (en) 2020-12-30 2020-12-30 Region positioning method and system for DM code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011612973.2A CN112800797B (en) 2020-12-30 2020-12-30 Region positioning method and system for DM code

Publications (2)

Publication Number Publication Date
CN112800797A true CN112800797A (en) 2021-05-14
CN112800797B CN112800797B (en) 2023-12-19

Family

ID=75805871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011612973.2A Active CN112800797B (en) 2020-12-30 2020-12-30 Region positioning method and system for DM code

Country Status (1)

Country Link
CN (1) CN112800797B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI827423B (en) * 2022-12-28 2023-12-21 大陸商信揚科技(佛山)有限公司 Scanning method and related devices

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398895A (en) * 2007-09-26 2009-04-01 杨高波 Image preprocess method based on data matrix two-dimension bar code identification
US20160343143A1 (en) * 2014-03-05 2016-11-24 Mitsubishi Electric Corporation Edge detection apparatus, edge detection method, and computer readable medium
CN106485183A (en) * 2016-07-14 2017-03-08 深圳市华汉伟业科技有限公司 A kind of Quick Response Code localization method and system
US20180165492A1 (en) * 2015-09-02 2018-06-14 Fujian Landi Commercial Equipment Co., Ltd. Decoding Method and System for QR Code with One Damaged Position Detection Pattern
CN108648205A (en) * 2018-05-07 2018-10-12 广州大学 A kind of sub-pixel edge detection method
WO2019227615A1 (en) * 2018-06-01 2019-12-05 平安科技(深圳)有限公司 Method for correcting invoice image, apparatus, computer device, and storage medium
CN110941970A (en) * 2019-12-05 2020-03-31 深圳牛图科技有限公司 High-speed dimension code positioning and identifying system based on full convolution neural network
CN111080661A (en) * 2019-12-09 2020-04-28 Oppo广东移动通信有限公司 Image-based line detection method and device and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398895A (en) * 2007-09-26 2009-04-01 杨高波 Image preprocess method based on data matrix two-dimension bar code identification
US20160343143A1 (en) * 2014-03-05 2016-11-24 Mitsubishi Electric Corporation Edge detection apparatus, edge detection method, and computer readable medium
US20180165492A1 (en) * 2015-09-02 2018-06-14 Fujian Landi Commercial Equipment Co., Ltd. Decoding Method and System for QR Code with One Damaged Position Detection Pattern
CN106485183A (en) * 2016-07-14 2017-03-08 深圳市华汉伟业科技有限公司 A kind of Quick Response Code localization method and system
CN108648205A (en) * 2018-05-07 2018-10-12 广州大学 A kind of sub-pixel edge detection method
WO2019227615A1 (en) * 2018-06-01 2019-12-05 平安科技(深圳)有限公司 Method for correcting invoice image, apparatus, computer device, and storage medium
CN110941970A (en) * 2019-12-05 2020-03-31 深圳牛图科技有限公司 High-speed dimension code positioning and identifying system based on full convolution neural network
CN111080661A (en) * 2019-12-09 2020-04-28 Oppo广东移动通信有限公司 Image-based line detection method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI827423B (en) * 2022-12-28 2023-12-21 大陸商信揚科技(佛山)有限公司 Scanning method and related devices

Also Published As

Publication number Publication date
CN112800797B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
CN105069394B (en) Quick Response Code weighted average gray level method coding/decoding method and system
US10438038B2 (en) Decoding method and system for QR code with one damaged position detection pattern
CN107633192B (en) Bar code segmentation and reading method based on machine vision under complex background
CN105989317B (en) Two-dimensional code identification method and device
EP1678659B1 (en) Method and image processing device for analyzing an object contour image, method and image processing device for detecting an object, industrial vision apparatus, smart camera, image display, security system, and computer program product
CN108182383B (en) Vehicle window detection method and device
US10528781B2 (en) Detection method and system for characteristic patterns of Han Xin codes
CN104517089A (en) Two-dimensional code decoding system and method
CN106407924A (en) Binocular road identifying and detecting method based on pavement characteristics
US6941026B1 (en) Method and apparatus using intensity gradients for visual identification of 2D matrix symbols
CN113012157B (en) Visual detection method and system for equipment defects
CN101727580A (en) Image processing apparatus, electronic medium, and image processing method
CA3045391C (en) Method for detection and recognition of long-range high-density visual markers
CN104318559A (en) Quick feature point detecting method for video image matching
CN116704209B (en) Quick flange contour extraction method and system
CN111914845A (en) Character layering method and device in license plate and electronic equipment
CN110533028B (en) Method and device for detecting commodity display state, electronic device and storage medium
CN112800797A (en) Method and system for positioning DM code region
CN114417904A (en) Bar code identification method based on deep learning and book retrieval system
CN104240269A (en) Video target tracking method based on spatial constraint coding
Wang LFTag: A scalable visual fiducial system with low spatial frequency
CN114612490B (en) Scenedesmus cell statistical method based on microscope image
US11941863B2 (en) Imaging system and method using a multi-layer model approach to provide robust object detection
CN111191759A (en) Two-dimensional code generation method and positioning and decoding method based on GPU
CN114926817A (en) Method and device for identifying parking space, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant