CN117197422B - Identification code positioning method, electronic equipment and storage medium - Google Patents

Identification code positioning method, electronic equipment and storage medium Download PDF

Info

Publication number
CN117197422B
CN117197422B CN202311466383.7A CN202311466383A CN117197422B CN 117197422 B CN117197422 B CN 117197422B CN 202311466383 A CN202311466383 A CN 202311466383A CN 117197422 B CN117197422 B CN 117197422B
Authority
CN
China
Prior art keywords
line segment
identification code
fitted
distance
code image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311466383.7A
Other languages
Chinese (zh)
Other versions
CN117197422A (en
Inventor
陈文钊
边旭
冉东来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Youibot Robotics Technology Co ltd
Original Assignee
Shenzhen Youibot Robotics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Youibot Robotics Technology Co ltd filed Critical Shenzhen Youibot Robotics Technology Co ltd
Priority to CN202311466383.7A priority Critical patent/CN117197422B/en
Publication of CN117197422A publication Critical patent/CN117197422A/en
Application granted granted Critical
Publication of CN117197422B publication Critical patent/CN117197422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides a positioning method of an identification code, electronic equipment and a computer readable storage medium. The positioning method of the identification code comprises the following steps: extracting candidate areas of the identification code images to be positioned to obtain reference code candidate areas of the identification code images; extracting line segments based on the identification code image to obtain a line segment set of the identification code image; performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image; integrating the reference code candidate region and the target included angle to obtain a parallelogram of the reference code candidate region; and outputting a target reference code region of the identification code image based on the parallelogram. According to the method and the device, the detection patterns of the reference code can be positioned under the conditions that the reference code is missing, damaged and shielded, the shielding resistance of the quadrilateral detection patterns of the reference code is improved, and then the recognition rate of the reference code is improved.

Description

Identification code positioning method, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer vision, and in particular, to a positioning method for an identification code, an electronic device, and a computer readable storage medium.
Background
Along with the continuous development of computer vision technology, two-dimensional identification codes (such as QR codes, dataMatrix codes, april tag codes, arUco codes) are applied to more and more fields, wherein detection patterns such as april tag codes and ArUco codes are rectangular reference codes, and if the rectangular detection patterns are blocked, damaged and polluted, the reference codes cannot be positioned normally, so that the reference code identification fails. It can be seen that how to improve the shielding resistance of the rectangular detection pattern of the reference code to improve the recognition rate of the reference code is a problem to be solved.
Disclosure of Invention
The application provides a positioning method of an identification code, electronic equipment and a computer readable storage medium, which can improve the shielding resistance of a quadrilateral detection pattern of a reference code, thereby improving the identification rate of the reference code.
In a first aspect, the present application provides a positioning method for an identification code, the method including:
extracting candidate areas of the identification code images to be positioned to obtain reference code candidate areas of the identification code images;
extracting line segments based on the identification code image to obtain a line segment set of the identification code image;
performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image;
Integrating the reference code candidate region and the target included angle to obtain a parallelogram of the reference code candidate region;
and outputting a target reference code region of the identification code image based on the parallelogram.
In some embodiments, the extracting the line segments based on the identification code image to obtain a line segment set of the identification code image includes:
performing edge detection on the identification code image to obtain a preliminary line segment of the identification code image;
performing inflection point detection based on a first line segment between a starting point of the preliminary line segment and a midpoint of the preliminary line segment to obtain a first inflection point result of the preliminary line segment;
performing inflection point detection based on a second line segment between the end point of the preliminary line segment and the midpoint of the preliminary line segment to obtain a second inflection point result of the preliminary line segment;
updating the end points of the preliminary line segment based on the first inflection point result and the second inflection point result to obtain an updated line segment serving as a target line segment;
and obtaining a line segment set of the identification code image based on the target line segment.
In some embodiments, the performing an angle fitting based on the line segment set to obtain each target angle of the identification code image includes:
Carrying out line segment combination based on the line segment set to obtain a first line segment to be fitted and a second line segment to be fitted, wherein the first line segment to be fitted points to a first end point from a first start end point, and the second line segment to be fitted points to a second end point from a second start end point;
acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted, wherein the first distance is the distance between the first ending endpoint and the second starting endpoint, and the second distance is the distance between the second ending endpoint and the first starting endpoint;
and if the first distance is smaller than the second distance and the first distance is smaller than a first preset length, taking an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted as a target included angle of the identification code image.
In some embodiments, the method further comprises:
and if the second distance is smaller than the first distance and the second distance is smaller than a first preset length, taking an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted as a target included angle of the identification code image.
In some embodiments, the obtaining the first distance and the second distance based on the first line segment to be fitted and the second line segment to be fitted includes:
detecting the length of the pixel point of the first line segment to be fitted and the length of the pixel point of the second line segment to be fitted;
and if the length of the pixel points of the first line segment to be fitted is larger than a second preset length and the length of the pixel points of the second line segment to be fitted is larger than a second preset length, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
In some embodiments, the obtaining the first distance and the second distance based on the first line segment to be fitted and the second line segment to be fitted includes:
detecting an angle between the first line segment to be fitted and the second line segment to be fitted;
and if the angle between the first line segment to be fitted and the second line segment to be fitted is in a preset angle range, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
In some embodiments, the extracting the candidate region of the identification code image to be located to obtain the candidate region of the reference code of the identification code image includes:
Performing binarization processing on an identification code image to be positioned to obtain a binarization image of the identification code image;
performing expansion operation on the binarization map to obtain an expansion map of the binarization map;
and obtaining a reference code candidate region of the identification code image based on the region of the continuous target pixel point in the expansion map, wherein the target pixel point is a pixel point with a gray value meeting a preset condition.
In some embodiments, the expanding the binarized map to obtain an expanded map of the binarized map includes:
acquiring a preset structural element;
acquiring an occupied pixel point of the structural element when the origin of the structural element is in a target pixel point in the binarization graph;
and obtaining an expansion map of the binarization map based on the occupied pixel points.
In a second aspect, the present application further provides an electronic device, where the electronic device includes a processor and a memory, where the memory stores a computer program, and the processor executes any one of the positioning methods of the identification codes provided in the present application when calling the computer program in the memory.
In a third aspect, the present application also provides a computer readable storage medium having stored thereon a computer program, the computer program being loaded by a processor to perform the method of locating an identification code.
In the application, on one hand, the edges of potential detection images in the identification code images can be extracted by extracting line segments based on the identification code images to obtain a line segment set of the identification code images; obtaining each target included angle of the identification code image by carrying out included angle fitting based on the line segment set, and restoring the included angle between edges of the potential detection image in the identification code image by utilizing the extracted line segment fitting; therefore, the parallelogram of the reference code candidate region is obtained by integrating the reference code candidate region and the target included angles, and the detection graph of the reference code in the identification code image can be recovered by utilizing each target included angle of the identification code image; on the other hand, the parallelogram of the reference code candidate region is obtained by integrating the reference code candidate region and the target included angle and is used as the target reference code region of the identification code image, so that the problem that the rectangular detection pattern cannot be detected due to deformation in the pattern can be avoided, the detection pattern of the reference code can be positioned under the conditions that the reference code is missing, damaged and blocked, the shielding resistance of the quadrilateral detection pattern of the reference code is improved, and the identification rate of the reference code is further improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic block diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a flow chart of a positioning method of an identification code according to an embodiment of the present application;
FIG. 3 is a schematic illustration of determining candidate regions of reference codes according to an embodiment of the present application;
FIG. 4 is a schematic illustration of determining a target line segment according to an embodiment of the present application;
fig. 5 is a schematic illustration of target angle fitting provided in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
In the description of the embodiments of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or an implicit indication of the number of features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The following description is presented to enable any person skilled in the art to make and use the application. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present application may be practiced without these specific details. In other instances, well-known processes have not been described in detail in order to avoid unnecessarily obscuring descriptions of the embodiments of the present application. Thus, the present application is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed in the embodiments of the present application.
The embodiment of the application provides a positioning method of an identification code, electronic equipment and a computer readable storage medium. The electronic device may be a mobile robot, a cell phone, a computer, etc.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Fig. 1 is a schematic block diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 1, the electronic device 100 includes a processor 101 and a memory 102, the processor 101 and the memory 102 being connected by a bus 103, such as an I2C (Inter-integrated Circuit) bus.
In particular, the processor 101 is configured to provide computing and control capabilities to support the operation of the overall electronic device 100. The processor 101 may be a central processing unit (Central Processing Unit, CPU), the processor 101 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Specifically, the Memory 102 may be a Flash chip, a Read-Only Memory (ROM) disk, an optical disk, a U-disk, a removable hard disk, or the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely a block diagram of a portion of the structure related to the embodiments of the present application and is not limiting of the electronic device to which the embodiments of the present application apply, and that a particular electronic device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
The processor 101 is configured to execute a computer program stored in the memory 102, and implement any one of the positioning methods of the identification code provided in the embodiments of the present application when the computer program is executed. For example, the processor 101 is configured to run a computer program stored in the memory 102, and when the computer program is executed, the following steps may be implemented:
extracting candidate areas of the identification code images to be positioned to obtain reference code candidate areas of the identification code images; extracting line segments based on the identification code image to obtain a line segment set of the identification code image; performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image; integrating the reference code candidate region and the target included angle to obtain a parallelogram of the reference code candidate region; and outputting a target reference code region of the identification code image based on the parallelogram.
It should be noted that, for convenience and brevity of description, specific working processes of the above-described electronic device may refer to corresponding processes in the following embodiment of the positioning method of the identification code, which is not described herein again.
Hereinafter, the method for locating an identification code provided in the embodiment of the present application will be described in detail by taking the electronic device shown in fig. 1 as an execution body of the method for locating an identification code as an example, and in order to simplify and facilitate description, the execution body will be omitted in the following method embodiments. It should be noted that, the scenario in fig. 1 is only used to explain the positioning method of the identification code provided in the embodiment of the present application, but does not constitute limitation of the application scenario of the positioning method of the identification code provided in the embodiment of the present application.
Referring to fig. 2, fig. 2 is a flowchart of a positioning method of an identification code according to an embodiment of the present application. The positioning method of the identification code comprises the following steps 201 to 205, wherein:
201. and extracting candidate areas of the identification code images to be positioned to obtain reference code candidate areas of the identification code images.
For a better understanding of the present embodiment, the following first describes the part names related to the present embodiment:
1. Target pixel point: the pixel point with the gray value equal to the preset gray value is referred to. For example, a pixel with the largest gray value (such as a pixel with a gray value of 255) in the binarized graph; for another example, the pixel with the smallest gray value (such as the pixel with the gray value of 0) in the binarization map.
2. Binarization map: the binary image of the identification code image is referred to, wherein the identification code image can be converted into the binary image by using a fixed threshold segmentation method or an adaptive threshold segmentation method.
3. Expansion diagram: refers to an image obtained by performing an expansion operation on the binarized map.
The reference code candidate region refers to a region where the reference code is preliminarily determined.
Wherein the identification code image is an image in which the position of the reference code needs to be identified. In some embodiments, only one reference code may be included in the identification code image; in some embodiments, the identification code image may also include a plurality of reference codes, and a plurality of target reference code regions will ultimately be located for the identification code image containing the plurality of reference codes.
There are various ways of determining the reference code candidate region in step 201, and exemplary ways include:
(1) And taking the area of the continuous target pixel points in the expansion map as a reference code candidate area of the identification code image. At this time, the step 201 may specifically include the following steps 2011a to 2013a:
2011A, performing binarization processing on the identification code image to be positioned to obtain a binarization graph of the identification code image.
2012A, performing expansion operation on the binarization map to obtain an expansion map of the binarization map.
For example, "performing an expansion operation on the binarized map to obtain an expansion map of the binarized map" may specifically include: acquiring a preset structural element; acquiring an occupied pixel point of the structural element when the origin of the structural element is in a target pixel point in the binarization graph; and obtaining an expansion map of the binarization map based on the occupied pixel points.
For example, as shown in fig. 3, each square in fig. 3 represents a pixel point, fig. 3 (a) represents a binarization map, squares 32, 42, 43, 44, 43, 54, 63 shown in fig. 3 (a) represent different target pixel points, and a dashed box represents a preset structural element, respectively; fig. 3 (b) shows a binarization map expansion map. For each target pixel point in the binarization map, acquiring an occupied pixel point of the structural element when the origin of the structural element is at the target pixel point in the binarization map (as shown in fig. 3, assuming that the origin of the structural element is at the target pixel point 42 of the binarization map, the occupied pixel point of the structural element when the origin of the structural element is at the target pixel point in the binarization map comprises a pixel point 41, a pixel point 43, a pixel point 51 and a pixel point 52, and in the same way, the occupied pixel point of the structural element when the origin of the structural element is at the target pixel points 32, 43, 44, 53, 54 and 63 of the binarization map can be determined; taking the image which is the same as the binarization image as a reference image (if the binarization image can be copied as the reference image), updating the gray value of the occupied pixel point in the reference image into the gray value of the target pixel point, and keeping the gray values of the other pixel points unchanged; and the like, obtaining an updated reference picture; and taking the updated reference graph as an expansion graph of the binarization graph.
2013A, obtaining a reference code candidate region of the identification code image based on the region where the continuous target pixel points in the expansion map are located.
The target pixel points are pixel points with gray values meeting preset conditions.
In some embodiments, the area where the number of continuous target pixel points in the expansion map is greater than the preset number threshold value and the continuous target pixel points are located may be directly used as the reference code candidate area of the identification code image.
In other embodiments, the area where the number of continuous target pixel points in the expansion map is greater than the preset number threshold value and the continuous target pixel points are located may be taken as the target area; and taking the surrounding area of the minimum circumscribed rectangle of the target area as a reference code candidate area of the identification code image. For example, as shown in fig. 3, in fig. 3 (b), the squares 31, 32, 41, 42, 43, 44, 51, 52, 53, 54, 62, 63, 64, 72, 73 represent different target pixels, and in fig. 3 (b), the number (i.e., 15) of consecutive target pixels (including the target pixels 31, 32, 41, 42, 43, 44, 51, 52, 53, 54, 62, 63, 64, 72, 73) is greater than a preset number threshold (e.g., 10), and then the surrounding area of the smallest circumscribed rectangle of the area where the consecutive target pixels are located may be used as the reference code candidate area of the identification code image (as indicated by the rectangular dotted line box in fig. 3 (b)). Similarly, one or more candidate regions of the reference code of the identification code image may be obtained.
(2) And taking the surrounding area of the smallest circumscribed rectangle of a plurality of continuous target pixel points in the binarization graph as a reference code candidate area of the identification code image. At this time, the step 201 may specifically include the following steps 20110 b to 2012b:
2011B, performing binarization processing on the identification code image to be positioned to obtain a binarization graph of the identification code image.
2012B, the surrounding area of the smallest circumscribed rectangle of the continuous plurality of target pixel points in the binarization map is used as the reference code candidate area of the identification code image.
202. And extracting line segments based on the identification code image to obtain a line segment set of the identification code image.
Wherein the target line segment is a line segment of a preliminarily determined, potential detection pattern edge.
Wherein the set of line segments is a set of target line segments.
The manner in which the segment set is determined in step 202 is various, and illustratively includes:
(1) And taking the preliminary line segment obtained by detecting the identification code image as a target line segment, thereby obtaining a line segment set of the identification code image. At this time, the step 202 may specifically include the following steps 2021a to 2022a:
2021A, performing edge detection on the identification code image to obtain a preliminary line segment of the identification code image, which is used as a target line segment.
For example, the edge detection principle may be utilized to perform edge detection on the binarized graph of the identification code image, so as to obtain each line segment in the identification code image, where each detected line segment is a preliminary line segment of the identification code image, and each detected preliminary line segment is used as a target line segment.
2022A, obtaining a set of line segments of the identification code image based on the target line segments.
Specifically, the set of all the target line segments detected in step 2021A is the line segment set of the identification code image. For example, performing edge detection on the identification code image to obtain 5 preliminary line segments, which are respectively: the line segment a, the line segment b, the line segment c, the line segment d and the line segment e are the line segment set { line segment a, line segment b, line segment c, line segment d and line segment e } of the identification code image.
(2) In order to improve the probability that the line segments in the line segment set are edges of the detection graph, after the preliminary line segments are detected, inflection point detection is performed on the preliminary line segments, and the detected inflection points are used for updating the end points of the preliminary line segments to obtain target line segments, so that the line segment set of the identification code image is obtained. At this time, the step 202 may specifically include the following steps 2021b to 2025b:
2021B, performing edge detection on the identification code image to obtain a preliminary line segment of the identification code image.
The implementation of step 2021B is similar to that of step 2021A, and reference may be made to the above description, and details are not repeated here.
2022B, performing inflection point detection based on a first line segment between the starting point of the preliminary line segment and the midpoint of the preliminary line segment, to obtain a first inflection point result of the preliminary line segment.
The first line segment refers to a line segment formed between the starting point of the preliminary line segment and the midpoint of the preliminary line segment. For example, as shown in FIG. 4, assume that the initial line segment has a start point, an end point, and a midpoint of P, P, respectively next And O, a line segment PO formed between the starting point P of the preliminary line segment and the midpoint O of the preliminary line segment is the first line segment.
The first inflection point result is used for indicating whether an inflection point exists in a pixel point with a distance between the first inflection point and the first line segment within a preset distance range (for example, the distance between the first inflection point and the first line segment is within 10 pixel points).
Please refer to fig. 4, an exampleThe image inflection point detection algorithm is used to detect each pixel point whose distance from the first line segment is within a preset distance range (for example, the distance from the first line segment is within 10 pixels), if there is an inflection point in the pixel point whose distance from the first line segment is within the preset distance range (as shown in fig. 4, the point P is assumed) start Is an inflection point whose distance from the first line segment is within a preset distance range), the inflection point whose distance from the first line segment is within a preset distance range (i.e., point P start ) As a first inflection result of the preliminary line segment; if no inflection point exists between the starting point of the preliminary line segment and the midpoint of the preliminary line segment, the inflection point does not exist between the starting point of the preliminary line segment and the midpoint of the preliminary line segment, and the inflection point is used as a first inflection point result.
Further, if a plurality of inflection points exist between the starting point of the preliminary line segment and the midpoint of the preliminary line segment, the inflection point closest to the starting point of the preliminary line segment is taken as a first inflection point result.
2023B, performing inflection point detection based on a second line segment between the end point of the preliminary line segment and the midpoint of the preliminary line segment, to obtain a second inflection point result of the preliminary line segment.
The second line segment refers to a line segment formed between the end point of the preliminary line segment and the midpoint of the preliminary line segment. For example, as shown in FIG. 4, assume that the initial line segment has a start point, an end point, and a midpoint of P, P, respectively next O, the end point P of the preliminary line segment next Line segment P formed between the midpoint O of the preliminary line segment next O is the second line segment.
The second inflection point result is used for indicating whether an inflection point exists in a pixel point with a distance between the second line segments within a preset distance range (for example, a distance between the second inflection point and the second line segments is within 10 pixel points).
The start point of the preliminary line segment and the end point of the preliminary line segment are two end points of the preliminary line segment, respectively, and the start point of the preliminary line segment and the end point of the preliminary line segment are only relatively speaking, and in this embodiment, the start point of the preliminary line segment and the end point of the preliminary line segment are only used as illustrations, and are not used for limiting the actual direction of the preliminary line segment. For example, the preliminary line segment includes an end point 1 and an end point 2, and if the end point 1 is used as the start point of the preliminary line segment, the end point 2 is used as the end point of the preliminary line segment; if the endpoint 2 is taken as the starting point of the preliminary line segment, the endpoint 1 is taken as the end point of the preliminary line segment.
Referring to fig. 4, for example, each pixel point whose distance from the second line segment is within a predetermined distance range (e.g., the distance from the second line segment is within 10 pixels) is detected by using an image inflection point detection algorithm, and if there is an inflection point in the pixel point whose distance from the second line segment is within the predetermined distance range (as shown in fig. 4, the assumed point P is end Is an inflection point whose distance from the second line segment is within a preset distance range), then the inflection point whose distance from the first line segment is within the preset distance range (i.e., point P end ) As a second inflection result of the preliminary line segment; if no inflection point exists between the end point of the preliminary line segment and the middle point of the preliminary line segment, the inflection point does not exist between the end point of the preliminary line segment and the middle point of the preliminary line segment, and the inflection point is used as a second inflection point result.
Further, if a plurality of inflection points exist between the end point of the preliminary line segment and the middle point of the preliminary line segment, the inflection point with the closest distance to the end point of the preliminary line segment is used as a second inflection point result.
2024B, updating the end point of the preliminary line segment based on the first inflection point result and the second inflection point result to obtain an updated line segment, so as to serve as a target line segment.
Illustratively, in one aspect, updating a start point of the preliminary line segment with a first inflection point result; if the first inflection point result indicates that no inflection point exists between the starting point of the preliminary line segment and the middle point of the preliminary line segment, the starting point of the preliminary line segment is kept unchanged, and the starting point of the updated line segment is kept as the starting point of the preliminary line segment; if the first inflection point result indicates that an inflection point exists between the starting point of the preliminary line segment and the midpoint of the preliminary line segment, updating the starting point of the preliminary line segment by using the first inflection point result (namely, the inflection point between the starting point of the preliminary line segment and the midpoint of the preliminary line segment) so that the starting point of the updated line segment is the inflection point between the starting point of the preliminary line segment and the midpoint of the preliminary line segment. On the other hand, updating the end point of the preliminary line segment by using a second inflection point result; if the second inflection point result indicates that no inflection point exists between the end point of the preliminary line segment and the middle point of the preliminary line segment, the end point of the preliminary line segment is kept unchanged, and the end point of the updated line segment is kept as the end point of the preliminary line segment; if the second inflection point result indicates that an inflection point exists between the end point of the preliminary line segment and the middle point of the preliminary line segment, updating the end point of the preliminary line segment by using the second inflection point result (namely, the inflection point between the end point of the preliminary line segment and the middle point of the preliminary line segment), so that the updated end point of the line segment is the inflection point between the end point of the preliminary line segment and the middle point of the preliminary line segment. And finally, taking an updated line segment obtained by updating the end point of the preliminary line segment as a target line segment.
For example, referring to fig. 4, assume that two end points of the detected preliminary line segment are respectively at P, P next By searching, it is found that: an inflection point (such as point P) exists in the pixel points with the distance from the first line segment within the preset distance range start ) An inflection point (such as point P) exists in the pixel points with the distance between the inflection point and the second line segment within the preset distance range end ) The start point P and the end point P of the preliminary line segment next Respectively replace with inflection point P start Inflection point P end The obtained updated line segment P start P end As a target line segment.
2025B, obtaining a set of line segments of the identification code image based on the target line segments.
Similarly, if edge detection is performed on the identification code image to obtain a plurality of preliminary line segments, processing is performed on each preliminary line segment as described above to obtain a plurality of target line segments. The collection of the plurality of target line segments is the line segment collection of the identification code image. For example, the identification code image is subjected to edge detection to obtain 5 preliminary line segments (such as line segment a, b, c, d, e), line segment a ' is obtained by using steps 2022b to 2024b as the target line segment for line segment a, line segment b ' is obtained by using steps 2022b to 2024b as the target line segment for line segment b, line segment c ' is obtained by using steps 2022b to 2024b as the target line segment for line segment c, line segment d ' is obtained by using steps 2022b to 2024b as the target line segment for line segment d, line segment e ' is obtained by using steps 2022b to 2024b as the target line segment for line segment e, and the set of line segments a ', b ', c ', d ', e ' is the line segment set { line segment a ', line segment b ', line segment c ', line segment d ', line segment e ' } of the identification code image.
Therefore, on one hand, as the detection pattern of the identification code is parallelogram, the real inflection point can be found out to serve as the end point of the line segment through inflection point detection, the problem that the preliminary line segment detected by the edge detection algorithm is not the real edge of the detection pattern is avoided, the probability that the determined target line segment is the edge of the detection pattern can be improved, and the positioning accuracy of the reference code region of the identification code can be further improved. On the other hand, the preliminary line segment is firstly searched out through edge detection, and then the inflection point is detected on the basis of the preliminary line segment, and as the inflection point is only detected near the preliminary line segment, the data processing for detecting the inflection point can be reduced, and the positioning speed of the identification code is improved.
203. And performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image.
The target included angle is an included angle obtained by performing included angle fitting on each line segment in the line segment set and is used for indicating the included angle between edges of the potential detection graph in the identification code image.
Illustratively, step 203 may specifically include the following steps 2031-2034:
2031. and carrying out line segment combination based on the line segment set to obtain a first line segment to be fitted and a second line segment to be fitted.
Wherein the first line segment to be fitted (denoted as l) points from a first starting endpoint to a first ending endpoint, and the second line segment to be fitted (denoted as l next ) The second ending endpoint is pointed to by the second starting endpoint. A first start point (denoted as P s1 ) Refers to the starting point and the first ending point (denoted as P e1 ) Refers to the end point of the first segment to be fitted and the second start end point (denoted as P s2 ) Refers to the starting point and the second ending point (denoted as P e2 ) Refers to the end point of the second line segment to be fitted.
Specifically, line segments in the line segment set can be combined two by two to obtain a plurality of line segment combinations; and respectively taking two line segments in the line segment combination as a first line segment to be fitted and a second line segment to be fitted for each line segment combination. For example, the line segments in the line segment set { line segment 1, line segment 2, line segment 3, line segment 4} are combined two by two to obtain 6 line segment combinations, which are respectively: combination 1[ line segment 1, line segment 2], combination 2[ line segment 1, line segment 3], combination 3[ line segment 1, line segment 4], combination 4[ line segment 2, line segment 3], combination 5[ line segment 2, line segment 4], combination 6[ line segment 3, line segment 4]. For the line segment combination 1, the line segment 1 can be used as a first line segment to be fitted, and the line segment 2 can be used as a second line segment to be fitted; similarly, for the line segment combination 2, the line segment 1 is used as a first line segment to be fitted, and the line segment 3 is used as a second line segment to be fitted; and so on.
2032. And acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
Specifically, in one aspect, a distance between the first ending endpoint and the second starting endpoint is obtained as a first distance; for example, as shown in FIG. 5, the first ending point P of the first segment to be fitted e1 Second starting end point P of line segment to be fitted with second s2 The distance between them is the first distance (denoted as d 1). On the other hand, the distance between the second ending endpoint and the first starting endpoint is obtained as a second distance; for example, as shown in FIG. 5, the second ending point P of the second line segment to be fitted e2 A first starting end point P of the first line segment to be fitted s1 The distance between them is the second distance (denoted as d 2).
Further, in order to improve the probability that the determined target included angle of the identification code image is an inter-edge included angle of the detection pattern, in some embodiments, the length of the pixel point of the first line segment to be fitted and the length of the pixel point of the second line segment to be fitted are also detected; and when the length of the pixel points of the first line segment to be fitted is larger than a second preset length and the length of the pixel points of the second line segment to be fitted is larger than a second preset length, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted. Because the length of the edge of the detection pattern can be relatively longer, when the length of the pixel point of the first line segment to be fitted is larger than the second preset length and the length of the pixel point of the second line segment to be fitted is larger than the second preset length, the first distance and the second distance are acquired again to perform included angle fitting, so that line segments which are not the edge of the detection pattern can be filtered, the probability that the determined target included angle of the identification code image is the included angle between the edges of the detection pattern is improved, and the positioning accuracy of the identification code is further improved.
Further, in order to increase the probability that the determined target included angle of the identification code image is an inter-edge included angle of the detection pattern, in some embodiments, an angle between the first line segment to be fitted and the second line segment to be fitted is also detected; and if the angle between the first line segment to be fitted and the second line segment to be fitted is in a preset angle range, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted. Because the detection patterns are rectangular, the included angle between edges of the detection patterns in the identification code image is in a certain range (such as 45-135 degrees), when the angle between the first line segment to be fitted and the second line segment to be fitted is in a preset angle range, the first distance and the second distance are acquired to perform included angle fitting, some included angles which are not included angles between edges of the detection patterns can be filtered, the probability that the determined target included angle of the identification code image is the included angle between edges of the detection patterns is improved, and the positioning accuracy of the identification code is improved.
The preset angle range may be [45 °,135 ° ], where specific values of the preset angle range are only examples, and are not limited thereto.
2034. And if the first distance and the second distance meet a first preset condition or meet a second preset condition, taking an included angle formed by the first line segment to be fitted and the second line segment to be fitted as a target included angle of the identification code image.
The first preset condition is that the first distance d1 is smaller than the second distance d2, and the first distance d1 is smaller than a first preset length.
The second preset condition is that the second distance d2 is smaller than the first distance d1, and the second distance d2 is smaller than a first preset length.
The specific value of the first preset length may be set according to the actual service scene requirement, where the specific value of the first preset length is not limited. For example, the first preset length may take on a value of 7 pixels.
And if the first distance is smaller than the second distance and the first distance is smaller than a preset distance threshold, and the first distance and the second distance are proved to accord with a first preset condition, an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted are used as target included angles of the identification code image.
And if the second distance is smaller than the first distance and the second distance is smaller than a preset distance threshold, and the first distance and the second distance are proved to meet a second preset condition, an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted are used as target included angles of the identification code image.
If the first distance and the second distance do not meet the first preset condition and do not meet the second preset condition, discarding the line segment combination formed by the first line segment to be fitted and the second line segment to be fitted for fitting the target included angle of the identification code image. Processing continues with the next segment combination.
And by analogy, each line segment combination in the plurality of line segment combinations obtained by combining line segments in the line segment set is referred to in steps 2032-2034, and the next line segment combination is continuously adopted for processing until all line segment combinations are processed, and each target included angle of the identification code image is obtained.
204. And integrating the reference code candidate region and the target included angle to obtain a parallelogram of the reference code candidate region.
Specifically, for each reference code candidate region, a target included angle inside the reference code candidate region is searched for integration to obtain a parallelogram of the reference code candidate region, and the obtained parallelogram can be used as a detection pattern output of the identification code. By analogy, for each reference code candidate region, its corresponding parallelogram may be determined, resulting in all detected patterns in the identification code image.
Therefore, in the embodiment, the identification code can be positioned by utilizing the identification idea of extracting line segments, forming included angles between the line segments and recovering the parallelogram under the conditions of detecting pattern deletion, damage and the like of the identification code, and the identification rate of the identification code is improved. The method comprises the steps of realizing a line segment extracting part in the recognition concept in step 202, realizing an included angle forming part between line segments in the recognition concept in step 203, and realizing an included angle recovering parallelogram part in the recognition concept in step 204.
205. And outputting a target reference code region of the identification code image based on the parallelogram.
The target reference code region is a region where the reference code in the identification code image is recognized.
For example, a surrounding area corresponding to a parallelogram of the reference code candidate area in the identification code image may be taken as a target reference code area; by analogy, for parallelogram in a plurality of reference code candidate regions of an identification code image, a plurality of target reference code regions can be identified and output. At this time, the target reference code region is decoded to obtain the content information corresponding to the target reference code region.
For convenience of understanding, a specific example will be described below to illustrate the positioning process of the identification code in this embodiment, for example, in order to improve the positioning accuracy of the robot to improve the control accuracy of the robot, the robot may perform positioning using, for example, april tag code, arUco code, or the like. Specifically, first, a camera of a robot collects an image in a current state as an identification code image; and positioning the identification code image by using the mode in the steps 201-205 to obtain a target reference code area of the identification code image. Then, the position of the target reference code area in the identification code image and the camera pose of the robot are utilized to determine the relative position between the robot and the reference code of the environment corresponding to the target reference code area, so that the position of the robot in the environment is determined, and further the position of the robot in the environment can be utilized to control the robot to execute corresponding tasks (such as robot navigation, object grabbing and the like).
Further, for better understanding, a section of pseudo code is provided below for understanding the positioning process of the identification code in this embodiment, specifically as follows:
input original image
Output several possible reference code areas called quads
1, converting an original image of a person to be input into a binary image by using an automatic threshold segmentation method:
2, using a swelling operation on the binarization map, and extracting 1.5 times of each target area (wherein the target area is an area with gray values of all pixel points in the area larger than a preset gray value) as a candidate area, wherein a candidate area set is called regions;
extracting all edge segments from the binary image by using an arbitrary edge extraction method, wherein the set of the edge segments is called edges;
4:foreach edge ∈ edges do
5:concour=polygon fitting algorithm (edge)
The number of points contained in if content exceeds 3 points then
7:foreach p in contour do
8 if' Distance (p, pnext) >14 pixels then
Ps, pe=line segment trimming algorithm (p, pnext)
10:lines ← Line(Ps, Pe)
11:end
12:end
13:end
14:end
15:foreach l ∈ lines do
The length of the if is more than 15 pixels then 16
17 if Angle (l, lnext) >30 DEG then
Pos=intersection of line segments (l, lnext)
19:d1= Digtancel(l.pe, lnext.Ps)
20:d2= Digtancel(l.ps, lnext.Pe)
21:if d1<d2 and d1<7 pixels then
22:if Cross-Producl(l, lnext)<0 then
23:corners ← Cornorpos, l, nert)
24:end
25:else if d2<d1 and d2<7 pixels then
26:if Cross- Producl(lnext, l)<0 then
27:corners ← Corner(pos,lnext,l)
28:end
29:end
30:end
31:end
32:foreach region ∈ regions do
33, quads≡search for corners inside region and integrate a rectangle:
34: end
35:return quads;
The line segment trimming algorithm of line 9 refers to that if the line segment is directly extracted by using the edge detection algorithm, the starting point and the ending point of the line segment may be inaccurate, in order to improve the accuracy of the extracted line segment, the trimming algorithm is used to send the point from the original point to the detected middle of the line segment to search for an inflection point as a new ending point of the line segment, and a line segment is represented by a starting point-ending point mode (for specific implementation, refer to steps 2021 b-2025 b). Referring to fig. 5, lines 19 to 29 are for the purpose of indicating angles in the same direction, such as in a counterclockwise direction or in a clockwise direction.
It will be appreciated that the pseudo code is primarily for understanding the present embodiment and is not to be construed as limiting the present embodiment.
From the above, on one hand, the edge of the potential detection image in the identification code image can be extracted by extracting the line segments based on the identification code image to obtain the line segment set of the identification code image; obtaining each target included angle of the identification code image by carrying out included angle fitting based on the line segment set, and restoring the included angle between edges of the potential detection image in the identification code image by utilizing the extracted line segment fitting; therefore, the parallelogram of the reference code candidate region is obtained by integrating the reference code candidate region and the target included angles, and the detection graph of the reference code in the identification code image can be recovered by utilizing each target included angle of the identification code image; on the other hand, the parallelogram of the reference code candidate region is obtained by integrating the reference code candidate region and the target included angle and is used as the target reference code region of the identification code image, so that the problem that the rectangular detection pattern cannot be detected due to deformation in the pattern can be avoided, the detection pattern of the reference code can be positioned under the conditions that the reference code is missing, damaged and blocked, the shielding resistance of the quadrilateral detection pattern of the reference code is improved, and the identification rate of the reference code is further improved.
It will be appreciated by those of ordinary skill in the art that all or part of the steps in the method for locating an identification code described above may be performed by instructions or by controlling associated hardware by instructions, which may be stored on a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform any of the methods for locating an identification code provided by embodiments of the present application. For example, the computer program can be loaded by a processor to perform the steps of:
extracting candidate areas of the identification code images to be positioned to obtain reference code candidate areas of the identification code images; extracting line segments based on the identification code image to obtain a line segment set of the identification code image; performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image; integrating the reference code candidate region and the target included angle to obtain a parallelogram of the reference code candidate region; and outputting a target reference code region of the identification code image based on the parallelogram.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
performing edge detection on the identification code image to obtain a preliminary line segment of the identification code image; performing inflection point detection based on a first line segment between a starting point of the preliminary line segment and a midpoint of the preliminary line segment to obtain a first inflection point result of the preliminary line segment; performing inflection point detection based on a second line segment between the end point of the preliminary line segment and the midpoint of the preliminary line segment to obtain a second inflection point result of the preliminary line segment;
updating the end points of the preliminary line segment based on the first inflection point result and the second inflection point result to obtain an updated line segment serving as a target line segment; and obtaining a line segment set of the identification code image based on the target line segment.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
carrying out line segment combination based on the line segment set to obtain a first line segment to be fitted and a second line segment to be fitted, wherein the first line segment to be fitted points to a first end point from a first start end point, and the second line segment to be fitted points to a second end point from a second start end point; acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted, wherein the first distance is the distance between the first ending endpoint and the second starting endpoint, and the second distance is the distance between the second ending endpoint and the first starting endpoint; and if the first distance is smaller than the second distance and the first distance is smaller than a first preset length, taking an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted as a target included angle of the identification code image.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
and if the second distance is smaller than the first distance and the second distance is smaller than a first preset length, taking an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted as a target included angle of the identification code image.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
detecting the length of the pixel point of the first line segment to be fitted and the length of the pixel point of the second line segment to be fitted; and if the length of the pixel points of the first line segment to be fitted is larger than a second preset length and the length of the pixel points of the second line segment to be fitted is larger than a second preset length, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
detecting an angle between the first line segment to be fitted and the second line segment to be fitted; and if the angle between the first line segment to be fitted and the second line segment to be fitted is in a preset angle range, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
performing binarization processing on an identification code image to be positioned to obtain a binarization image of the identification code image; performing expansion operation on the binarization map to obtain an expansion map of the binarization map; and obtaining a reference code candidate region of the identification code image based on the region of the continuous target pixel point in the expansion map, wherein the target pixel point is a pixel point with a gray value meeting a preset condition.
In some embodiments, the computer program is capable of being loaded by a processor to perform the steps of:
acquiring a preset structural element; acquiring an occupied pixel point of the structural element when the origin of the structural element is in a target pixel point in the binarization graph; and obtaining an expansion map of the binarization map based on the occupied pixel points.
Wherein the computer-readable storage medium may comprise: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
In the above-mentioned positioning method of the identification code, the computer readable storage medium and the electronic device embodiment, the descriptions of the embodiments are focused on, and the details of a certain embodiment may be referred to in the related descriptions of other embodiments. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and the beneficial effects of the computer readable storage medium, the electronic device and the corresponding units described above may refer to the description of the positioning method of the identification code in the above embodiment, which is not repeated herein.
The foregoing describes in detail a positioning method for an identification code, an electronic device and a computer readable storage medium provided in the embodiments of the present application, and specific examples are applied to illustrate principles and implementations of the present application, where the foregoing description of the embodiments is only for helping to understand the method and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (7)

1. A method of locating an identification code, the method comprising:
extracting candidate areas of the identification code images to be positioned to obtain reference code candidate areas of the identification code images;
extracting line segments based on the identification code image to obtain a line segment set of the identification code image;
performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image;
integrating the reference code candidate region and the target included angle to obtain a parallelogram of the reference code candidate region;
outputting a target reference code region of the identification code image based on the parallelogram;
The step of extracting line segments based on the identification code image to obtain a line segment set of the identification code image comprises the following steps:
performing edge detection on the identification code image to obtain a preliminary line segment of the identification code image;
performing inflection point detection based on a first line segment between a starting point of the preliminary line segment and a midpoint of the preliminary line segment to obtain a first inflection point result of the preliminary line segment;
performing inflection point detection based on a second line segment between the end point of the preliminary line segment and the midpoint of the preliminary line segment to obtain a second inflection point result of the preliminary line segment;
updating the end points of the preliminary line segment based on the first inflection point result and the second inflection point result to obtain an updated line segment serving as a target line segment;
obtaining a line segment set of the identification code image based on the target line segment;
and performing included angle fitting based on the line segment set to obtain each target included angle of the identification code image, wherein the included angle fitting comprises the following steps:
carrying out line segment combination based on the line segment set to obtain a first line segment to be fitted and a second line segment to be fitted, wherein the first line segment to be fitted points to a first end point from a first start end point, and the second line segment to be fitted points to a second end point from a second start end point;
Acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted, wherein the first distance is the distance between the first ending endpoint and the second starting endpoint, and the second distance is the distance between the second ending endpoint and the first starting endpoint;
if the first distance is smaller than the second distance and the first distance is smaller than a first preset length, an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted are used as target included angles of the identification code image;
the method further comprises the steps of:
and if the second distance is smaller than the first distance and the second distance is smaller than a first preset length, taking an intersection point between the first line segment to be fitted and the second line segment to be fitted and an included angle formed by the first line segment to be fitted and the second line segment to be fitted as a target included angle of the identification code image.
2. The method for locating an identification code according to claim 1, wherein the obtaining a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted includes:
Detecting the length of the pixel point of the first line segment to be fitted and the length of the pixel point of the second line segment to be fitted;
and if the length of the pixel points of the first line segment to be fitted is larger than a second preset length and the length of the pixel points of the second line segment to be fitted is larger than a second preset length, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
3. The method for locating an identification code according to claim 1, wherein the obtaining a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted includes:
detecting an angle between the first line segment to be fitted and the second line segment to be fitted;
and if the angle between the first line segment to be fitted and the second line segment to be fitted is in a preset angle range, acquiring a first distance and a second distance based on the first line segment to be fitted and the second line segment to be fitted.
4. The method for locating an identification code according to claim 1, wherein the step of extracting candidate areas of the identification code image to be located to obtain candidate areas of the reference code of the identification code image comprises the steps of:
Performing binarization processing on an identification code image to be positioned to obtain a binarization image of the identification code image;
performing expansion operation on the binarization map to obtain an expansion map of the binarization map;
and obtaining a reference code candidate region of the identification code image based on the region of the continuous target pixel point in the expansion map, wherein the target pixel point is a pixel point with a gray value meeting a preset condition.
5. The method for locating an identification code according to claim 4, wherein said performing an expansion operation on said binary map to obtain an expansion map of said binary map comprises:
acquiring a preset structural element;
acquiring an occupied pixel point of the structural element when the origin of the structural element is in a target pixel point in the binarization graph;
and obtaining an expansion map of the binarization map based on the occupied pixel points.
6. An electronic device comprising a processor and a memory, the memory having stored therein a computer program, the processor executing the method of locating an identification code according to any one of claims 1 to 5 when the computer program in the memory is invoked by the processor.
7. A computer-readable storage medium, on which a computer program is stored, the computer program being loaded by a processor to perform the method of locating an identification code according to any one of claims 1 to 5.
CN202311466383.7A 2023-11-07 2023-11-07 Identification code positioning method, electronic equipment and storage medium Active CN117197422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311466383.7A CN117197422B (en) 2023-11-07 2023-11-07 Identification code positioning method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311466383.7A CN117197422B (en) 2023-11-07 2023-11-07 Identification code positioning method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117197422A CN117197422A (en) 2023-12-08
CN117197422B true CN117197422B (en) 2024-03-26

Family

ID=88985389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311466383.7A Active CN117197422B (en) 2023-11-07 2023-11-07 Identification code positioning method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117197422B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485183A (en) * 2016-07-14 2017-03-08 深圳市华汉伟业科技有限公司 A kind of Quick Response Code localization method and system
CN106980851A (en) * 2017-03-21 2017-07-25 浙江华睿科技有限公司 A kind of localization method and device of data matrix DM codes
CN109086644A (en) * 2018-07-27 2018-12-25 广东奥普特科技股份有限公司 The localization method of the DataMatrix two dimensional code of sub-pixel precision
CN113591507A (en) * 2021-08-03 2021-11-02 深圳市杰恩世智能科技有限公司 Robust two-dimensional code DataMatrix positioning method and system
JP2021184141A (en) * 2020-05-21 2021-12-02 国立大学法人 鹿児島大学 Code decoding device, code decoding method, and program
CN116976372A (en) * 2023-06-05 2023-10-31 深圳优艾智合机器人科技有限公司 Picture identification method, device, equipment and medium based on square reference code

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6337811B2 (en) * 2015-03-17 2018-06-06 トヨタ自動車株式会社 Image processing apparatus and image processing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485183A (en) * 2016-07-14 2017-03-08 深圳市华汉伟业科技有限公司 A kind of Quick Response Code localization method and system
CN106980851A (en) * 2017-03-21 2017-07-25 浙江华睿科技有限公司 A kind of localization method and device of data matrix DM codes
CN109086644A (en) * 2018-07-27 2018-12-25 广东奥普特科技股份有限公司 The localization method of the DataMatrix two dimensional code of sub-pixel precision
JP2021184141A (en) * 2020-05-21 2021-12-02 国立大学法人 鹿児島大学 Code decoding device, code decoding method, and program
CN113591507A (en) * 2021-08-03 2021-11-02 深圳市杰恩世智能科技有限公司 Robust two-dimensional code DataMatrix positioning method and system
CN116976372A (en) * 2023-06-05 2023-10-31 深圳优艾智合机器人科技有限公司 Picture identification method, device, equipment and medium based on square reference code

Also Published As

Publication number Publication date
CN117197422A (en) 2023-12-08

Similar Documents

Publication Publication Date Title
CN110046529B (en) Two-dimensional code identification method, device and equipment
CN108875723B (en) Object detection method, device and system and storage medium
US10043090B2 (en) Information processing device, information processing method, computer-readable recording medium, and inspection system
JP2018524732A (en) Semi-automatic image segmentation
JP2014056572A (en) Template matching with histogram of gradient orientations
CN109784250B (en) Positioning method and device of automatic guide trolley
CN110717489A (en) Method and device for identifying character area of OSD (on screen display) and storage medium
CN109447117B (en) Double-layer license plate recognition method and device, computer equipment and storage medium
KR102471124B1 (en) System and method for detecting fiducial mark on the pcb
CN112926531A (en) Feature information extraction method, model training method and device and electronic equipment
CN113129298B (en) Method for identifying definition of text image
CN117197422B (en) Identification code positioning method, electronic equipment and storage medium
CN110334560B (en) Two-dimensional code positioning method and device
CN112560856A (en) License plate detection and identification method, device, equipment and storage medium
CN109871779B (en) Palm print identification method and electronic equipment
US10977527B2 (en) Method and apparatus for detecting door image by using machine learning algorithm
JPH08190690A (en) Method for determining number plate
JP2010191767A (en) Device and method for recognizing character
CN114399657A (en) Vehicle detection model training method and device, vehicle detection method and electronic equipment
US10832415B2 (en) Window image detection method and device
EP4336444A1 (en) Corner detection method and apparatus
CN113807293B (en) Deceleration strip detection method, deceleration strip detection system, deceleration strip detection equipment and computer readable storage medium
JP7382479B1 (en) Image processing device, program, and image processing method
US11836218B2 (en) System and method for object detection and dimensioning
CN109284665B (en) Method and apparatus for reducing number of detection candidates for object recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant