CN116843748B - Remote two-dimensional code and object space pose acquisition method and system thereof - Google Patents

Remote two-dimensional code and object space pose acquisition method and system thereof Download PDF

Info

Publication number
CN116843748B
CN116843748B CN202311121761.8A CN202311121761A CN116843748B CN 116843748 B CN116843748 B CN 116843748B CN 202311121761 A CN202311121761 A CN 202311121761A CN 116843748 B CN116843748 B CN 116843748B
Authority
CN
China
Prior art keywords
remote
code
dimensional code
positioning
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311121761.8A
Other languages
Chinese (zh)
Other versions
CN116843748A (en
Inventor
刘洋洋
赵越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xiangong Intelligent Technology Co ltd
Original Assignee
Shanghai Xiangong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xiangong Intelligent Technology Co ltd filed Critical Shanghai Xiangong Intelligent Technology Co ltd
Priority to CN202311121761.8A priority Critical patent/CN116843748B/en
Publication of CN116843748A publication Critical patent/CN116843748A/en
Application granted granted Critical
Publication of CN116843748B publication Critical patent/CN116843748B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06037Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application provides a remote two-dimensional code and a method and a system for acquiring the space pose of an object of the remote two-dimensional code, wherein the method comprises the following steps: step S100, performing binarization processing on an image containing remote two-dimensional code information, and performing edge extraction; step S200, carrying out ellipse fitting detection according to the extracted edge information to obtain an original code and a circle center coordinate thereof; step S300, positioning code distribution positions corresponding to each remote two-dimensional code in the original code are positioned according to the geometric relation of the positioning codes; step S400 calculates the space pose of the corresponding remote two-dimensional code under the camera coordinate system according to the center coordinates of the positioning code in the image. Therefore, the recognition accuracy of the two-dimensional code under the conditions of long-distance and multi-angle detection is ensured, and the spatial pose relation of the camera relative to the long-distance two-dimensional code can be accurately solved based on the long-distance two-dimensional code.

Description

Remote two-dimensional code and object space pose acquisition method and system thereof
Technical Field
The application relates to an indoor space positioning technology, in particular to a remote two-dimensional code and an object space pose acquisition method and system thereof.
Background
With the increase of application scenes of industrial robots, navigation modes deployed in different scenes are different. Common traditional positioning modes include laser map building positioning, magnetic stripe positioning, radio frequency positioning, reflecting column positioning and the like. However, due to the large scene change in the industrial environment, high positioning accuracy and other requirements, the traditional positioning mode has great limitation.
In recent years, along with the popularization of two-dimension codes, in the industrial field, a first person tries to detect and estimate the space pose of an object based on the two-dimension code, as the prior art has proposed a method for acquiring the space pose of the object based on the two-dimension code (chinese patent application publication No. CN 112766008A), which acquires a real-time image of a target through a monocular camera, inputs the real-time image into a processor, then performs binarization processing on the real-time image, finds out all the two-dimension codes in the image, and acquires the position coordinate information of the two-dimension code of the real-time image; carrying out homography on the position coordinates of the two-dimensional code to obtain a homography transformation matrix of the two-dimensional code, and obtaining Euler angles and space positions of the two-dimensional code based on the homography transformation matrix; finally, recording the relative position relation of each two-dimensional code, converting Euler angles of each two-dimensional code into shaft angles, fusing coordinate systems of each two-dimensional code, and corresponding the fused coordinate systems to a space coordinate system; and acquiring the pose on the fused coordinate system based on the recognized current pose of the two-dimensional code, and obtaining the pose of the object under the space coordinate system.
However, such two-dimensional code space pose estimation schemes are not ideal in practical applications. The inventor finds that if the two-dimensional code is arranged in a scene with a high height such as an indoor top, the recognition and detection of the traditional two-dimensional code can become very difficult due to the influence of factors such as the position change of the detection visual angle and the like caused by the fact that the detection distance is far.
For this reason, the inventor considers that, because such a scheme mostly adopts a traditional two-dimensional code similar to a pixel square as an information carrier, only a short-distance detection effect is considered at first, but the two-dimensional code is not suitable for long-distance multi-angle detection, and once the two-dimensional code is far from a detection camera and the angle is changed, the two-dimensional code is easy to form a block group which is blurred in a detection image shot by the camera, so that accurate identification cannot be performed.
Disclosure of Invention
Therefore, the application mainly aims to provide a remote two-dimensional code and a method and a system for acquiring the space pose of an object thereof, so as to ensure the recognition accuracy of the two-dimensional code under the conditions of remote and multi-angle detection, and the space pose relation of a camera relative to the remote two-dimensional code can be accurately solved based on the remote two-dimensional code.
In order to achieve the above object, according to one aspect of the present application, there is provided a remote two-dimensional code comprising: the positioning code is formed by arranging a plurality of direction points around a central point in a plurality of non-intersecting directions, a plurality of quadrant areas are divided at included angles of each adjacent direction point, and the information code is arranged in the corresponding quadrant areas according to a preset code table so as to form a two-dimensional lattice with the positioning code.
In a possible preferred embodiment, at least one of the plurality of direction points is arranged at a different distance from the center point than the other secondary direction points, and each secondary direction point is arranged at an equal distance from the center point.
In a possible preferred embodiment, the direction point, the center point and the information point are any one of a circle, an ellipse, a circular ring and an elliptical ring, and are all provided with a reflective layer on the surface.
In a possible preferred embodiment, the positioning code is formed by arranging a plurality of direction points around a central point according to four directions of southeast, northwest and northwest so as to divide four quadrant areas at the included angles of each adjacent direction point, the information code is arranged in the corresponding quadrant areas according to a preset code table so as to form a two-dimensional lattice with the positioning code, wherein the arrangement distance between at least one main direction point and the central point in the direction points is half of that between other auxiliary direction points, and the arrangement distance between each auxiliary direction point and the central point is equal.
In order to achieve the above object, according to a second aspect of the present application, there is further provided a method for acquiring a spatial pose of an object based on a remote two-dimensional code, including the steps of:
step S100, performing binarization processing on an image containing remote two-dimensional code information, and performing edge extraction;
step S200, carrying out ellipse fitting detection according to the extracted edge information to obtain an original code and a circle center coordinate thereof;
step S300, positioning code distribution positions corresponding to each remote two-dimensional code in the original code are positioned according to the geometric relation of the positioning codes;
step S400 calculates the space pose of the corresponding remote two-dimensional code under the camera coordinate system according to the center coordinates of the positioning code in the image.
In a possibly preferred embodiment, the step of calculating the spatial pose of the remote two-dimensional code in the camera coordinate system in step S400 includes:
step S410 sets the center coordinates of the bit code asEstablishing homogeneous matrix
Solving a homography matrix H based on SVD method, wherein u, v represent the pixel coordinates of the circle center of the positioning code,Representing coordinates of points under a remote two-dimensional code coordinate system, wherein s is an equivalent distance scale factor;
step S420 is to convert the relationship between the homography matrix and the remote two-dimensional code in the camera coordinate system
Obtaining rotation and translation matrix, wherein P is camera projection matrix, E is truncated extrinsic matrix,, respectively camerasDirection and directionFocal length in the direction of the beam,as the coordinates of the center point of the camera,for rotating the first two columns in the matrix,, , respectively representing the positions of the centers of the remote two-dimensional codes under the camera coordinate system,is a 3 x 3 homography projection matrix.
In a possibly preferred embodiment, the method further comprises:
step 430 solves the minimized error function by iterative optimization method
To optimize the rotation and translation matrix, where R represents rotation and t represents translation
Representing the i-th element in the translation vector,representing the ith column in the rotation matrix; wherein the method comprises the steps ofRespectively represent the first in the imageCoordinates of circle centers of remote two-dimensional codes projected to camera normalized coordinate systemThe center coordinates of all positioning codes are represented;respectively represent the first two-dimension codes in the long-distance two-dimension codesThe center space coordinates of the positioning codes are converted into coordinates of points in the normalized plane of the camera according to the rotation translation matrixRepresenting the spatial coordinates of the center of a circle of each positioning code in a remote two-dimensional code coordinate system
Wherein in the solving process by
As a constraint.
In a possible preferred embodiment, the method for acquiring the spatial pose of the object based on the remote two-dimensional code further includes:
step S500 is to constrain the spatial pose obtained in step S400 according to time filtering and prior pose constraint, wherein the time filtering step comprises the following steps: when the space pose result is judged to have mutation, filtering; the prior pose constraint step comprises the following steps: and when the remote two-dimensional code coordinate system and the camera coordinate system in the space pose result are judged to be non-parallel, filtering.
In a possibly preferred embodiment, step S200 further includes:
judging whether the major axis and minor axis ratio of each original code accords with a threshold value according to the major axis and minor axis formulas of the elliptic equation, and filtering when the major axis and minor axis ratio of each original code do not accord with the threshold value.
In order to achieve the above object, corresponding to the above method, according to a third aspect of the present application, there is further provided an object space pose acquisition system based on a remote two-dimensional code, including:
the storage unit is used for storing a program comprising the steps of the object space pose acquisition method based on the remote two-dimensional code, so that the control unit and the processing unit can timely adjust and execute the program;
the control unit is used for controlling the infrared camera to shoot an image to be processed containing the remote two-dimensional code;
the processing unit is used for carrying out binarization processing on the image to be processed and then carrying out edge extraction; then carrying out ellipse fitting detection according to the extracted edge information to obtain an original code and a circle center coordinate thereof; positioning code distribution positions of the remote two-dimensional codes in the original codes are positioned according to the geometric relation of the positioning codes preset by the remote two-dimensional codes, so that information codes in the original codes are screened out, and ID information of the remote two-dimensional codes is obtained according to a code table; when the remote two-dimensional code under the ID is judged to be first appeared, the space pose of the remote two-dimensional code under the camera coordinate system is calculated according to the center coordinates of the positioning code in the image, and the corresponding ID information is stored in the storage unit.
According to the remote two-dimensional code and the object space pose acquisition method and system thereof, provided by the application, the characteristic that even though the circular geometric shape changes along with the visual angle of a camera in space, affine transformation is generated, the affine transformation only becomes elliptical, the circle center still exists, and the circular shape still has the circular/elliptical robust characteristic even under the condition that remote detection is easy to blur, the two-dimensional code suitable for remote detection is designed, meanwhile, the elliptical circle center can be very accurately obtained through elliptical detection according to the characteristics of the remote two-dimensional code, so that a pose transformation matrix of the camera is established, and the problem that the two-dimensional code in a shot detection image is easy to appear a fuzzy block in a lump when the distance and the angle of a traditional two-dimensional code and the detection camera are far is transformed is solved, and accurate identification cannot be performed is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application. In the drawings:
fig. 1 is a schematic diagram of a remote two-dimensional code of the present application adhered to a ceiling in an indoor environment for indoor positioning scenes;
FIG. 2 is a schematic diagram of a remote two-dimensional code lattice structure, wherein a ring part is a routable position of an information code;
FIG. 3 is a schematic diagram of a remote two-dimensional code communication table according to the present application;
FIG. 4 is a schematic diagram of the geometric position relationship of the remote two-dimensional code according to the present application;
fig. 5 is a schematic diagram of a remote two-dimensional code of the present application photographed by an infrared camera;
fig. 6 is a schematic diagram of steps of a method for acquiring an object space pose based on a remote two-dimensional code;
fig. 7 is a schematic diagram of a binarized image in the object space pose acquisition method based on the remote two-dimensional code according to the application;
FIG. 8 is a schematic diagram of the singular phenomenon of pose;
fig. 9 is a schematic structural diagram of an object space pose acquisition system based on a remote two-dimensional code.
Detailed Description
In order that those skilled in the art can better understand the technical solutions of the present application, the following description will clearly and completely describe the specific technical solutions of the present application in conjunction with the embodiments to help those skilled in the art to further understand the present application. It will be apparent that the embodiments described herein are merely some, but not all embodiments of the application. It should be noted that embodiments of the present application and features of embodiments may be combined with each other by those of ordinary skill in the art without departing from the spirit of the present application and without conflicting with each other. All other embodiments, which are derived from the embodiments herein without creative effort for a person skilled in the art, shall fall within the disclosure and the protection scope of the present application.
Furthermore, the terms "first," "second," "S100," "S200," and the like in the description and in the claims and drawings are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those described herein. Also, the terms "comprising" and "having" and any variations thereof herein are intended to cover a non-exclusive inclusion. Unless specifically stated or limited otherwise, the terms "disposed," "configured," "mounted," "connected," "coupled" and "connected" are to be construed broadly, e.g., as being either permanently connected, removably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the terms in this case will be understood by those skilled in the art in view of the specific circumstances and in combination with the prior art.
Referring to fig. 1, considering that when the indoor two-dimensional code positioning scheme is established, if the two-dimensional code is attached to an upper space of an indoor ceiling, which is far away from the plasma, when a camera of a moving robot is shooting the two-dimensional code, a technical defect that the two-dimensional code cannot be accurately identified occurs due to distance and angle transformation factors.
For this reason, referring to fig. 2 to 5, in order to ensure the recognition accuracy of the two-dimensional code under the conditions of long-distance and multi-angle detection, the present application provides a long-distance two-dimensional code, which includes: the positioning code is formed by arranging a plurality of direction points around a central point in a plurality of non-intersecting directions, a plurality of quadrant areas are divided at included angles of each adjacent direction point, and the information code is arranged in the corresponding quadrant areas according to a preset code table so as to form a two-dimensional lattice with the positioning code.
In particular, the inventors have found that since circles (including ellipses, elliptical rings, circular rings) produce affine transformations when the viewing angle changes, which are exhibited in the form of ellipses in the camera plane, their ability to cope with the viewing angle transformations is relatively weak. While the two-dimensional code of the traditional square pixels has stronger robustness under the condition of different visual angles of the corner points, the weakness is that the corner points are easy to deviate under the long-distance visual angles, and based on the discovery, the inventor designs the two-dimensional code in a lattice form.
As shown in fig. 2 to 4, in this example, the positioning code is formed by arranging a plurality of direction points around a central point according to four directions of southeast, southwest and northwest, so as to divide four quadrant areas as shown in fig. 4 at the included angles of each adjacent direction point, while the information code is arranged in the corresponding quadrant areas according to a preset code table as shown in fig. 2, so as to form a two-dimensional lattice as shown in fig. 3 with the positioning code, and in this example, the direction point, the central point and the information point are preferably any one of a circle, an ellipse, an elliptical ring and a circular ring.
Through the arrangement, the remote two-dimensional code has only one basic pattern (round dot), so that the difficulty in processing and manufacturing the two-dimensional code is reduced. Meanwhile, in the design of the dot matrix, the arrangement modes of any two-dimensional codes have great difference, so that the probability of false detection can be reduced, and different height distances can be supported.
In addition, in order to enhance the arrangement feature of the positioning code, as shown in fig. 4, at least one main direction point (such as a point a in fig. 4) of the plurality of direction points is disposed at a different distance from the center point B than other auxiliary direction points, for example, a is spaced apart from B by half of C, D, F. And the setting distance between each auxiliary direction point and the center point is equal. Therefore, the method is used as one of constraints in the identification of the remote two-dimensional code, and the identification accuracy can be further improved.
Further, considering the environmental light factor and the bottom color of the indoor ceiling, the influence of various comprehensive factors such as suspended interference objects and the like can form interference on the image shot by the camera so as to influence the recognition rate of the remote two-dimensional code, in this example, a reflective layer is preferably arranged on the surface of the positioning code and the information surface of the remote two-dimensional code, if bright silver reflective cloth is adopted, the corresponding camera is preferably arranged as an infrared camera, so that an infrared filter is additionally arranged between the camera lens and the CMOS, infrared light in a certain wave band can pass through, visible light and ultraviolet light can be absorbed or reflected, and an infrared lamp can be optionally added for the camera. After the arrangement, as shown in fig. 5, the infrared camera can easily capture images obviously containing remote two-dimensional code information.
On the other hand, as shown in fig. 6, the application further provides a method for acquiring the space pose of the object based on the remote two-dimensional code, which comprises the following steps:
step S100 is to obtain an image containing remote two-dimension code information, binarize the image, and then extract edges.
Specifically, according to the above example of the remote two-dimensional code, when the infrared camera shoots an image containing the remote two-dimensional code, the obtained image is shown in fig. 5, and the rest parts except the dots of the positioning code and the information code are white, and at this time, the pixel values of the image can be inverted. And binarizing the inverted image to convert the gray-scale image into a black-and-white image, as shown in fig. 7.
And then, extracting edges from the black-and-white image by using a Canny edge detection algorithm. Due to the particularity of the infrared camera, the edges in the black-and-white image are fewer, and the edge extraction process can be greatly accelerated.
And step S200, carrying out ellipse fitting detection according to the extracted edge information to obtain an original code and a circle center coordinate thereof.
Specifically, a circle is a special shape of an ellipse, and points in a remote two-dimensional code are generally approximate to an ellipse in an imaging plane due to different viewing angles. Ellipse fitting is commonly used for giving a group of image measurement data from a certain ellipse in the problems of feature extraction, scene modeling, camera calibration and the likeAll edge points are grouped, including all the edge points that are elliptical in origin.Representing each edgeThe coordinates of a point consist of pixel locations, where N represents the number of pixel points that make up the edge. Image point edge point information is used herein.
Due to the unavoidable errors in the measured data, the problem becomes to recover its corresponding elliptical information from the data with errors. Equation of ellipseThe method can be written as follows: wherein the method comprises the steps ofA set of parameters representing an ellipse. The ellipse parameters are 6 in total.
Because the edge points of each ellipse are on the ellipse, all edge points are on the ellipse. Elliptic fitting translates into a linear least squares problem:
to ensure that the least squares solution must be elliptical, constraints must be added
Wherein:
thereby constructing a nonlinear optimization problem with constraints:
for this nonlinear optimization problem, the corresponding elliptic parameter information can be solved using the column-temperature-marquardt algorithm.
Based on the ellipse detection scheme, after all the edge information is extracted, each edge information is respectively subjected to ellipse fitting based on least square. And further obtain ellipse fitting information of each black, namely the original code, wherein the general equation of the ellipse is as follows. Wherein A, B, C, D and E are parameters of an elliptic equation, and are obtained by solving in an elliptic fitting process.
By elliptical parameters. And then the coordinates of the geometric center of the ellipse are obtained as follows:
wherein the major axis of the ellipseAnd the length of the short axis b is respectively:
further, due to noise in the image or due to the influence of other reflective materials, each elliptical information needs to be filtered to find the most circular elliptical equation.
For this purpose, step S200 further comprises: judging whether the major axis and minor axis ratio of each original code accords with a threshold value according to the major axis and minor axis formulas of the elliptic equation, and filtering when the major axis and minor axis ratio of each original code do not accord with the threshold value. For example, when the ratio between the major axis and the minor axis is smaller than 1.3, the ellipse (original code) is considered to be non-conforming to the shape of a circle, and filtering is performed. Meanwhile, in order to eliminate the influence of other noise points, the radiuses of all the dots belonging to the same remote two-dimensional code should be in the same range, so that all the original codes can be further filtered according to the condition to rapidly filter out unqualified original codes.
Step S300 is to locate the distribution positions of the positioning codes corresponding to the remote two-dimensional codes in the original codes according to the geometric relationship of the positioning codes.
Specifically, after all the original codes possibly forming the remote two-dimensional code are obtained, the remote two-dimensional code is formed by the positioning code and the information code as described in the above example. The positioning code consists of five points, namely a fixed template, as shown in fig. 4, so that the distance between one circle B in the center and the nearest circle A is exactly the distance between the other three circles C, D, F and the circles B. Based on the existence of the example geometric relationship, the position information of each remote two-dimensional code in the image can be rapidly positioned.
Therefore, the grouping of each remote two-dimensional code is obtained. The classification method can greatly accelerate the positioning process of the remote two-dimensional codes in the image, and meanwhile, as the template matching is similar, if a plurality of remote two-dimensional codes appear in the image, the remote two-dimensional codes can be distinguished without an additional clustering process.
In addition, after all the remote two-dimensional codes in one image are grouped, the position of the positioning code of each remote two-dimensional code in the image is obtained. For further verification, the circularity of all positioning codes is checked, ensuring that each circle has the same length of the major and minor axes. The distribution condition of the information codes in the remote two-dimensional codes can be obtained rapidly through the space positions of the positioning codes.
The information code is provided with the ID information of the remote two-dimensional code. Therefore, each remote two-dimensional code has different information codes. After the position of the positioning code is determined, all information codes of the remote two-dimensional code can be obtained by calculating the black-white condition of the pixel blocks in each quadrant. The prior code table determines the space position information of each remote two-dimensional code dot, so that the ID information of each remote two-dimensional code can be determined by comparing the code table.
The method is used for judging whether the remote two-dimensional code under the ID appears for the first time, if not, the calculated space pose matrix of the remote two-dimensional code under the camera coordinate system exists, and the calculated space pose matrix can be directly called; if the position is the first time, the next step can be executed, and the space pose of the remote two-dimensional code under the camera coordinate system is calculated. Therefore, calculation force is saved, and meanwhile, if the position of the remote two-dimensional code in the environment is recorded, the ID identification can also be used as the positioning of the mobile robot in the map.
Step S400 calculates the space pose of the corresponding remote two-dimensional code under the camera coordinate system according to the center coordinates of the positioning code in the image.
Specifically, in the first step, the coordinates of the inputted label (remote two-dimensional code) coordinate system and the coordinates of the pixel (image) are determined.
Such as: calculating a 3 x 3 homography matrix, taking 2D dots in homogeneous coordinates from the tag coordinate system (whereThe label is positioned in the center of the remote two-dimensional codeAndextending one unit in the direction) to a 2D image coordinate system. Homographies are calculated using a Direct Linear Transformation (DLT) algorithm. Note that since homography projection points are in homogeneous coordinates, they are only scaledAnd (5) defining.
And secondly, establishing the relationship between the homography matrix and the conversion matrix.
Calculating the position and direction of the remote two-dimensional code, namely an external reference matrix, requires additional information: the focal length of the camera and the physical size of the tag. Where the 3 x 3 homography matrix (calculated by DLT) can be written as the product of the 3 x 4 camera projection matrix P (assuming it is known) and the 4 x 3 truncated outlier matrix E.
While the extrinsic matrix is typically 4 x 4, but each location on the tag is at z=0 in the tag coordinate system. Therefore, each label coordinate can be rewritten to a two-dimensional homogeneous point with z implicitly zero, and the third column of the extrinsic matrix deleted, forming a truncated extrinsic matrix.
The relationship between the homography matrix and the conversion matrix after establishment is as follows:
wherein P is a camera projection matrix whose rotational component is expressed asE is a truncated extrinsic matrix, representing the translational component asS is an equivalent distance scaling factor.
And thirdly, solving a homography matrix.
The homography of a plane is defined as the projection mapping from one plane to another plane, described by a mathematical expression, i.e. a point on one plane is multiplied by a projection matrix, resulting in a corresponding point on the other plane, the pointIs the center point of the detected ellipse, where u, v denote the pixel coordinates of the center of the ellipse, respectively.Representing the point coordinates in the label coordinate system. The expression by using the homogeneous matrix is as follows:
wherein the projection matrix H is a 3×3 square matrix
Unfolding (10)
Wherein the equation is in the form of a matrix H in which the coefficients of the rows are compared, i.e. the numerator to the right of the equation can be multiplied by a scaling factor at the same time
The accuracy of the result is not affected, so that the projection matrix is not unique, matrix elements can be scaled in equal proportion, and when the matrix H is solved, the unknown number is only 8 but not 9, and only a determined value is given to any non-zero element in the matrix, and other elements can be determined according to the proportion.
Typically, the matrix used will beObtained by setting the value of (1), is noted as. This is of particular interest in some applications in order for the elements h3, h6 to be u, v to take on values of 0 for both x and y. From the foregoing, it can be seen that each time a pair of points on two planes is provided, i.e., simultaneous basis andtwo equations of coordinates require 8 constraints to solve for the projection matrix H, so 4 pairs of points need to be provided to solve for H.
In this example, to ensure the accuracy of pose calculation, a point is added. As known from the ellipse detection example in step S200, each remote two-dimensional code may provide at least five sets of anchor points. The five groups of positioning points provide more sufficient constraint relative to the four pairs of positioning points, so that the precision of pose calculation is ensured.
Fourth, solving the homography matrix based on SVD.
Writing formula (13) as follows
That is to say
Wherein the method comprises the steps of
Five pairs of points on two planesThen the following equation set can be obtained
Wherein the method comprises the steps of
For more intuition, the following is developed
This equation is then solved, and the solution of this matrix typically uses singular value decomposition.
And fifthly, solving a conversion matrix through the homography matrix.
After the H matrix is obtained through SVD decomposition in the fourth step, since the internal reference information of the camera can be obtained through camera calibration, vectors can be respectively calculated through the obtained homography matrix and the camera position relation, namely formula (21). Due to orthogonality of the rotation matrix. Obtaining a conversion matrix of each remote two-dimensional code relative to a camera coordinate systemFurther, a rotation and translation matrix can be obtained.
In addition, the rotation and translation matrices calculated by this method are not globally optimal solutions due to noise effects, and thus result optimization is required by constructing a nonlinear optimization problem for the result.
And sixthly, solving a minimized error function through an iterative optimization method to optimize the rotation and translation matrix.
Because the result obtained by the direct method is greatly affected by image noise, the result needs to be optimized by an iterative optimization method according to the distribution condition of the pixel points and the space points, so that more accurate remote two-dimensional code attitude information is obtained. From a set of spatial pointsAnd corresponding image point sets thereofHere, whereAnd the number of the circle centers of all the positioning codes in the remote two-dimensional code is represented, so that the camera attitude information is determined.
The rotation of the camera relative to the two-dimensional code coordinate system is expressed asTranslation matrix
Under the ideal condition of no influence of noise and other factors, under the condition that the camera is calibrated, the space pointAnd the points projected into the camera normalized planar coordinate system satisfy the following equation,
wherein the method comprises the steps ofRespectively represent the first two-dimension codes in the long-distance two-dimension codesThe center coordinates of the positioning codes are converted into coordinates of points in the normalized plane of the camera according to the rotation translation matrixRepresenting the spatial coordinates of the center of a circle of each positioning code in a remote two-dimensional code coordinate system
The spatial and pixel points do not actually meet the above problem due to the effect of errors, which translates into finding optimal parametersMinimizing error functionWhere R represents rotation and t represents translation. Wherein the method comprises the steps ofRepresenting the i-th element in the translation vector,representing the ith column in the rotation matrix, whereRespectively represent the first in the imageAnd projecting the circle centers of the remote two-dimensional codes to coordinates in a camera normalized coordinate system.Representing the center coordinates of all positioning codes. The minimization error function is:
in the process of minimization, to ensure that R is a rotation matrix, the constraint conditions are:
the optimization function becomes
For the nonlinear optimization problem, a column-wise Winberg-Marquardt algorithm is used to solve the corresponding accurate rotation and translation matrix.
Further, as shown in fig. 8, since in actual use, if the remote two-dimensional code is small or viewed from a large distance, a case of singular positions may occur. The problem of singular location belongs to the fundamental nature of the problem. Geometrically, these two poses correspond approximately to the flip of the object around a plane whose normal passes through the line of sight from the camera center to the object center. In such cases, there are typically two pose situations that can be solved by the above scheme, and the resulting re-projection errors of the two solutions are similar, so that the correct pose cannot be selected using the re-projection errors.
There is currently no reliable algorithm that can solve the pose ambiguity problem because it is a natural attribute of the planar target to recover the 3D pose problem. Solving this problem requires more information to constrain the pose of the two-dimensional code.
In the method of this example, the steps further include: step S500 is to constrain the space pose obtained in step S400 according to time filtering and prior pose constraint, wherein the time filtering is to judge whether the pose of the mobile robot is suddenly changed when the mobile robot obtains pose information of the remote two-dimensional code to calculate the current pose of the mobile robot, and once the mobile robot is incorrectly positioned, the pose has jumping conditions, and once the mobile robot is found, the position information of the current remote two-dimensional code can be filtered. The prior pose constraint is to consider that the remote two-dimensional code is stuck on the ceiling, and the coordinate system and the ground, namely the vehicle body, are in parallel planes, so that once the remote two-dimensional code has an error pose, the remote two-dimensional code can be filtered through the constraint.
On the other hand, as shown in fig. 9, corresponding to the above method, the present application further provides an object space pose acquisition system based on a remote two-dimensional code, which includes:
the storage unit is used for storing a program comprising the steps of the object space pose acquisition method based on the remote two-dimensional code, so that the control unit and the processing unit can timely adjust and execute the program;
the control unit is used for controlling the infrared camera to shoot an image to be processed containing the remote two-dimensional code;
the processing unit is used for carrying out binarization processing on the image to be processed and then carrying out edge extraction; then carrying out ellipse fitting detection according to the extracted edge information to obtain an original code and a circle center coordinate thereof; positioning code distribution positions of the remote two-dimensional codes in the original codes are positioned according to the geometric relation of the positioning codes preset by the remote two-dimensional codes, so that information codes in the original codes are screened out, and ID information of the remote two-dimensional codes is obtained according to a code table; when the remote two-dimensional code under the ID is judged to be first appeared, the space pose of the remote two-dimensional code under the camera coordinate system is calculated according to the center coordinates of the positioning code in the image, and the corresponding ID information is stored in the storage unit.
In summary, by the remote two-dimensional code and the object space pose acquisition method and system thereof provided by the application, the characteristic that even though the circular geometric shape changes along with the visual angle of a camera in space, affine transformation is generated and only becomes elliptical, the circle center still exists, and the circular shape still has the circular/elliptical robust characteristic even under the condition that remote detection is easy to blur, the two-dimensional code suitable for remote detection is designed, and meanwhile, the elliptical circle center can be accurately obtained through elliptical detection to establish a pose transformation matrix with the camera according to the characteristics of the remote two-dimensional code, so that the problem that the two-dimensional code in a shot detection image is easy to present a blurred block when the distance and the angle of a traditional two-dimensional code and the detection camera are far is transformed is solved, and accurate identification cannot be performed is solved.
The preferred embodiments of the application disclosed above are intended only to assist in the explanation of the application. The preferred embodiments are not exhaustive or to limit the application to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and the practical application, to thereby enable others skilled in the art to best understand and utilize the application. The application is to be limited only by the following claims and their full scope and equivalents, and any modifications, equivalents, improvements, etc., which fall within the spirit and principles of the application are intended to be included within the scope of the application.
It will be understood by those skilled in the art that the system, apparatus, units and their respective modules provided by the present application can be implemented entirely by logic programming of method steps, in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc., except for implementing the system, apparatus, units and their respective modules provided by the present application in a purely computer readable program code. Therefore, the system, the apparatus, and the respective modules thereof provided by the present application may be regarded as one hardware component, and the modules included therein for implementing various programs may also be regarded as structures within the hardware component; modules for implementing various functions may also be regarded as being either software programs for implementing the methods or structures within hardware components.
Furthermore, all or part of the steps in implementing the methods of the embodiments described above may be implemented by a program, where the program is stored in a storage medium and includes several instructions for causing a single-chip microcomputer, chip or processor (processor) to execute all or part of the steps in the methods of the embodiments of the application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In addition, any combination of various embodiments of the present application may be performed, so long as the concept of the embodiments of the present application is not violated, and the disclosure of the embodiments of the present application should also be considered.

Claims (7)

1. A method for acquiring the space pose of an object based on a remote two-dimensional code comprises the following steps:
step S100, performing binarization processing on an image containing remote two-dimensional code information, and performing edge extraction; wherein long-range two-dimensional code includes: the positioning code is formed by arranging a plurality of direction points around a central point in a plurality of non-intersecting directions, a plurality of quadrant areas are divided at included angles of each adjacent direction point, and the information code is arranged in the corresponding quadrant areas according to a preset code table so as to form a two-dimensional lattice with the positioning code;
step S200, carrying out ellipse fitting detection according to the extracted edge information to obtain an original code containing a positioning code and an information code and a circle center coordinate thereof;
step S300, positioning code distribution positions corresponding to each remote two-dimensional code in the original code are positioned according to the geometric relation of the positioning codes;
step S400, calculating the space pose of the corresponding remote two-dimensional code under a camera coordinate system according to the center coordinates of the positioning code in the image, wherein the steps comprise:
step S410 sets the center coordinates of the bit code asEstablishing homogeneous matrix
Solving a homography matrix H based on SVD method, wherein u, v represent the pixel coordinates of the circle center of the positioning code,Representing coordinates of points under a remote two-dimensional code coordinate system, wherein s is an equivalent distance scale factor;
step S420 is to convert the relationship between the homography matrix and the remote two-dimensional code in the camera coordinate system
Obtaining rotation and translation matrix, wherein P is camera projection matrix, E is truncated extrinsic matrix,, />camera respectively->Focal length in direction and y-direction, +.>,/>For the camera center point coordinates, +.>For rotating the first two columns in the matrix, +.>, />, />Respectively representing the positions of the centers of the remote two-dimensional codes under the camera coordinate system,a homography projection matrix of 3 x 3;
step 430 solves the minimized error function by an iterative optimization method:
to optimize the rotation and translation matrix, where R represents rotation and t represents translation
;/>
Representing the i-th element in the translation vector, < +.>Representing the ith column in the rotation matrix; wherein->Respectively represent +.>Coordinates of circle centers of remote two-dimensional codes projected to camera normalized coordinate system>;/>The center coordinates of all positioning codes are represented; />Respectively represent +.>The spatial coordinates of the centers of the circles of the positioning codes are converted into coordinates of points in the normalized plane of the camera according to the rotation translation matrix>;/>The center space coordinate of each positioning code in the remote two-dimension code coordinate system is represented by +.>
Wherein in the solving process by
As a constraint.
2. The object space pose acquisition method based on the remote two-dimensional code according to claim 1, wherein the steps further comprise:
step S500 is to constrain the spatial pose obtained in step S400 according to time filtering and prior pose constraint, wherein the time filtering step comprises the following steps: when the space pose result is judged to have mutation, filtering; the prior pose constraint step comprises the following steps: and when the remote two-dimensional code coordinate system and the camera coordinate system in the space pose result are judged to be non-parallel, filtering.
3. The method for acquiring the object space pose based on the remote two-dimensional code according to claim 1, wherein the step S200 further comprises:
judging whether the major axis and minor axis ratio of each original code accords with a threshold value according to the major axis and minor axis formulas of the elliptic equation, and filtering when the major axis and minor axis ratio of each original code do not accord with the threshold value.
4. The method for obtaining the object space pose based on the remote two-dimensional code according to claim 1, wherein the setting distance between at least one main direction point and the center point in the plurality of direction points is different from the setting distance between other auxiliary direction points, and the setting distance between each auxiliary direction point and the center point is equal.
5. The method for obtaining the object space pose based on the remote two-dimensional code according to claim 1, wherein the direction point, the center point and the information point are any one of a circle, an ellipse, a circular ring and an elliptical ring, and a reflective layer is arranged on the surface of each of the direction point, the center point and the information point.
6. The method for acquiring the object space pose based on the remote two-dimensional code according to claim 1, wherein the positioning code is formed by arranging a plurality of direction points around a center point according to four directions of southeast, northwest and northwest so as to divide four quadrant areas at included angles of each adjacent direction point, the information code is arranged in the corresponding quadrant areas according to a preset code table so as to form a two-dimensional lattice with the positioning code, wherein the arrangement distance between at least one main direction point and the center point in the direction points is half of the arrangement distance between other auxiliary direction points, and the arrangement distance between each auxiliary direction point and the center point is equal.
7. An object space pose acquisition system based on a remote two-dimensional code, which comprises:
a storage unit, configured to store a program including the steps of the remote two-dimensional code-based object space pose acquisition method according to any one of claims 1 to 6, for the control unit, and for the processing unit to timely retrieve and execute the program;
the control unit is used for controlling the infrared camera to shoot an image to be processed containing the remote two-dimensional code;
the processing unit is used for carrying out binarization processing on the image to be processed and then carrying out edge extraction; then carrying out ellipse fitting detection according to the extracted edge information to obtain an original code and a circle center coordinate thereof; positioning code distribution positions of the remote two-dimensional codes in the original codes are positioned according to the geometric relation of the positioning codes preset by the remote two-dimensional codes, so that information codes in the original codes are screened out, and ID information of the remote two-dimensional codes is obtained according to a code table; when the remote two-dimensional code under the ID is judged to be first appeared, the space pose of the remote two-dimensional code under the camera coordinate system is calculated according to the center coordinates of the positioning code in the image, and the corresponding ID information is stored in the storage unit.
CN202311121761.8A 2023-09-01 2023-09-01 Remote two-dimensional code and object space pose acquisition method and system thereof Active CN116843748B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311121761.8A CN116843748B (en) 2023-09-01 2023-09-01 Remote two-dimensional code and object space pose acquisition method and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311121761.8A CN116843748B (en) 2023-09-01 2023-09-01 Remote two-dimensional code and object space pose acquisition method and system thereof

Publications (2)

Publication Number Publication Date
CN116843748A CN116843748A (en) 2023-10-03
CN116843748B true CN116843748B (en) 2023-11-24

Family

ID=88172922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311121761.8A Active CN116843748B (en) 2023-09-01 2023-09-01 Remote two-dimensional code and object space pose acquisition method and system thereof

Country Status (1)

Country Link
CN (1) CN116843748B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002090118A (en) * 2000-09-19 2002-03-27 Olympus Optical Co Ltd Three-dimensional position and attitude sensing device
CN102735235A (en) * 2012-06-07 2012-10-17 无锡普智联科高新技术有限公司 Indoor mobile robot positioning system and method based on two-dimensional code
KR20140114741A (en) * 2013-03-19 2014-09-29 삼성전자주식회사 Apparatus and method for human pose estimation
CN104808685A (en) * 2015-04-27 2015-07-29 中国科学院长春光学精密机械与物理研究所 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN104833370A (en) * 2014-02-08 2015-08-12 本田技研工业株式会社 System and method for mapping, localization and pose correction
WO2018083510A1 (en) * 2016-11-02 2018-05-11 Precilabs Sa Detector device, positioning code and position detecting method
CN109241807A (en) * 2018-08-17 2019-01-18 湖南大学 A kind of remote two dimensional code localization method
BR102019016252A2 (en) * 2018-08-14 2020-02-18 Canon Kabushiki Kaisha IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD
CN113712665A (en) * 2021-11-01 2021-11-30 北京柏惠维康科技有限公司 Positioning method and device based on positioning marker and computer storage medium
AU2021107375A4 (en) * 2021-08-25 2021-12-16 Total Drain Group Pty Ltd Methods and systems for identifying objects in images
CN115793690A (en) * 2022-12-07 2023-03-14 南方电网电力科技股份有限公司 Indoor inspection method, system and equipment for unmanned aerial vehicle
CN116309829A (en) * 2023-02-28 2023-06-23 无锡赛锐斯医疗器械有限公司 Cuboid scanning body group decoding and pose measuring method based on multi-view vision

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002090118A (en) * 2000-09-19 2002-03-27 Olympus Optical Co Ltd Three-dimensional position and attitude sensing device
CN102735235A (en) * 2012-06-07 2012-10-17 无锡普智联科高新技术有限公司 Indoor mobile robot positioning system and method based on two-dimensional code
KR20140114741A (en) * 2013-03-19 2014-09-29 삼성전자주식회사 Apparatus and method for human pose estimation
CN104833370A (en) * 2014-02-08 2015-08-12 本田技研工业株式会社 System and method for mapping, localization and pose correction
CN104808685A (en) * 2015-04-27 2015-07-29 中国科学院长春光学精密机械与物理研究所 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
WO2018083510A1 (en) * 2016-11-02 2018-05-11 Precilabs Sa Detector device, positioning code and position detecting method
BR102019016252A2 (en) * 2018-08-14 2020-02-18 Canon Kabushiki Kaisha IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD
CN109241807A (en) * 2018-08-17 2019-01-18 湖南大学 A kind of remote two dimensional code localization method
AU2021107375A4 (en) * 2021-08-25 2021-12-16 Total Drain Group Pty Ltd Methods and systems for identifying objects in images
CN113712665A (en) * 2021-11-01 2021-11-30 北京柏惠维康科技有限公司 Positioning method and device based on positioning marker and computer storage medium
CN115793690A (en) * 2022-12-07 2023-03-14 南方电网电力科技股份有限公司 Indoor inspection method, system and equipment for unmanned aerial vehicle
CN116309829A (en) * 2023-02-28 2023-06-23 无锡赛锐斯医疗器械有限公司 Cuboid scanning body group decoding and pose measuring method based on multi-view vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
机场高速无人值守自助缴费车道的建设与应用;刘威等;《中国公路》(第20期);全文 *

Also Published As

Publication number Publication date
CN116843748A (en) 2023-10-03

Similar Documents

Publication Publication Date Title
CN111882612B (en) Vehicle multi-scale positioning method based on three-dimensional laser detection lane line
Geiger et al. Automatic camera and range sensor calibration using a single shot
Wöhler 3D computer vision: efficient methods and applications
EP2678824B1 (en) Determining model parameters based on transforming a model of an object
CN109211207B (en) Screw identification and positioning device based on machine vision
Goshtasby Theory and applications of image registration
US8106968B1 (en) System and method for pattern detection and camera calibration
CN115609591B (en) Visual positioning method and system based on 2D Marker and compound robot
CN112396656B (en) Outdoor mobile robot pose estimation method based on fusion of vision and laser radar
Carr et al. Point-less calibration: Camera parameters from gradient-based alignment to edge images
CN110415304B (en) Vision calibration method and system
Su et al. A novel camera calibration method based on multilevel-edge-fitting ellipse-shaped analytical model
CN113643380A (en) Mechanical arm guiding method based on monocular camera vision target positioning
CN116619358A (en) Self-adaptive positioning optimization and mapping method for autonomous mining robot
Xie et al. A4lidartag: Depth-based fiducial marker for extrinsic calibration of solid-state lidar and camera
CN116843748B (en) Remote two-dimensional code and object space pose acquisition method and system thereof
CN115393196B (en) Infrared multi-sequence image seamless splicing method for unmanned aerial vehicle area array swinging
CN116524041A (en) Camera calibration method, device, equipment and medium
Nguyen et al. Calibbd: Extrinsic calibration of the lidar and camera using a bidirectional neural network
CN115112098A (en) Monocular vision one-dimensional two-dimensional measurement method
CN117589145A (en) Map creation method and system based on remote two-dimensional code
Jende et al. Low-level tie feature extraction of mobile mapping data (mls/images) and aerial imagery
CN117109561A (en) Remote two-dimensional code map creation and positioning method and system integrating laser positioning
CN111179347B (en) Positioning method, positioning equipment and storage medium based on regional characteristics
CN114494316A (en) Corner marking method, parameter calibration method, medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant