CN112651261B - Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system - Google Patents

Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system Download PDF

Info

Publication number
CN112651261B
CN112651261B CN202011614733.6A CN202011614733A CN112651261B CN 112651261 B CN112651261 B CN 112651261B CN 202011614733 A CN202011614733 A CN 202011614733A CN 112651261 B CN112651261 B CN 112651261B
Authority
CN
China
Prior art keywords
coordinate system
target
dimensional code
camera coordinate
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011614733.6A
Other languages
Chinese (zh)
Other versions
CN112651261A (en
Inventor
郭志红
戴志强
金刚
赵严
姚毅
杨艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luster LightTech Co Ltd
Suzhou Luster Vision Intelligent Device Co Ltd
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Original Assignee
Luster LightTech Co Ltd
Suzhou Luster Vision Intelligent Device Co Ltd
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luster LightTech Co Ltd, Suzhou Luster Vision Intelligent Device Co Ltd, Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd filed Critical Luster LightTech Co Ltd
Priority to CN202011614733.6A priority Critical patent/CN112651261B/en
Publication of CN112651261A publication Critical patent/CN112651261A/en
Application granted granted Critical
Publication of CN112651261B publication Critical patent/CN112651261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a calculation method of conversion relation between a high-precision 2D camera coordinate system and a mechanical coordinate system, which is used for acquiring target images to be processed at different preset positions under the camera coordinate system; traversing a two-dimensional code in a target image to be processed; when the number of the two-dimensional codes is more than or equal to one, selecting the two-dimensional code closest to the center point of the camera coordinate system as a target two-dimensional code; extracting information of a target two-dimensional code; determining global target coordinates of all corner points in the target image to be processed; determining a common corner set in target images to be processed at different preset positions according to a global target coordinate system; screening the common corner sets to determine final common corner sets; and establishing a conversion relation between a camera coordinate system and a mechanical coordinate system according to the camera coordinate and the mechanical coordinate corresponding to the final common corner set. Through extracting the final common corner in the target image to be processed, a large number of samples are obtained, the accuracy of the conversion relation between the camera coordinate system and the mechanical coordinate system is improved, and the conversion time is shortened.

Description

Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system
Technical Field
The application relates to the field of industrial vision, in particular to a calculation method for a conversion relation between a high-precision 2D camera coordinate system and a mechanical coordinate system.
Background
In the field of industrial vision, camera calibration is one of important preconditions for carrying out detection, measurement, assembly and other processes, internal and external parameters of a camera can be calculated through camera calibration, the relation between a camera coordinate system and a mechanical coordinate system is established, and under the conditions that the camera coordinate system and the mechanical coordinate system are coplanar and the camera coordinate system and the mechanical coordinate system are vertical coordinate systems, only the relation of rotation, scaling and translation exists between the camera coordinate system and the mechanical coordinate system; thereby measuring and detecting the size, the defect, the position and the like of the product and realizing automatic production.
The method comprises the steps of carrying out camera calibration by selecting marking points of an actual product as fixed features, grabbing the actual product containing the fixed features through a mechanical mechanism, carrying out a certain step number (N > =3) of movement under a camera view, and identifying and positioning the fixed features through a visual algorithm, so that N groups of image coordinates and N groups of mechanical coordinates corresponding to the image coordinates are obtained, and further, the relation between a camera coordinate system and a mechanical coordinate system is obtained.
However, the positioning of the fixed features in the actual product, the imaging quality of the actual product, the capturing accuracy of the fixed features and the small number of samples have an influence on the camera calibration, and even in some cases, the actual product can move out of the camera visual field range, so that the camera calibration process fails, the camera calibration accuracy and efficiency are reduced, and the relationship between the camera coordinate system and the mechanical coordinate system is also inaccurate.
Disclosure of Invention
The application provides a calculation method of a conversion relation between a high-precision 2D camera coordinate system and a mechanical coordinate system, which aims to solve the technical problems of camera calibration failure or low precision caused by inaccurate acquisition of an actual product as a target in the camera calibration process.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
in a first aspect, a method for calculating a conversion relationship between a high-precision 2D camera coordinate system and a mechanical coordinate system is provided, the method comprising:
acquiring target images to be processed at different preset positions in a camera coordinate system, wherein the preset positions comprise a set of at least three points in the camera coordinate system;
traversing the two-dimensional code in the target image to be processed;
when the number of the two-dimensional codes is more than or equal to one, selecting the two-dimensional code closest to the center point of the camera coordinate system as a target two-dimensional code;
extracting information of the target two-dimensional code, wherein the information of the target two-dimensional code comprises position information and text information;
according to the information of the target two-dimensional code, determining global target coordinates of all corner points in the target image to be processed;
determining a common corner set in the target images to be processed at different preset positions according to the global target coordinate system;
screening the common corner sets to determine final common corner sets;
and establishing a conversion relation between the camera coordinate system and the mechanical coordinate system according to the camera coordinate and the mechanical coordinate corresponding to the final common corner set.
In one possible manner, the information of the target two-dimensional code includes position information and text information;
the position information comprises coordinates of the center of the two-dimensional code under the camera coordinate system, whether the two-dimensional code is mirrored or not and the angle of the two-dimensional code;
the text information is the content of the two-dimensional code and comprises the size of a black and white grid in the target image to be processed and global target coordinates corresponding to an origin point at the origin point of the two-dimensional code.
In one possible manner, the at least three points of the preset position under the camera coordinate system include a center point located in the camera coordinate system;
the target image to be processed acquired at the center point of the camera coordinate system is a center target image to be processed;
in one possible manner, the screening the common corner set to determine a final common corner set includes:
dividing the target image to be processed in the center according to a preset area;
determining the contribution number M of the common corner points in the corresponding preset areas according to the preset contribution coefficient of each preset area and the number of the common corner points in each preset area;
acquiring extraction errors of all the common angular points in each preset area;
determining a common corner sequence in each preset area according to the sequence from small extraction errors to large extraction errors;
for each preset region, determining the first M common corner points from the common corner point sequence as final common corner points according to the preset contribution number M;
and determining a final common corner set, wherein the final common corner set comprises the final common corner of each preset area.
In one possible manner, the extracting the error includes:
the target images to be processed at different preset positions are subjected to conversion relation between a global target coordinate system and the camera coordinate system according to global target coordinates of all the corner points;
determining an actual target coordinate according to the conversion relation between the global target coordinate system and the camera coordinate;
and determining an extraction error according to the square sum of the difference values of the global target coordinates and the actual target coordinates.
In one possible manner, the number of contribution M is determined by taking an integer by a further method after multiplying the preset contribution coefficient of each region by the number of common corner points of each region.
In a second aspect, a checkerboard target is provided, which is applied to the calculation method of the conversion relation between the high-precision 2D camera coordinate system and the mechanical coordinate system in the first aspect;
the checkerboard targets are provided with black and white grids with the same size, the black and white grids are arranged in a mutual mode, each preset number of black and white grids are provided with a two-dimensional code at each interval, and the content of the two-dimensional code comprises the overall size of the checkerboard targets, the size of the black and white grids and global target coordinates of origin points corresponding to origin points of the two-dimensional codes.
The application provides a calculation method of conversion relation between a high-precision 2D camera coordinate system and a mechanical coordinate system, which comprises the steps of obtaining target images to be processed at different preset positions under the camera coordinate system, wherein the preset positions comprise a set of at least three points under the camera coordinate system; traversing the two-dimensional code in the target image to be processed; when the number of the two-dimensional codes is more than or equal to one, selecting the two-dimensional code closest to the center point of the camera coordinate system as a target two-dimensional code; extracting information of the target two-dimensional code, wherein the information of the target two-dimensional code comprises position information and text information; according to the information of the target two-dimensional code, determining global target coordinates of all corner points in the target image to be processed; determining a common corner set in the target images to be processed at different preset positions according to the global target coordinate system; screening the common corner sets to determine final common corner sets; and establishing a conversion relation between the camera coordinate system and the mechanical coordinate system according to the camera coordinate and the mechanical coordinate corresponding to the final common corner set. Through extracting the final common corner in the target image to be processed, a large number of samples are obtained, the accuracy of the conversion relation between the camera coordinate system and the mechanical coordinate system is improved, and the conversion time is shortened.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of the conversion relationship between the co-planar camera coordinate system and the mechanical coordinate system of the present application;
FIG. 2 is a schematic diagram of a checkerboard target according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an origin point and an origin point of a two-dimensional code according to an embodiment of the present application;
FIG. 4 is a flowchart of a method for calculating a conversion relationship between a high-precision 2D camera coordinate system and a mechanical coordinate system according to an embodiment of the present application;
FIG. 5 is a flowchart of a method for calculating a conversion relationship between a high-precision 2D camera coordinate system and a mechanical coordinate system according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a camera coordinate system and a global target coordinate system according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a conventional 9-point walk at a preset position and a path thereof in an embodiment of the present application;
FIG. 8 is a schematic diagram of dividing preset areas and contribution coefficients of a target image to be processed in the center according to the embodiment of the application;
wherein: 1-a two-dimensional code origin; 2-origin corner points.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Mechanical mechanisms are widely used in the field of industrial vision in automated production, such as assembly of electronic products, component pick-up, precision welding, and the like. To achieve automated production, precise guidance by means of a camera is often required, i.e. camera coordinates of the fixed features are acquired by the camera and then converted into values in a mechanical coordinate system, so that high-precision automated production is achieved. To convert the camera coordinates into mechanical coordinates, a relationship between a camera coordinate system and the mechanical coordinate system needs to be established, which is called camera calibration, wherein the camera coordinate system is that an origin is positioned at the upper left corner of an image acquired by the camera, the X-axis direction is horizontal to the right, and the Y-axis direction is vertical to the lower; the mechanical coordinate system is that the origin is positioned at the rotation center of the mechanical end actuating mechanism, the X axis and the Y axis are vertical, and the direction is determined by the manufacturer of the mechanical mechanism.
In the camera calibration process, under the conditions that the camera coordinate system and the mechanical coordinate system are coplanar and the camera coordinate system and the mechanical coordinate system are vertical, as shown in fig. 1, only the relationship of rotation, scaling and translation exists between the two coordinate systems. The conversion relation between the camera coordinate system and the mechanical coordinate system is as follows:
in the formula (P) x ,R y ) Is a mechanical coordinate representing the position of the fixed feature in the mechanical coordinate system; (I) x ,I y ) Is camera coordinates representing the position of the fixed feature under the camera coordinate system; (T) x ,T y ) Is the origin of the camera coordinate system and the origin of the mechanical coordinate systemOffset of the point; θ is the angle between the camera coordinate system X-axis and the mechanical coordinate system X-axis; (S) x ,S y ) The actual distance of a pixel in the camera coordinate system in the X and Y directions in the mechanical coordinate system is expressed in pixel/mm; e (E) y Representing the type of the mechanical coordinate system, E when the manipulator is a left-hand coordinate system y A value of 1, E when the manipulator is a right-hand coordinate system y The value is-1.
The formula (1) of the conversion relation is deformed to obtain the following formula:
by 3 sets of corresponding mechanical coordinates (P x ,P y ) And camera coordinates (I) x ,I y ) Obtaining E in the above formula (2) y 、θ、S x 、S y 、T x 、T y I.e. at least 3 sets of camera coordinates and corresponding mechanical coordinates are acquired.
The application is described in further detail below with reference to the attached drawing figures:
the embodiment of the application provides a checkerboard target, which is applied to a calculation method of a conversion relation between a high-precision 2D camera coordinate system and a mechanical coordinate system in the first aspect, and is shown by referring to FIG. 2, wherein the checkerboard target is provided with black and white grids with the same size, the black and white grids are arranged in a mutual mode, a two-dimensional code is arranged in each black and white grid with a preset number at each interval, and the content of the two-dimensional code comprises the overall size of the checkerboard target, the size of the black and white grid and the global target coordinate of an origin point corresponding to the two-dimensional code; as shown in fig. 3, the two-dimensional code origin 1 refers to the intersection point of two sides of the L sides of the two-dimensional code; . The origin point 2 refers to the nearest target point from the origin point of the two-dimension code.
The embodiment of the application also provides a method for calculating the conversion relation between the high-precision 2D camera coordinate system and the mechanical coordinate system, as shown in fig. 4 and 5, the method comprises the following steps:
s100, acquiring target images to be processed at different preset positions in a camera coordinate system, wherein the preset positions comprise a set of at least three points in the camera coordinate system; the at least three points of the preset position under the camera coordinate system comprise a center point positioned in the camera coordinate system.
And taking the target image to be processed acquired at the center point of the camera coordinate system as a center target image to be processed.
And S200, traversing the two-dimensional code in the target image to be processed.
And S300, when the number of the two-dimensional codes is greater than or equal to one, selecting the two-dimensional code closest to the center point of the camera coordinate system as a target two-dimensional code.
S400, extracting information of the target two-dimensional code, wherein the information of the target two-dimensional code comprises position information and text information; the information of the target two-dimensional code comprises position information and text information; the position information comprises coordinates of the center of the two-dimensional code under the camera coordinate system, whether the two-dimensional code is mirrored or not and the angle of the two-dimensional code; the text information is the content of the two-dimensional code and comprises the size of a black and white grid in the target image to be processed and global target coordinates corresponding to the origin point 2 at the origin point of the two-dimensional code.
S500, determining global target coordinates of all corner points in the target image to be processed according to the information of the target two-dimensional code; and according to the global target coordinates of all the corner points, establishing a conversion relation between a global target coordinate system and the camera coordinate system.
As shown in fig. 6, if the global report coordinates of the origin point P0 are 100 and 100, and the actual physical size of the black-white grid is 1X 1mm, determining the axis direction of the global target coordinate system according to the information of the target two-dimensional code, wherein the X-axis direction of the global target coordinate system is consistent with the horizontal side direction of the L side of the two-dimensional code; the Y-axis direction of the global target coordinate system is consistent with the vertical side direction of the L side of the two-dimensional code. Calculating the target coordinates of the P1 corner as 98 and 100; the target coordinates of the P2 corner point are 100, 98. And similarly, global target coordinates of all checkerboard target corner points in the image can be obtained.
After the calculation of the global target coordinates of all the checkered target corner points is completed, the mechanical mechanism drives the checkered target to move to the next preset position, and the steps are repeated until the global target coordinates of all the corner points in the target image to be processed at all the preset positions are determined. According to the requirement of the formula (2), the number of the preset positions is more than or equal to 3, and all the preset positions cannot be on the same straight line. In practical application, a common preset position is a set of 9 points, and the walking path of the preset position is shown in fig. 7.
And S600, determining a common corner set in the target images to be processed at different preset positions according to the global target coordinate system. Taking the preset position shown in fig. 7 as an example, after completing the walking of 0 to 8 total 9 points, a certain amount of checkerboard target angular point data is collected at each point, and the common angular point is extracted according to the global target coordinates in the 9 groups of data.
And S700, screening the common corner point set to determine a final common corner point set. The method specifically comprises the following steps:
dividing the target image to be processed in the center according to a preset area; determining the contribution number M of the common corner points in the corresponding preset areas according to the preset contribution coefficient of each preset area and the number of the common corner points in each preset area; multiplying the preset contribution coefficient of each region by the number of the common angular points of each region, and then taking an integer by a forward method to determine the contribution number M. Acquiring extraction errors of all the common angular points in each preset area; determining a common corner sequence in each preset area according to the sequence from small extraction errors to large extraction errors; for each preset region, determining the first M common corner points from the common corner point sequence as final common corner points according to the preset contribution number M; and determining a final common corner set, wherein the final common corner set comprises the final common corner of each preset area.
As shown in fig. 8, the target image to be processed in the center is divided into 9 preset rectangular areas, the preset contribution coefficient of each preset area is shown in fig. 8, the preset area in the upper left corner is taken as an example, the contribution coefficient is 0.1, 6 common corner points of the area are obtained, the product of the number 6 of the common corner points and the contribution coefficient of 0.1 is used, the result after the product is rounded by a method to obtain the contribution number m=1, and the final common corner point of the area takes 1 with the smallest error in the 6 common corner points.
The extraction error establishes a conversion relation between a global target coordinate system and the camera coordinate system according to global target coordinates of all the corner points through target images to be processed at different preset positions; determining an actual target coordinate according to the conversion relation between the global target coordinate system and the camera coordinate; and determining an extraction error according to the square sum of the difference values of the global target coordinates and the actual target coordinates.
S800, according to the camera coordinates and the mechanical coordinates corresponding to the final common corner set, establishing a conversion relation between the camera coordinate system and the mechanical coordinate system.
And (3) setting the number of the final common corner sets as N, then bringing the camera coordinate values of the final common corner sets into a formula (2), wherein the mechanical coordinate of each position is known, and when the mechanical mechanism walks to each point in the walking process shown in fig. 7, the mechanical coordinate values of the current position of the mechanical mechanism are recorded, so that 2N equation sets can be obtained, and the conversion relation from the final camera coordinate system to the mechanical coordinate system can be obtained by solving the equation, thereby completing the camera calibration.
The application provides a calculation method of conversion relation between a high-precision 2D camera coordinate system and a mechanical coordinate system, which comprises the steps of obtaining target images to be processed at different preset positions under the camera coordinate system, wherein the preset positions comprise a set of at least three points under the camera coordinate system; traversing the two-dimensional code in the target image to be processed; when the number of the two-dimensional codes is more than or equal to one, selecting the two-dimensional code closest to the center point of the camera coordinate system as a target two-dimensional code; extracting information of the target two-dimensional code, wherein the information of the target two-dimensional code comprises position information and text information; according to the information of the target two-dimensional code, determining global target coordinates of all corner points in the target image to be processed; determining a common corner set in the target images to be processed at different preset positions according to the global target coordinate system; screening the common corner sets to determine final common corner sets; and establishing a conversion relation between the camera coordinate system and the mechanical coordinate system according to the camera coordinate and the mechanical coordinate corresponding to the final common corner set. Through extracting the final common corner in the target image to be processed, a large number of samples are obtained, the accuracy of the conversion relation between the camera coordinate system and the mechanical coordinate system is improved, and the conversion time is shortened.
The above is only for illustrating the technical idea of the present application, and the protection scope of the present application is not limited by this, and any modification made on the basis of the technical scheme according to the technical idea of the present application falls within the protection scope of the claims of the present application.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application is not intended to limit the sequence of the processes and methods unless specifically recited in the claims. While certain presently useful embodiments have been discussed in the foregoing disclosure by way of various examples, it is to be understood that such details are for the purpose of illustration only and that the appended claims are not limited to the disclosed embodiments, but rather are intended to cover all modifications and equivalent combinations that fall within the spirit and scope of the embodiments of the present application. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be appreciated that in order to simplify the present disclosure and thereby facilitate an understanding of one or more embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure. This method of disclosure, however, is not intended to imply that more features than are required by the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited herein is hereby incorporated by reference in its entirety. Except for the application history file that is inconsistent or conflicting with this disclosure, the file (currently or later attached to this disclosure) that limits the broadest scope of the claims of this disclosure is also excluded. It is noted that the description, definition, and/or use of the term in the appended claims controls the description, definition, and/or use of the term in this application if there is a discrepancy or conflict between the description, definition, and/or use of the term in the appended claims.

Claims (4)

1. A method for calculating a conversion relationship between a high-precision 2D camera coordinate system and a mechanical coordinate system, the method comprising:
acquiring target images to be processed at different preset positions in a camera coordinate system, wherein the preset positions comprise a set of at least three points in the camera coordinate system, and the at least three points comprise center points positioned in the camera coordinate system; the target image to be processed acquired at the center point of the camera coordinate system is a center target image to be processed;
traversing the two-dimensional code in the target image to be processed;
when the number of the two-dimensional codes is more than or equal to one, selecting the two-dimensional code closest to the center point of the camera coordinate system as a target two-dimensional code;
extracting information of the target two-dimensional code, wherein the information of the target two-dimensional code comprises position information and text information;
according to the information of the target two-dimensional code, determining global target coordinates of all corner points in the target image to be processed;
determining a common corner set in the target images to be processed at different preset positions according to a global target coordinate system;
screening the common corner sets to determine final common corner sets, including: dividing the target image to be processed in the center according to a preset area; determining the contribution number M of the common corner points in the corresponding preset areas according to the preset contribution coefficient of each preset area and the number of the common corner points in each preset area; acquiring extraction errors of all the common angular points in each preset area; determining a common corner sequence in each preset area according to the sequence from small extraction errors to large extraction errors; for each preset region, determining the first M common corner points from the common corner point sequence as final common corner points according to the preset contribution number M; determining a final common corner set, wherein the final common corner set comprises the final common corner of each preset area; wherein the extraction error comprises: the target images to be processed at different preset positions are subjected to conversion relation between a global target coordinate system and the camera coordinate system according to global target coordinates of all the corner points; determining an actual target coordinate according to the conversion relation between the global target coordinate system and the camera coordinate; determining an extraction error according to the sum of squares of the difference values of the global target coordinates and the actual target coordinates;
and establishing a conversion relation between the camera coordinate system and the mechanical coordinate system according to the camera coordinate and the mechanical coordinate corresponding to the final common corner set.
2. The method for calculating the conversion relation between the high-precision 2D camera coordinate system and the mechanical coordinate system according to claim 1, wherein the information of the target two-dimensional code comprises position information and text information;
the position information comprises coordinates of the center of the two-dimensional code under the camera coordinate system, whether the two-dimensional code is mirrored or not and the angle of the two-dimensional code;
the text information is the content of the two-dimensional code and comprises the size of a black and white grid in the target image to be processed and global target coordinates corresponding to an origin point at the origin point of the two-dimensional code.
3. The method for calculating the conversion relation between the high-precision 2D camera coordinate system and the mechanical coordinate system according to claim 1, wherein the number of the contribution M is determined by multiplying the preset contribution coefficient of each region by the number of the common corner points of each region and then taking an integer by a further method.
4. A checkerboard target, applied to the calculation method of the conversion relation between the high-precision 2D camera coordinate system and the mechanical coordinate system according to any one of claims 1 to 3, characterized in that:
the checkerboard targets are provided with black and white grids with the same size, the black and white grids are arranged in a mutual mode, each preset number of black and white grids are provided with a two-dimensional code at each interval, and the content of the two-dimensional code comprises the overall size of the checkerboard targets, the size of the black and white grids and global target coordinates of origin points corresponding to origin points of the two-dimensional codes.
CN202011614733.6A 2020-12-30 2020-12-30 Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system Active CN112651261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011614733.6A CN112651261B (en) 2020-12-30 2020-12-30 Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011614733.6A CN112651261B (en) 2020-12-30 2020-12-30 Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system

Publications (2)

Publication Number Publication Date
CN112651261A CN112651261A (en) 2021-04-13
CN112651261B true CN112651261B (en) 2023-11-10

Family

ID=75364378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011614733.6A Active CN112651261B (en) 2020-12-30 2020-12-30 Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system

Country Status (1)

Country Link
CN (1) CN112651261B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113592803B (en) * 2021-07-27 2024-05-24 凌云光技术股份有限公司 Screw thread turn number measuring method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
CN108765494A (en) * 2018-05-21 2018-11-06 南昌航空大学 A kind of polyphaser scaling method for demarcating object based on cylinder
KR20190051463A (en) * 2017-11-07 2019-05-15 현대모비스 주식회사 Apparatus and method for detecting checkerboard corner point for camera calibration
CN110148187A (en) * 2019-06-04 2019-08-20 郑州大学 A kind of the high-precision hand and eye calibrating method and system of SCARA manipulator Eye-in-Hand
CN110163912A (en) * 2019-04-29 2019-08-23 达泊(东莞)智能科技有限公司 Two dimensional code pose scaling method, apparatus and system
CN110415304A (en) * 2019-07-31 2019-11-05 北京博视智动技术有限公司 A kind of vision calibration method and system
CN111699513A (en) * 2018-07-16 2020-09-22 深圳配天智能技术研究院有限公司 Calibration plate, internal parameter calibration method, machine vision system and storage device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9940717B2 (en) * 2014-12-23 2018-04-10 Intel Corporation Method and system of geometric camera self-calibration quality assessment
JP6541590B2 (en) * 2016-02-03 2019-07-10 クラリオン株式会社 Camera calibration device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
KR20190051463A (en) * 2017-11-07 2019-05-15 현대모비스 주식회사 Apparatus and method for detecting checkerboard corner point for camera calibration
CN108765494A (en) * 2018-05-21 2018-11-06 南昌航空大学 A kind of polyphaser scaling method for demarcating object based on cylinder
CN111699513A (en) * 2018-07-16 2020-09-22 深圳配天智能技术研究院有限公司 Calibration plate, internal parameter calibration method, machine vision system and storage device
CN110163912A (en) * 2019-04-29 2019-08-23 达泊(东莞)智能科技有限公司 Two dimensional code pose scaling method, apparatus and system
CN110148187A (en) * 2019-06-04 2019-08-20 郑州大学 A kind of the high-precision hand and eye calibrating method and system of SCARA manipulator Eye-in-Hand
CN110415304A (en) * 2019-07-31 2019-11-05 北京博视智动技术有限公司 A kind of vision calibration method and system

Also Published As

Publication number Publication date
CN112651261A (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN107431788B (en) Method and system for image-based tray alignment and tube slot positioning in a vision system
CN111263142B (en) Method, device, equipment and medium for testing optical anti-shake of camera module
CN112276936B (en) Three-dimensional data generating device and robot control system
JP5432835B2 (en) How to calibrate the camera
CN110611770B (en) Method and system for judging whether line frequency of linear array camera is matched with object motion speed
KR20080012764A (en) Screen printer and image sensor position alignment method
WO2014045508A1 (en) Inspection device, inspection method, and inspection program
TWI628415B (en) Positioning and measuring system based on image scale
CN108472706B (en) Deformation processing support system and deformation processing support method
CN111829439B (en) High-precision translation measuring method and device
CN111707187A (en) Measuring method and system for large part
CN112651261B (en) Calculation method for conversion relation between high-precision 2D camera coordinate system and mechanical coordinate system
CN114502913A (en) Correction parameter calculation method and device, and displacement calculation method and device
JP3696336B2 (en) How to calibrate the camera
KR102023087B1 (en) Method for camera calibration
CN109631757B (en) Grating scale calibration method and device and visual detection device
CN115375681B (en) Large-size target measuring method based on image splicing
KR101626374B1 (en) Precision position alignment technique using edge based corner estimation
US20240085448A1 (en) Speed measurement method and apparatus based on multiple cameras
CN114485479B (en) Structured light scanning and measuring method and system based on binocular camera and inertial navigation
CN115760811A (en) Monocular vision and feature marker based method for measuring 6D pose of workpiece
US11418771B1 (en) Method for calibrating 3D camera by employing calibrated 2D camera
CN112991372B (en) 2D-3D camera external parameter calibration method based on polygon matching
CN114945450B (en) Robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant