CN116205988A - Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium - Google Patents

Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium Download PDF

Info

Publication number
CN116205988A
CN116205988A CN202211732126.9A CN202211732126A CN116205988A CN 116205988 A CN116205988 A CN 116205988A CN 202211732126 A CN202211732126 A CN 202211732126A CN 116205988 A CN116205988 A CN 116205988A
Authority
CN
China
Prior art keywords
calibration
image
coordinates
coordinate
reference coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211732126.9A
Other languages
Chinese (zh)
Inventor
李弟
安登奎
戴志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luster LightTech Co Ltd
Original Assignee
Luster LightTech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luster LightTech Co Ltd filed Critical Luster LightTech Co Ltd
Priority to CN202211732126.9A priority Critical patent/CN116205988A/en
Publication of CN116205988A publication Critical patent/CN116205988A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a coordinate processing method, a coordinate processing device, a computer device and a computer readable storage medium. The coordinate processing method comprises the following steps: the camera is controlled to acquire a calibration image, wherein the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array; dividing the calibration image into a plurality of calibration areas; processing each calibration area to obtain characteristic points marked in each calibration area; determining the identified image coordinates according to the identified feature points in each calibration area; determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each calibration area includes reference coordinates. In the technical scheme of the invention, the calibration image is subjected to multi-region calibration, so that a plurality of reference coordinates can be determined, and the error of the subsequent coordinate conversion is reduced.

Description

Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium
Technical Field
The present invention relates to the field of camera calibration technology, and more particularly, to a coordinate processing method, a coordinate processing apparatus, a computer device, and a computer-readable storage medium.
Background
In the field of industrial vision, camera calibration is one of the important preconditions for performing detection, measurement, assembly and other processes. Through camera calibration, the internal and external parameters of the camera can be calculated, and the relation between the camera coordinate system and the world coordinate system is established, so that the size, the defect, the position and the like of the product are measured and detected, and automatic production is realized. In the related art, the camera and the target device may perform coordinate conversion according to the reference coordinates, resulting in a large error of the conversion result due to the limitation of the reference coordinates.
Disclosure of Invention
The embodiment of the invention provides a coordinate processing method, a coordinate processing device, a computer device and a computer readable storage medium.
The invention provides a coordinate processing method, which comprises the following steps: controlling a camera to acquire a calibration image, wherein the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array; dividing the calibration image into a plurality of calibration areas; processing each calibration area to obtain the marked characteristic points in each calibration area; determining the image coordinates of the marks according to the characteristic points of the marks in each calibration area; determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each of the calibration areas includes reference coordinates.
In some embodiments, the target device comprises a laser, and the coordinate processing method further comprises: controlling the laser to engrave the mark on a fixed-position mark sheet; the controlling the camera to obtain the calibration image comprises the following steps: and controlling the camera to shoot the calibration sheet so as to obtain the calibration image.
In some embodiments, the dividing the calibration image into a plurality of calibration areas includes: determining the size of a segmentation area according to the image size of the calibration image and the preset segmentation number; dividing the calibration image into the calibration areas with preset dividing quantity according to the dividing area size.
In some embodiments, the dividing the calibration image into a plurality of calibration areas further comprises: and determining the preset segmentation number according to the input information.
In some embodiments, the coordinate processing method further comprises: controlling the camera to acquire a current image, wherein the current image comprises a point position to be converted; determining the image coordinates of the point to be converted; determining reference coordinates corresponding to the image coordinates of the point to be converted from the reference coordinates of the calibration areas to serve as target reference coordinates; and converting the image coordinates of the point to be converted into the equipment coordinates of the point to be converted according to the conversion relation between the target reference coordinates and the corresponding calibration area.
In some embodiments, the reference coordinates of each calibration area include image reference coordinates and corresponding device reference coordinates, and the determining, from the reference coordinates of a plurality of calibration areas, the reference coordinates corresponding to the image coordinates of the point to be converted as target reference coordinates includes:
and determining the reference coordinates of the calibration area corresponding to the image reference coordinates with the closest image coordinate distance of the point to be converted as the target reference coordinates.
In some embodiments, the reference coordinates of each calibration area include an image reference coordinate and a corresponding device reference coordinate, the conversion relationship of each calibration area includes a calibration result parameter, and the converting the image coordinates of the point to be converted into the device coordinates of the point to be converted according to the conversion relationship of the target reference coordinate and the corresponding calibration area includes: determining an image coordinate difference according to the image reference coordinates of the target reference coordinates and the image coordinates of the point to be converted; determining equipment coordinate differences according to the image coordinate differences and the calibration result parameters of the corresponding calibration areas; and determining the equipment coordinates of the point to be converted according to the equipment coordinate difference and the equipment reference coordinates of the target reference coordinates.
The present invention provides a coordinate processing apparatus including: the system comprises a first acquisition module, an image dividing module, a processing module, a first determining module and a second determining module, wherein the first acquisition module is used for controlling a camera to acquire a calibration image, and the calibration image comprises an identification which is formed by target equipment according to equipment coordinates and is arranged in an array; the image dividing module is used for dividing the calibration image into a plurality of calibration areas; the processing module is used for processing each calibration area to obtain the marked characteristic points in each calibration area; the first determining module is used for determining the image coordinates of the mark according to the characteristic points of the mark in each calibration area; the second determining module is used for determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each of the calibration areas includes reference coordinates.
An embodiment of the present invention provides a computer device, where the computer device includes one or more processors and a memory, where the memory stores a computer program, where the computer program, when executed by the processor, implements the steps of the coordinate processing method of any of the foregoing embodiments.
An embodiment of the present invention is a computer-readable storage medium having stored thereon a computer program that, when executed by a processor, implements the steps of any one of the coordinate processing methods of the embodiments described above.
In the coordinate processing method, the coordinate processing device, the computer equipment and the computer readable storage medium, the calibration image is subjected to multi-region calibration, so that a plurality of reference coordinates can be determined, and errors of subsequent coordinate conversion are reduced conveniently.
Additional aspects and advantages of embodiments of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow chart of a coordinate processing method according to some embodiments of the present invention;
FIG. 2 is a schematic diagram of a coordinate processing device according to some embodiments of the present invention;
FIG. 3 is a schematic diagram of a computer device in accordance with certain embodiments of the invention;
FIG. 4 is a flow chart of a coordinate processing method according to some embodiments of the invention;
FIG. 5 is a schematic diagram of a coordinate processing device according to some embodiments of the present invention;
FIG. 6 is a flow chart of a coordinate processing method according to some embodiments of the invention;
FIG. 7 is a flow chart of a coordinate processing method according to some embodiments of the present invention;
FIG. 8 is a schematic representation of a calibration image of some embodiments of the inventions;
FIG. 9 is a flow chart of a coordinate processing method according to some embodiments of the invention;
FIG. 10 is a flow chart of a coordinate processing method according to some embodiments of the invention;
FIG. 11 is a schematic diagram of a coordinate processing device according to some embodiments of the present invention;
FIG. 12 is a flow chart of a coordinate processing method according to some embodiments of the invention;
FIG. 13 is a flow chart of a coordinate processing method according to some embodiments of the invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the present invention and are not to be construed as limiting the present invention.
The following disclosure provides many different embodiments, or examples, for implementing different structures of embodiments of the invention. In order to simplify the disclosure of embodiments of the present invention, components and arrangements of specific examples are described below. They are, of course, merely examples and are not intended to limit the invention.
In the field of industrial vision, camera calibration is one of the important preconditions for performing detection, measurement, assembly and other processes. Through camera calibration, the internal and external parameters of the camera can be calculated, and the relation between the camera coordinate system and the world coordinate system is established, so that the size, the defect, the position and the like of the product are measured and detected, and automatic production is realized.
In the related art, the camera calibration is nine circles, namely, 3×3Mark points are marked on a calibration plate by a laser, and the conversion relation from image coordinates to equipment coordinates is calculated according to 9 points. When calculating the device coordinates P (x, y) after calibration is completed, the following formula can be used for calculation: (1)
Figure BDA0004031573050000041
Figure BDA0004031573050000042
Wherein->
Figure BDA0004031573050000043
For calibrating the result parameters, (Tx, ty) is the equipment coordinates of the origin of the image coordinate system, the translation parameters are represented, P (sx, sy), I (sx, sy) are the laser coordinates and the image coordinates which participate in calibration, I (x, y) is the image coordinates to be converted into the laser coordinates, and the laser coordinates P (x, y) can be obtained in the formulas (2) - (1): (3)/>
Figure BDA0004031573050000044
In summary, the error in calculating the device coordinates depends on the reference coordinates (P (sx, sy), I (sx, sy)) in the above equation, and the error is large in calculating the out-of-area coordinates when the reference coordinates are one in the single-area calibration. The camera and the target device can perform coordinate conversion according to the reference coordinates, and the conversion result has larger error due to the limitation of the reference coordinates.
Referring to fig. 1, an embodiment of the present invention provides a coordinate processing method, where the coordinate processing method includes:
0010: the camera is controlled to acquire a calibration image, wherein the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array;
0020: dividing the calibration image into a plurality of calibration areas;
0030: processing each calibration area to obtain characteristic points marked in each calibration area;
0040: determining the identified image coordinates according to the identified feature points in each calibration area;
0050: determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each calibration area includes reference coordinates.
Referring to fig. 2, an embodiment of the present invention provides a coordinate processing apparatus 100, where the coordinate processing apparatus 100 includes a first acquisition module 11, an image dividing module 12, a processing module 13, a first determining module 14, and a second determining module 15.
The coordinate processing method of the embodiment of the present invention may be implemented by the coordinate processing apparatus 100 of the embodiment of the present invention. Wherein step 0010 can be performed by the first acquisition module 11, step 0020 can be performed by the image division module 12, step 0030 can be performed by the processing module 13, step 0040 can be performed by the first determination module 14, and step 0050 can be performed by the second determination module 15. That is, the first obtaining module 11 may be configured to control the camera to obtain a calibration image, where the calibration image includes the identifiers of the target devices arranged in an array according to the device coordinates. The image division module 12 may be used to divide the calibration image into a plurality of calibration areas. The processing module 13 may be configured to process each calibration area to obtain the feature points identified in each calibration area. The first determination module 14 may be configured to determine the identified image coordinates based on the identified feature points in each calibration area. The second determining module 15 may be configured to determine a conversion relationship corresponding to the identified device coordinates and the identified image coordinates in each calibration area; wherein each calibration area includes reference coordinates.
In the coordinate processing method and the coordinate processing device 100 according to the embodiments of the present invention, the calibration image is calibrated in multiple areas, so that a plurality of reference coordinates can be determined, and errors in subsequent coordinate conversion can be reduced.
Referring to fig. 3, the coordinate processing apparatus 100 may be applied to a computer device 1000. The computer device 1000 may include a smart phone, a tablet computer, a smart watch, a smart bracelet, etc., without specific limitation. The computer device 1000 of the embodiment of the present invention is illustrated by a smart phone, and should not be construed as limiting the present invention.
Step 0010 controls the camera to obtain a calibration image, wherein the calibration image comprises the identification of the target equipment in array arrangement formed according to the equipment coordinates. Specifically, the target device comprises a laser, a welding device, a gluing device and the like, before image division is performed, a camera is firstly required to be controlled to obtain a calibration image, the calibration image comprises marks which are formed by the target device according to device coordinates and are arranged in an array, and the marks comprise circles, crosses, rectangles and the like.
Thus, preparation can be made for dividing the calibration area subsequently by acquiring the calibration image.
Step 0020 divides the calibration image into a plurality of calibration regions. Specifically, after the calibration image is acquired, the coordinate processing method divides the calibration image into a plurality of calibration areas.
In this way, the calibration image is divided into a plurality of calibration areas, so that the characteristic points of each area can be conveniently processed later.
Step 0030 processes each calibration area to obtain the feature points identified in each calibration area. Specifically, after the division of the calibration image into a plurality of regions is completed, each region will be processed to obtain the feature points identified in each calibration region.
In this way, by deriving the feature points identified in each calibration area, provision can be made for determining the image coordinates for the subsequent step.
Step 0040 determines the identified image coordinates from the identified feature points in each calibration area. Specifically, after the feature points identified in each calibration area are obtained by processing each area, the identified image coordinates will be determined from the feature points identified in each calibration area.
In this way, the determination of the identified image coordinates can be accomplished by the identified feature points in each calibration area.
Step 0050 determining the conversion relation between the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each calibration area includes reference coordinates. Specifically, after the identification of the image coordinates is completed, the conversion relation corresponding to the identified device coordinates and the identified image coordinates in each calibration area is determined, wherein the conversion relation comprises a calibration result parameter and a translation parameter, the calibration result parameter comprises a rotation parameter, a scaling parameter and a device coordinate system type parameter (the type of the image coordinate system is a left-hand coordinate system, the right is X and the downward is Y), and the reference coordinates are utilized to offset the translation parameter in the later process, so that one parameter can be reduced, and the accuracy is improved. The conversion relationship can be expressed as
Figure BDA0004031573050000051
Wherein, (Px, py) are device coordinates representing the location of the marker point in the device coordinate system; (Ix, iy) is the image coordinates, representing the position of the marker point in the image coordinate system; (Tx, ty) is the device coordinates of the origin of the image coordinate system, representing the translation parameters; θ is the angle from the image coordinate system to the device coordinate system, and represents a rotation parameter, and is defined as the angle of the positive X-axis direction of the image coordinate system in the platform coordinate system; (Sx, sy) is the actual physical distance of one pixel of the image coordinate system in the X and Y directions, representing the scaling parameter; ey represents the type of device coordinate system, and is 1 when the device coordinate system is a left-hand coordinate system, and is-1 when the device coordinate system is a right-hand coordinate system.
In this way, the device coordinates and the conversion relation are determined, so that preparation can be made for the subsequent completion of coordinate conversion.
Referring to fig. 4, in some embodiments, the target device includes a laser, and the coordinate processing method further includes:
0060: the laser is controlled to engrave marks on the fixed position marking piece.
Referring to FIG. 5, in some embodiments, the coordinate processing apparatus 100 further includes a control module 16, and step 0060 can be implemented by the control module 16. That is, the control module 16 may be used to control the laser to engrave a logo on a fixed location of the logo.
Thus, by controlling the laser to engrave the mark on the fixed position calibration sheet, preparation can be made for acquiring the calibration image.
Specifically, the control laser is engraved with a logo on a stationary calibration piece prior to acquisition of the calibration image.
Referring to fig. 6, step 0010 (controlling the camera to acquire the calibration image) includes:
0011: the camera is controlled to shoot the calibration piece to obtain a calibration image.
Referring to fig. 2, in some embodiments, step 0011 may be implemented by the first acquisition module 11. That is, the first acquisition module 11 may be used to control the camera to take calibration pieces to obtain calibration images.
Thus, by controlling the camera to take a calibration sheet to obtain a calibration image, preparation can be made for dividing the calibration area later.
Specifically, the coordinate processing method will control the camera to take the calibration pieces to obtain the calibration image before dividing the calibration area.
Referring to fig. 7 and 8, in certain embodiments, step 0020 (dividing the calibration image into a plurality of calibration regions) comprises:
0021: determining the size of a segmentation area according to the image size of the calibration image and the preset segmentation number;
0022: dividing the calibration image into calibration areas with preset dividing quantity according to the size of the dividing areas.
Referring to fig. 2, in some embodiments, steps 0021 and 0022 may be implemented by the image dividing module 12. That is, the image dividing module 12 may be configured to determine the size of the division area according to the image size of the calibration image and the preset division number; dividing the calibration image into calibration areas with preset dividing quantity according to the size of the dividing areas.
Thus, the division of the calibration area can be completed by determining the size of the division area.
Specifically, after the acquisition of the calibration image is completed, the calibration image may be divided into calibration areas of a preset division number according to the image size of the calibration image and the preset division number, which may be input, for example, 4 may be input, the division area size may be determined, and then the calibration image may be divided into the calibration areas of the preset division number according to the division area size.
Referring to fig. 9, in some embodiments, step 0020 (dividing the calibration image into a plurality of calibration regions) further comprises:
0023: and determining a preset segmentation number according to the input information.
Referring to fig. 2, in some embodiments, step 0023 may be implemented by the image dividing module 12. That is, the image dividing module 12 may be configured to determine the preset number of divisions according to the input information.
Thus, by determining the preset number of divisions, provision can be made for a subsequent determination of the division area size.
Specifically, after the acquisition of the calibration image is completed, a preset division number may be determined according to the input information, and then the size of the division area may be determined according to the size of the calibration image and the preset division number.
Referring to fig. 10, in some embodiments, the coordinate processing method further includes:
0070: controlling a camera to acquire a current image, wherein the current image comprises points to be converted;
0080: determining image coordinates of a point to be converted;
0090: determining reference coordinates corresponding to the image coordinates of the point to be converted from the reference coordinates of the plurality of calibration areas as target reference coordinates;
0100: and converting the image coordinates of the point to be converted into the equipment coordinates of the point to be converted according to the conversion relation between the target reference coordinates and the corresponding calibration area.
Referring to fig. 11, in some embodiments, the coordinate processing apparatus 100 further includes a second obtaining module 17, a third determining module 18, a fourth determining module 19, and a converting module 20, where step 0070 may be implemented by the second obtaining module 17, step 0080 may be implemented by the third determining module 18, step 0090 may be implemented by the fourth determining module 19, and step 0100 may be implemented by the converting module 20. That is, the second obtaining module 17 may be configured to control the camera to obtain the current image, where the current image includes the point to be converted. A third determination module 18 may be used to determine the image coordinates of the point to be converted. The fourth determination module 19 may be configured to determine, as the target reference coordinates, reference coordinates corresponding to the image coordinates of the point to be converted from the reference coordinates of the plurality of calibration areas. The conversion module 20 may be configured to convert the image coordinates of the point to be converted into the device coordinates of the point to be converted according to the conversion relationship between the target reference coordinates and the corresponding calibration area.
Therefore, the image sitting standard of the point to be converted can be changed into the equipment coordinate of the point to be converted through the conversion relation between the target reference coordinate and the corresponding calibration area.
Specifically, after the determination of the device coordinates identified in each calibration area and the determination of the conversion relation corresponding to the identified image coordinates are completed, the control camera is controlled to acquire a current image, wherein the current image comprises a point to be converted, then the image coordinates of the point to be converted are determined, then the reference coordinates corresponding to the image coordinates of the point to be converted are determined from the reference coordinates of a plurality of calibration areas to serve as target reference coordinates, a plurality of references are generated through multi-area calibration, the error is smaller when the coordinates are converted into the shbei coordinates of the point to be converted, and finally the image coordinates of the point to be converted are converted into the device coordinates of the point to be converted according to the target reference coordinates and the conversion relation of the corresponding calibration areas.
Referring to fig. 12, in some embodiments, the reference coordinates of each calibration area include an image reference coordinate and a corresponding device reference coordinate, step 0090 (determining, from the reference coordinates of the plurality of calibration areas, the reference coordinates corresponding to the image coordinates of the point to be converted as target reference coordinates) includes:
0091: and determining the reference coordinates of the calibration area corresponding to the image reference coordinates with the closest image coordinate distance of the point to be converted as target reference coordinates.
Referring to FIG. 11, in some embodiments, step 0091 may be implemented by fourth determination module 19. That is, the fourth determination module 19 may be configured to determine, as the target reference coordinate, the reference coordinate of the calibration area corresponding to the image reference coordinate whose image coordinate distance of the point to be converted is closest.
Thus, the coordinate conversion can be provided for the subsequent steps by determining the target reference coordinates.
Specifically, after the image coordinates of the point to be converted are determined, reference coordinates corresponding to the image coordinates of the point to be converted may be determined as target reference coordinates from the reference coordinates of the plurality of calibration areas.
Referring to fig. 13, in some embodiments, the reference coordinates of each calibration area include an image reference coordinate and a corresponding device reference coordinate, the conversion relationship of each calibration area includes a calibration result parameter, and step 0100 (converting the image coordinate of the point to be converted into the device coordinate of the point to be converted according to the conversion relationship of the target reference coordinate and the corresponding calibration area) includes:
0101: determining an image coordinate difference according to the image reference coordinates of the target reference coordinates and the image coordinates of the point to be converted;
0102: determining equipment coordinate differences according to the image coordinate differences and the calibration result parameters of the corresponding calibration areas;
0103: and determining the equipment coordinates of the point to be converted according to the equipment coordinate difference and the equipment reference coordinates of the target reference coordinates.
Referring to FIG. 11, in some embodiments, step 0101, step 0102, and step 0103 may be implemented by conversion module 20. That is, the conversion module 20 may be configured to determine the image coordinate difference according to the image reference coordinates of the target reference coordinates and the image coordinates of the point to be converted; determining equipment coordinate differences according to the image coordinate differences and the calibration result parameters of the corresponding calibration areas; and determining the equipment coordinates of the point to be converted according to the equipment coordinate difference and the equipment reference coordinates of the target reference coordinates.
Therefore, the equipment coordinates of the point to be converted can be determined through the equipment coordinate difference and the equipment reference coordinates of the target reference coordinates, and the error of the subsequent coordinate conversion is reduced.
Specifically, after the determination of the target reference coordinates is completed, an image coordinate difference is determined according to the image reference coordinates of the target reference coordinates and the image coordinates of the point to be converted, then an equipment coordinate difference is determined according to the image coordinate difference and the calibration result parameters of the corresponding calibration area, and finally the equipment coordinates of the point to be converted are determined according to the equipment coordinate difference and the equipment reference coordinates of the target reference coordinates.
Referring to fig. 3, the coordinate processing method according to the embodiment of the present invention may be implemented by the computer device 1000 according to the embodiment of the present invention. In particular, the computer device 1000 includes one or more processors 200 and memory 300. The memory 300 stores a computer program. When the computer program is executed by the processor 200, the steps of the coordinate processing method according to any of the above embodiments are realized.
For example, in the case where the computer program is executed by the processor 200, the steps of the following coordinate processing method are implemented:
0010: the camera is controlled to acquire a calibration image, wherein the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array;
0020: dividing the calibration image into a plurality of calibration areas;
0030: processing each calibration area to obtain characteristic points marked in each calibration area;
0040: determining the identified image coordinates according to the identified feature points in each calibration area;
0050: determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each calibration area includes reference coordinates.
The embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program that, when executed by the processor 200, implements the steps of the coordinate processing method of any of the above embodiments.
For example, in the case where the program is executed by the processor 200, the following steps of the coordinate processing method are implemented:
0010: the camera is controlled to acquire a calibration image, wherein the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array;
0020: dividing the calibration image into a plurality of calibration areas;
0030: processing each calibration area to obtain characteristic points marked in each calibration area;
0040: determining the identified image coordinates according to the identified feature points in each calibration area;
0050: determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each calibration area includes reference coordinates.
In the coordinate processing method, the coordinate processing apparatus 100, the computer device 1000 and the computer readable storage medium according to the embodiments of the present invention, the calibration image is calibrated in multiple areas, so that a plurality of reference coordinates can be determined, and errors in subsequent coordinate conversion can be reduced.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, system that includes a processing module, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (control method) with one or more wires, a portable computer cartridge (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of embodiments of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
Furthermore, functional units in various embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, reference is made to the description of the term "certain embodiments" or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics described may be combined in any suitable manner in any one or more embodiments.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (10)

1. A coordinate processing method, characterized in that the coordinate processing method comprises:
controlling a camera to acquire a calibration image, wherein the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array;
dividing the calibration image into a plurality of calibration areas;
processing each calibration area to obtain the marked characteristic points in each calibration area;
determining the image coordinates of the marks according to the characteristic points of the marks in each calibration area;
determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each of the calibration areas includes reference coordinates.
2. The coordinate processing method according to claim 1, wherein the target device comprises a laser, the coordinate processing method further comprising:
controlling the laser to engrave the mark on a fixed-position mark sheet;
the controlling the camera to obtain the calibration image comprises the following steps:
and controlling the camera to shoot the calibration sheet so as to obtain the calibration image.
3. The coordinate processing method according to claim 1, wherein the dividing the calibration image into a plurality of calibration areas includes:
determining the size of a segmentation area according to the image size of the calibration image and the preset segmentation number;
dividing the calibration image into the calibration areas with preset dividing quantity according to the dividing area size.
4. The coordinate processing method according to claim 3, wherein the dividing the calibration image into a plurality of calibration areas further comprises:
and determining the preset segmentation number according to the input information.
5. The coordinate processing method according to claim 1, characterized in that the coordinate processing method further comprises:
controlling the camera to acquire a current image, wherein the current image comprises a point position to be converted;
determining the image coordinates of the point to be converted;
determining reference coordinates corresponding to the image coordinates of the point to be converted from the reference coordinates of the calibration areas to serve as target reference coordinates;
and converting the image coordinates of the point to be converted into the equipment coordinates of the point to be converted according to the conversion relation between the target reference coordinates and the corresponding calibration area.
6. The coordinate processing method according to claim 5, wherein the reference coordinates of each of the calibration areas include image reference coordinates and corresponding device reference coordinates, and the determining, as target reference coordinates, reference coordinates corresponding to the image coordinates of the point to be converted from among the reference coordinates of the plurality of calibration areas includes:
and determining the reference coordinates of the calibration area corresponding to the image reference coordinates with the closest image coordinate distance of the point to be converted as the target reference coordinates.
7. The coordinate processing method according to claim 5, wherein the reference coordinates of each calibration area include image reference coordinates and corresponding device reference coordinates, the conversion relation of each calibration area includes calibration result parameters, and the converting the image coordinates of the point to be converted into the device coordinates of the point to be converted according to the conversion relation of the target reference coordinates and the corresponding calibration area includes:
determining an image coordinate difference according to the image reference coordinates of the target reference coordinates and the image coordinates of the point to be converted;
determining equipment coordinate differences according to the image coordinate differences and the calibration result parameters of the corresponding calibration areas;
and determining the equipment coordinates of the point to be converted according to the equipment coordinate difference and the equipment reference coordinates of the target reference coordinates.
8. A coordinate processing apparatus, characterized in that the coordinate processing apparatus comprises:
the first acquisition module is used for controlling the camera to acquire a calibration image, and the calibration image comprises marks which are formed by target equipment according to equipment coordinates and are arranged in an array;
the image dividing module is used for dividing the calibration image into a plurality of calibration areas;
the processing module is used for processing each calibration area to obtain the marked characteristic points in each calibration area;
the first determining module is used for determining the image coordinates of the marks according to the characteristic points of the marks in each calibration area;
the second determining module is used for determining a conversion relation corresponding to the identified equipment coordinates and the identified image coordinates in each calibration area; wherein each of the calibration areas includes reference coordinates.
9. A computer device comprising one or more processors and a memory storing a computer program which, when executed by the processor, implements the steps of the coordinate processing method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the coordinate processing method of any one of claims 1 to 7.
CN202211732126.9A 2022-12-30 2022-12-30 Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium Pending CN116205988A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211732126.9A CN116205988A (en) 2022-12-30 2022-12-30 Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211732126.9A CN116205988A (en) 2022-12-30 2022-12-30 Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116205988A true CN116205988A (en) 2023-06-02

Family

ID=86506920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211732126.9A Pending CN116205988A (en) 2022-12-30 2022-12-30 Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN116205988A (en)

Similar Documents

Publication Publication Date Title
US8306323B2 (en) Method and apparatus for correcting depth image
CN101213440B (en) Method for forming master data for inspecting protruding and recessed figure
JP2022528301A (en) Calibration method, positioning method, equipment, electronic devices and storage media
JP4776259B2 (en) Pattern evaluation method, pattern alignment method, and program
CN105989588B (en) Special-shaped material cutting image correction method and system
CN112184811B (en) Monocular space structured light system structure calibration method and device
CN113034612B (en) Calibration device, method and depth camera
CN111829439B (en) High-precision translation measuring method and device
US20050012056A1 (en) Method for determining corresponding points in three-dimensional measurement
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN114332237A (en) Method for calculating conversion relation between camera coordinate system and laser coordinate system
JP2003166814A (en) Shape inspection device
KR101653861B1 (en) Drawing data generating method, drawing method, drawing data generating apparatus and drawing apparatus
CN112292577B (en) Three-dimensional measuring device and method
CN116205988A (en) Coordinate processing method, coordinate processing device, computer equipment and computer readable storage medium
CN115511718A (en) PCB image correction method and device, terminal equipment and storage medium
JP2008185395A (en) Mounting substrate visual inspection method
CN113689397A (en) Workpiece circular hole feature detection method and workpiece circular hole feature detection device
JP2006292453A (en) Image recognition method
CN113379837A (en) Angle correction method of detection device, device and computer readable storage medium
KR20160117302A (en) Reference position obtaining method, reference position obtaining apparatus, pattern writing method, pattern writing apparatus, and a program recorded on
Hou et al. Automatic calibration method based on traditional camera calibration approach
CN108846863B (en) Position detection method and device for positioning mark, computer and storage medium
JP2020197495A (en) Information processing apparatus, measuring device, information processing method, program, system, and method for manufacturing article
CN116563388B (en) Calibration data acquisition method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination