CN114663500A - Vision calibration method, computer device and storage medium - Google Patents

Vision calibration method, computer device and storage medium Download PDF

Info

Publication number
CN114663500A
CN114663500A CN202210342950.7A CN202210342950A CN114663500A CN 114663500 A CN114663500 A CN 114663500A CN 202210342950 A CN202210342950 A CN 202210342950A CN 114663500 A CN114663500 A CN 114663500A
Authority
CN
China
Prior art keywords
camera
reference point
robot
coordinate
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210342950.7A
Other languages
Chinese (zh)
Inventor
褚金龙
周云龙
田魁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ADTECH (SHENZHEN) TECHNOLOGY CO LTD
Original Assignee
ADTECH (SHENZHEN) TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ADTECH (SHENZHEN) TECHNOLOGY CO LTD filed Critical ADTECH (SHENZHEN) TECHNOLOGY CO LTD
Priority to CN202210342950.7A priority Critical patent/CN114663500A/en
Publication of CN114663500A publication Critical patent/CN114663500A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Manipulator (AREA)

Abstract

The application provides a visual calibration method, computer equipment and a storage medium, wherein the visual calibration method comprises the steps of obtaining a pixel equivalent value of a camera; calculating the actual distance from any reference point to the center of the camera according to the pixel equivalent value; acquiring the actual relative distance between the center of the camera and the execution tail end of the robot; obtaining a conversion relation between the pixel coordinates of the camera and the robot execution tail end according to the actual distance from the datum point to the center of the camera and the distance from the center of the camera to the robot execution tail end; and obtaining the position of the reference point in the robot coordinate system according to the conversion relation. By means of the mode, the calibration calculation time can be shortened, and the calibration efficiency is improved.

Description

Vision calibration method, computer device and storage medium
Technical Field
The present application relates to the field of camera calibration technologies, and in particular, to a visual calibration method, a computer device, and a storage medium.
Background
The camera calibration in the field of computer vision is an important research subject and is the basis for realizing 3D reconstruction and vision measurement. In the situation of high requirement on the precision of the vision system, the calibration process of the camera is often completed by means of a target with known geometric dimension. Meanwhile, camera calibration is a basic requirement for acquiring three-dimensional information from two-dimensional images in the field of computer vision, and is an essential step for completing many visual works. The method aims to determine the internal geometric and optical characteristics (internal parameters) of the camera and the coordinate relation (external coefficients) of the camera in the three-dimensional world, and mainly aims to solve the pixel equivalent of the camera and solve the affine transformation matrix from the image coordinate to the two-dimensional coordinate. With the continuous development of machine vision and the popularization of cameras, it is very desirable to use a simple and flexible calibration method to complete the work related to vision in the field of vision positioning.
At present, nine-point calibration is two-dimensional hand-eye calibration widely used in industry, objects are grabbed from a fixed plane for assembly and other operations, and most industrial application scenes adopt the method. Similar to general hand-eye calibration, the result of the nine-point calibration is a transformation matrix of the camera coordinate system and the tool coordinate system, and a transformation matrix of the camera coordinate system and the workpiece coordinate system.
It can be understood that the physical conversion relation between the camera pixel coordinate and the machine coordinate is obtained through nine-point calibration, and the calibration precision is high and the applicability is strong. However, the method needs to shoot at least 9 calibration plate images with different angles, obtains a matrix of nine points to complete calibration, has long calibration working time and higher working environment requirement, and brings certain trouble in practical engineering application.
Disclosure of Invention
An object of the embodiments of the present application is to provide a visual calibration method, a computer device, and a storage medium, which can solve the technical problems of long calibration working time, high requirement for working environment, and the like caused by acquiring a physical transformation relationship between a camera pixel coordinate and a machine coordinate through nine-point calibration in the prior art.
In order to solve the above technical problem, an embodiment of the present application provides a visual calibration method, where the visual calibration method includes: acquiring a pixel equivalent value of a camera; calculating the actual distance from any reference point to the center of the camera according to the pixel equivalent value; acquiring an actual relative distance between the center of the camera and an execution tail end of the robot; obtaining a conversion relation between the pixel coordinates of the camera and the execution tail end of the robot according to the actual distance between the datum point and the center of the camera and the distance between the center of the camera and the execution tail end of the robot; and obtaining the position of the reference point in the robot coordinate system according to the conversion relation.
Wherein the obtaining a pixel equivalent value of the camera further comprises: the robot coordinate system and the camera coordinate system are corresponding; creating a first fiducial under a camera field of view; locating a first pixel coordinate of the first reference point; controlling the robot to move a preset distance in any direction; determining a second pixel coordinate of the first reference point after the first reference point moves a preset distance; and obtaining the pixel equivalent of the camera according to the absolute value of the difference value of the first pixel coordinate and the second pixel coordinate and the preset moving distance of the robot.
Wherein the locating the first pixel coordinate of the first reference point further comprises: establishing an outer contour template of the first datum point; searching for a match in the first image by using the parameters of the outer contour template; and taking the position with the highest matching score as the first pixel coordinate of the first reference point.
Wherein the determining a second pixel coordinate of the first reference point after moving a preset distance further comprises: establishing an outer contour template of the first datum point; searching for a match in a second image using the parameters of the outer contour template; and taking the position with the highest matching score as the second pixel coordinate of the first reference point.
Wherein, the coordinate system of the robot and the coordinate system of the camera are corresponded, further comprising: creating a second fiducial under the camera field of view; fixing the second reference point and moving the camera in a first direction and a second direction of the robot, respectively; judging whether the direction change of the pixel coordinate of the second reference point is consistent with the actual moving direction of the second reference point or not; if not, the robot coordinate system and the camera coordinate system are exchanged.
Wherein, the calculating the actual distance from any reference point to the center of the camera according to the pixel equivalent value further comprises: acquiring the pixel coordinate of the center of the camera and the pixel coordinate of the any reference point; calculating the difference value of the pixel coordinates of the center of the camera and the arbitrary datum point; and obtaining the actual distance from any one reference point to the center of the camera according to the pixel equivalent value of the camera and the pixel coordinate difference value.
Wherein the obtaining an actual relative distance of the camera center from a robot performing tip further comprises: mapping the camera center coordinates and the coordinates of the robot execution end to the same actual position; acquiring a coordinate difference value between a camera center and the robot execution tail end; and calculating to obtain the actual relative distance between the center of the camera and the execution tail end of the robot according to the coordinate difference.
Wherein, the obtaining the position of the reference point in the robot coordinate system according to the conversion relation further comprises: acquiring the image coordinate position of the reference point by using shape matching; and obtaining the position of the reference point in a robot coordinate system according to the image coordinate position and the conversion relation.
In order to solve the above technical problem, an embodiment of the present application further provides a computer device, including a memory and a processor, where the memory stores computer readable instructions, and the processor, when executing the computer readable instructions, implements the steps of the vision calibration method according to any one of the above.
In order to solve the foregoing technical problem, an embodiment of the present application further provides a computer-readable storage medium, which adopts the following technical solutions: the computer readable storage medium has stored thereon computer readable instructions, which when executed by a processor, implement the steps of the vision calibration method described above.
Compared with the prior art, the embodiment of the application mainly has the following beneficial effects:
the application provides a visual calibration method, computer equipment and a storage medium, which are characterized in that the actual coordinate position of a visual reference point in a robot coordinate is solved, a coordinate conversion relation is obtained through calibration calculation, and the calibration calculation time is reduced and the calibration efficiency is improved by utilizing the image pixel relation and the spatial coordinate numerical equivalent. In addition, the method and the device utilize the corresponding coordinate relation of the two-dimensional plane, reduce the shooting times of the camera, enable the operation of the calibration process of the screw machine to be simple, have good flexibility and short time period, and are easy to realize the production requirements of the screw machine production line.
Drawings
In order to more clearly illustrate the solution of the present application, the drawings needed for describing the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
FIG. 1 is a schematic diagram of an embodiment of a calibration system of the present application;
FIG. 2 is a schematic flow chart diagram illustrating an embodiment of a visual calibration method according to the present application;
FIG. 3 is a schematic flow chart of one embodiment of step S100 of the present application;
FIG. 4 is a schematic flow chart illustrating an embodiment of step S110 of the present application;
FIG. 5 is a schematic flow chart of one embodiment of step S200 of the present application;
FIG. 6 is a schematic flow chart of one embodiment of step S300 of the present application;
FIG. 7 is a flowchart illustrating an embodiment of step S500 of the present application;
FIG. 8 is a schematic block diagram of an embodiment of a computer device according to the present application.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The terms "including" and "having," and any variations thereof, in the description and claims of this application and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of this application or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The visual calibration method can be applied to the field of mobile phone screw driving, in a specific application scene, the calibration method designs the screw model commonly used on the mobile phone into a special tool fixture, and an operator can finish high-precision visual screw machine calibration only by aligning the screwdriver with the screw model corresponding to the tool fixture, so that the calibration step of the visual screw machine is greatly simplified, and the field debugging time is shortened. And the application establishes a mobile camera center tool coordinate system and is divided into two steps, firstly, common screw models on a mobile phone are designed into special tool fixtures, the pixel distance of the tool fixtures and the actual similar model size are measured, and the parameter ratio of the pixel and the robot coordinate is obtained. Then, a machine teaching process is utilized to enable the circle center of the circle for calibration to be equal to the camera view center, and finally a camera center tool coordinate system is established. By improving the calibration method, the robot is prevented from moving for nine times of photographing, so that the operation of the calibration process of the screw machine is simple, the flexibility is good, the time period is short, and the production requirement of the screw machine production line is easy to realize. It should be noted that the visual calibration method of the present application may also be applied to other scenarios, and is not specifically limited herein. Please refer to the following detailed description of the calibration method of the present application.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic diagram of an embodiment of a calibration system of the present application, and as shown in fig. 1, the calibration system of the present application includes a calibration robot 100, a camera C, and a target workpiece G. The robot 100 includes a robot arm 110 and an execution end H (corresponding to a screw of the present application) disposed on the robot arm 110, a camera C disposed on the robot arm, and a target workpiece G having a plurality of screw holes. With reference to fig. 1, the calibration is performed on the premise that the camera calibration is performed, and the relationship between the mechanical coordinate system and the image pixel coordinate system in the space is solved, that is, the camera calibration process is performed.
Further, the coordinate system involved in the process includes a mechanical coordinate system (x)H,yH,zH) Camera coordinate system (x)c,yc,zc) The image pixel coordinate system (x, y), the mechanical coordinate system and the camera coordinate system are transformed into:
Figure BDA0003580028510000051
wherein R is a rotation change matrix, and T is a translation transformation matrix.
Further, the conversion of the camera coordinate system and the image coordinate system is as follows:
Figure BDA0003580028510000052
s is a proportionality coefficient, fx,fyThe value of the focal length of the lens in the x and y directions, and gamma is used for the deviation caused by the non-perpendicularity of the relative plane and the optical axis. And the delta mu and the delta nu are deviation values of the center of the image plane and the optical axis passing through the image plane. In ideal condition fx=fy=f;γ=Δν=Δμ=0。
The relational expression between the mechanical coordinates and the image coordinates can be obtained from the above relation, and it can be understood that since the screw machine is in the two-dimensional plane field, the value of z can be temporarily ignored, and the obtained formula is as follows:
Figure BDA0003580028510000053
wherein R is1For the rotation coefficient, the above equation can be converted into:
Figure BDA0003580028510000054
according to the above equation, a camera is used to photograph to obtain the ratio K between the pixel size of the screw and the actual physical size of the screw, i.e. the pixel equivalent, and the ratio of a to b can be obtained by using K. Wherein a represents the actual size of the screw and b is the pixel size of the screw. Bringing into the above formula can result:
Figure BDA0003580028510000055
by trigonometric theorem, (x) can be obtainedH,yH) The coordinates of (x, y) and the above equation are substituted, and the corresponding rotation coefficient and displacement coefficient can be obtained by calculation.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating an embodiment of a visual calibration method according to the present application, and as shown in fig. 1, the visual calibration method according to the present application includes the following steps:
s100, acquiring a pixel equivalent value of the camera.
Referring to fig. 3, fig. 3 is a schematic flow chart of an embodiment of step S100 of the present application, and step S100 of the present application further includes the following sub-steps:
and S110, enabling the robot coordinate system to correspond to the camera coordinate system.
Referring to fig. 4, fig. 4 is a schematic flow chart of an embodiment of step S110 of the present application, and step S110 of the present application further includes the following sub-steps as shown in fig. 4:
s111, creating a second reference point under the visual field of the camera.
Alternatively, determining the X-axis direction and the Y-axis direction of the camera coordinate system requires creating a second reference point in the field of view of the camera.
And S112, fixing the second reference point, and moving the camera along the first direction and the second direction of the robot respectively.
Optionally, in this application, the first direction may be a positive X-axis direction, and the second direction may be a positive Y-axis direction, and the camera moves along the positive X-axis direction and the positive Y-axis direction of the robot, respectively. Optionally, the robot is moved, and the camera moves along with the robot, wherein the second reference point is fixed, and it is only necessary to observe the pixel coordinate transformation of the second reference point in the image, so as to determine whether the robot coordinate system and the camera coordinate system correspond to each other.
S113, it is determined whether the direction change of the pixel coordinates of the second reference point coincides with the actual moving direction of the second reference point.
Further, it is determined whether the direction change of the pixel coordinate of the second reference point is consistent with the actual moving direction of the second reference point, and if the direction change of the pixel coordinate of the second reference point is consistent with the actual moving direction of the second reference point, it indicates that the robot coordinate system and the camera coordinate system are aligned, the process proceeds to step S120. On the other hand, if the direction change of the pixel coordinates of the second reference point does not coincide with the actual moving direction of the second reference point, the process proceeds to step S114.
And S114, exchanging the robot coordinate system and the camera coordinate system.
It is understood that if the direction change of the pixel coordinates of the second reference point does not coincide with the actual moving direction of the second reference point, the XY-axis coordinate system of the camera is exchanged with the XY-axis coordinate system of the robot coordinate system.
S120, creating a first reference point under the camera view.
Further, a first reference point is first established under the field of view of the camera. It is understood that the first reference point in step S120 may be the same as or different from the second reference point in step S111, and is not limited herein.
S130, positioning the first pixel coordinate of the first reference point.
Specifically, the first pixel coordinate of the current first reference point may be located in the shape matching direction in step S130. The image matching method mainly comprises the following steps:
and establishing an outer contour template of the first reference point, searching for matching in the first image by using the parameters of the outer contour template, and taking the position with the highest matching score as the first pixel coordinate of the first reference point. Of course, the first pixel coordinate of the first reference point may also be obtained by using a manner of fitting an outer contour edge of the first reference point to a circle to obtain a center coordinate of the point, which is not specifically limited herein.
And S140, controlling the robot to move a preset distance in any direction.
And further, controlling the robot to deviate a point in any direction by a preset distance.
And S150, determining a second pixel coordinate of the first reference point after the first reference point moves the preset distance.
And determining the second pixel coordinate of the first reference point after the deviation through shape matching, wherein the specific method comprises the following steps:
establishing an outer contour template of the first datum point; searching for a match in a second image using the parameters of the outer contour template; and taking the position with the highest matching score as the second pixel coordinate of the first reference point. Of course, in other embodiments, the second pixel coordinate of the first reference point may also be obtained by fitting the outer contour edge of the first reference point to the circle center coordinate of the circle acquisition point, which is not specifically limited herein.
And S160, obtaining the pixel equivalent of the camera according to the absolute value of the difference value of the first pixel coordinate and the second pixel coordinate and the preset moving distance of the robot.
For example, if the first pixel coordinate of the first reference point is a, the preset robot offset distance is Dis, and the second pixel coordinate of the first reference point is B, the pixel Equivalent value m _ Equivalent of the camera can be obtained by dividing the preset robot offset distance Dis by the absolute value of the difference between the first pixel coordinate a and the second pixel coordinate B:
m_Equivalent=Dis/|A-B| (6)
and S200, calculating the actual distance from any reference point to the center of the camera according to the pixel equivalent value.
Referring to fig. 5, fig. 5 is a schematic flow chart of an embodiment of step S200 of the present application, and step S200 of fig. 5 further includes the following sub-steps:
s210, acquiring pixel coordinates of the center of the camera and pixel coordinates of any datum point.
It is understood that the pixel coordinates of the camera center O are half the size of the camera long side W and short side H, and a reference point is arbitrarily selected in the target workpiece, and the pixel coordinates thereof are acquired.
And S220, calculating the pixel coordinate difference value between the center of the camera and any reference point.
And subtracting the pixel coordinate of the center of the camera from the pixel coordinate of any one datum point to obtain the pixel coordinate difference value of the two datum points.
And S230, obtaining the actual distance from any reference point to the center of the camera according to the pixel equivalent value and the pixel coordinate difference value of the camera.
Further, the difference value between the pixel coordinate of any one reference point and the pixel coordinate of the center of the camera is multiplied by the pixel Equivalent value m _ Equivalent, so that the actual distance S from any one reference point to the center of the camera is obtained.
And S300, acquiring the actual relative distance between the center of the camera and the execution end of the robot.
Referring to fig. 6, fig. 6 is a schematic flow chart of an embodiment of step S300 of the present application, and step S300 of fig. 5 further includes the following sub-steps:
and S310, mapping the coordinates of the center of the camera and the coordinates of the execution end of the robot to the same actual position.
And S320, acquiring a coordinate difference value between the center of the camera and the execution end of the robot.
And S330, calculating the actual relative distance between the center of the camera and the execution end of the robot according to the coordinate difference.
Specifically, a reference point is attached under the camera view. It is understood that the reference point does not have any relation to the first reference point and the second reference point. The distance S1 between the center of the camera and the execution end can be obtained by subtracting the machine coordinate of the center of the calibration needle from the machine coordinate of the center of the camera.
And S400, obtaining the conversion relation between the pixel coordinates of the camera and the robot execution tail end according to the actual distance from the reference point to the center of the camera and the distance from the center of the camera to the robot execution tail end.
Further, the actual relative distance between the center of the camera and the end of the robot is S1 from the actual distance S0 between any one of the reference points and the center of the camera, and the conversion relation between the pixel coordinates of the camera and the end of the robot is obtained by conversion, i.e., S2 is set to S0+ S1, and the center of the camera is converted as the origin of the coordinate system.
And S500, obtaining the position of the reference point in the robot coordinate system according to the conversion relation.
Referring to fig. 7, fig. 7 is a flowchart illustrating an embodiment of step S500 of the present application, and step S500 of fig. 7 further includes the following sub-steps:
and S510, acquiring the image coordinate position of the reference point by using shape matching.
Alternatively, the image coordinate positions of the reference points are acquired using shape matching.
Further, an outer contour template of the reference point is established, matching is searched in the image by using parameters of the outer contour template, and the position with the highest matching score is used as the pixel coordinate of the reference point.
And S520, obtaining the position of the reference point in the robot coordinate system according to the image coordinate position and the conversion relation.
Further, the position of the reference point in the robot coordinate system can be obtained from the image coordinate position and the conversion relationship S2 between the camera pixel coordinates and the actual robot execution end.
In a specific application scenario of the application, the machine coordinate position of the screw hole can be obtained by converting the relation S2, and then the execution end of the robot is moved to the corresponding coordinate to align with the screw hole, and the screw is screwed.
In the above embodiment, the actual coordinate position of the visual reference point in the robot coordinate is solved, the coordinate conversion relationship is obtained through calibration calculation, and the calibration calculation time is reduced and the calibration efficiency is improved by using the image pixel relationship and the spatial coordinate numerical equivalent. In addition, the method and the device utilize the corresponding coordinate relation of the two-dimensional plane, reduce the shooting times of the camera, enable the operation of the calibration process of the screw machine to be simple, have good flexibility and short time period, and are easy to realize the production requirements of the screw machine production line.
Furthermore, the vision calibration method disclosed by the application is found through experimental comparison that the improved calibration method is greatly improved in precision and efficiency compared with nine-point calibration. As shown in the following table, the average of data obtained by 100 repeated experiments is as follows;
TABLE 1 comparison of experimental data
Calibration method Deviation of X axis (mm) Y axis deviation (mm) Calibration time (ms) Number of times of photographing
Nine point calibration 0.00503 0.0612 54 9
Methods of the invention 0.00543 0.0565 30 3
Through the data comparison, the fact that the visual calibration method of the application is greatly improved in time efficiency compared with a nine-point method of Halcon can be found, and the shooting times are obviously reduced.
In order to solve the technical problem, the embodiment of the application further provides computer equipment. Referring to fig. 8, fig. 8 is a block diagram of a basic structure of a computer device according to the present embodiment.
The computer device 300 includes a memory 301, a processor 302, and a network interface 303 communicatively coupled to each other via a system bus. It is noted that only a computer device 300 having components 301 and 303 is shown in FIG. 8, but it is understood that not all of the illustrated components are required and that more or fewer components may alternatively be implemented. As will be understood by those skilled in the art, the computer device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The computer device can be a desktop computer, a notebook, a palm computer, a cloud server and other computing devices. The computer equipment can carry out man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch panel or voice control equipment and the like.
The memory 301 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the storage 301 may be an internal storage unit of the computer device 300, such as a hard disk or a memory of the computer device 300. In other embodiments, the memory 301 may also be an external storage device of the computer device 300, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the computer device 300. Of course, the memory 301 may also include both internal and external storage devices of the computer device 300. In this embodiment, the memory 301 is generally used for storing an operating system installed in the computer device 300 and various application software, such as computer readable instructions of an interface calling method. In addition, the memory 301 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 302 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 302 generally serves to control the overall operation of the computer device 300. In this embodiment, the processor 302 is configured to execute computer readable instructions stored in the memory 301 or process data, such as computer readable instructions for executing a vision calibration method.
The network interface 303 may comprise a wireless network interface or a wired network interface, and the network interface 303 is generally used for establishing a communication connection between the computer device 300 and other electronic devices.
In the above embodiment, the actual coordinate position of the visual reference point in the robot coordinate is solved, the coordinate conversion relationship is obtained through calibration calculation, and the calibration calculation time is reduced and the calibration efficiency is improved by using the image pixel relationship and the spatial coordinate numerical equivalent. In addition, the method and the device utilize the corresponding coordinate relation of the two-dimensional plane, reduce the shooting times of the camera, enable the operation of the calibration process of the screw machine to be simple, have good flexibility and short time period, and are easy to realize the production requirements of the screw machine production line.
The present application further provides another embodiment, which is to provide a computer-readable storage medium, wherein the computer-readable storage medium stores computer-readable instructions, which can be executed by at least one processor, so as to cause the at least one processor to execute the steps of the vision calibration method as described above.
In the above embodiment, the actual coordinate position of the visual reference point in the robot coordinate is solved, the coordinate conversion relationship is obtained through calibration calculation, and the calibration calculation time is reduced and the calibration efficiency is improved by using the image pixel relationship and the spatial coordinate numerical equivalent. In addition, the method and the device utilize the corresponding coordinate relation of the two-dimensional plane, reduce the photographing times of the camera, enable the screw machine calibration process to be simple in operation, good in flexibility and short in time period, and easily achieve the production requirements of a screw machine production line.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method of the embodiments of the present application.
It is to be understood that the above-described embodiments are merely illustrative of some, but not restrictive, of the broad invention, and that the appended drawings illustrate preferred embodiments of the invention and do not limit the scope of the invention. This application is capable of embodiments in many different forms and is provided for the purpose of enabling a thorough understanding of the disclosure of the application. Although the present application has been described in detail with reference to the foregoing embodiments, it will be apparent to one skilled in the art that the present application may be practiced without modification or with equivalents of some of the features described in the foregoing embodiments. All equivalent structures made by using the contents of the specification and the drawings of the present application are directly or indirectly applied to other related technical fields and are within the protection scope of the present application.

Claims (10)

1. A visual calibration method, characterized in that the calibration method comprises:
acquiring a pixel equivalent value of a camera;
calculating the actual distance from any reference point to the center of the camera according to the pixel equivalent value;
acquiring an actual relative distance between the center of the camera and an execution tail end of the robot;
obtaining a conversion relation between the pixel coordinates of the camera and the execution tail end of the robot according to the actual distance between the datum point and the center of the camera and the distance between the center of the camera and the execution tail end of the robot;
and obtaining the position of the reference point in the robot coordinate system according to the conversion relation.
2. The calibration method according to claim 1, wherein the obtaining of the pixel equivalent value of the camera further comprises:
the robot coordinate system and the camera coordinate system are corresponding;
creating a first fiducial under a camera field of view;
locating a first pixel coordinate of the first reference point;
controlling the robot to move a preset distance in any direction;
determining a second pixel coordinate of the first reference point after the first reference point moves a preset distance;
and obtaining the pixel equivalent of the camera according to the absolute value of the difference value of the first pixel coordinate and the second pixel coordinate and the preset moving distance of the robot.
3. The calibration method according to claim 2, wherein said locating the first pixel coordinate of the first reference point further comprises:
establishing an outer contour template of the first datum point;
searching for a match in the first image by using the parameters of the outer contour template;
and taking the position with the highest matching score as the first pixel coordinate of the first reference point.
4. The calibration method according to claim 3, wherein the determining the second pixel coordinate of the first reference point after moving a preset distance further comprises:
establishing an outer contour template of the first datum point;
searching for a match in a second image by using the parameters of the outer contour template;
and taking the position with the highest matching score as the second pixel coordinate of the first reference point.
5. The calibration method according to claim 2, wherein said associating the robot coordinate system with the camera coordinate system further comprises:
creating a second reference point under the camera field of view;
fixing the second reference point and moving the camera in a first direction and a second direction of the robot, respectively;
judging whether the direction change of the pixel coordinate of the second reference point is consistent with the actual moving direction of the second reference point or not;
if not, the robot coordinate system and the camera coordinate system are exchanged.
6. The calibration method according to claim 1, wherein said calculating an actual distance from any reference point to a camera center according to the pixel equivalent value further comprises:
acquiring the pixel coordinate of the center of the camera and the pixel coordinate of the any reference point;
calculating a pixel coordinate difference value between the camera center and the any one reference point;
and obtaining the actual distance from any one reference point to the center of the camera according to the pixel equivalent value of the camera and the pixel coordinate difference value.
7. The calibration method according to claim 1, wherein the obtaining of the actual relative distance between the camera center and the robot performing end further comprises:
mapping the camera center coordinates and the coordinates of the robot executing end to the same actual position;
acquiring a coordinate difference value between a camera center and the robot execution tail end;
and calculating to obtain the actual relative distance between the center of the camera and the execution tail end of the robot according to the coordinate difference.
8. The calibration method according to claim 1, wherein the obtaining the position of the reference point in the robot coordinate system according to the transformation relation further comprises:
acquiring the image coordinate position of the reference point by using shape matching;
and obtaining the position of the reference point in a robot coordinate system according to the image coordinate position and the conversion relation.
9. A computer device comprising a memory having computer readable instructions stored therein and a processor which when executed implements the steps of a vision calibration method as claimed in any one of claims 1 to 8.
10. A computer readable storage medium having computer readable instructions stored thereon which, when executed by a processor, implement the steps of the vision calibration method as claimed in any one of claims 1 to 8.
CN202210342950.7A 2022-04-02 2022-04-02 Vision calibration method, computer device and storage medium Pending CN114663500A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210342950.7A CN114663500A (en) 2022-04-02 2022-04-02 Vision calibration method, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210342950.7A CN114663500A (en) 2022-04-02 2022-04-02 Vision calibration method, computer device and storage medium

Publications (1)

Publication Number Publication Date
CN114663500A true CN114663500A (en) 2022-06-24

Family

ID=82033624

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210342950.7A Pending CN114663500A (en) 2022-04-02 2022-04-02 Vision calibration method, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN114663500A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115228705A (en) * 2022-08-01 2022-10-25 广东利元亨智能装备股份有限公司 Glue brushing height control method, glue brushing control method and glue brushing control equipment
CN115294217A (en) * 2022-10-10 2022-11-04 季华实验室 Visual experiment platform calibration method, positioning method and related equipment
CN116060269A (en) * 2022-12-08 2023-05-05 中晟华越(郑州)智能科技有限公司 Spraying method for loop-shaped product

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115228705A (en) * 2022-08-01 2022-10-25 广东利元亨智能装备股份有限公司 Glue brushing height control method, glue brushing control method and glue brushing control equipment
CN115294217A (en) * 2022-10-10 2022-11-04 季华实验室 Visual experiment platform calibration method, positioning method and related equipment
CN115294217B (en) * 2022-10-10 2022-12-09 季华实验室 Visual experiment platform calibration method, positioning method and related equipment
CN116060269A (en) * 2022-12-08 2023-05-05 中晟华越(郑州)智能科技有限公司 Spraying method for loop-shaped product

Similar Documents

Publication Publication Date Title
CN114663500A (en) Vision calibration method, computer device and storage medium
JP4191080B2 (en) Measuring device
WO2018098811A1 (en) Localization method and device
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
US20190139255A1 (en) Posture positioning system for machine and the method thereof
CN113687629B (en) Circular arc straight line fitting method, system, equipment and storage medium for laser CAM processing
CN112621743B (en) Robot, hand-eye calibration method for fixing camera at tail end of robot and storage medium
CN111754579A (en) Method and device for determining external parameters of multi-view camera
CN114474056B (en) Monocular vision high-precision target positioning method for grabbing operation
CN111438688A (en) Robot correction method, robot correction device, computer equipment and storage medium
WO2022037389A1 (en) Reference plane-based high-precision method and system for estimating multi-degree-of-freedom attitude of object
CN110232710B (en) Article positioning method, system and equipment based on three-dimensional camera
CN110363801B (en) Method for matching corresponding points of workpiece real object and three-dimensional CAD (computer-aided design) model of workpiece
CN107442973B (en) Welding bead positioning method and device based on machine vision
CN113172636B (en) Automatic hand-eye calibration method and device and storage medium
JP2730457B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN114193444A (en) Robot hand-eye calibration method, system, equipment and medium
CN111037559B (en) Quick calibration method and device for position of material tray of machine and storage medium
CN117260712A (en) Method, system, device and medium for automatically calibrating coordinates of end assembly of robot
CN114734444B (en) Target positioning method and device, electronic equipment and storage medium
CN110060330B (en) Three-dimensional modeling method and device based on point cloud image and robot
CN110842917B (en) Method for calibrating mechanical parameters of series-parallel connection machinery, electronic device and storage medium
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
CN113997059A (en) Compressor workpiece assembling method, device and system and storage medium
CN114963928A (en) Mechanical correction method, device, electronic equipment, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination