CN115100299B - Calibration method, device, equipment and storage medium - Google Patents

Calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN115100299B
CN115100299B CN202211036717.2A CN202211036717A CN115100299B CN 115100299 B CN115100299 B CN 115100299B CN 202211036717 A CN202211036717 A CN 202211036717A CN 115100299 B CN115100299 B CN 115100299B
Authority
CN
China
Prior art keywords
camera
image
calibration
laser sensor
calibration block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211036717.2A
Other languages
Chinese (zh)
Other versions
CN115100299A (en
Inventor
林显峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Luchen Intelligent Equipment Technology Co ltd
Original Assignee
Guangzhou Luchen Intelligent Equipment Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Luchen Intelligent Equipment Technology Co ltd filed Critical Guangzhou Luchen Intelligent Equipment Technology Co ltd
Priority to CN202211036717.2A priority Critical patent/CN115100299B/en
Publication of CN115100299A publication Critical patent/CN115100299A/en
Application granted granted Critical
Publication of CN115100299B publication Critical patent/CN115100299B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/042Calibration or calibration artifacts

Abstract

The invention discloses a calibration method, a calibration device, calibration equipment and a storage medium. The method is applied to a calibration system, and comprises the following steps: driving a camera to move above the convex calibration block through a mechanical arm, and shooting a first image of the convex calibration block through the camera; acquiring a first included angle between a target edge of the convex calibration block and a first image boundary according to the first image; acquiring a target height map corresponding to the first image through a laser sensor; determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map; determining an included angle error between the camera and the laser sensor according to the first included angle and the second included angle; and horizontally calibrating the camera and the laser sensor according to the included angle error. According to the technical scheme, the manual participation in the calibration process can be reduced, the calibration result is more precise and accurate, and the consistency of the equipment when leaving a factory is ensured.

Description

Calibration method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of coating equipment, in particular to a calibration method, a calibration device, calibration equipment and a storage medium.
Background
With the development of Optical measurement technology, AOI (Automated Optical Inspection) equipment is widely used. AOI devices are devices that detect common defects encountered in welding production based on optical principles. When the AOI equipment is used for detection, the camera is required to shoot an image of a detected object, and the laser sensor is used for scanning the height of the detected object, so that the detection of the object is completed. In the process, the installation positions of the camera and the laser sensor have a crucial influence on the detection accuracy of the AOI device.
In the prior art, the installation positions of the camera and the laser sensor are generally calibrated by research personnel through visual inspection and a calibration plate, measurement and calibration are not performed, and only people mainly watch the camera and the laser sensor when the camera and the laser sensor are directly installed. The calibration process of the camera and the laser sensor depends on manual operation, which causes deviation of the installation positions of the camera and the laser sensor, and the specific deviation value can not give clear quantitative index, and the quality of the product can not be ensured without index.
Disclosure of Invention
Embodiments of the present invention provide a calibration method, apparatus, device, and storage medium, so as to reduce human involvement in the calibration process, and introduce quantization operation in the calibration process, so that the calibration result is more precise and accurate.
According to an aspect of the present invention, there is provided a calibration method applied to a calibration system, where the calibration system includes: calibration board, first equipment to and drive the arm that first equipment removed, first equipment includes: camera and laser sensor, the calibration board includes: the convex calibration block, the calibration method comprises the following steps:
driving the camera to move above the convex calibration block through a mechanical arm, and shooting a first image of the convex calibration block through the camera;
acquiring a first included angle between a target edge of the convex calibration block and a first image boundary according to the first image, wherein the first image boundary is any boundary of the first image;
acquiring a target height map corresponding to the first image through a laser sensor;
determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map;
determining an included angle error of the camera and the laser sensor according to the first included angle and the second included angle;
and horizontally calibrating the camera and the laser sensor according to the included angle error.
According to another aspect of the present invention, there is provided a calibration apparatus, including:
the processing module is used for driving the camera to move above the convex calibration block through the mechanical arm and shooting a first image of the convex calibration block through the camera;
the first obtaining module is used for obtaining a first included angle between a target edge of the convex calibration block and a first image boundary according to the first image, wherein the first image boundary is any boundary of the first image;
the second acquisition module is used for acquiring a target height map corresponding to the first image through a laser sensor;
the first determining module is used for determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map;
the second determining module is used for determining the included angle error of the camera and the laser sensor according to the first included angle and the second included angle;
and the horizontal calibration module is used for horizontally calibrating the camera and the laser sensor according to the included angle error.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the calibration method according to any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the calibration method according to any one of the embodiments of the present invention when the computer instructions are executed.
According to the embodiment of the invention, the camera is driven by the mechanical arm to move to the position above the convex calibration block, the first image of the convex calibration block is shot by the camera, the first included angle between the target edge of the convex calibration block and the boundary of the first image is obtained according to the first image, the target height map corresponding to the first image is obtained by the laser sensor, the second included angle between the target edge of the convex calibration block and the boundary of the target height map is determined according to the target height map, the included angle error between the camera and the laser sensor is determined according to the first included angle and the second included angle, and the camera and the laser sensor are horizontally calibrated according to the included angle error. According to the technical scheme, the manual participation in the calibration process can be reduced, the quantization operation is introduced in the calibration process, the calibration result can be more precise and accurate, and the consistency of the final equipment when leaving the factory is ensured.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a schematic structural diagram of a calibration system in an embodiment of the present invention;
FIG. 2 is a flow chart of a calibration method in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a method of determining a center point of a laser line in an embodiment of the invention;
FIG. 4 is a schematic diagram of a method for determining an offset distance in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a calibration plate according to an embodiment of the present invention;
FIG. 6 is a top view of a calibration plate in an embodiment of the present invention;
FIG. 7 is a diagram illustrating a division of a second image according to an embodiment of the present invention;
FIG. 8 is a schematic structural diagram of a calibration apparatus in an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device implementing the calibration method according to the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a schematic structural diagram of a calibration system in an embodiment of the present invention. The calibration method is applied to a calibration system, and the calibration system comprises: calibration board 11, first equipment 12, and take first equipment 12 to the arm 13 that moves, first equipment 12 includes: camera 121 and laser sensor 122, calibration plate 11 includes: the bump calibration block 111.
Among them, the convex calibration block 111 may be a convex calibration block placed on the calibration plate 11. In the implementation process, the convex calibration block 111 may be disposed at any position of the calibration plate 11, and preferably, the convex calibration block 111 may be disposed at a central position of the calibration plate 11. Specifically, the convex calibration block 111 may be, for example, a rectangular calibration block.
In the present embodiment, the calibration system first takes a photograph with a camera 121 included in the first apparatus 12, i.e., an Automated Optical Inspection (AOI) coating apparatus, and then performs height measurement of relevant elements, colloids, and the like with a laser sensor 122 included in the first apparatus 12. However, since the camera 121 and the laser sensor 122 included in the first device 12 are not located at the same central point, data acquired by the camera 121 and the laser sensor 122 need to be unified into a system centered on the same coordinate system when data processing is performed, so that the offset between the camera 121 and the laser sensor 122 must be accurately calibrated.
Fig. 2 is a flowchart of a calibration method in an embodiment of the present invention, where this embodiment is applicable to a calibration situation, and the method may be executed by a calibration apparatus in an embodiment of the present invention, where the apparatus may be implemented in a software and/or hardware manner, and as shown in fig. 2, the method specifically includes the following steps:
s201, driving the camera to move to the position above the protruding calibration block through the mechanical arm, and shooting a first image of the protruding calibration block through the camera.
It should be noted that the first image may be an image captured by a camera and containing a convex calibration block. Illustratively, the first image may be rectangular, and may be any shape with other known angles. In the actual operation process, if the first image shot by the camera does not contain the convex calibration block, the first image is considered to be invalid.
Concretely, drive the camera through the arm and remove, the control removes the camera to the position that includes protruding calibration piece on the calibration board, then shoots the first image that includes protruding calibration piece through the camera.
S202, acquiring a first included angle between the target edge of the convex calibration block and the boundary of the first image according to the first image.
The target edge may be any one of four edges of the convex calibration block in the first image captured by the camera.
It should be noted that the first included angle may be an included angle between a target edge of the convex calibration block acquired according to the first image and a boundary of the first image. The first image boundary is any one of the boundaries of the first image.
Specifically, the target edges of the convex calibration blocks acquired from the first image must be in one-to-one correspondence with the boundaries of the first image. Illustratively, the first image is a rectangle, the photographed protruding calibration block in the first image is also a rectangle, if the target edge of the protruding calibration block in the first image is one edge on the left side of the rectangle, the first image boundary is also the edge on the left side, and an included angle between the one edge on the left side of the protruding calibration block and the edge on the left side of the first image is a first included angle, that is, the target edge of the protruding calibration block in the first image is different, and the corresponding first included angles are different.
And S203, acquiring a target height map corresponding to the first image through the laser sensor.
It is known that the height map refers to an image of a scanned portion obtained by scanning data of the position of the projection by the laser sensor. The target height map may be a height map obtained by scanning an area corresponding to the first image with a laser sensor.
In the actual operation process, the area scanned by the laser sensor and the area shot by the camera are the same area, that is, if the first image shot by the camera is the central position area of the calibration plate, the area scanned by the laser sensor is also the central position area of the calibration plate.
Specifically, the laser sensor is driven to move to the area corresponding to the first image through the mechanical arm, and the target height map is obtained after the area corresponding to the first image is scanned.
And S204, determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map.
It should be noted that the second included angle may be an included angle between a target edge of the convex calibration block determined according to the target height map and a boundary of the target height map. The target edge of the raised calibration block determined according to the target height map and the target edge of the raised calibration block obtained according to the first image are the same edge, and the boundary of the target height map and the boundary of the first image are edges in the same direction.
Specifically, the target edges of the convex calibration blocks determined according to the target height map and the boundaries of the target height map must be in one-to-one correspondence, and the second included angles and the first included angles are also in one-to-one correspondence. Illustratively, the first image is a rectangle, the shot protrusion calibration block in the first image is also a rectangle, the protrusion calibration block in the target height map scanned by the laser sensor is a rectangle, the target height map is a rectangle, if the target edge of the protrusion calibration block in the first image is one edge on the left side of the rectangle, the first image boundary is also the edge on the left side, an included angle between the one edge on the left side of the protrusion calibration block and the edge on the left side of the first image is a first included angle, the target edge of the protrusion calibration block in the target height map is one edge on the left side of the rectangle, the target height map boundary is also the edge on the left side, and an included angle between the one edge on the left side of the protrusion calibration block and the edge on the left side of the target height map is a second included angle.
S205, determining an included angle error of the camera and the laser sensor according to the first included angle and the second included angle.
It should be noted that the included angle error may be an angle error between the camera and the laser sensor due to a mounting angle deviation.
Specifically, the first included angle and the second included angle are subjected to difference, and the included angle error of the camera and the laser sensor is determined according to the difference result. For example, if the difference between the first included angle and the second included angle is minus 5 degrees, the error of the included angle between the camera and the laser sensor is determined to be minus 5 degrees.
And S206, horizontally calibrating the camera and the laser sensor according to the included angle error.
It should be noted that the horizontal calibration operation may be to record position information of the camera and the laser sensor according to the angle error, and quantify horizontal offset information of the camera and the laser sensor.
Specifically, the difference between the first included angle and the second included angle is used for obtaining an included angle error, and then the camera and the laser sensor can be horizontally calibrated according to the included angle error.
According to the embodiment of the invention, the camera is driven by the mechanical arm to move to the position above the convex calibration block, the first image of the convex calibration block is shot by the camera, the first included angle between the target edge of the convex calibration block and the boundary of the first image is obtained according to the first image, the target height map corresponding to the first image is obtained by the laser sensor, the second included angle between the target edge of the convex calibration block and the boundary of the target height map is determined according to the target height map, the included angle error between the camera and the laser sensor is determined according to the first included angle and the second included angle, and the camera and the laser sensor are horizontally calibrated according to the included angle error. According to the technical scheme, the manual participation in the calibration process can be reduced, the quantization operation is introduced in the calibration process, the calibration result can be more precise and accurate, and the consistency of the final equipment when leaving the factory is ensured.
Optionally, the calibration plate further includes: and (4) sinking the calibration block.
In this embodiment, the recessed calibration block may be a calibration block recessed inside the calibration plate. In the implementation process, the recessed calibration block may be disposed at any position of the calibration plate, and preferably, the recessed calibration block may be disposed at a central position of the calibration plate. In particular, the recessed calibration block may be a funnel-shaped pattern with flat sides and a recessed center, such as a cone-shaped calibration block.
The method further comprises the following steps:
the camera is driven to move through the mechanical arm until the central point of a second image shot by the camera is coincident with the lowest point of the depression calibration block, and the position information of the camera is obtained.
It should be noted that the second image may be an image of the center position of the calibration board captured by the camera. For example, the second image may be rectangular, and the center point of the second image may be an intersection position of two diagonal lines of the rectangle.
The recessed calibration block may be a calibration block recessed inside the calibration plate, and the recessed lowest point of the recessed calibration block may be the lowest point of the recessed calibration block recessed into the calibration plate scanned by the laser sensor.
In this embodiment, the position information of the camera may be coordinate information of the camera relative to the calibration board, and the coordinate information may specifically include x-axis coordinate information and y-axis coordinate information.
Specifically, the camera is driven to move through the mechanical arm, the camera is controlled to move to the position of the approximate center of the calibration plate, then the camera is controlled by software to move in small steps until the central point of a second image shot by the camera is coincident with the lowest point of the depression calibration block, and the position information of the camera at the moment is acquired. For example, the second image captured by the camera may be rectangular, and when the center point of the second image coincides with the lowest point of the depression calibration block, that is, the intersection position of two diagonal lines of the rectangle coincides with the lowest point of the depression calibration block depressed into the calibration plate, the coordinate information of the camera at this time is acquired. In an actual operation process, an error of coincidence of the center point of the second image and the recessed lowest point of the recessed calibration block may be within 2 pixels.
The laser sensor is driven to move through the mechanical arm until the central point of the laser line of the laser sensor is coincident with the lowest point of the depression calibration block, and the position information of the laser sensor is obtained.
It should be explained that the laser sensor emits a laser line having a certain length, wherein the center point of the laser line may be the center point of the length of the laser line. FIG. 3 is a schematic diagram of a method for determining a center point of a laser line in an embodiment of the invention. As shown in fig. 3, the calibration board is vertically cut according to a vertical plane where the lowest point M of the recess calibration block is located, and fig. 3 is a cross-sectional view after vertical cutting, where point M is laser data corresponding to the lowest point M of the recess calibration block, s1 represents the length of the laser line of the laser sensor on the left side of the recess calibration block, and s2 represents the length of the laser line of the laser sensor on the right side of the recess calibration block. If the lengths of the laser lines of the laser sensors on the two sides of the sunken calibration block are equal, namely s1= s2, the center point of the laser line of the laser sensor is considered to coincide with the lowest sunken point M of the sunken calibration block. In the actual operation process, the coincidence error of the central point of the laser line of the laser sensor and the lowest point M of the depression calibration block can be within 2 pixels, i.e. s1-s2 < 2 (unit: pixel).
In this embodiment, the position information of the laser sensor may be coordinate information of the laser sensor relative to the calibration board, and the coordinate information may specifically include x-axis coordinate information and y-axis coordinate information.
Specifically, the laser sensor is driven to move through the mechanical arm, the laser sensor is controlled to move to the position of the approximate center of the calibration plate, and then the laser sensor is controlled to move in small steps through software. For example, the x-axis central point of the laser line of the laser sensor is found by moving left and right, the x-axis coordinate is fixed, and then the y-axis central point of the laser line of the laser sensor is found by moving up and down; or moving up and down to find the y-axis central point of the laser line of the laser sensor, fixing the y-axis coordinate, and moving left and right to find the x-axis central point of the laser line of the laser sensor. And acquiring the position information of the laser sensor at the moment when the central point of the laser line of the laser sensor is superposed with the lowest point of the depression calibration block.
And calibrating the offset of the camera and the laser sensor according to the position information of the camera and the position information of the laser sensor.
In this embodiment, the offset calibration may be to record position information of the camera and the laser sensor, and quantize offset information of the camera and the laser sensor.
Specifically, the offset value is obtained by subtracting the coordinate information of the camera and the coordinate information of the laser sensor, so that the camera and the laser sensor can be calibrated in an offset manner according to the position information of the camera and the position information of the laser sensor.
Optionally, acquiring a target height map corresponding to the first image by using a laser sensor includes:
a target scan area is determined from the first image.
The first image is an image obtained by shooting a target scanning area through a camera.
It should be noted that the target scanning area may be an area position of the calibration board photographed by the camera. Specifically, the target scanning area may be a central position of the calibration plate.
Specifically, the target scanning area is determined according to a first image captured by a camera. For example, if a convex calibration block located at the center of the calibration plate exists in the first image captured by the camera, the target scanning area may be determined to be the center of the calibration plate.
And acquiring a target height map corresponding to the target scanning area through a laser sensor.
Specifically, after a target scanning area is determined according to the first image, the laser sensor is driven to move through the mechanical arm until the laser sensor can scan the target scanning area, and a target height map of the target scanning area is obtained through the laser sensor.
In the actual operation process, the area scanned by the laser sensor and the area shot by the camera are the same area, that is, if the first image shot by the camera is the central position area of the calibration plate, the area scanned by the laser sensor is also the central position area of the calibration plate.
Optionally, determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map includes:
and carrying out filtering processing on the target height map to obtain a filtered target height map.
In this embodiment, the filtering process may be a filtering process performed on a target height map corresponding to the first image acquired by the laser sensor, and the specific filtering process method is not limited in this embodiment.
Specifically, the target height map is filtered, noise points in the target height map are filtered, and the filtered target height map is obtained, so that the target height map is clearer, and a subsequent identification result of the target height map is more accurate.
And carrying out binarization processing on the filtered target height map to obtain a third image.
In this embodiment, the binarization processing may be to compare the gray scale value of the pixel point in the filtered target height map with a preset threshold, set the gray scale value greater than the preset threshold to 1, and set the gray scale value less than the preset threshold to 0. The preset threshold may be a gray value preset according to an actual situation, and the size of the preset threshold is not limited in this embodiment.
The third image may be an image obtained by performing binarization processing on the filtered target height map.
Specifically, the boundary of the standard block in the image cannot be directly seen through the target height map acquired by the laser sensor as in the image shot by the camera, and the boundary of the standard block in the image can only be seen through the third image obtained by filtering the target height map and then performing binarization processing on the filtered target height map.
And acquiring a second included angle between the target edge of the convex calibration block and the boundary of the third image according to the third image.
And the third image boundary is any one boundary of the third image.
Specifically, the target edge of the convex calibration block obtained according to the third image and the third image boundary are required to be in one-to-one correspondence, the second included angle and the first included angle are also in one-to-one correspondence, the target edge of the convex calibration block obtained according to the third image and the target edge of the convex calibration block obtained according to the first image are the same edge, and the third image boundary and the first image boundary are edges in the same direction. Illustratively, the first image is a rectangle, the captured convex calibration block in the first image is also a rectangle, the third image is a rectangle, the convex calibration block in the third image is a rectangle, if the target edge of the convex calibration block in the first image is one side of the left side of the rectangle, the first image boundary is also the side of the left side, an included angle between the one side of the left side of the convex calibration block and the side of the left side of the first image is a first included angle, the target edge of the convex calibration block in the third image is one side of the left side of the rectangle, the third image boundary is also the side of the left side, and an included angle between the one side of the left side of the convex calibration block and the side of the left side of the third image boundary is a second included angle.
Optionally, after horizontally calibrating the camera and the laser sensor according to the included angle error, the method further includes:
and adjusting the installation angle of the laser sensor according to the included angle error of the camera and the laser sensor.
In the present embodiment, the installation angle of the laser sensor may be an angle at which the laser sensor is installed in the first device.
Specifically, the error of the included angle between the camera and the laser sensor is determined according to the first included angle and the second included angle, and then the installation angle of the laser sensor is adjusted according to the error of the included angle between the camera and the laser sensor. For example, if the difference between the first included angle and the second included angle is negative 5 degrees, the error of the included angle between the camera and the laser sensor is determined to be negative 5 degrees, and then the installation angle of the laser sensor is adjusted to the right by 5 degrees; and if the difference between the first included angle and the second included angle is made to obtain a result of positive 5 degrees, determining that the error of the included angle between the camera and the laser sensor is positive 5 degrees, and then adjusting the installation angle of the laser sensor to the left by 5 degrees. In the actual operation process, an instruction for adjusting the installation angle of the laser sensor can be generated after the included angle error between the camera and the laser sensor is determined, the adjustment instruction can also include how to adjust and the angle to be adjusted specifically, the adjustment instruction can be sent to a user after the adjustment instruction is generated, and the user can manually adjust the installation angle of the laser sensor according to the adjustment instruction.
Optionally, after adjusting the installation angle of the laser sensor according to the included angle error between the camera and the laser sensor, the method further includes:
and acquiring the updated included angle error of the adjusted camera and the laser sensor.
It should be explained that the updated included angle error may be an updated included angle error between the camera and the laser sensor after the user manually adjusts the installation angle of the laser sensor according to the adjustment instruction.
Specifically, after a user manually adjusts the installation angle of the laser sensor according to the adjustment instruction, the adjusted error of the updated included angle between the camera and the laser sensor is obtained.
And acquiring the scanning distance corresponding to the laser sensor.
It should be noted that the scanning distance may be the farthest distance that the laser sensor can scan.
Specifically, the farthest scanning distance corresponding to the laser sensor is obtained according to the configuration information of the laser sensor.
And determining the offset distance according to the scanning distance corresponding to the laser sensor and the updated included angle error.
The offset distance may be an offset distance of a boundary of the target height map scanned by the laser sensor with respect to a boundary of the first image captured by the camera due to a deviation of a mounting angle of the laser sensor.
Specifically, fig. 4 is a schematic diagram of a method for determining an offset distance in an embodiment of the present invention. As shown in fig. 4, the side AC of the right triangle is the farthest scanning distance corresponding to the laser sensor, the angle C is the updated included angle error, and the side AB of the right triangle is the offset distance caused by the deviation of the laser sensor installation angle. The offset distance AB can be obtained by trigonometric calculation according to the specific formula
Figure 430717DEST_PATH_IMAGE001
And determining the installation angle deviation value of the laser sensor according to the deviation distance.
The mounting angle deviation value may be a deviation value due to a mounting angle when the laser sensor is mounted.
Specifically, after the offset distance AB is determined, the coordinates of the point B can be derived, and the mounting angle deviation value of the laser sensor can be determined according to the offset distance AB.
And adjusting the installation angle of the laser sensor according to the installation angle deviation value of the laser sensor.
Specifically, after the installation angle deviation value of the laser sensor is determined, the installation angle of the laser sensor is adjusted according to the installation angle deviation value of the laser sensor. For example, if the installation angle deviation value of the laser sensor is a negative value, the angle corresponding to the installation angle deviation value is adjusted rightward; and if the mounting angle deviation value of the laser sensor is a positive value, adjusting the angle corresponding to the mounting angle deviation value leftwards.
According to the technical scheme of the embodiment of the invention, the laser sensor is adjusted to the maximum precision which can be realized by a human operator through at least one manual adjustment (namely, the user performs the manual adjustment operation on the installation angle of the laser sensor according to the adjustment instruction), and then compensation is performed through software (namely, the operation of acquiring the updated included angle error of the adjusted camera and the laser sensor, acquiring the scanning distance corresponding to the laser sensor, determining the offset distance according to the scanning distance corresponding to the laser sensor and the updated included angle error, determining the installation angle deviation value of the laser sensor according to the offset distance and adjusting the installation angle of the laser sensor according to the installation angle deviation value of the laser sensor), namely, the installation angle of the laser sensor is adjusted at least twice, so that the adjustment on the installation angle of the laser sensor is more reasonable and accurate, and the calibration accuracy is improved.
Optionally, fig. 5 is a schematic structural diagram of a calibration board in an embodiment of the present invention, and fig. 6 is a top view of the calibration board in the embodiment of the present invention. As shown in fig. 5, the convex calibration block 111 is disposed right above the concave calibration block 112, and as shown in fig. 6, the concave lowest point M of the concave calibration block 112 is coaxial with the center point of the convex calibration block 111, the convex calibration block 111 is a rectangular calibration block, and the concave calibration block 112 is a conical calibration block. Wherein, the point M is the lowest point of the depression calibration block 112, and the points a, b, c and d are the four vertexes of the protrusion calibration block 111.
Correspondingly, drive the camera through the arm and remove, when the central point of the second image that shoots up to the camera and the sunken nadir coincidence of sunken calibration piece, acquire the positional information of camera, include:
and in the process that the mechanical arm drives the camera to move, shooting a second image containing the convex calibration block and the concave calibration block by the camera.
Specifically, the camera is driven to move through the mechanical arm, and in the process that the mechanical arm drives the camera to move, a second image containing the convex calibration block and the concave calibration block is shot through the camera.
Four vertex coordinates of the convex calibration block are determined from the second image.
Here, the four vertex coordinates may be coordinates of four vertices of the protrusion calibration block 111 in fig. 6, that is, a point a, a point b, a point c, and a point d.
Specifically, after a second image containing the convex calibration block and the concave calibration block is shot by the camera, the second image is processed by the image processing software to determine four vertex coordinates of the convex calibration block. In actual operation, the error of the coordinates of the four vertices of the convex calibration block may be within 2 pixels.
The first movement distance is determined from the four vertex coordinates of the bump calibration block and the four vertex coordinates of the second image.
It should be explained that the first moving distance may be a distance that moves the camera from the current position to a position where the center of the image captured by the camera coincides with the center of the convex calibration block.
Specifically, the distance between each vertex of the convex calibration block and each vertex of the second image is determined according to the coordinates of the four vertices of the convex calibration block and the coordinates of the four vertices of the second image, and the first moving distance of the camera is determined, so that the camera moves to a position where the center of the image shot by the camera coincides with the center of the convex calibration block, that is, the distances between each vertex of the convex calibration block and each vertex of the second image are the same.
And controlling the mechanical arm to drive the camera to move to the target position according to the first moving distance.
The target position may be a position where the center of the image captured by the camera coincides with the center of the convex calibration block, that is, a position where the distance between each vertex of the convex calibration block and each vertex of the second image is the same.
Specifically, the mechanical arm is controlled to drive the camera to move a first moving distance to a target position, so that the camera moves to a position where the center of an image shot by the camera coincides with the center of the convex calibration block, namely, the distance between each vertex of the convex calibration block and each vertex of the second image is the same.
When the camera is located at the target position, if the central point of the second image shot by the camera is coincided with the central point of the convex calibration block, the position information of the camera is obtained.
Specifically, when the camera is located at the target position, if a center point of a second image captured by the camera coincides with a center point of the convex calibration block, position information of the camera is acquired. Because the sunken lowest point of sunken calibration piece and the central point coincidence of protruding calibration piece, the central point of the second image that consequently camera shot and the sunken lowest point of sunken calibration piece also coincide at this moment.
Optionally, the camera is driven to move by the mechanical arm, and when a central point of a second image shot by the camera coincides with the lowest point of the depression calibration block, position information of the camera is acquired, including:
and in the process that the mechanical arm drives the camera to move, shooting a second image containing the concave calibration block by the camera.
Specifically, the camera is driven to move through the mechanical arm, and in the process that the mechanical arm drives the camera to move, a second image containing the concave calibration block is shot through the camera. In the actual operation process, if the second image shot by the camera does not contain the recess calibration block, the second image is considered invalid.
And dividing the second image to obtain a preset number of regions.
Exemplarily, fig. 7 is a schematic diagram of dividing the second image in the embodiment of the present invention. As shown in fig. 7, in this embodiment, the second image may be a rectangle, and the connection line between the middle points of the opposite sides of the second image may be a connection line between two sets of middle points of the opposite sides of the rectangle, that is, two broken lines in fig. 7, where the preset number may be 4, and specifically, the second image may be divided according to the connection line between the middle points of the opposite sides of the second image, so as to obtain 4 regions.
The first movement distance is determined based on the area of the upper surface of the recessed calibration block in each region.
For example, the upper surface of the concave calibration block may be circular. In the actual operation process, when the second image is divided according to the connecting line of the middle points of the opposite sides of the second image, the upper surface of the trap calibration block is also divided.
Specifically, the area of the upper surface of the indentation calibration block in each region is calculated, and the first movement distance is determined according to the area of the upper surface of the indentation calibration block in each region. The first moving distance may be a distance that moves the camera from the current position to a position where the center of the image captured by the camera coincides with the center of the recessed calibration block, that is, a position where the areas of the upper surfaces of the recessed calibration blocks in each region are the same.
And controlling the mechanical arm to drive the camera to move to the target position according to the first moving distance.
Specifically, the mechanical arm is controlled to drive the camera to move a first moving distance to a target position, so that the camera moves to a position where the center of an image shot by the camera coincides with the center of the concave calibration block, namely, a position where the areas of the upper surfaces of the concave calibration blocks in each region are the same.
When the camera is at the target position, if the areas of the upper surfaces of the recessed calibration blocks in the respective regions are the same, the position information of the camera is acquired.
Specifically, when the camera is located at the target position, if the areas of the upper surfaces of the concave calibration blocks in the respective regions are the same, that is, the center point of the second image captured by the camera coincides with the center point of the convex calibration block, the position information of the camera is acquired.
According to the technical scheme of the embodiment of the invention, only a small amount of manual work is needed in the implementation process, fine correction and adjustment are automatically calculated by software, the manual workload is reduced, the offset error caused by manual calibration operation is reduced, the working efficiency is improved, the calibration operation process is relatively simple, even non-technical personnel can easily calibrate the camera and the laser sensor, the technical requirements on the operating personnel are reduced, the labor expenditure is saved, meanwhile, the consistency of equipment when leaving a factory is ensured by introducing quantization operation in the calibration process, and the product quality is ensured.
Example two
Fig. 8 is a schematic structural diagram of a calibration device in an embodiment of the present invention. The present embodiment may be applicable to the calibration, and the apparatus may be implemented in a software and/or hardware manner, and the apparatus may be integrated into any device providing the calibration function, as shown in fig. 8, where the calibration apparatus specifically includes: a processing module 301, a first obtaining module 302, a second obtaining module 303, a first determining module 304, a second determining module 305, and a level calibration module 306.
The processing module 301 is configured to drive a camera to move above a raised calibration block through a mechanical arm, and capture a first image of the raised calibration block through the camera;
a first obtaining module 302, configured to obtain, according to the first image, a first included angle between a target edge of the protrusion calibration block and a first image boundary, where the first image boundary is any boundary of the first image;
a second obtaining module 303, configured to obtain, by using a laser sensor, a target height map corresponding to the first image;
a first determining module 304, configured to determine, according to the target height map, a second included angle between a target edge of the protrusion calibration block and a boundary of the target height map;
a second determining module 305, configured to determine an angle error between the camera and the laser sensor according to the first angle and the second angle;
and a horizontal calibration module 306, configured to perform horizontal calibration on the camera and the laser sensor according to the included angle error.
Optionally, the calibration plate further includes: a recessed calibration block;
the device further comprises:
the third acquisition module is used for driving the camera to move through the mechanical arm until the central point of a second image shot by the camera is superposed with the lowest point of the depression calibration block, and acquiring the position information of the camera;
the fourth acquisition module is used for driving the laser sensor to move through the mechanical arm until the central point of a laser line of the laser sensor coincides with the lowest point of the depression calibration block, and acquiring the position information of the laser sensor;
and the offset calibration module is used for calibrating the offset of the camera and the laser sensor according to the position information of the camera and the position information of the laser sensor.
Optionally, the second obtaining module 303 includes:
the first determining unit is used for determining a target scanning area according to the first image, wherein the first image is obtained by shooting the target scanning area through a camera;
and the first acquisition unit is used for acquiring a target height map corresponding to the target scanning area through the laser sensor.
Optionally, the first determining module 304 includes:
the filtering processing unit is used for carrying out filtering processing on the target height map to obtain a filtered target height map;
a binarization processing unit, configured to perform binarization processing on the filtered target height map to obtain a third image;
and the second acquisition unit is used for acquiring a second included angle between the target edge of the convex calibration block and the boundary of the third image according to the third image, wherein the boundary of the third image is any boundary of the third image.
Optionally, the apparatus further comprises:
and the adjusting module is used for adjusting the installation angle of the laser sensor according to the included angle error of the camera and the laser sensor after the camera and the laser sensor are horizontally calibrated according to the included angle error.
Optionally, the adjusting module includes:
the third acquisition unit is used for acquiring the updated included angle error of the adjusted camera and the laser sensor after the installation angle of the laser sensor is adjusted according to the included angle error of the camera and the laser sensor;
the fourth acquisition unit is used for acquiring the scanning distance corresponding to the laser sensor after the installation angle of the laser sensor is adjusted according to the included angle error between the camera and the laser sensor;
the second determining unit is used for determining an offset distance according to the scanning distance corresponding to the laser sensor and the updated included angle error after the installation angle of the laser sensor is adjusted according to the included angle error between the camera and the laser sensor;
the third determining unit is used for determining the mounting angle deviation value of the laser sensor according to the offset distance after the mounting angle of the laser sensor is adjusted according to the included angle error between the camera and the laser sensor;
and the adjusting unit is used for adjusting the installation angle of the laser sensor according to the installation angle deviation value of the laser sensor after adjusting the installation angle of the laser sensor according to the included angle error between the camera and the laser sensor.
Optionally, the convex calibration block is arranged right above the concave calibration block, the convex calibration block is a rectangular calibration block, and the concave calibration block is a conical calibration block;
correspondingly, the third obtaining module includes:
the first shooting unit is used for shooting a second image containing the convex calibration block and the concave calibration block through the camera in the process that the mechanical arm drives the camera to move;
a fourth determination unit configured to determine four vertex coordinates of the convex calibration block from the second image;
a fifth determining unit configured to determine a first moving distance according to the four vertex coordinates of the protrusion calibration block and the four vertex coordinates of the second image;
the first control unit is used for controlling the mechanical arm to drive the camera to move to a target position according to the first moving distance;
and the fifth acquisition unit is used for acquiring the position information of the camera if the central point of a second image shot by the camera is superposed with the central point of the convex calibration block when the camera is at the target position.
Optionally, the third obtaining module includes:
the second shooting unit is used for shooting a second image containing the concave calibration block by the camera in the process of driving the camera to move by the mechanical arm;
the dividing unit is used for dividing the second image to obtain a preset number of areas;
a sixth determining unit configured to determine the first moving distance based on an area of an upper surface of the concave calibration block in each region;
the second control unit is used for controlling the mechanical arm to drive the camera to move to a target position according to the first moving distance;
a sixth acquiring unit, configured to acquire, when the camera is at the target position, position information of the camera if areas of upper surfaces of the concave calibration blocks in the respective regions are the same.
The product can execute the calibration method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE III
FIG. 9 shows a schematic block diagram of an electronic device 40 that may be used to implement embodiments of the present invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 9, the electronic device 40 includes at least one processor 41, and a memory communicatively connected to the at least one processor 41, such as a Read Only Memory (ROM) 42, a Random Access Memory (RAM) 43, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 41 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 42 or the computer program loaded from the storage unit 48 into the Random Access Memory (RAM) 43. In the RAM 43, various programs and data necessary for the operation of the electronic apparatus 40 can also be stored. The processor 41, the ROM 42, and the RAM 43 are connected to each other via a bus 44. An input/output (I/O) interface 45 is also connected to bus 44.
A number of components in the electronic device 40 are connected to the I/O interface 45, including: an input unit 46 such as a keyboard, a mouse, etc.; an output unit 47 such as various types of displays, speakers, and the like; a storage unit 48 such as a magnetic disk, an optical disk, or the like; and a communication unit 49 such as a network card, modem, wireless communication transceiver, etc. The communication unit 49 allows the electronic device 40 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Processor 41 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 41 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. Processor 41 performs the various methods and processes described above, such as the calibration method:
driving the camera to move above the convex calibration block through a mechanical arm, and shooting a first image of the convex calibration block through the camera;
acquiring a first included angle between a target edge of the convex calibration block and a first image boundary according to the first image, wherein the first image boundary is any boundary of the first image;
acquiring a target height map corresponding to the first image through a laser sensor;
determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map;
determining an included angle error of the camera and the laser sensor according to the first included angle and the second included angle;
and horizontally calibrating the camera and the laser sensor according to the included angle error.
In some embodiments, the calibration method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 48. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 40 via the ROM 42 and/or the communication unit 49. When the computer program is loaded into the RAM 43 and executed by the processor 41, one or more steps of the calibration method described above may be performed. Alternatively, in other embodiments, processor 41 may be configured to perform the calibration method by any other suitable means (e.g., by way of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A calibration method is characterized by being applied to a calibration system, and the calibration system comprises: calibration board, first equipment to and drive the arm that first equipment removed, first equipment includes: camera and laser sensor, the calibration board includes: the calibration method comprises the following steps:
driving the camera to move above the convex calibration block through a mechanical arm, and shooting a first image of the convex calibration block through the camera;
acquiring a first included angle between a target edge of the raised calibration block and a first image boundary according to the first image, wherein the first image boundary is any boundary of the first image, the target edge is any edge of the raised calibration block in the first image, the first included angle is an included angle between the target edge of the raised calibration block and the first image boundary, and the target edge of the raised calibration block corresponds to the first image boundary one by one;
acquiring a target height map corresponding to the first image through a laser sensor;
determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map;
determining an included angle error between the camera and the laser sensor according to the first included angle and the second included angle;
and horizontally calibrating the camera and the laser sensor according to the included angle error.
2. The method of claim 1, wherein the calibration plate further comprises: a recessed calibration block;
the method further comprises the following steps:
driving the camera to move through the mechanical arm until the central point of a second image shot by the camera coincides with the lowest point of the depression calibration block, and acquiring the position information of the camera;
the laser sensor is driven to move through the mechanical arm until the central point of a laser line of the laser sensor is coincident with the lowest point of the depression calibration block, and the position information of the laser sensor is obtained;
and carrying out offset calibration on the camera and the laser sensor according to the position information of the camera and the position information of the laser sensor.
3. The method of claim 1, wherein obtaining the target height map corresponding to the first image with a laser sensor comprises:
determining a target scanning area according to the first image, wherein the first image is an image obtained by shooting the target scanning area through a camera;
and acquiring a target height map corresponding to the target scanning area through the laser sensor.
4. The method of claim 1, further comprising, after calibrating the camera and the laser sensor horizontally according to the angle error:
and adjusting the installation angle of the laser sensor according to the included angle error of the camera and the laser sensor.
5. The method of claim 4, further comprising, after adjusting the installation angle of the laser sensor according to an angle error between the camera and the laser sensor:
acquiring an updated included angle error of the adjusted camera and the adjusted laser sensor;
acquiring a scanning distance corresponding to the laser sensor;
determining an offset distance according to the scanning distance corresponding to the laser sensor and the updated included angle error;
determining a mounting angle deviation value of the laser sensor according to the deviation distance;
and adjusting the installation angle of the laser sensor according to the installation angle deviation value of the laser sensor.
6. The method of claim 2, wherein the convex calibration block is disposed directly above the concave calibration block, the convex calibration block being a rectangular calibration block and the concave calibration block being a conical calibration block;
correspondingly, drive through the arm camera removal, when the central point of the second image that the camera was shot and the sunken minimum of sunken calibration block coincide, acquire the positional information of camera, include:
in the process that the mechanical arm drives the camera to move, shooting a second image containing the convex calibration block and the concave calibration block by the camera;
determining four vertex coordinates of the convex calibration block according to the second image;
determining a first moving distance according to the coordinates of the four top points of the convex calibration block and the coordinates of the four top points of the second image;
controlling the mechanical arm to drive the camera to move to a target position according to the first moving distance;
and when the camera is positioned at the target position, if the central point of a second image shot by the camera is superposed with the central point of the convex calibration block, acquiring the position information of the camera.
7. The method of claim 2, wherein the camera is moved by a mechanical arm until a central point of a second image captured by the camera coincides with a lowest point of a depression of the depression calibration block, and the obtaining of the position information of the camera comprises:
in the process that the mechanical arm drives the camera to move, shooting a second image containing the concave calibration block through the camera;
dividing the second image to obtain a preset number of regions;
determining a first movement distance from an area of an upper surface of the recessed calibration block in each region;
controlling the mechanical arm to drive the camera to move to a target position according to the first moving distance;
when the camera is at a target position, if the areas of the upper surfaces of the concave calibration blocks in the respective regions are the same, the position information of the camera is acquired.
8. A calibration device, comprising:
the processing module is used for driving the camera to move above the convex calibration block through the mechanical arm and shooting a first image of the convex calibration block through the camera;
a first obtaining module, configured to obtain, according to the first image, a first included angle between a target edge of the raised calibration block and a first image boundary, where the first image boundary is any boundary of the first image, the target edge is any edge of the raised calibration block in the first image, the first included angle is an included angle between the target edge of the raised calibration block and the first image boundary, and the target edge of the raised calibration block corresponds to the first image boundary one to one;
the second acquisition module is used for acquiring a target height map corresponding to the first image through a laser sensor;
the first determining module is used for determining a second included angle between the target edge of the convex calibration block and the boundary of the target height map according to the target height map;
the second determining module is used for determining the included angle error of the camera and the laser sensor according to the first included angle and the second included angle;
and the horizontal calibration module is used for horizontally calibrating the camera and the laser sensor according to the included angle error.
9. A terminal device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a calibration method as defined in any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the calibration method as set forth in any one of claims 1-7.
CN202211036717.2A 2022-08-29 2022-08-29 Calibration method, device, equipment and storage medium Active CN115100299B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211036717.2A CN115100299B (en) 2022-08-29 2022-08-29 Calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211036717.2A CN115100299B (en) 2022-08-29 2022-08-29 Calibration method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115100299A CN115100299A (en) 2022-09-23
CN115100299B true CN115100299B (en) 2023-02-10

Family

ID=83301394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211036717.2A Active CN115100299B (en) 2022-08-29 2022-08-29 Calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115100299B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117119324B (en) * 2023-08-24 2024-03-08 合肥埃科光电科技股份有限公司 Multi-area array sensor camera and installation position adjusting method and device thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN111300481A (en) * 2019-12-11 2020-06-19 苏州大学 Robot grabbing pose correction method based on vision and laser sensor
CN111376841A (en) * 2018-12-28 2020-07-07 北汽福田汽车股份有限公司 Calibration method and device of lane departure system and storage medium
CN112819896A (en) * 2019-11-18 2021-05-18 商汤集团有限公司 Calibration method and device of sensor, storage medium and calibration system
CN114200430A (en) * 2021-12-10 2022-03-18 上海西井信息科技有限公司 Calibration method, system, equipment and storage medium for laser radar and camera

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2741049A1 (en) * 2012-12-05 2014-06-11 Leica Geosystems AG Test device for the horizontal dispersion of a laser beam and corresponding method
CN111435162B (en) * 2020-03-03 2021-10-08 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN112017205B (en) * 2020-07-27 2021-06-25 清华大学 Automatic calibration method and system for space positions of laser radar and camera sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN111376841A (en) * 2018-12-28 2020-07-07 北汽福田汽车股份有限公司 Calibration method and device of lane departure system and storage medium
CN112819896A (en) * 2019-11-18 2021-05-18 商汤集团有限公司 Calibration method and device of sensor, storage medium and calibration system
CN111300481A (en) * 2019-12-11 2020-06-19 苏州大学 Robot grabbing pose correction method based on vision and laser sensor
CN114200430A (en) * 2021-12-10 2022-03-18 上海西井信息科技有限公司 Calibration method, system, equipment and storage medium for laser radar and camera

Also Published As

Publication number Publication date
CN115100299A (en) 2022-09-23

Similar Documents

Publication Publication Date Title
CN108537834B (en) Volume measurement method and system based on depth image and depth camera
CN105835507A (en) Method for attaching mobile phone cover lens to liquid crystal display
CN107345789A (en) A kind of pcb board hole location detecting device and method
CN115100299B (en) Calibration method, device, equipment and storage medium
CN114913121A (en) Screen defect detection system and method, electronic device and readable storage medium
CN110146017A (en) Industrial robot repetitive positioning accuracy measurement method
CN111311671A (en) Workpiece measuring method and device, electronic equipment and storage medium
US8274597B2 (en) System and method for measuring a border of an image of an object
CN113172636A (en) Automatic hand-eye calibration method and device and storage medium
CN116208853A (en) Focusing angle determining method, device, equipment and storage medium
CN111475016A (en) Assembly process geometric parameter self-adaptive measurement system and method based on computer vision
CN112785650A (en) Camera parameter calibration method and device
CN113470103B (en) Method and device for determining camera acting distance in vehicle-road cooperation and road side equipment
CN114926545A (en) Camera calibration precision evaluation method and device, electronic equipment and storage medium
CN114862963A (en) Bonding positioning method, device, equipment and storage medium
CN117115233B (en) Dimension measurement method and device based on machine vision and electronic equipment
CN117173156B (en) Pole piece burr detection method, device, equipment and medium based on machine vision
CN114943769B (en) Positioning method, positioning device, electronic equipment and medium
CN115877401B (en) Posture detection method, device and equipment for hydraulic support and storage medium
CN115631249B (en) Camera correction method, device, equipment and storage medium
CN117078908A (en) Label positioning method, system, device, equipment and storage medium
CN114043531A (en) Table top inclination angle determination method, table top inclination angle use method, table top inclination angle determination device, robot and storage medium
CN115861443A (en) Multi-camera internal reference calibration method and device, electronic equipment and storage medium
CN115526918A (en) Structural deformation detection method, system, device, electronic equipment and medium
CN115824285A (en) Sensor position calibration method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant