CN220170495U - Calibration system of equipment - Google Patents

Calibration system of equipment Download PDF

Info

Publication number
CN220170495U
CN220170495U CN202321091245.0U CN202321091245U CN220170495U CN 220170495 U CN220170495 U CN 220170495U CN 202321091245 U CN202321091245 U CN 202321091245U CN 220170495 U CN220170495 U CN 220170495U
Authority
CN
China
Prior art keywords
calibrated
calibration
equipment
calibration plate
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202321091245.0U
Other languages
Chinese (zh)
Inventor
韦盛斌
王进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rainbow Software Co ltd
Original Assignee
Rainbow Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rainbow Software Co ltd filed Critical Rainbow Software Co ltd
Priority to CN202321091245.0U priority Critical patent/CN220170495U/en
Application granted granted Critical
Publication of CN220170495U publication Critical patent/CN220170495U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model discloses a calibration system of equipment. The calibration system of the equipment comprises: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an image acquisition device and an inertial measurement unit; the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range; the inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range; the control unit is connected with the image acquisition device and the inertial measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated. The utility model achieves the aim of improving the accuracy of the calibration parameters of the equipment to be calibrated, and solves the technical problem of inaccurate equipment parameter calibration in the related technology.

Description

Calibration system of equipment
Technical Field
The utility model relates to the technical field of calibration, in particular to a calibration system of equipment.
Background
With the rise of the concept of "meta-space", the demands of Augmented-Reality (AR) devices, virtual-Reality (VR) devices, and Mixed-Reality (MR/XR) devices are in a continuously rising situation. These devices are typically equipped with multiple cameras and inertial sensors (Inertial Measurement Unit, abbreviated IMUs) for scene perception, device positioning, virtual-real combination, user interaction, etc., and in order to avoid distortion of the camera imaging, the parameters of the camera and IMU equipped with the device are typically calibrated in advance.
In the related art, in the multi-camera and IMU combined calibration, a pattern array flat plate with a fixed interval is usually used as a calibration plate for calibration, but the anti-blurring capacity of the calibration plate is weak, in the process of quick movement of equipment, the image of the calibration plate acquired by a camera can generate motion blurring, and if the camera adopts a fisheye lens, the situation of failure in picture edge corner detection easily occurs, so that the calibration of internal parameters of the camera, particularly distortion parameters, is inaccurate, and the accuracy of the working result of the camera is affected.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The utility model mainly aims to provide a calibration system of equipment, which at least solves the technical problem of inaccurate equipment parameter calibration in the related art.
To achieve the above object, according to one aspect of the present utility model, there is provided a calibration system of an apparatus. The calibration system of the device comprises: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an image acquisition device and an inertial measurement unit; the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range; the inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range; the control unit is connected with the image acquisition device and the inertial measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated.
Optionally, the calibration system further comprises a mechanical arm, wherein one end of the mechanical arm is connected with the control unit, and the other end of the mechanical arm is fixed with the equipment to be calibrated and used for controlling the equipment to be calibrated to move in the target space range.
Optionally, the calibration plate is disposed within the target space.
Optionally, the calibration plate includes a normal feature point and a mark feature point, wherein a difference between the mark information of the normal feature point and the mark information of the mark feature point includes at least one of: mark structure, mark color, mark gray value, mark shape.
Optionally, the calibration plate comprises an outward facing camera calibration plate and an eye movement camera calibration plate.
Optionally, the image acquisition device comprises: the outward camera is arranged at the outer side of the equipment to be calibrated and is used for collecting image data of an outward camera calibration plate; the eye movement camera is arranged on the inner side of the equipment to be calibrated and used for collecting image data of the eye movement camera calibration plate.
Optionally, the target space range includes a first target space range in which the outward camera and the inertial measurement unit are jointly calibrated, and a second target space range in which the eye movement camera and the inertial measurement unit are jointly calibrated, wherein the first target space range is internally equipped with at least one outward camera calibration plate and the second target space range is internally equipped with at least one outward camera calibration plate and one eye movement camera calibration plate.
Optionally, the outward facing camera calibration plate and the eye movement camera calibration plate within the second target space range are arranged in parallel, and the calibration plate patterns are arranged opposite to each other.
Optionally, in the first target space range, an included angle between the outward camera and the outward camera calibration plate is in a target threshold range; and in the second target space range, the outward camera and the outward camera calibration plate are arranged oppositely, and the eye movement camera calibration plate are arranged oppositely.
Optionally, the control unit is connected with the equipment to be calibrated through a data line or a wireless network.
In the present utility model, a calibration system of an apparatus includes: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an inertial measurement unit of an image acquisition device; the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range; the inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range; the control unit is connected with the image acquisition device and the inertial measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated. That is, in the present utility model, a calibration plate is disposed in a target space range, and an image acquisition device in the device to be calibrated can acquire image data of the calibration plate in the target space range; the inertial measurement unit can acquire the space data of the equipment to be calibrated moving in the target space range, the control unit can receive the image data of the calibration plate acquired by the image acquisition equipment and the space data acquired by the inertial measurement unit, based on the space data and the image data, the space data can be converted into the calibration parameters of the equipment to be calibrated, the efficiency of determining the calibration parameters of the equipment to be calibrated can be improved, the determined calibration parameters are accurate, the technical effect of improving the accuracy of the calibration parameters is achieved, and the technical problem that the equipment parameter calibration in the related art is inaccurate is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the utility model and are incorporated in and constitute a part of this specification, illustrate embodiments of the utility model and together with the description serve to explain the utility model. In the drawings:
FIG. 1 is a schematic illustration of a calibration system of an apparatus according to the present utility model;
FIG. 2 is a schematic illustration of an apparatus to be calibrated according to the present utility model;
FIG. 3 is a schematic diagram of a calibration system of an apparatus according to the present utility model;
FIG. 4 is a schematic illustration of an outward facing camera calibration plate and an eye movement camera calibration plate arrangement within a second target spatial range in accordance with the present utility model;
FIG. 5 is a schematic illustration of a marker feature point according to the present utility model;
fig. 6 is a schematic illustration of an outward facing camera and an outward facing camera calibration plate within a first target spatial range according to the present utility model.
Detailed Description
It should be noted that, without conflict, the embodiments of the present utility model and features of the embodiments may be combined with each other. The utility model will be described in detail below with reference to the drawings in connection with embodiments.
In order that those skilled in the art will better understand the present utility model, a technical solution in the embodiments of the present utility model will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present utility model, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present utility model without making any inventive effort, shall fall within the scope of the present utility model.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present utility model and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the utility model herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed but may include other elements not expressly listed or inherent to such article or apparatus.
FIG. 1 is a schematic diagram of a calibration system for an apparatus according to the present utility model. As shown in fig. 1, the calibration system 100 of the apparatus includes: the device to be calibrated 20 comprises a control unit 10, a device to be calibrated 20 and a calibration plate 30, wherein the device to be calibrated 20 comprises an image acquisition device 201 and an inertial measurement unit 202.
The image acquisition equipment 201 is used for acquiring image data of the calibration plate in the target space range;
the inertial measurement unit 202 is disposed in the device to be calibrated, and is configured to collect spatial data of movement of the device to be calibrated within the target spatial range, where the spatial data includes gesture data and speed data of movement of the device to be calibrated within the target spatial range.
The control unit 10 is connected with the image acquisition device 201 and the inertial measurement unit 202, and is configured to convert image data and spatial data into calibration parameters of a device to be calibrated, where the control unit 10 is connected with the device to be calibrated 20 through a data line or a wireless network, and the control unit 10 may be a computer, a tablet, or other devices.
The calibration system of the device further comprises a mechanical arm 40, as shown in fig. 1, one end of the mechanical arm 40 is connected with the control unit 10, the other end of the mechanical arm 40 is connected with the device 20 to be calibrated, the device 20 to be calibrated is fixed on a fixture of the mechanical arm 40, and the control unit 10 can control the mechanical arm 40 to drive the device 20 to be calibrated to move in a target space range so as to collect calibration data.
In this embodiment, the device to be calibrated may be an AR/VR/MR/XR glasses, a cell phone, a robot, a drone, etc. with both a camera and an IMU.
In the embodiment, when calibration data are acquired by using equipment to be calibrated, the mechanical arm is controlled to move in a six-degree-of-freedom mode in a target space range, so that the equipment to be calibrated is driven to move in the target space range, the calibration data are acquired, wherein the calibration data comprise image data acquired by a plurality of outward cameras and space data acquired by an inertial measurement unit, and the space data at least comprise gesture data and speed data of the equipment to be calibrated moving in the target space range.
Fig. 2 is a schematic diagram of an apparatus to be calibrated according to the present utility model, as shown in fig. 2, an image capturing apparatus 201 of the apparatus to be calibrated 20 includes a plurality of outward facing cameras 2011 and a plurality of eye movement cameras 2012, wherein the plurality of outward facing cameras 2011 are disposed outside the apparatus to be calibrated and are used for capturing image data of an outward facing camera calibration plate; the plurality of eye movement cameras 2012 are disposed at the inner side of the device to be calibrated and are used for collecting image data of the eye movement camera calibration plate.
In this embodiment, the target spatial range may include a first target spatial range within which the at least one calibration plate is disposed and a second target spatial range within which the at least one outward facing camera calibration plate and the at least one eye movement camera calibration plate are disposed.
For example, since there is usually no enough visible area between the plurality of outward cameras of the device to be calibrated, that is, the difference of orientation between the plurality of outward cameras is large, based on this, when the device performs large rotation and translation, the included angle between the camera optical axis and the finding of the calibration board is easily caused to be large, so that the marking feature points appear at the edge of the image and even move out of the image, so, in order to ensure that the plurality of outward cameras can collect the marking feature points on the calibration board, it is usually required that the size of the calibration board deployed in the first target space range is large enough, and the marking feature points are arranged in the middle area of the calibration board, so that the condition that the complete marking feature points are included in the image data collected by the camera is satisfied, so that the situation that the following feature points are misdetected and missed is usually required is avoided, or a scheme that the plurality of calibration boards are arranged according to a proper angle is also adopted, so that the plurality of outward cameras can collect the marking feature points on the calibration board deployed in the first target space range, and the number of the calibration boards deployed in the first target space range is not limited.
Fig. 3 is a schematic view of a calibration system of an apparatus according to the present utility model, and fig. 3 shows a case where two calibration plates are disposed in a first target space range, as shown in fig. 3, where the first target space range is disposed with a calibration plate 301 and a calibration plate 302, and where the calibration plate 301 and the calibration plate 302 are disposed at a certain angle; when the control unit 10 controls the mechanical arm 40 to drive the equipment 20 to be calibrated to move in the first target space range, a plurality of outward cameras 2011 and the inertia measurement unit 202 in the equipment 20 to be calibrated are in an open state, wherein the outward cameras 2011 are used for acquiring image data of the calibration plate 301 and the calibration plate 302 in the process of moving in the first target space range, and the inertia measurement unit 202 is used for acquiring space data of the equipment 20 to be calibrated when moving in the first target space range, wherein the space data comprises gesture data and speed data of the equipment to be calibrated moving in the target space range.
As shown in fig. 3, one outward-facing camera calibration plate 303 and one eye-movement camera calibration plate 304 are disposed in the second target space range, the control unit 10 may control the mechanical arm 40 to move the device 20 to be calibrated into the second target space range, when the device 20 to be calibrated is located in the second target space range, any one of the outward-facing cameras 2011 and any one of the eye-movement cameras 2012 in the plurality of outward-facing cameras 2011 and the plurality of eye-movement cameras in the device 20 to be calibrated are in an on state, where the outward-facing camera 2011 is used for collecting image data of the outward-facing camera calibration plate 303, and the eye-movement camera 2012 is used for collecting image data of the eye-movement camera calibration plate 304. As shown in fig. 3, in the second target space range, the outward camera 2011 is disposed opposite to the outward camera calibration plate 303, and the eye movement camera 2012 is disposed opposite to the eye movement camera calibration plate 304.
It should be noted that, when the mechanical arm 40 drives the device to be calibrated 20 to move within the first target space range, the device is used for jointly calibrating the outward camera and the IMU; when the mechanical arm 40 moves the device to be calibrated 20 to the second target space range, the device is used for jointly calibrating the eye movement camera and the outward camera.
In this embodiment, in the second target space range, the face with the feature point on the outward camera calibration plate and the face with the feature point in the eye movement camera calibration plate are arranged face to face, and the outward camera and the eye movement camera in the device to be calibrated are arranged back to back, based on this, the device to be calibrated can be fixed on the fixture of the mechanical arm, and the mechanical arm is controlled to move the device to be calibrated between the outward camera calibration plate and the eye movement camera calibration plate, so that the outward camera collects image data of the outward camera calibration plate, and at the same time, the eye movement camera collects image data of the eye movement camera calibration plate, that is, the calibration data can include image data of the outward camera calibration plate collected by the outward camera and image data of the eye movement camera calibration plate collected by the eye movement camera.
When the calibration data is acquired, the control unit can firstly send a data acquisition instruction to the equipment to be calibrated, wherein the data acquisition instruction is used for controlling the equipment to be calibrated to start a data acquisition function, namely controlling an outward camera, an eye movement camera or an inertia measurement unit in the equipment to be calibrated to start the data acquisition function, and then the control unit can control the mechanical arm to drive the equipment to be calibrated to move in a target space range so as to acquire the calibration data in the target space range. After the calibration data is collected, the collected calibration data can be transmitted to the control unit, or the equipment to be calibrated can also transmit the collected calibration data to the control unit in real time, and the control unit can receive the calibration data sent by the equipment to be calibrated and determine the calibration parameters of the equipment to be calibrated based on the calibration data.
In this embodiment, the first target range may further include a plurality of road marking points, and the control unit may control the mechanical arm to drive the device to be calibrated to pass through a plurality of preset road marking points from the initial position until the final position. In the moving process of equipment to be calibrated, an outward camera in the equipment to be calibrated can acquire image data of a calibration plate in a first target space range, an inertial measurement unit in the equipment to be calibrated can acquire space data of the equipment to be calibrated moving in the first target space range, after the image data and the space data are acquired, the equipment to be calibrated can transmit the acquired image data and the acquired space data to a control unit, the control unit can input the image data into a visual detection module after receiving the image data, the visual detection module can detect index numbers of marking feature points of the calibration plate included in the image data, after the index numbers are determined, three-dimensional coordinates of the marking feature points in the real world can be determined, based on positions of the marking feature points in the image data, image coordinates of the marking feature points in an image coordinate system can be determined, after the image coordinates and the three-dimensional coordinates of the marking feature points are acquired, the image coordinates and the three-dimensional coordinates corresponding to each marking feature point can be determined as one feature point pair, and then a plurality of feature point pairs can be acquired. After a plurality of characteristic point pairs are obtained, the plurality of characteristic point pairs can form a point pair set, and then the point pair set and the spatial data collected in the process that the inertial measurement unit moves on a moving route formed by a plurality of road mark points are input into a parameter optimization module to determine the internal participation and the external parameters of the outward camera and the internal parameters and the external parameters of the inertial measurement unit.
In this embodiment, when any one of the outward cameras in the device to be calibrated collects the image data of at least one outward camera calibration plate disposed in the second target space range, after any one of the eye movement cameras collects the image data of at least one of the eye movement camera calibration plates disposed in the second target space range, the position coordinates of each of the marker feature points in the image data in the three-dimensional space coordinate system and the position coordinates of each of the marker feature points in the image coordinate system may be determined according to the above-described method, so that the position coordinates of each of the marker feature points in the three-dimensional space coordinate and the position coordinates of each of the marker feature points in the image coordinate system form a feature point pair, a plurality of feature point pairs may be obtained, and the plurality of feature point pairs may form a point pair set. And then, respectively inputting the point pair sets corresponding to the characteristic points on the calibration plate in the image data acquired by the outward camera and the point pair sets corresponding to the characteristic points on the calibration plate in the image data acquired by the eye movement camera into a parameter optimization module for optimization calculation to obtain calibration parameters of equipment to be calibrated, wherein the calibration parameters comprise an inner parameter and an outer parameter of the outward camera and the inner parameter and the outer parameter of the eye movement camera.
Fig. 4 is a schematic diagram of an arrangement of an outward-facing camera calibration plate and an eye-movement camera calibration plate in a second target space range according to the present utility model, as shown in fig. 4, the outward-facing camera calibration plate 303 and the eye-movement camera calibration plate 304 in the second target space range are arranged in parallel, and the calibration plate patterns are arranged opposite to each other.
In this embodiment, the calibration plate includes a plurality of feature points, which are divided into a normal feature point and a mark feature point, wherein the difference between the mark information of the normal feature point and the mark information of the mark feature point includes at least one of: mark structure, mark color, mark gray value, mark shape.
For example, the common feature points on the calibration plate may be composed of patterns having the same size and a fixed pitch, and the mark feature points may be distinguished from the common feature points by the mark information, and the mark feature points constitute mark bits on the calibration plate. For example, the common feature points on the calibration plate may be black dots with the same size and a fixed pitch, and the mark feature points in the mark bits may be dots, dots and rings, dots and squares, dots and triangles with different sizes, wherein the color of the mark feature points may also be different from the color of the common feature points. Fig. 5 is a schematic view of a marking feature point according to the present utility model, and as shown in fig. 5, the marking feature point may constitute a marking bit, and the marking feature point in the marking bit may be further distinguished from a normal feature point by a structure, a color, a shape, or a dot/circle.
It should be noted that, under the premise that the image acquisition device can acquire the marking feature points on the calibration plate, the number of marking bits contained on one calibration plate can be determined according to the camera arrangement and the field angle of the device to be actually calibrated. For example, there may be one or more marking positions on one calibration plate, but the design of the marking positions needs to ensure the uniqueness of the orientation of the calibration plate, that is, the placement direction of the calibration plate can be distinguished based on the marking positions on the calibration plate. The utility model does not limit the number of sets of marking bits on the calibration plate.
Fig. 6 is a schematic diagram of an outward camera and an outward camera calibration board in a first target space range according to the present utility model, as shown in fig. 6, a position of the outward camera calibration board in the first target space range may be set according to an acquisition range of the outward camera in the first target space range, where a rule of setting the outward camera calibration board is that the outward camera may acquire a marked feature point on the outward camera calibration board.
And a certain included angle exists between the outward camera in the first target space range and the normal line of the outward camera calibration plate, wherein the included angle is in a target threshold range, and the target threshold can be preset. As can be seen from the foregoing description, at least one calibration plate is disposed within the first target space for calibrating the internal parameters and external parameters of the outward camera and the internal parameters and external parameters of the IMU in the device to be calibrated; and the at least one outward camera calibration plate and the at least one eye movement camera calibration plate are arranged in the second target space range and are used for calibrating the outward camera and the inner and outer parameters of the eye movement camera in the equipment to be calibrated. It should be noted that, at least one outward camera calibration plate and at least one eye movement camera calibration plate may be deployed in the first target space range, and at least one calibration plate may be deployed in the second target space range, in this case, the at least one outward camera calibration plate and at least one eye movement camera calibration plate deployed in the first target space range may be used to calibrate the outward camera and the inner and outer parameters of the eye movement camera of the device to be calibrated, and the at least one calibration plate may be used to calibrate the inner and outer parameters of the outward camera and the inner and outer parameters of the IMU in the device to be calibrated, that is, the calibration plate deployed in the first target space range and the calibration plate deployed in the second target space range may be interchanged on the premise of satisfying the calibration of the outward camera, the IMU and the eye movement camera in the device to be calibrated.
In the present utility model, a calibration system of an apparatus includes: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an inertial measurement unit of an image acquisition device; the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range; the inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range; the control unit is connected with the image acquisition device and the inertial measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated. That is, in the present utility model, a calibration plate is disposed in a target space range, and an image acquisition device in the device to be calibrated can acquire image data of the calibration plate in the target space range; the inertial measurement unit can acquire the space data of the equipment to be calibrated moving in the target space range, the control unit can receive the image data of the calibration plate acquired by the image acquisition equipment and the space data acquired by the inertial measurement unit, based on the space data and the image data, the space data can be converted into the calibration parameters of the equipment to be calibrated, the efficiency of determining the calibration parameters of the equipment to be calibrated can be improved, the determined calibration parameters are accurate, the technical effect of improving the accuracy of the calibration parameters is achieved, and the technical problem that the equipment parameter calibration in the related art is inaccurate is solved.
The method for acquiring calibration data by using the present utility model will be described by way of example.
The device to be calibrated comprises a first device and a second device, wherein the first device can be an image acquisition device, for example, a camera or video camera, and the second device can be an Inertial Measurement Unit (IMU) or an image acquisition device, based on which a target spatial range can be determined from candidate spatial ranges according to the type of the device to be calibrated, wherein at least one calibration plate is disposed in the candidate spatial range.
In this embodiment, the candidate spatial range includes a first target spatial range in which at least one first camera calibration plate is disposed and a second target spatial range in which at least one first camera calibration plate and at least one second camera calibration plate are disposed, based on which the first target spatial range may be selected as the target spatial range when the kind of the second device is the inertial measurement unit, and the second target spatial range may be selected as the target spatial range when the kind of the second device is the image capturing device.
For example, when the device to be calibrated is one of AR/VR/MR/XR glasses, since the AR/VR/MR/XR glasses include a first camera, a second camera, and an inertial measurement unit, wherein the first camera may be an outward facing camera, and the second camera may be an eye movement camera, based on which, when the first camera and the inertial measurement unit are calibrated, a candidate spatial range where at least one first camera calibration plate is disposed may be taken as a target spatial range; when the first camera and the second camera are calibrated, a candidate spatial range in which at least one first camera calibration plate and at least one second camera calibration plate are disposed may be taken as a target spatial range.
Optionally, the device to be calibrated may also be a device with a camera and an IMU, such as a mobile phone, a robot, an unmanned aerial vehicle, etc.
After the target space range is determined, calibration data can be acquired through the equipment to be calibrated in the target space range, wherein the calibration data comprise image data acquired by the first equipment on the calibration plate and optimization data acquired by the second equipment.
In this embodiment, since the device to be calibrated includes a first device and a second device, where the first device may be a camera and a video camera, and the second device may be an inertial measurement unit or an image acquisition device, based on this, when the second device is an inertial measurement unit, at least one first camera calibration board is disposed in a target space range, based on this, when calibration data is acquired by using the device to be calibrated, the device to be calibrated may be fixed on a fixture of the mechanical arm, and the mechanical arm is controlled to perform six degrees of freedom movement in the target space range, so as to drive the device to be calibrated to move in the target space range, and collect calibration data, where the calibration data includes image data collected by the first device and optimization data collected by the second device, and the optimization data includes at least spatial data of the device to be calibrated moving in the target space range, and the spatial data includes speed data and attitude data of the device to be calibrated moving in the target space range.
Optionally, when the second device is an image acquisition device, at least one first camera calibration board and at least one second camera calibration board are disposed in the target space, where a face including the marking feature point in the at least one first camera calibration board and a face including the marking feature point in the at least one second camera calibration board are disposed opposite to each other, based on which, when the calibration data is acquired by the device to be calibrated, the device to be calibrated is fixed on a fixture of the mechanical arm, and the mechanical arm is controlled to move the device to be calibrated between the first camera calibration board and the second camera calibration board, so that the first device acquires image data of the first camera calibration board, and the second device acquires image data of the second camera calibration board, that is, the image data of the first camera calibration board acquired by the first device and the image data of the second camera calibration board acquired by the second device are optimized data.
Optionally, when the calibration data is acquired, the control unit may first send a data acquisition instruction to the device to be calibrated, where the data acquisition instruction is used to control the device to be calibrated to start a data acquisition function, that is, control the first device and the second device in the device to be calibrated to start the data acquisition function, and then the control unit may control the mechanical arm to drive the device to be calibrated to move in the target space range so as to acquire the calibration data in the target space range. And then, the collected calibration data can be transmitted to the control unit, or the equipment to be calibrated can also transmit the collected calibration data to the control unit in real time, and the control unit can receive the calibration data sent by the equipment to be calibrated and determine the calibration parameters of the equipment to be calibrated based on the calibration data.
In this embodiment, the calibration plate may be a first camera calibration plate or a second camera calibration plate, and the calibration plate includes a plurality of feature points, and the plurality of feature points are divided into a normal feature point and a mark feature point, where a difference between mark information of the normal feature point and mark information of the mark feature point includes at least one of: mark structure, mark color, mark gray value, mark shape.
For example, the common feature points on the calibration plate may be composed of patterns having the same size and a fixed pitch, and the mark feature points may be distinguished from the common feature points by the mark information, and the mark feature points constitute mark bits on the calibration plate. For example, the common feature points on the calibration plate may be black dots with the same size and a fixed pitch, and the mark feature points in the mark bits may be dots, dots and rings, dots and squares, dots and triangles with different sizes, wherein the color of the mark feature points may also be different from the color of the common feature points. As shown in fig. 5, the marker feature points in the marker bits may also be distinguished from the normal feature points by structure, color, shape, or dots/circles.
In the combined calibration of the camera and the inertial measurement unit, the equipment needs to rotate and translate greatly, so that when the included angle between the optical axis of the camera and the normal line of the calibration plate is large, the marked characteristic points can often appear at the edge of the image and even move out of the image, and the marked characteristic points on the calibration plate fail to detect or are in error detection, thereby causing the problem of calibration failure. According to the camera arrangement characteristics of the equipment to be calibrated, the scheme that a plurality of calibration plates are arranged according to proper angles is adopted, and on the other hand, the marking characteristic points are arranged in the middle area of the calibration plates, so that the condition of containing complete marking characteristic points is met in the image data collected by the camera, and the condition of false detection and missing detection of subsequent characteristic points is avoided. Even when the camera only shoots a part of the calibration plate, the serial number of the detected round dot can be calculated through the marking position, a correct mapping relation is established with the calibration plate in the real three-dimensional world, and the calibration parameters are obtained through an optimization algorithm.
The calibration data comprises image data acquired by the first equipment and optimization data acquired by the second equipment, wherein when the second equipment is an inertial measurement unit, the optimization data at least comprises space data of the equipment to be calibrated moving in a target space range, the space data comprises gesture data and speed data of the equipment to be calibrated moving in the target space range, and when the second equipment is an image acquisition equipment, the optimization data at least comprises image data acquired by a second camera calibration plate. Based on the above, after the calibration data acquired by the device to be calibrated is acquired, the image data acquired by the first device in the calibration data may be detected first to determine the position information of the marker feature point, where the position information is used to indicate the position coordinates of the marker feature point in the three-dimensional space and the position coordinates of the marker feature point in the image coordinate system.
In this embodiment, after the image data acquired by the first device is acquired, the image data may be input to the visual detection module for detection, and since the image data includes the image data of the first camera calibration board, and the first camera calibration board includes the mark feature point, the mark feature point has a feature that is significantly different from the common feature point, based on this, the position information of the mark feature point in the image data may be determined by the visual detection module.
For example, each feature point on the camera calibration board corresponds to an index number, the index number is used for determining a position of the feature point on the camera calibration board, based on the index number, after the image data is input into the visual detection module, the visual detection module can determine a position coordinate of each mark feature point on the camera calibration board under the image coordinate system and an index number of each mark feature point, wherein the index number of each mark feature point is used for indicating a position of each mark feature point on the camera calibration board, after determining a position of each mark feature point on the camera calibration board, the position coordinate of each mark feature point under the three-dimensional space coordinate can be further determined, and then the position coordinate of each mark feature point under the image coordinate system and the position coordinate of each feature point under the three-dimensional space determine position information for positioning each mark feature point.
After the position information of each marking characteristic point on the calibration plate is determined, synchronous parameter optimization can be carried out by combining the position information and the optimization data to obtain the calibration parameters of the equipment to be calibrated, wherein the calibration parameters comprise the internal parameters and the external parameters of the first equipment (image acquisition equipment), and when the second equipment is an inertial measurement unit, the calibration parameters also comprise the internal parameters and the external parameters of the second equipment; when the second device is an image acquisition device, the calibration parameters include an internal parameter and an external parameter of the second device.
In this embodiment, the position coordinates of each marking feature point in the image data acquired by the first device under the image coordinate system and the position coordinates of each marking feature point in the three-dimensional space coordinate system may be used as a pair of feature point pairs, and further the feature point pairs corresponding to the plurality of marking feature points may form a point pair set. When the second device is an inertial measurement unit, the optimization data acquired by the second device at least comprise space data of the device to be calibrated moving in the target space range, based on the space data, a point pair set formed by a plurality of characteristic point pairs and the space data acquired by the second device can be input into a parameter optimization module to perform parameter optimization calculation so as to obtain calibration parameters of the device to be calibrated, wherein the calibration parameters comprise inner parameters and outer parameters of the first device, and the inner parameters and the outer parameters of the second device.
Optionally, when the second device is an image acquisition device, the method described above may be referred to, and the position coordinates of each marking feature point in the image data acquired by the second device under the three-dimensional space coordinate system and the position coordinates of each marking feature point under the image coordinate system may be determined, so that a plurality of feature point pairs are formed by the position coordinates of each marking feature point under the three-dimensional space coordinate system and the position coordinates of each marking feature point under the image coordinate system, and the plurality of feature point pairs form a point pair set, and then the point pair set corresponding to the plurality of feature points on the calibration board in the image data acquired by the first device and the point pair set corresponding to the plurality of feature points on the calibration board in the image data acquired by the second device are input to the parameter optimization module for optimization calculation, so as to obtain the calibration parameters of the device to be calibrated, where the calibration parameters include the internal parameters and the external parameters of the first device and the internal parameters and the external parameters of the second device.
Through the steps, at least one calibration plate is deployed in the target space range, calibration data in the target space range can be obtained through equipment to be calibrated, the calibration data comprise image data and optimization data acquired by the calibration plate, the image data in the calibration data are detected, the position information of the marked feature points on the calibration plate, which are included in the image data, can be determined, and because the position information of the marked feature points can represent the position coordinates of the marked feature points in a three-dimensional space coordinate system and the position coordinates of the marked feature points in the image coordinate system, the position of the marked feature points can be accurately determined, and based on the position information of the marked feature points and the optimization data acquired by the second equipment, parameter optimization is carried out on the equipment to be calibrated, so that the obtained calibration parameters of the equipment to be calibrated are accurate, the technical effect of improving the accuracy of the calibration parameters is achieved, and the technical problem that the equipment parameter calibration in the related technology is inaccurate is solved.
The above description is only of the preferred embodiments of the present utility model and is not intended to limit the present utility model, but various modifications and variations can be made to the present utility model by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present utility model should be included in the protection scope of the present utility model.

Claims (10)

1. A calibration system for an apparatus, the calibration system comprising: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an image acquisition device and an inertial measurement unit;
the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range;
the inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range;
the control unit is connected with the image acquisition device and the inertia measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated.
2. The calibration system of claim 1, wherein the calibration system further comprises:
and one end of the mechanical arm is connected with the control unit, and the other end of the mechanical arm is fixed with the equipment to be calibrated and used for controlling the equipment to be calibrated to move in the target space range.
3. The calibration system of claim 1, wherein the calibration plate is deployed within the target spatial range.
4. A calibration system according to claim 3, wherein the calibration plate comprises normal feature points and labeled feature points, wherein the difference between the labeled information of the normal feature points and the labeled information of the labeled feature points comprises at least one of: mark structure, mark color, mark gray value, mark shape.
5. A calibration system according to claim 3, wherein the calibration plate comprises an outward facing camera calibration plate and an eye movement camera calibration plate.
6. The calibration system of claim 5, wherein the image acquisition device comprises:
the outward camera is arranged at the outer side of the equipment to be calibrated and is used for collecting image data of the outward camera calibration plate;
and the eye movement camera is arranged at the inner side of the equipment to be calibrated and is used for collecting the image data of the eye movement camera calibration plate.
7. The calibration system of claim 6, wherein the target spatial range comprises a first target spatial range in which the outward facing camera and the inertial measurement unit are jointly calibrated, and a second target spatial range in which the eye movement camera and the inertial measurement unit are jointly calibrated, wherein the first target spatial range deploys at least one outward facing camera calibration plate and the second target spatial range deploys at least one outward facing camera calibration plate and one eye movement camera calibration plate.
8. The calibration system of claim 7, wherein the outward facing camera calibration plate and the eye movement camera calibration plate within the second target spatial range are disposed in parallel and the calibration plate pattern is disposed opposite.
9. The calibration system of claim 7, wherein an angle between the outward facing camera and the outward facing camera calibration plate is within a target threshold range within the first target spatial range; and in the second target space range, the outward camera and the outward camera calibration plate are arranged oppositely, and the eye movement camera calibration plate are arranged oppositely.
10. The calibration system according to claim 1, characterized in that the control unit is connected to the device to be calibrated via a data line or a wireless network.
CN202321091245.0U 2023-05-05 2023-05-05 Calibration system of equipment Active CN220170495U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202321091245.0U CN220170495U (en) 2023-05-05 2023-05-05 Calibration system of equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202321091245.0U CN220170495U (en) 2023-05-05 2023-05-05 Calibration system of equipment

Publications (1)

Publication Number Publication Date
CN220170495U true CN220170495U (en) 2023-12-12

Family

ID=89061073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202321091245.0U Active CN220170495U (en) 2023-05-05 2023-05-05 Calibration system of equipment

Country Status (1)

Country Link
CN (1) CN220170495U (en)

Similar Documents

Publication Publication Date Title
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN112907676B (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
US7423666B2 (en) Image pickup system employing a three-dimensional reference object
CN110146038B (en) Distributed monocular camera laser measuring device and method for assembly corner of cylindrical part
CN110246185B (en) Image processing method, device, system, storage medium and calibration system
CN102778207B (en) A kind of measuring method, Apparatus and system of structural member ess-strain
CN110470226B (en) Bridge structure displacement measurement method based on unmanned aerial vehicle system
CN112655024B (en) Image calibration method and device
CN109448054A (en) The target Locate step by step method of view-based access control model fusion, application, apparatus and system
JP2013115540A (en) On-vehicle camera system, and calibration method and program for same
CN102798350A (en) Method, device and system for measuring deflection of arm support
CN110910460A (en) Method and device for acquiring position information and calibration equipment
CN109269525B (en) Optical measurement system and method for take-off or landing process of space probe
CN109920009B (en) Control point detection and management method and device based on two-dimensional code identification
CN107917700A (en) The 3 d pose angle measuring method of target by a small margin based on deep learning
KR101203816B1 (en) Robot fish localization system using artificial markers and method of the same
CN110991306B (en) Self-adaptive wide-field high-resolution intelligent sensing method and system
JP5019478B2 (en) Marker automatic registration method and system
CN205664784U (en) Need not three -dimensional scanning system at object surface paste mark point
CN220170495U (en) Calibration system of equipment
CN111866490A (en) Depth image imaging system and method
CN109682312B (en) Method and device for measuring length based on camera
CN111649716A (en) Space point-to-point distance measuring and calculating method based on panoramic image
CN112884832B (en) Intelligent trolley track prediction method based on multi-view vision
CN112750165B (en) Parameter calibration method, intelligent driving method, device, equipment and storage medium thereof

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant