CN116577072A - Calibration method, device, system and storage medium of equipment - Google Patents

Calibration method, device, system and storage medium of equipment Download PDF

Info

Publication number
CN116577072A
CN116577072A CN202310508727.XA CN202310508727A CN116577072A CN 116577072 A CN116577072 A CN 116577072A CN 202310508727 A CN202310508727 A CN 202310508727A CN 116577072 A CN116577072 A CN 116577072A
Authority
CN
China
Prior art keywords
calibration
equipment
data
calibrated
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310508727.XA
Other languages
Chinese (zh)
Inventor
韦盛斌
欧阳高
郭亮亮
罗敏辉
王进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rainbow Software Co ltd
Original Assignee
Rainbow Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rainbow Software Co ltd filed Critical Rainbow Software Co ltd
Priority to CN202310508727.XA priority Critical patent/CN116577072A/en
Publication of CN116577072A publication Critical patent/CN116577072A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a calibration method, device and system of equipment and a storage medium. The calibration method of the equipment comprises the following steps: determining a target space range from candidate space ranges according to the types of equipment to be calibrated, wherein at least one calibration plate is deployed in the candidate space range, and the equipment to be calibrated comprises first equipment and second equipment; in the target space range, acquiring calibration data through equipment to be calibrated, wherein the calibration data comprises image data acquired by a first equipment on a calibration plate and optimization data acquired by a second equipment; detecting image data in the calibration data, and determining the position information of the mark characteristic points; and carrying out synchronous parameter optimization by combining the position information and the optimization data to obtain the calibration parameters of the equipment to be calibrated. The invention solves the technical problem of inaccurate equipment parameter calibration in the related technology.

Description

Calibration method, device, system and storage medium of equipment
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method, an apparatus, a system, and a storage medium for calibrating a device.
Background
With the rise of the concept of "meta-space", the demands of augmented-Reality (AR) devices, virtual-Reality (VR) devices, and Mixed-Reality (MR/XR) devices are in a continuously rising situation. These devices are typically equipped with multiple cameras and inertial sensors (Inertial Measurement Unit, abbreviated IMUs) for scene perception, device positioning, virtual-real combination, user interaction, etc., and in order to avoid distortion of the camera imaging, the parameters of the camera and IMU equipped with the device are typically calibrated in advance.
In the related art, in the multi-camera and IMU combined calibration, a pattern array flat plate with a fixed interval is usually used as a calibration plate for calibration, but the anti-blurring capacity of the calibration plate is weak, in the process of quick movement of equipment, the image of the calibration plate acquired by a camera can generate motion blurring, and if the camera adopts a fisheye lens, the situation of failure in picture edge corner detection easily occurs, so that the calibration of internal parameters of the camera, particularly distortion parameters, is inaccurate, and the accuracy of the working result of the camera is affected.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a device calibration method, a device calibration system and a storage medium, which are used for at least solving the technical problem of inaccurate device parameter calibration in the related technology.
According to one aspect of the embodiment of the invention, a calibration method of equipment is provided. The method may include: determining a target space range from candidate space ranges according to the types of equipment to be calibrated, wherein at least one calibration plate is deployed in the candidate space range, and the equipment to be calibrated comprises first equipment and second equipment; in the target space range, acquiring calibration data through equipment to be calibrated, wherein the calibration data comprises image data acquired by a first equipment on a calibration plate and optimization data acquired by a second equipment; detecting image data in the calibration data, and determining the position information of the mark characteristic points; and carrying out synchronous parameter optimization by combining the position information and the optimization data to obtain the calibration parameters of the equipment to be calibrated.
Optionally, determining the target spatial range from the candidate spatial ranges according to the type of the device to be calibrated includes: the candidate spatial range comprises a first target spatial range and a second target spatial range, wherein at least one first camera calibration plate is arranged in the first target spatial range, and at least one first camera calibration plate and at least one second camera calibration plate are arranged in the second target spatial range; when the type of the second equipment is an inertial measurement unit, selecting the first target space range as a target space range; when the type of the second device is an image capturing device, the second target spatial range is selected as the target spatial range.
Optionally, the first target space range includes a plurality of preset road mark points, and the equipment to be calibrated moves according to the preset road mark points, and calibration data is collected in the moving process.
Optionally, in the second target space range, the first camera calibration plate and the second camera calibration plate are parallel, and the calibration plate patterns are disposed opposite to each other.
Optionally, the marking feature point on the calibration board corresponds to an index number, and detects image data in the calibration data to determine position information of the marking feature point in the image data, including: detecting image data in the calibration data to obtain index numbers of the marked feature points on the calibration plate; determining position coordinates of the marking feature points under a three-dimensional space coordinate system based on the index numbers; determining position coordinates of the mark feature points under an image coordinate system based on the positions of the mark feature points in the image data; position coordinates of the marker feature points in the three-dimensional space coordinate system and position coordinates of the marker feature points in the image coordinate system are determined as position information of the marker feature points.
Optionally, obtaining calibration data by the device to be calibrated includes: transmitting a data acquisition instruction to equipment to be calibrated, wherein the data acquisition instruction is used for instructing the equipment to be calibrated to start a data acquisition function so as to acquire image data and optimized data; and receiving image data and optimization data fed back by the equipment to be calibrated, wherein the image data and the optimization data form calibration data.
Optionally, the calibration plate includes a normal feature point and a mark feature point, wherein a difference between the mark information of the normal feature point and the mark information of the mark feature point includes at least one of: mark structure, mark color, mark gray value, mark shape.
Optionally, the marking information of the marking feature points in different marking plates is different; the arrangement angles among the plurality of calibration plates are determined according to the arrangement positions and the view field angles between the first equipment and the second equipment, so that the first equipment and the second equipment can acquire complete marking information of marking feature points.
Optionally, the image capture device is a rolling shutter camera, and the calibration parameter includes a time compensation parameter of the rolling shutter.
Optionally, the image capturing device is a fisheye camera, and the calibration parameter includes a vignetting correction parameter.
Optionally, the calibration method of the device comprises the following steps: determining gray values of marking feature points in the image data and position coordinates of the marking feature points under an image coordinate system; and performing polynomial fitting on the gray value and the position coordinates to obtain the dark angle correction parameters.
According to another aspect of the embodiment of the invention, a calibration device of the equipment is also provided. The device comprises: the first determining unit is used for determining a target space range from candidate space ranges according to the type of equipment to be calibrated, wherein at least one calibration plate is arranged in the candidate space range, and the equipment to be calibrated comprises first equipment and second equipment; the acquisition unit is used for acquiring calibration data through equipment to be calibrated in a target space range, wherein the calibration data comprise image data acquired by first equipment on a calibration plate and optimization data acquired by second equipment; the second determining unit is used for detecting the image data in the calibration data and determining the position information of the marking characteristic points; and the optimizing unit is used for carrying out synchronous parameter optimization by combining the position information and the optimizing data to obtain the calibration parameters of the equipment to be calibrated.
According to another aspect of the embodiment of the application, a calibration system of the device is also provided. The calibration system may include: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an image acquisition device and an inertial measurement unit; the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range; the inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range; the control unit is connected with the image acquisition device and the inertial measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated.
According to another aspect of the embodiment of the present application, there is further provided a computer readable storage medium, where the computer readable storage medium includes a stored computer program, where the computer program when executed controls a device in which the computer readable storage medium is located to execute the calibration method of any one of the above devices.
In the method, a target space range is determined from candidate space ranges according to the type of equipment to be calibrated, wherein at least one calibration plate is deployed in the candidate space range, and the equipment to be calibrated comprises first equipment and second equipment; in the target space range, acquiring calibration data through equipment to be calibrated, wherein the calibration data comprises image data acquired by a first equipment on a calibration plate and optimization data acquired by a second equipment; detecting image data in the calibration data, and determining the position information of the mark characteristic points; and carrying out synchronous parameter optimization by combining the position information and the optimization data to obtain the calibration parameters of the equipment to be calibrated. That is, in the application, at least one calibration plate is deployed in the target space range, calibration data in the target space range can be acquired through equipment to be calibrated, image data in the calibration data can be detected, the position information of the marked feature points on the calibration plate included in the image data can be determined, and because the position information of the marked feature points can represent the position coordinates of the marked feature points under the three-dimensional space coordinate system, parameter optimization is performed on the equipment to be calibrated based on the position information of the marked feature points and the optimization data acquired by the second equipment, the obtained calibration parameters of the equipment to be calibrated are accurate, the technical effect of improving the accuracy of the calibration parameters is achieved, and the technical problem of inaccurate equipment parameter calibration in the related art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of a method of calibrating a device according to an embodiment of the application;
FIG. 2 is a schematic illustration of a calibration plate according to an embodiment of the application;
FIG. 3 is a schematic diagram of a flag bit according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a calibration system for an apparatus according to an embodiment of the application;
FIG. 5 is a schematic illustration of an image taken by a VR eye at several landmark points in accordance with an embodiment of the present application;
FIG. 6 is a schematic diagram of two calibration plates to adapt to the calibration requirements of a VR eye in accordance with an embodiment of the application;
FIG. 7 is a schematic diagram of an eye movement camera calibration environment in accordance with an embodiment of the application;
FIG. 8 is a schematic illustration of a calibration plate image captured by a camera within a second target spatial range in accordance with an embodiment of the present application;
FIG. 9 is a schematic diagram of a calibration device of an apparatus according to an embodiment of the application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
To facilitate an understanding of the invention by those skilled in the art, some terms or nouns involved in the various embodiments of the invention are explained below:
the image acquisition equipment is formed by combining a lens and an area array sensor and can be equipment such as a camera, a video camera and the like;
an inertial measurement unit Inertial Measurement Unit, abbreviated as IMU, a device mainly used for detecting and measuring three-axis attitude angles (or angular rates) and accelerations of an object;
Camera calibration, in image measurement processes and machine vision applications, in order to determine the correlation between the three-dimensional geometric position of a point on the surface of a spatial object and its corresponding point in an image, a geometric model of camera imaging must be established, and these geometric model parameters are camera parameters. Under most conditions, the parameters (internal parameters, external parameters and distortion parameters) must be obtained through experiments and calculation, and the process of solving the parameters is called camera calibration;
and (3) carrying out combined calibration, and obtaining internal parameters and external parameters of each camera, the internal parameters and the external parameters of the IMU, camera rolling shutter compensation and dark angle correction parameters through calibration. Parameters of the camera such as focal length, optical center coordinates, distortion and the like are included in the parameters for describing an imaging rule, parameters of the IMU such as zero offset, scale factor errors, installation errors and the like are included in the parameters for describing the intrinsic properties of the IMU, and parameters of the camera/IMU are used for describing rotation and translation of the camera/IMU in a world coordinate system and describing a conversion relationship from coordinates of the camera/IMU to a unified world coordinate system.
The embodiments of the present invention described below can be applied to various display devices capable of realizing augmented reality, virtual reality, or mixed reality functions, for example, head-mounted display devices such as head-mounted display devices Holens (AR glasses), metaquest2 (VR glasses), or image pickup devices such as cameras.
Example 1
According to an embodiment of the present application, there is provided an apparatus calibration method embodiment, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order other than that shown or described herein.
FIG. 1 is a flow chart of a method of calibrating a device according to an embodiment of the application, as shown in FIG. 1, the method comprising the steps of:
step S101, determining a target space range from candidate space ranges according to the type of equipment to be calibrated.
In the technical solution of the above step S101 of the present application, the device to be calibrated includes a first device and a second device, where the first device may be an image capturing device, for example, a camera or a video camera, and the second device may be an Inertial Measurement Unit (IMU) or an image capturing device, based on which a target space range may be determined from candidate space ranges according to a type of the device to be calibrated, where at least one calibration board is disposed in the candidate space ranges.
In this embodiment, the candidate spatial range includes a first target spatial range in which at least one first camera calibration plate is disposed and a second target spatial range in which at least one first camera calibration plate and at least one second camera calibration plate are disposed, based on which the first target spatial range may be selected as the target spatial range when the kind of the second device is the inertial measurement unit, and the second target spatial range may be selected as the target spatial range when the kind of the second device is the image capturing device.
For example, when the device to be calibrated is one of AR/VR/MR/XR glasses, since the AR/VR/MR/XR glasses include a first camera, a second camera, and an inertial measurement unit, wherein the first camera may be an outward facing camera, and the second camera may be an eye movement camera, based on which, when the first camera and the inertial measurement unit are calibrated, a candidate spatial range where at least one first camera calibration plate is disposed may be taken as a target spatial range; when the first camera and the second camera are calibrated, a candidate spatial range in which at least one first camera calibration plate and at least one second camera calibration plate are disposed may be taken as a target spatial range.
Optionally, the device to be calibrated may also be a device with a camera and an IMU, such as a mobile phone, a robot, an unmanned aerial vehicle, etc.
Step S102, obtaining calibration data through equipment to be calibrated in the target space range.
In the technical solution of the above step S102 of the present application, after determining the target space range, calibration data may be obtained by the device to be calibrated in the target space range, where the calibration data includes image data collected by the first device degree calibration board and optimization data collected by the second device.
In this embodiment, since the device to be calibrated includes a first device and a second device, where the first device may be a camera and a video camera, and the second device may be an inertial measurement unit or an image acquisition device, based on this, when the second device is an inertial measurement unit, at least one first camera calibration board is disposed in a target space range, based on this, when calibration data is acquired by using the device to be calibrated, the device to be calibrated may be fixed on a fixture of the mechanical arm, and the mechanical arm is controlled to perform six degrees of freedom movement in the target space range, so as to drive the device to be calibrated to move in the target space range, and collect calibration data, where the calibration data includes image data collected by the first device and optimization data collected by the second device, and the optimization data includes at least spatial data of the device to be calibrated moving in the target space range, and the spatial data includes posture data and speed data of the device to be calibrated moving in the target space range. The spatial data collected by the inertial measurement unit includes, for example, at least, three-axis acceleration and three-axis angular velocity.
In addition, in the device to be calibrated, an eye movement camera for capturing the movement of human eyes and the direction of the line of sight is usually provided, the eye movement camera is usually mounted on the inner side of the glasses for photographing the eyes of the human, and the outward facing camera is usually mounted on the outer side of the glasses for photographing the application scene. The external parameters between the eye movement camera and other cameras observed outwards are marked, which are necessary conditions for converting the vision under the coordinate system of the eye movement camera into the coordinate system of the glasses, and are also preconditions for realizing interaction subsequently. However, since the eye movement camera and the outward camera are designed back-to-back, that is, the viewing angles of the two are opposite, and there is no overlapping area, the external parameter calibration between the cameras opposite to the viewing angles cannot be completed in the process of the combined calibration of the first camera and the inertial measurement unit. Therefore, the application also deploys multiple camera calibration scenarios within the target space range to achieve one-stop calibration.
Optionally, when the second device is an image capturing device, because the first device and the second device in the device to be calibrated are designed back to back, based on this, at least one first camera calibration board and at least one second camera calibration board can be disposed in the target space range, where the at least one first camera calibration board and the at least one second camera calibration board include marking feature points thereon, and one surfaces of the at least one first camera calibration board and the at least one second camera calibration board including the marking feature points are disposed oppositely, so that when the device to be calibrated moves between the first camera calibration board and the second camera calibration board, the first device captures image data of the first camera calibration board, and the second device captures image data of the second camera calibration board.
For example, the device to be calibrated may be fixed on a fixture of the mechanical arm, and the mechanical arm is controlled to move the device to be calibrated between the first camera calibration plate and the second camera calibration plate, so that the first device collects image data of the first camera calibration plate, and the second device collects image data of the second camera calibration plate, that is, the calibration data includes image data of the first camera calibration plate collected by the first device and image data of the second camera calibration plate collected by the second device.
Optionally, when the calibration data is acquired, the control unit may first send a data acquisition instruction to the device to be calibrated, where the data acquisition instruction is used to control the device to be calibrated to start a data acquisition function, that is, control the first device and the second device in the device to be calibrated to start the data acquisition function, and then the control unit may control the mechanical arm to drive the device to be calibrated to move in the target space range so as to acquire the calibration data in the target space range. And then, the collected calibration data can be transmitted to the control unit, or the equipment to be calibrated can also transmit the collected calibration data to the control unit in real time, and the control unit can receive the calibration data sent by the equipment to be calibrated and determine the calibration parameters of the equipment to be calibrated based on the calibration data.
In this embodiment, the calibration plate may be a first camera calibration plate or a second camera calibration plate, and the calibration plate includes a plurality of feature points, and the plurality of feature points are divided into a normal feature point and a mark feature point, where a difference between mark information of the normal feature point and mark information of the mark feature point includes at least one of: mark structure, mark color, mark gray value, mark shape.
For example, the common feature points on the calibration plate may be composed of patterns having the same size and a fixed pitch, and the mark feature points may be distinguished from the common feature points by the mark information, and the mark feature points constitute mark bits on the calibration plate. For example, the common feature points on the calibration plate may be black dots with the same size and a fixed pitch, and the mark feature points in the mark bits may be dots, dots and rings, dots and squares, dots and triangles with different sizes, wherein the color of the mark feature points may also be different from the color of the common feature points. Fig. 2 is a schematic diagram of a calibration plate according to an embodiment of the present invention, as shown in fig. 2, the marking feature points in the marking positions on the same calibration plate are obviously different from the common marking points on the calibration plate, wherein only one marking feature point is shown in fig. 2, besides, the marking feature points in the marking positions can be further distinguished by other shapes or colors, and fig. 3 is a schematic diagram of one marking position according to an embodiment of the present invention, as shown in fig. 3, the marking feature points in the marking positions can be further distinguished from the common feature points by structures, colors, shapes, or dots/circles.
In the combined calibration of the camera and the inertial measurement unit, the equipment needs to rotate and translate greatly, so that when the included angle between the optical axis of the camera and the normal line of the calibration plate is large, the marked characteristic points can often appear at the edge of the image and even move out of the image, and the marked characteristic points on the calibration plate fail to detect or are in error detection, thereby causing the problem of calibration failure. According to the camera arrangement characteristics of the equipment to be calibrated, the scheme that a plurality of calibration plates are arranged according to proper angles is adopted, and on the other hand, the marking characteristic points are arranged in the middle area of the calibration plates, so that the condition of containing complete marking characteristic points is met in the image data collected by the camera, and the condition of false detection and missing detection of subsequent characteristic points is avoided. Even when the camera only shoots a part of the calibration plate, the serial number of the detected round dot can be calculated through the marking position, a correct mapping relation is established with the calibration plate in the real three-dimensional world, and the calibration parameters are obtained through an optimization algorithm.
It should be noted that, under the premise that the image acquisition device can acquire the marking feature points on the calibration plate, the number of marking bits contained on one calibration plate can be determined according to the camera arrangement and the field angle of the device to be actually calibrated. For example, there may be one or more marking positions on one calibration plate, but the design of the marking positions needs to ensure the uniqueness of the orientation of the calibration plate, that is, the placement direction of the calibration plate can be distinguished based on the marking positions on the calibration plate. The application does not limit the number of sets of marking bits on the calibration plate.
Step S103, detecting the image data in the calibration data, and determining the position information of the mark characteristic points.
In the technical solution of the above step S103 of the present application, as known from the foregoing description, the calibration data includes image data collected by the first device and optimization data collected by the second device, where when the second device is an inertial measurement unit, the optimization data includes at least spatial data of movement of the device to be calibrated in the target spatial range, where the spatial data includes gesture data and speed data of movement of the device to be calibrated in the target spatial range, and when the second device is an image collecting device, the optimization data includes at least image data collected by the second camera calibration board. Based on the above, after the calibration data acquired by the device to be calibrated is acquired, the image data acquired by the first device in the calibration data may be detected first to determine the position information of the marker feature point, where the position information is used to indicate the position coordinates of the marker feature point in the three-dimensional space and the position coordinates of the marker feature point in the image coordinate system.
In this embodiment, after the image data acquired by the first device is acquired, the image data may be input to the visual detection module for detection, and since the image data includes the image data of the first camera calibration board, and the first camera calibration board includes the mark feature point, the mark feature point has a feature that is significantly different from the common feature point, based on this, the position information of the mark feature point in the image data may be determined by the visual detection module.
For example, each feature point on the camera calibration board corresponds to an index number, the index number is used for determining the position of the feature point on the camera calibration board, based on the index number, after the image data is input into the visual detection module, the visual detection module can determine the position coordinate of each mark feature point on the camera calibration board under the image coordinate system and the index number of each mark feature point, wherein the index number of each mark feature point is used for indicating the position of each mark feature point on the camera calibration board, after the position of each mark feature point on the camera calibration board is determined, the position coordinate of each mark feature point under the three-dimensional space coordinate can be further determined by combining the index number of the feature point, and then the position coordinate of each mark feature point under the image coordinate system and the position coordinate of each feature point under the three-dimensional space are formed into the position information of each mark feature point.
And step S104, carrying out synchronous parameter optimization by combining the position information and the optimization data to obtain the calibration parameters of the equipment to be calibrated.
In the technical scheme of the step S104, after determining the position information of each marking feature point on the calibration board, the synchronous parameter optimization can be performed by combining the position information and the optimized data to obtain the calibration parameters of the equipment to be calibrated, wherein the calibration parameters comprise the internal parameters and the external parameters of the first equipment (image acquisition equipment), and when the second equipment is an inertial measurement unit, the calibration parameters also comprise the internal parameters and the external parameters of the second equipment; when the second device is an image acquisition device, the calibration parameters include an internal parameter and an external parameter of the second device.
In this embodiment, the position coordinates of each marking feature point in the image data acquired by the first device under the image coordinate system and the position coordinates of each marking feature point in the three-dimensional space coordinate system may be used as a pair of feature point pairs, and further the feature point pairs corresponding to the plurality of marking feature points may form a point pair set. When the second device is an inertial measurement unit, the optimization data acquired by the second device at least comprise space data of the device to be calibrated moving in the target space range, based on the space data, a point pair set formed by a plurality of characteristic point pairs and the space data acquired by the second device can be input into a parameter optimization module to perform parameter optimization calculation so as to obtain calibration parameters of the device to be calibrated, wherein the calibration parameters comprise inner parameters and outer parameters of the first device, and the inner parameters and the outer parameters of the second device.
For example, when the parameter optimization module performs parameter optimization calculation based on the plurality of feature point pairs and the spatial data acquired by the second device to obtain the calibration parameter of the device to be calibrated, the movement track of the device to be calibrated in the first target spatial range can be determined based on the plurality of feature point pairs, the determined movement track is further synthesized into a continuous track, the continuous track and the track determined according to the spatial data acquired by the IMU are subjected to time synchronization calculation, the nonlinear optimization problem of the first device (image acquisition device) and the IMU can be constructed, the nonlinear optimization problem is solved, and the external parameter between the first device and the IMU camera can be obtained, and the internal parameter of the IMU is obtained.
Optionally, when the second device is an image acquisition device, the method described in the foregoing step S103 may be referred to determine a position coordinate of each marking feature point in the image data acquired by the second device under the three-dimensional space coordinate system and a position coordinate of each marking feature point under the image coordinate system, so as to form a feature point pair with the position coordinate of each marking feature point under the three-dimensional space coordinate system and the position coordinate of each marking feature point under the image coordinate system, to obtain a plurality of feature point pairs, and form a point pair set with the plurality of feature point pairs, so as to input a point pair set corresponding to the plurality of feature points on the calibration board in the image data acquired by the first device, and a point pair set corresponding to the plurality of feature points on the calibration board in the image data acquired by the second device into the parameter optimization module, to perform optimization calculation, so as to obtain the calibration parameters of the device to be calibrated, where the calibration parameters include the internal parameters and the external parameters of the first device, and the internal parameters and the external parameters of the second device.
Through the steps, at least one calibration plate is deployed in the target space range, calibration data in the target space range can be obtained through equipment to be calibrated, the calibration data comprise image data and optimization data acquired by the calibration plate, the image data in the calibration data are detected, the position information of the marked feature points on the calibration plate, which are included in the image data, can be determined, and because the position information of the marked feature points can represent the position coordinates of the marked feature points in a three-dimensional space coordinate system and the position coordinates of the marked feature points in the image coordinate system, the position of the marked feature points can be accurately determined based on the position information, and based on the position information of the marked feature points and the optimization data acquired by the second equipment, parameter optimization is carried out on the equipment to be calibrated, the obtained calibration parameters of the equipment to be calibrated are accurate, the technical effect of improving the accuracy of the calibration parameters is achieved, and the technical problem that the parameter calibration of the equipment in the related technology is inaccurate is solved.
The above method of this embodiment is further exemplified below.
As an optional embodiment, step S101, determining the target space range from the candidate space ranges according to the type of the device to be calibrated, includes: the candidate spatial range comprises a first target spatial range and a second target spatial range, wherein at least one first camera calibration plate is arranged in the first target spatial range, and at least one first camera calibration plate and at least one second camera calibration plate are arranged in the second target spatial range; when the type of the second equipment is an inertial measurement unit, selecting the first target space range as a target space range; when the type of the second device is an image capturing device, the second target spatial range is selected as the target spatial range.
In this embodiment, since the device to be calibrated includes the first device and the second device, and the first device is the image capturing device, the second device may be the inertial measurement unit or the image capturing device, based on which, when the target spatial range is determined from the candidate spatial range according to the kind of the device to be calibrated, the first target spatial range where the at least one first camera calibration board is disposed may be taken as the target spatial range when the second device is the inertial measurement unit; when the second device is an image capturing device, a second target spatial range in which at least one first camera calibration plate and at least one second camera calibration plate are disposed may be taken as the target spatial range.
For example, when the device to be calibrated is one of AR/VR/MR/XR glasses, since the AR/VR/MR/XR glasses include a first camera, a second camera and an inertial measurement unit, where the first camera may be an outward facing camera, and the second camera may be an eye movement camera, when the first camera and the inertial measurement unit are calibrated, since there is usually not enough common viewing area between the plurality of first cameras, that is, the orientation difference of each camera is large, in this case, it is usually required that the coverage area of the calibration plate is large enough, based on which the size of a single calibration plate can be increased, but when the included angle between the optical axis of the camera and the normal line of the calibration plate is large, the situation that the detection of the mark feature point on the calibration plate fails or the detection error occurs easily occurs, which results in the calibration failure, and in this case, a scheme that the plurality of calibration plates are arranged according to a proper angle may be adopted to adapt to the calibration requirement of the device to be calibrated. Thus, when calibrating the first camera and the inertial measurement unit, a first target spatial range in which the at least one first camera calibration plate is deployed may be determined as the target spatial range.
For another example, when the first camera and the second camera are calibrated, because the first camera and the second camera are designed back to back in the device to be calibrated, that is, the viewing directions of the first camera and the second camera are opposite, there is no overlapping area, therefore, the candidate space range where at least one first camera calibration board and at least one second camera calibration board are disposed can be used as the target space range, and the faces including the marking feature points on the at least one first camera calibration board and the at least one second camera calibration board are oppositely arranged, so that when the device to be calibrated moves between the first camera calibration board and the second camera calibration board, the first camera can acquire the image data of the first camera calibration board, and the second camera can acquire the image data of the second camera calibration board.
As an optional implementation manner, the first target space range comprises a plurality of preset road marking points, the equipment to be calibrated moves according to the preset road marking points, and the calibration data are collected in the moving process.
In this embodiment, the first target space range includes a plurality of preset road marking points, and a moving route of the equipment to be calibrated in the first target space range is formed by connecting the plurality of preset road marking points, where calibration data can be collected during the moving process when the equipment to be calibrated moves according to the moving route. In addition, the calibration device is moved along the landmark points in a given order.
For example, the mechanical arm can be controlled to drive the equipment to be calibrated to move from the initial position to pass through a plurality of preset landmark points, wherein in the moving process of the equipment to be calibrated along the moving route formed by the landmark points according to the set sequence, the control unit can send a data acquisition instruction to the equipment to be calibrated, and the equipment to be calibrated can acquire calibration data after receiving the data acquisition instruction.
As an alternative embodiment, in the second target space range, the first camera calibration plate and the second camera calibration plate are parallel, and the calibration plate patterns are disposed opposite to each other.
In this embodiment, since at least one first camera calibration plate and at least one second camera calibration plate are disposed in the second target space, and the first camera calibration plate and the second camera calibration plate each include a mark feature point, the mark feature points are calibration plate patterns on the calibration plates, and since the first camera and the second camera are designed back to back in the equipment to be calibrated, the viewing directions of the first camera and the second camera are opposite, in order to ensure that the equipment to be calibrated moves into the second target space, the first camera calibration plate and the second camera calibration plate in the second target space can be arranged in parallel to the mark feature points collected on the calibration plates, and the calibration plate patterns on the first camera calibration plate and the second camera calibration plate are arranged relatively.
As an optional implementation manner, the marking feature points on the calibration board are corresponding to index numbers, and step S103, detecting the image data in the calibration data, and determining the position information of the marking feature points in the image data, includes: detecting image data in the calibration data to obtain index numbers of the marked feature points on the calibration plate; determining position coordinates of the marking feature points under a three-dimensional space coordinate system based on the index numbers; determining position coordinates of the mark feature points under an image coordinate system based on the positions of the mark feature points in the image data; position coordinates of the marker feature points in the three-dimensional space coordinate system and position coordinates of the marker feature points in the image coordinate system are determined as position information of the marker feature points.
In this embodiment, index numbers are corresponding to the marking feature points on the calibration plate, based on which, when image data in the calibration data is detected, the image data can be input into a visual detection model for detection, where the visual detection model is a model trained in advance based on image samples, and based on the visual detection model, index numbers of each marking feature point on the calibration plate in the image data can be detected. Specifically, the visual detection model can infer which of a plurality of marker bits is the detected marker bit according to the local combined structure, and can accurately infer indexes and coordinates of other dots on the image shot by the camera through the marker bit. After the index numbers of the marking feature points in the image data are detected, the position coordinates of the marking feature points under the three-dimensional space coordinate system can be further determined based on the corresponding index numbers of the marking feature points on the calibration plate.
In this embodiment, since the image data is acquired by collecting the calibration plate disposed in the target space, the position of the marker feature point in the image can be clearly determined based on the image data, and then an image coordinate system can be established, and the position coordinates of the marker feature point in the image coordinate system can be determined.
After determining the position coordinates of the marker feature points in the three-dimensional space coordinate system and the position coordinates of the marker feature points in the image coordinate system, the position coordinates of the marker feature points in the three-dimensional space coordinate system and the position coordinates of the marker feature points in the image coordinate system may be determined as the position information of the marker feature points. According to the method, the position information of each marking characteristic point on the calibration plate can be determined, after the position information of each marking characteristic point on the calibration plate is determined, the position coordinate of each marking characteristic point under the three-dimensional space coordinate system and the position coordinate of the marking characteristic point under the image coordinate system can be regarded as one coordinate point pair, a plurality of coordinate point pairs are further obtained, and the plurality of coordinate point pairs are regarded as a point pair set.
As an optional implementation manner, step S101, obtaining calibration data by the device to be calibrated, includes: transmitting a data acquisition instruction to equipment to be calibrated, wherein the data acquisition instruction is used for instructing the equipment to be calibrated to start a data acquisition function so as to acquire image data and optimized data; and receiving image data and optimization data fed back by the equipment to be calibrated, wherein the image data and the optimization data form calibration data.
In this embodiment, the control unit may establish a communication connection with the device to be calibrated through a data line and/or a wireless network, based on which, when obtaining calibration data, the control unit may send a data acquisition instruction to the device to be calibrated, and receive image data and optimization data returned by the device to be calibrated, where the image data and the optimization data may constitute the calibration data.
For example, when the device to be calibrated moves within the first target space range, the control unit may send a data acquisition instruction to the device to be calibrated through the data line and/or the wireless network, and after receiving the data acquisition instruction, the device to be calibrated may start a data acquisition function, for example, the device to be calibrated may start an image acquisition function of the first device to acquire image data, and start a second device (an inertia measurement unit) to acquire space data when the device to be calibrated moves within the target space range, where the space data includes gesture data and speed data of the device to be calibrated moving within the target space range. After the image data and the space data are acquired by the equipment to be calibrated, the acquired image data and space data can be transmitted to the control unit through a data line and/or a wireless network, and the control unit can receive the image data and the space data fed back by the equipment to be calibrated and determine the image data and the space data as calibration data.
For another example, when the device to be calibrated is within the second target space range, the control unit may send a data acquisition instruction to the device to be calibrated, after receiving the data acquisition instruction, the device to be calibrated may start an image acquisition function of the first device to acquire image data of the first camera calibration board within the second target space range, and start an image acquisition function of the second device (image acquisition device) to acquire image data of the second camera calibration board within the second target space range, and then send the image data acquired by the first device and the second device to the control unit, and the control unit may receive the image data returned by the device to be calibrated and use the received image data as calibration data.
As an alternative embodiment, the calibration plate comprises a common feature point and a marking feature point, wherein the difference between the marking information of the common feature point and the marking information of the marking feature point comprises at least one of the following: mark structure, mark color, mark gray value, mark shape.
In this embodiment, the calibration board includes normal feature points and marking feature points, where marking information of the normal feature points is different from marking information of the marking feature points, for example, the normal feature points may be composed of patterns with the same size and fixed intervals, the marking feature points may be different from the normal feature points by marking information of structures, colors, gray values, shapes, and the like, and the marking feature points on the calibration board may constitute marking bits. For example, the common feature points on the calibration plate may be black dots with the same size and a fixed pitch, and the mark feature points in the mark bits may be dots, dots and rings, dots and squares, dots and triangles with different sizes, wherein the color of the mark feature points may also be different from the color of the common feature points.
As an alternative embodiment, the marking information of the marking feature points in different marking plates is different; the arrangement angles among the plurality of calibration plates are determined according to the arrangement positions and the view field angles between the first equipment and the second equipment, so that the first equipment and the second equipment can acquire complete marking information of marking feature points.
In this embodiment, the marking information of the marking feature points in the different calibration plates may be different, for example, the marking information of the calibration plate a and the calibration plate B may be different, wherein the difference in the marking information is at least reflected in: mark structure, mark color, mark gray value, or mark shape. The distribution angle among the plurality of calibration plates can be determined according to the arrangement positions and the view angles of the first equipment and the second equipment, so that the first equipment and the second equipment can acquire complete marking information of marking characteristic points.
For example, when the second device is an image capturing device, since the first device and the second device in the device to be calibrated are disposed back to back, a surface of at least one first camera calibration board disposed in the second target space range, which contains the marking feature points, and a surface of at least one second camera calibration board, which contains the marking feature points, are disposed opposite to each other, so as to ensure that when the device to be calibrated moves between the first camera calibration board and the second camera calibration board, the first device and the second device can both capture complete marking information of the marking feature points on the calibration boards.
As an alternative embodiment, the image acquisition device is a rolling shutter camera, and the calibration parameter includes a time compensation parameter of the rolling shutter.
In this embodiment, when the image capturing device on the device to be calibrated is a rolling shutter camera, the rolling shutter camera may also capture image data of the calibration board when the device to be calibrated moves within the target space range, and after capturing the image data, the image data may be processed based on the visual detection model and the parameter optimization model, and a time compensation parameter of the rolling shutter may be obtained.
As an alternative embodiment, the image capturing device is a fisheye camera, and the calibration parameter includes a vignetting correction parameter.
In this embodiment, when the image capturing device on the device to be calibrated is a fisheye camera, since the fisheye camera typically has a dark corner at an image edge when capturing image data of the calibration board, when the device to be calibrated moves within the target space range, a marker bit may appear at each position in the image data of the calibration board captured by the fisheye camera, based on which the image data captured by the fisheye camera is processed by the visual detection model and the parameter optimization model, a dark corner correction parameter may be obtained to overcome the situation that the image captured by the fisheye camera has a dark corner at the image edge.
As an alternative embodiment, determining the gray value of the marking feature point in the image data and the position coordinate of the marking feature point under the image coordinate system; and performing polynomial fitting on the gray value and the position coordinates to obtain the dark angle correction parameters.
In this embodiment, when determining the vignetting correction parameter, the gray value of the white area inside the marker bit formed by the marker feature points in each frame of image in the image data acquired by the fisheye camera and the position coordinates of the marker feature points in the marker bit under the image coordinate system may be determined first, and then the determined gray value and the position coordinates may be subjected to polynomial fitting to obtain the vignetting correction parameter.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
According to the embodiment of the invention, the calibration system of the equipment comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an image acquisition equipment and an inertial measurement unit. The image acquisition device is used for acquiring image data of a calibration plate in a target space range, and the inertial measurement unit is deployed in the device to be calibrated and used for acquiring space data of the device to be calibrated moving in the target space range, wherein the space data comprises attitude data and speed data of the device to be calibrated moving in the target space range; the control unit is connected with the image acquisition device and the inertial measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated.
Fig. 4 is a schematic diagram of a calibration system of an apparatus according to an embodiment of the present invention, as shown in fig. 4, a calibration system 400 of the apparatus includes a control unit 401, an apparatus to be calibrated 402, and a calibration board 403, where the apparatus to be calibrated 402 includes an image capturing apparatus 4021 and an inertial measurement unit 4022, the image capturing apparatus 4021 includes an outward camera 40211 and an eye movement camera 40212, the calibration board 403 includes a calibration board 4031 disposed in a first target space range, a calibration board 4032, and an outward camera calibration board 4033 disposed in a second target space range, and the calibration board 4031 and the calibration board 4032 disposed in the first target space range are disposed in a certain angle, and the outward camera calibration board 4033 in the second target space range is disposed opposite to a face on the eye movement camera calibration board 4034 that includes a marking feature point.
The calibration system 400 of the device further comprises a mechanical arm 404, wherein one end of the mechanical arm 404 is connected with the control unit 401, the other end of the mechanical arm 404 is connected with the device to be calibrated 402, and the mechanical arm 404 is used for driving the device to be calibrated 402 to move within a first target space range or moving the device to be calibrated 402 into a second target space range under the control of the control unit 401.
When the mechanical arm 404 is located at the combined calibration position of the outward camera and the IMU, the equipment to be calibrated 402 is located in the first target space range, the mechanical arm 404 can drive the equipment to be calibrated to move in the first target space range, in the moving process, the outward camera 40211 can collect image data of the calibration plate 4031 and the calibration plate 4032 in the first target space range, and the IMU4022 can collect space data of the equipment to be calibrated in the moving process of the equipment to be calibrated, wherein the space data comprises gesture data and speed data of the equipment to be calibrated moving in the first target space range; when the mechanical arm 404 is located at the combined calibration position of the eye movement camera and the IMU, the device to be calibrated 402 is located in the second target space range, the mechanical arm 404 may move the device to be calibrated to the second target space range, the outward camera 40211 may collect image data of the outward camera calibration board 4033 located in the second target space range, and the eye movement camera 40212 may collect image data of the eye movement camera calibration board 4034 located in the second target space range.
After the image data and the space data in the first target space range are acquired, the acquired image data and space data may be used as calibration data, and then the calibration data is processed to determine the internal parameters and the external parameters of the outward camera 40211, and the internal parameters and the external parameters of the IMU, where the process of processing the calibration data may refer to the methods described in the foregoing step S103 and step S104, and will not be described herein again; similarly, after the image data in the second target space range is collected, the collected image data may be used as calibration data, and then the calibration data is processed to determine the internal parameters and the external parameters of the eye movement camera, where the processing process of the calibration data may refer to the methods described in the foregoing step S103 and the step S104, which are not described herein again.
Example two
The technical solution of the embodiment of the present application will be illustrated in the following with reference to a preferred embodiment.
With the rise of the concept of "meta-universe", the demands of augmented reality devices, virtual reality devices, and mixed reality devices are presenting a continuously rising situation. These devices are typically equipped with multiple cameras and IMUs for scene perception, device positioning, virtual-real integration, user interaction, etc., and for distortion to occur in virtual reality device imaging, it is often necessary to pre-scale the internal and external parameters of each camera in the virtual reality device, as well as the internal and external parameters of the IMU. The internal parameters of the camera comprise parameters such as focal length, optical center coordinates, distortion and the like, and are used for describing an imaging rule of the camera; the intrinsic parameters of the IMU comprise zero offset, scale factor error, installation error and the like, and are used for describing the intrinsic properties of the IMU; the external references of the camera/IMU are used to describe the rotation and translation of the camera/IMU in world coordinates, and to describe the conversion relationship between the camera/IMU's coordinate system and the world coordinate system.
At present, in the joint calibration of a multi-camera and an IMU, a pattern array flat plate with a fixed interval is generally adopted as a calibration plate, a handheld device to be calibrated moves in front of the calibration plate, the movement comprises translation and rotation, but because the anti-blurring capability of the calibration plate is weaker, in the process of rapidly moving the device to be calibrated, the image of the calibration plate acquired by a camera can generate motion blurring, and if the camera adopts a fisheye lens, the condition that the detection of picture edge corner points fails easily occurs, the calibration of camera internal parameters, particularly distortion parameters, is inaccurate, so that the accuracy of the working result of the camera is influenced.
However, the embodiment of the application provides a calibration method for equipment, at least one calibration plate is deployed in a target space range, calibration data in the target space range can be obtained through equipment to be calibrated, the calibration data comprise image data and optimization data acquired by the calibration plate, the image data in the calibration data are detected, the position information of the marked feature points on the calibration plate, which are included in the image data, can be determined, and can be used for indicating the position coordinates of the marked feature points under a three-dimensional space coordinate system and the position coordinates of the marked feature points under the image coordinate system.
In this embodiment, when the device to be calibrated is one of AR/VR/MR/XR glasses, since the AR/VR/MR/XR glasses comprise an outward facing camera, an eye movement camera and an inertial measurement unit, wherein the outward facing camera is disposed outside the AR/VR/MR/XR glasses and the eye movement camera is disposed inside the AR/VR/MR/XR glasses, the calibration for the AR/VR/MR/XR glasses based on this comprises: calibration of the outward camera and the inertial measurement unit, and calibration of the external parameters of the eye movement camera.
The calibration method of the outward camera and the inertial measurement unit is further described below.
In this embodiment, two calibration plates may be disposed in the first target space range, as shown in fig. 4, where the two calibration plates may be disposed with a certain included angle, the calibration plates include marking feature points, and the control unit is connected with the equipment to be calibrated (AR/VR/MR/XR glasses) through a data line or a wireless connection, based on this, the AR/VR/MR/XR glasses may be fixed on the head of the mechanical arm, the mechanical arm is controlled by the control unit, and the mechanical arm may drive the equipment to be calibrated to perform free movement with six degrees of freedom in the first target space range, for example, the control unit may control the mechanical arm to drive the equipment to be calibrated to pass through a plurality of preset landmark points from the initial position until the final position. The AR/VR/MR/XR glasses can collect calibration data during movement when moving along the movement path. The outward facing camera in the AR/VR/MR/XR glasses can acquire image data of the calibration plate in a first target space range, and the inertial measurement unit in the AR/VR/MR/XR glasses can acquire space data of the equipment to be calibrated moving in the first target space range, wherein the space data comprise attitude data and speed data of the equipment to be calibrated in the moving process. After the image data and the space data are collected, the AR/VR/MR/XR glasses can transmit the collected image data and the space data to the control unit, the control unit can input the image data into the visual detection module after receiving the image data, the visual detection module can detect the index number of the mark feature point of the calibration plate included in the image data, after determining the index number, the three-dimensional coordinates of the mark feature point in the real world can be determined, based on the position of the mark feature point in the image data, the image coordinates of the mark feature point under the image coordinate system can be determined, after obtaining the image coordinates and the three-dimensional coordinates of the mark feature point, the image coordinates and the three-dimensional coordinates corresponding to each mark feature point can be determined as one feature point pair, a plurality of feature point pairs are further obtained, the plurality of feature point pairs form a point pair set, then the point pair set and the space data collected by the inertia measurement unit are input into the parameter optimization module, and the internal parameters and the external parameters of the external camera are determined.
Fig. 5 is a schematic diagram of an image captured by VR glasses at a plurality of landmark points and a visual detection result of a feature point of a calibration board according to an embodiment of the present invention.
Optionally, if the outward camera carried by the equipment to be calibrated is a rolling shutter camera, the time compensation parameter of the rolling shutter camera can be determined based on the method.
Optionally, if the outward camera carried by the equipment to be calibrated is a fisheye camera, due to the fact that the fisheye camera has a dark angle at the edge of an image, the dark angle correction parameter of the fisheye camera can be determined based on the calibration method. For example, the mechanical arm drives the equipment to be calibratedDuring the movement, marking bits formed by marking feature points may appear at various positions of the image. Since the objective brightness of the marker bits in the real world is fixed, for a single camera, the algorithm records the gray value { gray of the white area inside the marker bits of each frame i -image coordinate value { x } i ,y i Polynomial fitting by the following formula, the dark angle correction parameters of the image can be obtained.
gray predict (u,v)=gray measure (u,v)*(1+k 1 *r+k 2 *r 2 +k 3 *r 3 )
Wherein, gray predict (u, v) is the gray value after the vignetting correction, gray measure (u, v) is the gray value of the original image, r=sqrt ((u-u 0) 2+ (v-v 0) 2), and u0 and v0 are the luminance center, k, of the image to be fitted 1 、k 2 、k 3 Is a fitting parameter.
It should be noted that, the calibration board in the above calibration environment includes a common feature point and a marking feature point, where the difference of marking information between the common feature point and the marking feature point includes at least one of the following: mark structure, mark color, mark gray value, mark shape. For the equipment to be calibrated (such as a mobile phone and partial AR glasses) with a plurality of cameras facing basically the same direction, only one calibration plate with a proper area is needed. As shown in fig. 2, the calibration plate is composed of dots, wherein two marking bits are included, and the marking bits are composed of large dots and rings which are different from common dots. The algorithm can identify the special dots in the marking bits, deduce which of a plurality of marking bits is the detected marking bit according to the local combined structure, and can accurately deduce the indexes and coordinates of other dots on the image shot by the camera through the marking bit, so that the calibration parameters are estimated by using the information. As shown in fig. 3, a sample of some marking bits is shown, and the marking bit distinguishing method can also be distinguished by information with distinguishing degree such as color, shape and the like.
It should be noted that, there is often not enough common view area between a plurality of cameras of some equipment to be calibrated (such as VR glasses), that is, the difference of the camera orientation is great, at this time, the coverage area of the calibration board is often required to be larger, based on this, the size of a single calibration board can be increased, but because the condition that the characteristic point on the calibration board fails to detect or detects the mistake easily occurs when the included angle between the optical axis of the camera and the normal line of the calibration board is great, the calibration fails, therefore, the scheme that a plurality of calibration boards are arranged according to proper angles can be adopted according to the camera arrangement characteristics of the equipment to be calibrated. Fig. 6 is a schematic diagram of adapting to the calibration requirements of VR glasses by using two calibration boards according to an embodiment of the present invention, as shown in fig. 6, the distribution of circles-dots of the pattern of the marking bits on the calibration board is identified by an algorithm, so that two calibration boards can be distinguished, for example, one circle plus two marking bits of a big dot belong to a type-a calibration board, and two circles plus one marking bit of a big dot belong to a type-B calibration board. The distinction of the calibration plates is essentially performed by the distinction of the marker bits, and therefore, the distinction of the calibration plates can also be performed by information having the distinction degree such as the color, the shape, the structure of the marker bits, and the like.
The calibration method of the eye movement camera is further described below.
In this embodiment, at least one outward-facing camera calibration plate and at least one eye-movement camera calibration plate may be disposed in the second target space range, and since the outward-facing camera and the eye-movement camera in the device to be calibrated are designed back-to-back, the two viewing angles are opposite, based on which the at least one outward-facing camera calibration plate and the at least one eye-movement camera calibration plate may be disposed opposite to each other on a surface containing the marking feature point. Fig. 7 is a schematic view of an eye movement camera calibration environment according to an embodiment of the present invention, and as shown in fig. 7, an eye movement camera is used for capturing image data of an eye movement camera calibration board, and an outward facing camera is used for capturing image data of an outward facing camera calibration board. The outward camera calibration plate and the eye movement camera calibration plate comprise marking feature points, and after the outward camera calibration plate and the eye movement camera calibration plate are designed and installed, the relative position relationship between the two calibration plates can be obtained through measurement, so that 3D coordinates of the marking feature points on the two calibration plates under the same coordinate system are obtained. For example, the image coordinates of the marking feature points on the calibration plate and the corresponding index numbers thereof can be detected by the visual detection module, the three-dimensional coordinates of the marking feature points in the real world can be calculated based on the index numbers, and the point pair sets of the image coordinates-the three-dimensional coordinates corresponding to the marking feature points on the calibration plate can be obtained according to the method; the parameter optimization module may utilize the set of point pairs of image coordinates-three-dimensional coordinates to solve for external parameters of the eye movement camera and the outward facing camera. In this embodiment, the outward facing camera may be any one of the aforementioned cameras in combination with the IMU, and generally, a most suitable camera may be selected as the outward facing camera according to the installation position and the installation angle thereof.
Fig. 8 is a schematic diagram of an image of a calibration plate captured by an internal camera in a second target space range according to an embodiment of the present application, and as shown in fig. 8, a left image 801 is an image of a calibration plate captured by an external camera in VR glasses, a middle image 802 is an image of a calibration plate captured by one eye movement camera, and a right image 803 is an image of a calibration plate captured by another eye movement camera.
In the embodiment, the equipment to be calibrated is driven to move according to a plurality of preset road mark points by the mechanical arm, the equipment to be calibrated simultaneously collects image data and IMU data, calibration parameters of a camera and the IMU are obtained through a calibration algorithm, the parameters comprise internal parameters and external parameters of the camera and internal parameters and external parameters of the IMU, and in addition, the calibration scheme can also simultaneously realize rolling shutter time compensation calibration and image dark angle correction calibration; the equipment to be calibrated is moved to the calibration position of the eye movement camera through the mechanical arm, so that the external parameter calibration of the eye movement camera and the outward camera of the AR/VR/MR glasses can be met, and a coordinate conversion relation is provided for the eye movement equipment in a virtual scene.
Example III
According to the embodiment of the application, a calibration device of the equipment is also provided. It should be noted that the calibration device of the apparatus may be used to perform the calibration method of the apparatus in embodiment 1.
FIG. 9 is a schematic diagram of a calibration device of an apparatus according to an embodiment of the invention. As shown in fig. 9, the calibration device 900 of the apparatus may include: a first determining unit 901, an acquiring unit 902, a second determining unit 903, an optimizing unit 904.
The first determining unit 901 is configured to determine a target space range from candidate space ranges according to a type of a device to be calibrated, where at least one calibration board is disposed in the candidate space range, and the device to be calibrated includes a first device and a second device.
And the acquiring unit 902 is configured to acquire calibration data through the device to be calibrated in the target space range, where the calibration data includes image data acquired by the first device on the calibration board and optimization data acquired by the second device.
A second determining unit 903, configured to detect image data in the calibration data and determine position information of the marker feature point.
And the optimizing unit 904 is used for carrying out synchronous parameter optimization by combining the position information and the optimizing data to obtain the calibration parameters of the equipment to be calibrated.
Optionally, the candidate spatial range includes a first target spatial range in which at least one first camera calibration plate is disposed and a second target spatial range in which at least one first camera calibration plate and at least one second camera calibration plate are disposed, the first determining unit 901 includes: the first selection module is used for selecting the first target space range as the target space range when the type of the second equipment is the inertial measurement unit; and the second selection module is used for selecting the second target space range as the target space range when the type of the second equipment is the image acquisition equipment.
Optionally, the marking feature points on the calibration board are corresponding to index numbers, and detect image data in the calibration data, and the second determining unit 903 includes: the acquisition module is used for acquiring an index number of the marked characteristic point on the calibration plate by detecting the image data in the calibration data; the first determining module is used for determining the position coordinates of the marking characteristic points under the three-dimensional space coordinate system based on the index numbers; a second determining module for determining position coordinates of the marker feature points under the image coordinate system based on the positions of the marker feature points in the image data; and the third determining module is used for determining the position coordinates of the marking characteristic points in the three-dimensional space coordinate system and the position coordinates of the marking characteristic points in the image coordinate system as the position information of the marking characteristic points.
Alternatively, the acquisition unit 901 includes: the transmitting module is used for transmitting a data acquisition instruction to the equipment to be calibrated, wherein the data acquisition instruction is used for instructing the equipment to be calibrated to start a data acquisition function so as to acquire image data and optimized data; the receiving module is used for receiving the image data and the optimization data fed back by the equipment to be calibrated, wherein the image data and the optimization data form calibration data.
Optionally, the apparatus 900 further includes: a third determining unit for determining a gray value of the marker feature point in the image data and a position coordinate of the marker feature point under the image coordinate system; and the fitting unit is used for performing polynomial fitting on the gray value and the position coordinates to obtain the dark angle correction parameters.
In the embodiment, the equipment to be calibrated is driven to move according to a plurality of preset road mark points by the mechanical arm, the equipment to be calibrated simultaneously collects image data and IMU data, calibration parameters of a camera and the IMU are obtained through a calibration algorithm, the parameters comprise internal parameters and external parameters of the camera and internal parameters and external parameters of the IMU, and in addition, the calibration scheme can also simultaneously realize rolling shutter time compensation calibration and image dark angle correction calibration; the equipment to be calibrated is moved to the calibration position of the eye movement camera through the mechanical arm, so that the external parameter calibration of the eye movement camera and the outward camera of the AR/VR/MR glasses can be met, and a coordinate conversion relation is provided for the eye movement equipment in a virtual scene.
Example IV
According to an embodiment of the present invention, there is also provided a computer-readable storage medium including a stored program, wherein the device in which the computer-readable storage medium is controlled to execute the calibration method of the device in embodiment 1 when the program runs.
Example five
According to an embodiment of the present application, there is also provided a processor for running a program, wherein the program executes the calibration method of the apparatus in embodiment 1.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (14)

1. A method of calibrating a device, comprising:
determining a target space range from candidate space ranges according to the types of equipment to be calibrated, wherein at least one calibration plate is deployed in the candidate space range, and the equipment to be calibrated comprises first equipment and second equipment;
in the target space range, acquiring calibration data through the equipment to be calibrated, wherein the calibration data comprises image data acquired by the first equipment on the calibration plate and optimization data acquired by the second equipment;
detecting the image data in the calibration data, and determining the position information of the mark characteristic points;
and carrying out synchronous parameter optimization by combining the position information and the optimization data to obtain the calibration parameters of the equipment to be calibrated.
2. The calibration method according to claim 1, wherein determining the target spatial range from the candidate spatial ranges according to the kind of the equipment to be calibrated comprises:
The candidate space range comprises a first target space range and a second target space range, wherein at least one first camera calibration plate is arranged in the first target space range, and at least one first camera calibration plate and at least one second camera calibration plate are arranged in the second target space range;
when the type of the second equipment is an inertial measurement unit, selecting the first target space range as the target space range;
and when the type of the second device is an image acquisition device, selecting the second target space range as the target space range.
3. The calibration method according to claim 2, wherein the first target space range includes a plurality of preset road marking points, and the equipment to be calibrated moves according to the preset road marking points, and the calibration data are collected during the movement.
4. The calibration method according to claim 2, wherein in the second target spatial range, the first camera calibration plate and the second camera calibration plate are parallel, and the calibration plate patterns are disposed opposite.
5. The calibration method according to claim 1, wherein the marking feature points on the calibration board are provided with index numbers, the detecting the image data in the calibration data, and determining the position information of the marking feature points in the image data, includes:
Detecting the image data in the calibration data to obtain the index number of the marked characteristic point on the calibration plate;
determining position coordinates of the marking feature points under a three-dimensional space coordinate system based on the index numbers;
determining position coordinates of the marking feature points under an image coordinate system based on the positions of the marking feature points in the image data;
and determining the position coordinates of the marking characteristic points in the three-dimensional space coordinate system and the position coordinates of the marking characteristic points in the image coordinate system as the position information of the marking characteristic points.
6. The calibration method according to claim 1, wherein the obtaining calibration data by the device to be calibrated comprises:
transmitting a data acquisition instruction to the equipment to be calibrated, wherein the data acquisition instruction is used for instructing the equipment to be calibrated to start a data acquisition function so as to acquire the image data and the optimization data;
and receiving the image data and the optimization data fed back by the equipment to be calibrated, wherein the image data and the optimization data form the calibration data.
7. The calibration method according to claim 1, wherein the calibration plate comprises a normal feature point and the marker feature point, wherein the difference between the marker information of the normal feature point and the marker information of the marker feature point includes at least one of: mark structure, mark color, mark gray value, mark shape.
8. The calibration method according to claim 5, wherein the marking information of the marking feature points in the different calibration plates is different; the arrangement angles among the plurality of calibration plates are determined according to the arrangement positions and the view field angles between the first equipment and the second equipment, so that the first equipment and the second equipment can acquire complete marking information of the marking characteristic points.
9. The calibration method of claim 1, wherein the image capture device is a rolling shutter camera, and the calibration parameters include time compensation parameters of the rolling shutter.
10. The calibration method according to claim 1, wherein the image capturing device is a fisheye camera, and the calibration parameters include a vignetting correction parameter.
11. The calibration method according to claim 10, characterized in that the method further comprises:
determining gray values of marking feature points in the image data and position coordinates of the marking feature points under an image coordinate system;
and performing polynomial fitting on the gray value and the position coordinate to obtain the dark angle correction parameter.
12. A calibration device for an apparatus, comprising:
The first determining unit is used for determining a target space range from candidate space ranges according to the types of equipment to be calibrated, wherein at least one calibration plate is arranged in the candidate space ranges, and the equipment to be calibrated comprises first equipment and second equipment;
the acquisition unit is used for acquiring calibration data through the equipment to be calibrated in the target space range, wherein the calibration data comprise image data acquired by the first equipment on the calibration plate and optimization data acquired by the second equipment;
the second determining unit is used for detecting the image data in the calibration data and determining the position information of the marking characteristic points;
and the optimizing unit is used for carrying out synchronous parameter optimization by combining the position information and the optimizing data to obtain the calibration parameters of the equipment to be calibrated.
13. A calibration system for an apparatus, the calibration system comprising: the device comprises a control unit, equipment to be calibrated and a calibration plate, wherein the equipment to be calibrated comprises an image acquisition device and an inertial measurement unit;
the image acquisition equipment is used for acquiring image data of the calibration plate in the target space range;
The inertial measurement unit is deployed in the equipment to be calibrated and is used for collecting the space data of the equipment to be calibrated moving in the target space range;
the control unit is connected with the image acquisition device and the inertia measurement unit and is used for converting the image data and the space data into calibration parameters of the device to be calibrated.
14. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored computer program, wherein the computer program, when run, controls a device in which the computer readable storage medium is located to perform a calibration method of a device according to any one of claims 1 to 11.
CN202310508727.XA 2023-05-05 2023-05-05 Calibration method, device, system and storage medium of equipment Pending CN116577072A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310508727.XA CN116577072A (en) 2023-05-05 2023-05-05 Calibration method, device, system and storage medium of equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310508727.XA CN116577072A (en) 2023-05-05 2023-05-05 Calibration method, device, system and storage medium of equipment

Publications (1)

Publication Number Publication Date
CN116577072A true CN116577072A (en) 2023-08-11

Family

ID=87535218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310508727.XA Pending CN116577072A (en) 2023-05-05 2023-05-05 Calibration method, device, system and storage medium of equipment

Country Status (1)

Country Link
CN (1) CN116577072A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117147582A (en) * 2023-08-31 2023-12-01 上海百琪迈科技(集团)有限公司 Fabric flaw identification device and control method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117147582A (en) * 2023-08-31 2023-12-01 上海百琪迈科技(集团)有限公司 Fabric flaw identification device and control method thereof
CN117147582B (en) * 2023-08-31 2024-05-17 上海百琪迈科技(集团)有限公司 Fabric flaw identification device and control method thereof

Similar Documents

Publication Publication Date Title
CN106643699B (en) Space positioning device and positioning method in virtual reality system
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN103020952B (en) Messaging device and information processing method
JP7059355B2 (en) Equipment and methods for generating scene representations
CN112655024B (en) Image calibration method and device
WO2015190204A1 (en) Pupil detection system, gaze detection system, pupil detection method, and pupil detection program
JP2017112602A (en) Image calibrating, stitching and depth rebuilding method of panoramic fish-eye camera and system thereof
CN107289931B (en) A kind of methods, devices and systems positioning rigid body
US20110249117A1 (en) Imaging device, distance measuring method, and non-transitory computer-readable recording medium storing a program
CN113808160B (en) Sight direction tracking method and device
CN111968228B (en) Augmented reality self-positioning method based on aviation assembly
WO2008144370A1 (en) Camera-projector duality: multi-projector 3d reconstruction
CN114714356A (en) Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
CN108154533A (en) A kind of position and attitude determines method, apparatus and electronic equipment
CN113691788A (en) Projection method of projector system
CN106767526A (en) A kind of colored multi-thread 3-d laser measurement method based on the projection of laser MEMS galvanometers
CN116577072A (en) Calibration method, device, system and storage medium of equipment
US20210133517A1 (en) Systems and methods for generating composite sets of data from different sensors
CN111199576B (en) Outdoor large-range human body posture reconstruction method based on mobile platform
JP5173551B2 (en) Vehicle perimeter monitoring apparatus and camera mounting position / posture information setting correction method applied thereto
CN110720113A (en) Parameter processing method and device, camera equipment and aircraft
KR20120108256A (en) Robot fish localization system using artificial markers and method of the same
CN104641395B (en) Image processing equipment and image processing method
WO2016194179A1 (en) Imaging device, endoscope and imaging method
CN113701750A (en) Fusion positioning system of underground multi-sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination