CN115345942A - Space calibration method and device, computer equipment and storage medium - Google Patents

Space calibration method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115345942A
CN115345942A CN202210897309.XA CN202210897309A CN115345942A CN 115345942 A CN115345942 A CN 115345942A CN 202210897309 A CN202210897309 A CN 202210897309A CN 115345942 A CN115345942 A CN 115345942A
Authority
CN
China
Prior art keywords
target
pose
camera
coordinate system
calibration plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210897309.XA
Other languages
Chinese (zh)
Inventor
姜文波
赵贵华
谭阳
蔺飞
范晓轩
熊伟
曾义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Media Group
Original Assignee
China Media Group
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Media Group filed Critical China Media Group
Priority to CN202210897309.XA priority Critical patent/CN115345942A/en
Publication of CN115345942A publication Critical patent/CN115345942A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application discloses a space calibration method, a space calibration device, computer equipment and a storage medium, and relates to the technical field of image processing. After the position and pose information of the camera tracker and the calibration plate tracker is obtained, the local coordinate system of the camera and the world coordinate system of the capture system are connected in series through the calibration plate, and the position and pose of the local coordinate system of the camera in the world coordinate system are obtained. The problem of low space calibration precision in the prior art is solved, and the space calibration precision is improved.

Description

Space calibration method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a space calibration method and apparatus, a computer device, and a storage medium.
Background
With the rapid development of television program production technology, the spatial tracking technology of a photographic camera is becoming more mature in order to realize more diversified program production styles. In the prior art, a tracking device can be installed in a camera, an active positioning device is fixed on a camera, the position of the active positioning device is obtained by the positioning device, and the camera is fixedly connected with the positioning device, so that the pose of the camera is obtained.
Disclosure of Invention
In order to solve the foregoing technical problem, embodiments of the present application provide a method and an apparatus for space calibration, a computer device, and a storage medium.
In a first aspect of an embodiment of the present application, a spatial calibration method is provided, where the method includes:
acquiring an initial pose of an object to be calibrated in a camera coordinate system of a target camera;
acquiring the pose of a target camera tracker of a target feature point in a current calibration plate in a camera coordinate system and the pose of a calibration plate tracker of a reflection point of the target feature point in the calibration plate coordinate system;
converting a camera coordinate system and a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature point, and determining the relative pose of the initial pose in the calibration plate coordinate system;
and determining the target pose of the initial pose of the object to be calibrated in the world coordinate system according to the position information of the target feature point in the world coordinate system.
In an optional embodiment of the present application, determining a target pose of an initial pose of an object to be calibrated in a world coordinate system, further includes:
acquiring an initial external parameter matrix and an initial camera internal parameter of a target camera;
calculating a reprojection error of a target feature point in the current calibration plate according to the initial external parameter matrix and the initial camera internal parameters;
if the reprojection error is larger than or equal to a preset threshold value, adjusting the posture of the target camera, and recalculating the external parameter matrix and the camera internal parameters;
calculating a re-projection error of a target feature point in the current calibration plate according to the external parameter matrix and the camera internal parameters of the current target camera until the re-projection error is smaller than a preset threshold value, determining the current external parameter matrix as a target external parameter matrix, and determining the current camera internal parameters as the target camera internal parameters;
and acquiring the pose of the target according to the external reference matrix of the target and the internal reference of the target camera.
In an optional embodiment of the present application, acquiring an initial external reference matrix and an initial camera internal reference of the target camera includes:
acquiring a calibration plate image of a current calibration plate;
acquiring position information of a target feature point in the calibration plate image;
acquiring the size parameter of the current calibration plate;
and calculating initial camera internal reference and initial external reference matrixes of the target camera according to the position information and the size parameters of the current calibration plate.
In an optional embodiment of the present application, the obtaining of the position information of the preset feature point in the current calibration board image includes:
determining a target area in the current calibration plate image;
calculating position information of the target feature points in the target area;
the target area is an area different from an adjacent area in a plurality of image areas of the calibration board image.
In an alternative embodiment of the present application, determining a target area in a calibration plate image comprises:
obtaining a characteristic parameter of each area in the calibration plate image by increasing contrast ratio of the calibration plate image, wherein the characteristic parameter comprises at least one of a color value and a gray value;
traversing each area in the calibration board image, and calculating the difference value between the characteristic parameter of each area and the characteristic parameter in the adjacent area;
and if the difference value of the current area is greater than the preset value, determining the current area as the target area.
In an optional embodiment of the present application, acquiring a calibration plate tracker pose of a reflection point of a target feature point in a calibration plate coordinate system includes:
acquiring position information of a light reflecting point corresponding to at least one characteristic point in a current calibration plate;
and generating the position and pose of the calibration plate tracker according to the position information of the reflection point in the coordinate system of the calibration plate.
In an optional embodiment of the present application, determining the relative pose of the initial pose in the calibration plate coordinate system comprises:
calculating coordinate conversion parameters of the target camera tracker pose in a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature points;
and determining the relative pose of the initial pose in the coordinate system of the calibration plate according to the coordinate conversion parameters.
In an alternative embodiment of the present application, determining the target pose of the relative pose in the world coordinate system comprises:
acquiring an initial pose of a target feature point in a calibration plate in a world coordinate system;
calculating the target coordinates of the target feature points in a coordinate system of a calibration plate according to the internal reference matrix and the external reference matrix of the camera;
and determining the target pose of the relative pose in the world coordinate system through coordinate transformation between the initial pose and the target coordinate.
In an optional embodiment of the present application, acquiring a pose of a target camera tracker of a preset feature point in a camera coordinate system includes:
according to the position information of the target feature points acquired by the target camera tracker;
and generating the pose of the target camera tracker according to the pose of the position information in the camera coordinate system.
In a second aspect of the embodiments of the present application, there is provided a space calibration apparatus, including:
the positioning module is used for acquiring an initial pose of an object to be calibrated in a camera coordinate system of the target camera;
the acquisition module is used for acquiring the pose of a target camera tracker of a target feature point in a current calibration plate in a camera coordinate system and the pose of a calibration plate tracker of a reflection point of the target feature point in the calibration plate coordinate system;
the parameter calculation module is used for converting a camera coordinate system and a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature point and determining the relative pose of the initial pose in the calibration plate coordinate system;
and the calibration module is used for determining the target pose of the initial pose of the object to be calibrated in the world coordinate system according to the position information of the target feature point in the world coordinate system.
In a third aspect of embodiments of the present application, there is provided a computer device, including: comprising a memory storing a computer program and a processor implementing the steps of the method as claimed in any one of the above when the processor executes the computer program.
In a fourth aspect of the embodiments of the present application, there is provided a computer-readable storage medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the method as in any one of the above.
According to the space calibration method, the camera tracker and the calibration plate tracker are used for realizing multi-dimensional positioning of the target positioning point, the accuracy of calculating the camera internal reference matrix and the camera external reference matrix of the target camera is improved through the multi-dimensional positioning, the initial pose of the target object in the local coordinate system of the camera can be calculated through the camera internal reference matrix and the camera external reference matrix, the initial pose is converted into the pose in the world coordinate system, and finally the space calibration accuracy is improved. The space calibration method provided by the embodiment of the application has the advantages of simple calibration process, convenience in operation and high positioning accuracy.
According to the embodiment of the application, the pose information of the target camera in the current space can be acquired through space calibration of the target camera, the pose information is processed through the virtual rendering engine, the pose information of the real camera in the three-dimensional space can be provided for the virtual camera in the engine, the camera internal reference and external reference matrixes of the target camera and the like, and finally the synchronization of the motion and composition of the virtual camera and the real camera is realized by means of real-time rendering and real-time image matting and synthesis technologies, so that real-time shot pictures and virtual elements can be synthesized in real time. According to the embodiment of the application, the camera can constantly obtain the coordinate system of the camera under the world coordinate system in the free motion of the large space, and the association between the camera and the virtual scene of the broadcast television production is facilitated.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic view of an application scenario of a spatial calibration method according to an embodiment of the present application;
fig. 2 is a first flowchart of a space calibration method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a calibration board in a space calibration method according to an embodiment of the present application;
fig. 4 is a second flowchart of a space calibration method according to an embodiment of the present application;
fig. 5 is a flowchart of a space calibration method according to an embodiment of the present application;
fig. 6 is a fourth flowchart of a space calibration method according to an embodiment of the present application;
fig. 7 is a fifth flowchart of a space calibration method according to an embodiment of the present application;
fig. 8 is a sixth flowchart of a space calibration method according to an embodiment of the present application;
fig. 9 is a seventh flowchart of a space calibration method according to an embodiment of the present application;
fig. 10 is a flowchart eight of a space calibration method according to an embodiment of the present application;
fig. 11 is a flowchart of a spatial calibration method provided in an embodiment of the present application;
fig. 12 is a first schematic structural diagram of a space calibration apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a second space calibration apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In the process of implementing the present application, the inventor finds that the precision is poor in the current process of performing spatial calibration on a camera. In view of the above problems, an embodiment of the present application provides a method for spatial calibration.
The scheme in the embodiment of the application can be implemented by adopting various computer languages, such as object-oriented programming language Java and transliterated scripting language JavaScript.
In order to make the technical solutions and advantages in the embodiments of the present application more clearly understood, the following description of the exemplary embodiments of the present application with reference to the accompanying drawings is made in further detail, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and are not exhaustive of all the embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In the prior art, an active calibration method of a camera may be as follows: the active positioning device is fixed on the camera, the position of the active positioning device is acquired by the positioning device, and the camera is fixedly connected with the positioning device, so that the pose of the camera is acquired. When the pose of the camera is calculated, active calibration generally requires that fixed mark points are installed in a scene to determine the position of the camera, but the calculation mode depends on the fixed mark points and a fixed place, and if the number of the fixed mark points deployed in the scene is small or the mark points are scattered, the spatial distance span between an initial position and a current position is large when the pose of the camera is calculated, and the positioning accuracy is poor.
In view of this, in order to solve the technical problem that the precision of the current spatial calibration is low, the embodiment of the present application provides a spatial calibration method. According to the space calibration method, the camera tracker is arranged on the surface of the camera, the pose information of the camera tracker and the pose information of the calibration plate tracker are captured, the local coordinate system and the world coordinate system in the target camera are connected in series through the calibration plate, and finally the pose of the local coordinate system of the video camera in the world coordinate system is obtained.
The space calibration method provided by the application realizes the rapid and accurate calibration of the camera by the camera calibration in a passive positioning mode and the combination of a passive mark point capturing system.
The following briefly describes an application environment of the instruction processing method provided by the embodiment of the present application:
the space calibration method provided by the embodiment of the application is applied to a space calibration system 10, the space calibration system includes a hardware system 101, a software system 102 and an auxiliary system 103, and the software system 102 is respectively connected with the hardware system 101 and the auxiliary system 103. The hardware system 101 controls the camera to acquire the position information of the feature points in the calibration board and the image of the calibration board according to the control instruction of the software system 102, and transmits the acquired data to the software system 101.
Referring to fig. 1, the hardware system mentioned in the present application at least includes a camera, a camera tracker disposed on a surface of the camera, at least one calibration board, and a control system respectively connected to the camera, the camera tracker, and the calibration board. The control system collects the reflecting points of the target characteristic points generated when the camera shoots the target characteristic points in the calibration plate, and the reflecting points form a rigid body and are marked as the calibration plate tracker.
The software system in the space calibration system is used for controlling the hardware system to collect the calibration plate image and the position information of the feature points in the calibration plate, controlling the auxiliary system to collect the position and pose data of the feature points in the calibration plate, and converting the position information of the target object in different coordinate systems according to the position information, the pose data and the image data of the calibration plate, so that the space calibration of the target object is realized, the conversion process between the coordinate systems is optimized, and the precision of the space calibration is improved.
The software system in the space calibration system at least comprises an acquisition controller, an image processor, a camera parameter calculator, a camera parameter optimizer, a camera local pose calculator, a camera local pose optimizer and the like.
The acquisition controller is used for controlling the camera to acquire a calibration plate image which moves freely in space and controlling the capture system to acquire the position and the posture of the camera tracker and the position and the posture of the calibration plate tracker in real time; the image processor is used for preprocessing the calibration board image acquired by the camera and extracting the position coordinates of the checkerboard feature points; the camera parameter calculator is used for calculating a plurality of relative calibration plate external parameter matrixes of the camera; the camera parameter optimizer is used for optimizing an external parameter matrix of camera internal parameters of the camera to minimize a reprojection error; the camera local pose calculator is used for calculating the pose of the camera coordinate system relative to the camera tracker coordinate system; and the camera local pose optimizer is used for optimizing the camera local pose to obtain the camera pose with the minimum re-projection error.
The auxiliary capture system in the space calibration system is used for capturing the pose of the camera tracker and the pose of the calibration plate tracker in the hardware system and sending the pose of the camera tracker and the pose of the calibration plate tracker to the software system. The auxiliary capturing system comprises an auxiliary hardware system and an auxiliary software system, the auxiliary hardware system comprises an infrared camera and a server, and the auxiliary software system comprises passive point capturing software; and the auxiliary software system controls the auxiliary hardware system to collect the light reflecting points of the infrared camera in the calibration plate. The infrared cameras in the auxiliary hardware system may include a plurality of near-infrared cameras that are evenly distributed around the large camera motion space to ensure that the camera motions are all in their capture zones. When the near-infrared multi-camera collects the calibration plate, the coordinates of a plurality of passive reflection points can be obtained in real time, such as the coordinates are obtained at the frequency of 120 Hz; and tracking the rigid body pose formed by a plurality of reflecting points.
Referring to fig. 2, the following embodiment uses the control system as an execution subject, and applies the method for spatial calibration provided by the embodiment of the present application to the control system, where the spatial calibration of the target camera is specifically described as an example. The space calibration method provided by the embodiment of the application comprises the following steps 201 to 204:
step 201, acquiring an initial pose of an object to be calibrated in a camera coordinate system of the target camera.
The camera coordinate system is the origin of the coordinate system with the optical center in the lens of the target camera, and the object position information can be determined from the angle of the camera through the target camera coordinate system.
The initial pose at least comprises position information and posture information, so that the displacement and torsion of the object can be described in a three-dimensional space system.
The object to be calibrated can be a tracker deployed on the surface of the camera, and the pose of the local coordinate system of the camera in the world coordinate system can be calculated by acquiring the initial pose of the camera tracker, so that the spatial calibration of the camera is completed.
Step 202, acquiring the pose of a target camera tracker of the target feature point in the current calibration plate in the camera coordinate system and the pose of a calibration plate tracker of the reflection point of the target feature point in the calibration plate coordinate system.
The target camera tracker mentioned in the embodiments of the present application refers to a tracker disposed on a surface of a target camera, which can be used to locate the target camera. The pose of the target camera tracker mentioned in the embodiment of the application refers to the pose generated by the camera tracker when shooting the target feature points in the calibration board. The reflection point of the target feature point referred to in the embodiments of the present application refers to a reflection point generated when the camera photographs the feature point of the calibration plate. The embodiment of the application can mark the rigid body pose formed by the reflecting points as a calibration plate tracker, and the pose information of the reflecting points of the calibration plate can be conveniently tracked by the calibration plate tracker. According to the embodiment of the application, the pose information of the rigid body pose in the coordinate system of the calibration plate is determined according to the rigid body pose formed by the reflecting points, and the pose of the calibration plate tracker is generated.
As shown in fig. 3, the calibration plate mentioned in the embodiment of the present application includes a plurality of feature points and reflection points corresponding to the feature points, where a solid black circle in the drawing is a feature point, and a hollow white circle in the drawing is a reflection point corresponding to a feature point.
According to the embodiment of the application, the position and the pose of the calibration plate tracker and the position and the pose of the target camera tracker can be obtained at the same time, the time domain synchronism of the position and the pose of the calibration plate tracker and the position and the pose of the target camera tracker obtained by the system is ensured, and therefore the accuracy of space calculation is improved.
According to the embodiment of the application, the infrared camera in the auxiliary tracking system can be controlled, the passive characteristic points capable of reflecting infrared light can be captured in real time, and the position of the characteristic points is triangulated to obtain the pose of the target camera tracker.
According to the method, the pose information of a plurality of target positioning points can be captured by acquiring the pose of the calibration plate tracker and the pose of the target camera tracker, more captured pose information can provide more reference data for calculating the relative pose of the initial position in the calibration plate coordinate system, and the accuracy of calculating the space calibration is finally improved.
And 203, converting a camera coordinate system and a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature points, and determining the relative pose of the initial pose in the calibration plate coordinate system.
According to the embodiment of the application, the conversion relation of the feature points from the camera coordinate system to the calibration plate coordinate system can be determined through the pose information of the same feature point in different coordinate systems, namely how to obtain the relative pose information in the calibration plate coordinate system through translation and torsion of the initial position information.
And 204, determining the target pose of the initial pose of the object to be calibrated in the world coordinate system according to the position information of the target feature points in the world coordinate system.
And connecting the local coordinate system of the camera and the world coordinate system of the capture system in series through the calibration plate to obtain the pose of the local coordinate system of the camera under the world coordinate system. According to the space calibration method provided by the embodiment of the application, the camera tracker is deployed in the target camera, the pose information of the camera tracker and the position information of the calibration plate tracker are obtained, and the local coordinate system of the video camera and the world coordinate system of the capture system are connected in series through the calibration plate, so that the pose of the local coordinate system of the video camera in the world coordinate system is obtained. According to the space calibration method provided by the embodiment of the application, the camera tracker and the calibration plate tracker are used for realizing multi-dimensional positioning of the target positioning point, the accuracy of calculating the camera internal reference matrix and the camera external reference matrix of the target camera is improved through the multi-dimensional positioning, the initial pose of the target object in the local coordinate system of the camera can be calculated through the camera internal reference matrix and the camera external reference matrix, the initial pose of the target object in the world coordinate system is converted into the pose in the world coordinate system, and finally the space calibration accuracy is improved. The space calibration method provided by the embodiment of the application has the advantages of simple calibration process, convenience in operation and high positioning accuracy.
According to the embodiment of the application, the pose information of the target camera in the current space can be acquired through space calibration of the target camera, the pose information is processed through the virtual rendering engine, the pose information of the real camera in the three-dimensional space can be provided for the virtual camera in the engine, the camera internal reference and external reference matrixes of the target camera and the like, and finally the synchronization of the motion and composition of the virtual camera and the real camera is realized by means of real-time rendering and real-time image matting and synthesis technologies, so that real-time shot pictures and virtual elements can be synthesized in real time. According to the embodiment of the application, the camera can constantly obtain the coordinate system of the camera under the world coordinate system in the free motion of the large space, and the association between the camera and the virtual scene of the broadcast television is facilitated.
Referring to fig. 4, in an optional embodiment of the present application, the step 202 of acquiring the pose of the target camera tracker of the target feature point in the camera coordinate system includes the following steps 401 to 403:
step 401, obtaining position information of the target feature point collected by the target camera tracker.
The target feature points mentioned in the embodiments of the present application may be a plurality of feature points in the current calibration board. According to the embodiment of the application, a plurality of calibration plates can be deployed in the current space, namely the large environment, and the relative pose of the target camera relative to the calibration plates can be conveniently calculated by capturing the position information of the feature points in the calibration plates.
And 402, generating the pose of the target camera tracker according to the pose of the position information in the camera coordinate system.
According to the embodiment of the application, the conversion relation between the camera coordinate system and the calibration plate coordinate system can be established by capturing the pose of the target camera tracker, so that the relative pose in the calibration plate coordinate system can be determined according to the initial pose in the camera coordinate system, the target pose of a target object in a world coordinate system is determined through the calibration plate, and finally space calibration is completed; by acquiring the pose of the target camera tracker, multi-dimensional positioning can be realized, so that the accuracy of space calibration is improved.
Referring to fig. 5, in an alternative embodiment of the present application, the step 202 of obtaining the pose of the target tracker of the reflection point of the target feature point in the coordinate system of the calibration plate includes the following steps 501 to 502:
step 501, obtaining position information of a light reflecting point corresponding to at least one characteristic point in a current calibration board.
The reflection points refer to reflection points generated by target feature points when the target camera captures the target feature points in the current calibration plate.
And 502, generating the position and the pose of the calibration plate tracker according to the position information of the reflection point in a coordinate system of the calibration plate.
And determining specific parameter values of the position information of the reflecting points in a coordinate system of the calibration plate to generate the position and the posture of the tracker of the calibration plate.
According to the embodiment of the application, the conversion relation between the camera coordinate system and the calibration plate coordinate system can be established by capturing the pose of the calibration plate tracker, so that the relative pose in the calibration plate coordinate system can be determined according to the initial pose in the camera coordinate system, the target pose of a target object in the world coordinate system is determined through the calibration plate, and finally space calibration is completed; by acquiring the position and pose of the calibration plate tracker, multi-dimensional positioning can be realized, so that the precision of space calibration is improved; and the local coordinate system of the camera and the world coordinate system of the capture system are connected in series through the calibration plate, so that the pose of the same object in different coordinate systems can be calculated conveniently, and the space calibration efficiency is improved.
Referring to fig. 6, in an alternative embodiment of the present application, the determining the relative pose of the initial pose in the calibration plate coordinate system in step 203 includes the following steps 601-602:
step 601, calculating coordinate conversion parameters of the target camera tracker pose in a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature points.
The coordinate transformation parameters include at least rotation, translation and scale parameters.
And (3) performing coordinate conversion on the target characteristic points from a target camera coordinate system through calculation, converting the target characteristic points into a calibration plate coordinate system, namely, a process of projecting three-dimensional points to a two-dimensional imaging plane to form image points, acquiring a displacement vector and a rotation matrix in the conversion process, and calculating coordinate conversion parameters of the target camera tracker pose in the calibration plate coordinate system.
And step 602, determining the relative pose of the initial pose in the coordinate system of the calibration plate according to the coordinate conversion parameters.
By acquiring the coordinate transfer parameters, the relative pose in the coordinate system of the calibration plate can be finally acquired after calculating how to rotate and translate the initial pose.
According to the method and the device, the conversion relation between different coordinate systems is established through the pose of the same characteristic point in the calibration plate and the pose of the same characteristic point in the camera coordinate system, and the efficiency of calculating the initial pose of the target object in the local coordinate system of the camera and converting the initial pose of the target object into the pose of the target object in the world coordinate system is improved.
Referring to fig. 7, in an optional embodiment of the present application, the embodiment of the present application further discloses that the calculation accuracy of the target pose is optimized, including the following steps 701 to 705:
and 701, acquiring an initial external parameter matrix and initial camera internal parameters of the target camera.
The camera parameters are used to indicate physical parameters of the target camera imaging. The camera internal reference at least comprises: principal point, focal length, radial lens distortion, off-axis lens distortion, and other system error parameters.
The camera external parameter is used for indicating a conversion relation between coordinate systems, and at least comprises the following components: rotation matrix, relative displacement. The rotation matrix is the rotation angle of the camera in the world coordinate system, namely the camera can be parallel to the world coordinate system through the transformation of the rotation matrix; the relative displacement is the displacement of the camera in the world coordinate system, namely the world coordinate system can be obtained by the camera through relative translation after passing through the rotation matrix.
According to the embodiment of the application, the initial external reference matrix and the initial camera internal reference of the target camera can be calculated by acquiring the coordinates of the feature points in the calibration plate coordinate system and the real physical size coordinates of the calibration plate.
For example, the initial external parameter matrix and the initial camera internal parameter of the target camera can be solved by applying a Zhang-Yong plane calibration algorithm, and the calculation steps include:
firstly, establishing a homography matrix H equation of a calibration plate plane and a camera imaging plane; secondly, solving a homography matrix H, wherein the H matrix is irrelevant to the proportion and has 8 degrees of freedom, so that a calibration plate at least needs 4 points to establish equation solution, the number of characteristic points on the general calibration plate is far greater than 4 points, and a redundancy equation is established to solve; then, obtaining the orthogonality constraint of each component of the rotation matrix, wherein K represents camera internal parameters; and finally, carrying out matrix decomposition by using a square root method, obtaining an internal reference matrix, and finally obtaining a rotation matrix and a translation vector of each calibration plate image.
And step 702, calculating the reprojection error of the target characteristic point in the current calibration plate according to the initial external parameter matrix and the initial camera internal parameter.
The reprojection error refers to an error which is a difference between a projection value of a real three-dimensional space point on an image plane of the calibration plate and a reprojection.
The reprojection error of the target feature point can be obtained by calculating the residual value between the coordinate point of the calibration plate in the camera coordinate system, which is reprojected to the image plane, and the image coordinate in the calibration plate coordinate system.
And 703, if the reprojection error is greater than or equal to a preset threshold value, adjusting the posture of the target camera, and recalculating the external parameter matrix and the camera internal parameters.
And adjusting the posture of the target camera to enable the projection points of the calibration plate in the current calibration plate to coincide as much as possible.
And after the posture of the target camera is adjusted, position information and pose information of the calibration plate are obtained again, and a target external parameter matrix and a target camera internal parameter are calculated.
According to the embodiment of the application, the posture of the target camera can be optimized by using a beam adjustment nonlinear optimization algorithm to obtain the camera pose with the minimum reprojection error, and the camera pose can be optimized by using an LM (Levenberg-Marquardt ) algorithm.
Step 704, calculating a reprojection error of the target feature point in the current calibration board according to the external parameter matrix and the camera internal parameters of the current target camera until the reprojection error is smaller than a preset threshold, determining the current external parameter matrix as a target external parameter matrix, and determining the current camera internal parameters as the target camera internal parameters.
According to the embodiment of the application, the reprojection error is smaller than the preset threshold value and serves as an optimization target, namely the minimum residual value between the reprojection of the 3D coordinate point of the calibration plate to the image plane and the 2D image coordinate is taken as the optimization target, so that the error of the feature point when the coordinate system of the calibration plate is converted into the target pose of the world coordinate system is reduced.
Step 705, acquiring a target pose according to the target external parameter matrix and the target camera internal parameter.
According to the method provided by the embodiment of the application, the initial pose is used as an initial value and is used as an optimization parameter, the 3D characteristic point coordinate under the self coordinate system of the chessboard grid calibration plate is used as a true value, the 3D characteristic point coordinate distance error is used as a target function, and the final local pose of the camera is obtained through iterative optimization.
According to the optimization process provided by the embodiment of the application, the accuracy of the internal reference matrix and the external reference matrix of the camera can be optimized by adjusting the posture of the camera, the initial pose of the target object in the local coordinate system of the camera is calculated through the optimized internal reference matrix and the optimized external reference matrix of the camera, and the initial pose is converted into the pose in the world coordinate system, so that the accuracy of space calibration is further improved.
Referring to fig. 8, in an optional embodiment of the present application, the step 701 of acquiring the initial external reference matrix and the initial camera internal reference of the target camera includes the following steps 801 to 804:
step 801, obtaining a calibration board image of the current calibration board.
And acquiring the calibration board image of the current calibration board within the acquisition range of the camera.
And step 802, acquiring position information of the target feature point in the calibration plate image.
The position information of the target feature point refers to the position information of the target feature point in the coordinate system of the calibration plate.
And step 803, acquiring the dimension parameter of the current calibration plate.
The size parameter refers to the actual physical size of the current calibration plate.
The size parameters of the current calibration plate can be obtained through actual measurement, or can be obtained by deploying the calibration plate with a preset size in advance and reading the preset size.
And step 804, calculating camera internal reference and external reference matrixes of the initial target camera according to the position information of the current calibration plate and the size parameter of the current calibration plate.
Wherein the camera intrinsic parameters are used for indicating imaging parameters of the camera; the external reference matrix is used for indicating how to convert the positions of the target objects in the world coordinate system and the calibration board coordinate system.
The specific calculation process may be as described in step 701 above, and is not described herein again.
According to the method provided by the embodiment of the application, the initial pose of the target object in the local coordinate system of the camera can be calculated through the internal reference matrix and the external reference matrix of the camera, and the initial pose is converted into the pose in the world coordinate system, so that the space calibration accuracy is further improved.
Referring to fig. 9, in an alternative embodiment of the present application, the step 802 of obtaining the position information of the target feature point in the calibration board image includes the following steps 901 to 902:
and step 901, determining a target area in the calibration plate image.
By collecting a calibration plate moving in space, a calibration plate image is generated.
The calibration plate image can be divided into a plurality of areas, and in the plurality of areas of the calibration plate image, the target area is an area which is obviously different from an adjacent area in the calibration plate image, for example, a gray value is larger than the adjacent area.
The target area, such as an elliptical spot area, in the calibration plate may be extracted by using a spot detection algorithm when calculating the target area.
And step 902, calculating position information of the target characteristic points in the target area.
For example, the blob centroid coordinates can be calculated from the image moments; and then the position information of the target characteristic point is obtained through the spot particle coordinates.
The image moment can be used for indicating the global feature of the calibration board image and can also indicate the image information of the calibration board image, and the image information at least comprises information such as image size, position, direction and shape.
The centroid coordinate refers to the position of a point in the graph relative to each vertex.
According to the embodiment provided by the application, the target area is determined in the calibration plate image, the areas different from other areas of the calibration plate image can be rapidly distinguished, and the position information of the target feature point is calculated in the target area, so that the target feature point can be more accurately identified.
Referring to fig. 10, in an alternative embodiment of the present application, the step 901 determines the target area in the calibration board image, and includes the following steps 1001 to 1003:
step 1001, obtaining characteristic parameters of each area in the calibration plate image by performing contrast increasing processing on the calibration plate image.
The contrast enhancement processing may be for improving the picture contrast or improving the overall balance of the picture.
For example, the contrast of the calibration plate image may be enhanced by image gray scale conversion, or histogram mapping process.
Dividing the calibration board image into N areas according to a preset rule, and sequentially calculating the characteristic parameters of each area in the N areas.
The characteristic parameter includes at least one of a color value and a gray value.
Step 1002, traversing each region in the calibration plate image, and calculating the difference value between the characteristic parameter of each region and the characteristic parameter in the adjacent region.
The difference in the calculated characteristic parameters may be calculated by calculating a gray value, or pixel value, of each region.
The route of traversing the calibration board image can be traversed in a Z shape, and can also be traversed according to a preset route.
And step 1003, if the difference value of the current area is larger than a preset value, determining that the current area is determined as the target area.
The preset value can be set according to a user or according to a historical comparison result.
The embodiment that this application provided can more accurate confirm the target area in the calibration board image through carrying out the processing of reinforcing contrast to the calibration board image, highlights the contrast of target feature point in the calibration board image to in order to improve the recognition degree in the calibration board, the more efficient calculation feature point's coordinate.
Referring to fig. 11, in an alternative embodiment of the present application, determining the target pose of the relative pose in the world coordinate system includes the following steps 1101-1103:
and 1101, acquiring an initial pose of the target feature point in the calibration board in a world coordinate system.
And acquiring the initial pose of the target feature point in a world coordinate system, namely the initial pose in the current space through an infrared camera.
And 1102, calculating target coordinates of the target feature points in a coordinate system of a calibration plate according to the internal reference matrix and the external reference matrix of the camera.
And acquiring the target coordinates of the target feature points in the coordinate system of the calibration plate by calculating the positions of the target feature points when the target feature points are projected to the current calibration plate.
When the pose of the target camera in the coordinate system of the capture system is calculated, the target camera can be associated with the pose of the capture system by taking the calibration plate as a bridge, and the camera coordinate is associated with the world coordinate. By actually measuring the projected positions of the target feature points, the relative pose relationship between the calibration plate tracker and the coordinate system of the calibration plate can be obtained, so that a complete relationship is established, and the relative pose of the coordinate system of the target camera relative to the coordinate system of the camera tracker, namely, the local pose of the video camera and the lower pose of the coordinate system of the target camera in the current space in the world coordinate system are obtained.
And 1103, determining the target pose of the relative pose in the world coordinate system through coordinate conversion between the initial pose and the target coordinate.
The coordinate transformation refers to a process of transforming from one coordinate system to another coordinate system, and by the transformation process, the method determines how to transform the coordinate system of the calibration plate into a position under the world coordinate system through rotation and translation, so as to determine the target pose of the relative pose in the world coordinate system.
According to the embodiment provided by the application, multi-dimensional positioning can be realized by acquiring the position and the posture of the calibration plate tracker, so that the precision of space calibration is improved; and the calibration plate is connected with the local coordinate system of the camera and the world coordinate system of the capture system in series, so that the pose of the same object in different coordinate systems can be calculated conveniently, and the efficiency of space calibration is improved.
In the embodiment provided by the application, the accuracy of the local pose of the camera can be optimized, for example, the local pose of the camera is used as an initial value and is used as an optimization parameter, the 3D characteristic point coordinate under the self coordinate system of the checkerboard is used as a true value, the 3D characteristic point coordinate distance error is used as a target function, a transformation matrix is calculated by utilizing the coordinate system of the calibration board, the coordinate system of the camera, the tracker coordinate system of the camera, the coordinate system of the capture system, the coordinate system of the calibration board tracker and the coordinate system of the calibration board tracker, and the 3D characteristic point coordinate is transformed to the coordinate system to obtain an observed value; and obtaining the final local pose of the camera through iterative optimization.
It should be understood that, although the steps in the flowchart are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not limited to being performed in the exact order illustrated and, unless explicitly stated herein, may be performed in other orders. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
Referring to fig. 12, an embodiment of the present application provides a space calibration apparatus 120, including:
the positioning module 1201 is used for acquiring an initial pose of an object to be calibrated in a camera coordinate system of the target camera;
the acquisition module 1202 is configured to acquire a pose of a target camera tracker of a target feature point in a current calibration plate in a camera coordinate system and a pose of a calibration plate tracker of a reflection point of the target feature point in the calibration plate coordinate system;
a parameter calculation module 1203, configured to perform conversion between a camera coordinate system and a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature point, and determine a relative pose of the initial pose in the calibration plate coordinate system;
and the calibration module 1204 is configured to determine, according to the position information of the target feature point in the world coordinate system, a target pose of the initial pose of the object to be calibrated in the world coordinate system.
Referring to fig. 13, in an alternative embodiment of the present application, the space calibration apparatus 120 further includes an optimization module 1205, where the optimization module 1205 is configured to:
acquiring an initial external parameter matrix and an initial camera internal parameter of the target camera;
calculating a reprojection error of the target feature point in the current calibration plate according to the initial external reference matrix and the initial camera internal reference;
if the reprojection error is larger than or equal to a preset threshold value, adjusting the posture of the target camera, and recalculating the external parameter matrix and the camera internal parameters;
calculating a re-projection error of the target feature points in the current calibration plate according to the external parameter matrix of the current target camera and the camera internal parameters until the re-projection error is smaller than a preset threshold value, determining the current target camera external parameter matrix as a target external matrix, and determining the current internal parameters as target internal parameters;
and acquiring the pose of the target according to the external reference matrix of the target and the internal reference of the target camera.
In an optional embodiment of the present application, the optimization module 1205 in the space calibration apparatus 120 is further configured to:
acquiring a calibration plate image of a current calibration plate;
acquiring position information of a target feature point in a calibration plate image;
acquiring size parameters of a current calibration plate;
and calculating the camera internal reference and external reference matrixes of the target camera according to the position information and the size parameters of the current calibration plate.
In an optional embodiment of the present application, the optimization module 1205 in the spatial calibration apparatus 120 is further configured to:
determining a target area in the current calibration plate image;
calculating position information of the target feature points in the target area;
the target area is an area different from an adjacent area in a plurality of image areas of the calibration board image.
In an optional embodiment of the present application, the optimization module 1205 in the space calibration apparatus 120 is further configured to:
obtaining a characteristic parameter of each area in the calibration plate image by performing contrast increasing processing on the calibration plate image, wherein the characteristic parameter comprises at least one of a color value and a gray value;
traversing each area in the calibration plate image, and calculating the difference value between the characteristic parameter of each area and the characteristic parameter in the adjacent area;
and if the difference value of the current area is greater than the preset value, determining the current area as the target area.
In an optional embodiment of the present application, the acquisition module 1202 in the spatial calibration apparatus 120 is further configured to:
acquiring position information of a light reflecting point corresponding to at least one characteristic point in the current calibration plate;
and generating the position and the pose of the calibration plate tracker according to the position information of the reflection point in the coordinate system of the calibration plate.
In an optional embodiment of the present application, the acquisition module 1202 in the spatial calibration apparatus 120 is further configured to:
according to the position information of the target feature points acquired by the target camera tracker;
and generating the pose of the target camera tracker according to the pose of the position information in the camera coordinate system.
In an optional embodiment of the present application, the parameter calculating module 1203 in the space calibrating apparatus 120 is further configured to:
calculating coordinate conversion parameters of the target camera tracker pose in a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature points;
and determining the relative pose of the initial pose in the coordinate system of the calibration plate according to the coordinate conversion parameters.
In an optional embodiment of the present application, the calibration module 1204 in the space calibration apparatus 120 is further configured to:
acquiring an initial pose of a target feature point in a calibration plate in a world coordinate system;
calculating the target coordinates of the target feature points in a coordinate system of a calibration plate according to the internal reference matrix and the external reference matrix of the camera;
and determining the target pose of the relative pose in the world coordinate system according to the coordinate transformation between the initial pose and the target coordinate.
Through the space positioning device provided by the application, the multidimensional positioning of the target positioning point can be realized through the camera tracker and the calibration plate tracker, the accuracy of calculating the camera internal reference and external reference matrixes of the target camera is improved through the multidimensional positioning, the initial pose of the target object in the local coordinate system of the camera can be calculated through the camera internal reference and external reference matrixes, the pose is converted to the pose in the world coordinate system, and finally the accuracy of space calibration is improved. The space calibration method provided by the application is simple in calibration process, convenient and fast to operate and high in positioning accuracy.
For the specific definition of the spatial calibration device, reference may be made to the definition of the spatial calibration method above, and details are not described herein again. The modules in the spatial scaling device may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, and the internal structure of the computer device can be as shown in fig. 14. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a spatial scaling method as above. The method comprises the following steps: the spatial scaling method comprises a memory and a processor, wherein the memory stores a computer program, and the processor executes the computer program to realize any step of the spatial scaling method.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out any of the above spatial scaling methods.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable space-demarcating device to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (12)

1. A space calibration method is characterized by comprising the following steps:
acquiring an initial pose of an object to be calibrated in a camera coordinate system of a target camera;
acquiring the pose of a target camera tracker of a target feature point in a current calibration plate in a camera coordinate system and the pose of a calibration plate tracker of a reflection point of the target feature point in the calibration plate coordinate system;
converting a camera coordinate system and a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature point, and determining the relative pose of the initial pose in the calibration plate coordinate system;
and determining the target pose of the initial pose of the object to be calibrated in the world coordinate system according to the position information of the target feature point in the world coordinate system.
2. The method for spatial calibration according to claim 1, wherein the determining the target pose of the initial pose in the world coordinate system further comprises:
acquiring an initial external parameter matrix and initial camera internal parameters of the target camera;
calculating a reprojection error of a target feature point in the current calibration plate according to the initial external parameter matrix and the initial camera internal parameters;
if the reprojection error is larger than or equal to a preset threshold value, adjusting the posture of the target camera, and recalculating the external parameter matrix and the camera internal parameters;
calculating a reprojection error of a target feature point in the current calibration plate according to an external parameter matrix and camera internal parameters of a current target camera until the reprojection error is smaller than a preset threshold value, determining the current external parameter matrix as a target external parameter matrix, and determining the current camera internal parameters as target camera internal parameters;
and acquiring the pose of the target according to the external parameter matrix of the target and the internal parameters of the target camera.
3. The spatial calibration method according to claim 2, wherein the obtaining of the initial external reference matrix and the initial camera internal reference of the target camera comprises:
acquiring a calibration plate image of a current calibration plate;
acquiring position information of target feature points in the calibration plate image;
acquiring the size parameter of the current calibration plate;
and calculating initial camera internal reference and initial external reference matrixes of the target camera according to the position information and the size parameters of the current calibration plate.
4. The space calibration method according to claim 3, wherein the obtaining of the position information of the target feature point in the calibration plate image comprises:
determining a target area in the current calibration plate image;
calculating position information of the target feature points in the target area;
wherein the target region is a region differing from an adjacent region among a plurality of image regions of the calibration sheet image.
5. The spatial calibration method according to claim 4, wherein the determining the target area in the calibration plate image comprises:
obtaining a characteristic parameter of each area in the calibration plate image by increasing contrast ratio of the calibration plate image, wherein the characteristic parameter comprises at least one of a color value and a gray value;
traversing each area in the calibration board image, and calculating the difference value between the characteristic parameter of each area and the characteristic parameter in the adjacent area;
and if the difference value of the current area is greater than the preset value, determining the current area as the target area.
6. The space calibration method according to claim 1, wherein the obtaining of the position of the calibration plate tracker of the reflection point of the target feature point in the calibration plate coordinate system comprises:
acquiring position information of a light reflecting point corresponding to at least one characteristic point in the current calibration plate;
and generating the position and the pose of the calibration plate tracker according to the position information of the reflection point in the coordinate system of the calibration plate.
7. The spatial calibration method of claim 1, wherein the determining the relative pose of the initial pose in the calibration plate coordinate system comprises:
calculating coordinate conversion parameters of the target camera tracker pose in a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature points;
and determining the relative pose of the initial pose in a coordinate system of the calibration plate according to the coordinate conversion parameters.
8. The method for spatial calibration according to claim 1, wherein the determining the target pose of the relative pose in the world coordinate system comprises:
acquiring an initial pose of a target feature point in a calibration plate in a world coordinate system;
calculating target coordinates of the target feature points in a coordinate system of a calibration plate according to the internal reference matrix and the external reference matrix of the camera;
and determining the target pose of the relative pose in the world coordinate system through coordinate transformation between the initial pose and the target coordinate.
9. The space calibration method according to claim 1, wherein the obtaining of the pose of the target camera tracker comprises:
according to the position information of the target feature points acquired by the target camera tracker;
and generating the pose of the target camera tracker according to the pose of the position information in the camera coordinate system.
10. A spatial calibration apparatus, comprising:
the positioning module is used for acquiring an initial pose of an object to be calibrated in a camera coordinate system of the target camera;
the acquisition module is used for acquiring the pose of a target camera tracker of a target feature point in a current calibration plate in a camera coordinate system and the pose of a calibration plate tracker of a reflection point of the target feature point in the calibration plate coordinate system;
the parameter calculation module is used for converting a camera coordinate system and a calibration plate coordinate system according to the calibration plate tracker pose and the target camera tracker pose corresponding to the target feature point and determining the relative pose of the initial pose in the calibration plate coordinate system;
and the calibration module is used for determining the target pose of the initial pose of the object to be calibrated in the world coordinate system according to the position information of the target feature point in the world coordinate system.
11. A computer device, comprising: comprising a memory and a processor, said memory storing a computer program, characterized in that said processor realizes the steps of the method according to any one of claims 1 to 9 when executing said computer program.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 9.
CN202210897309.XA 2022-07-28 2022-07-28 Space calibration method and device, computer equipment and storage medium Pending CN115345942A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210897309.XA CN115345942A (en) 2022-07-28 2022-07-28 Space calibration method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210897309.XA CN115345942A (en) 2022-07-28 2022-07-28 Space calibration method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115345942A true CN115345942A (en) 2022-11-15

Family

ID=83950325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210897309.XA Pending CN115345942A (en) 2022-07-28 2022-07-28 Space calibration method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115345942A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116524022A (en) * 2023-04-28 2023-08-01 北京优酷科技有限公司 Offset data calculation method, image fusion device and electronic equipment
CN117765098A (en) * 2024-02-22 2024-03-26 青岛科技大学 Camera auxiliary calibration method, system, equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116524022A (en) * 2023-04-28 2023-08-01 北京优酷科技有限公司 Offset data calculation method, image fusion device and electronic equipment
CN116524022B (en) * 2023-04-28 2024-03-26 神力视界(深圳)文化科技有限公司 Offset data calculation method, image fusion device and electronic equipment
CN117765098A (en) * 2024-02-22 2024-03-26 青岛科技大学 Camera auxiliary calibration method, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
TWI555379B (en) An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
US8600192B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
CN111750820B (en) Image positioning method and system
CN110782394A (en) Panoramic video rapid splicing method and system
CN114399554B (en) Calibration method and system of multi-camera system
US11488322B2 (en) System and method for training a model in a plurality of non-perspective cameras and determining 3D pose of an object at runtime with the same
KR20160116075A (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
CN115345942A (en) Space calibration method and device, computer equipment and storage medium
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
JP2007192832A (en) Calibrating method of fish eye camera
CN112444242A (en) Pose optimization method and device
CN108362205B (en) Space distance measuring method based on fringe projection
CN111461963B (en) Fisheye image stitching method and device
JP2019032218A (en) Location information recording method and device
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
CN110544278B (en) Rigid body motion capture method and device and AGV pose capture system
CN113643436A (en) Depth data splicing and fusing method and device
JP7489253B2 (en) Depth map generating device and program thereof, and depth map generating system
Gaspar et al. New Dynamic Estimation of Depth from Focus in Active Vision Systems-Data Acquisition, LPV Observer Design, Analysis and Test
KR101673144B1 (en) Stereoscopic image registration method based on a partial linear method
CN116012227A (en) Image processing method, device, storage medium and processor
JP2005275789A (en) Three-dimensional structure extraction method
Georgiev et al. A fast and accurate re-calibration technique for misaligned stereo cameras
WO2021014538A1 (en) Template creation device, object recognition processing device, template creation method, object recognition processing method, and program
CN111489384A (en) Occlusion assessment method, device, equipment, system and medium based on mutual view

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination