CN113052915A - Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium - Google Patents

Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium Download PDF

Info

Publication number
CN113052915A
CN113052915A CN202110237550.5A CN202110237550A CN113052915A CN 113052915 A CN113052915 A CN 113052915A CN 202110237550 A CN202110237550 A CN 202110237550A CN 113052915 A CN113052915 A CN 113052915A
Authority
CN
China
Prior art keywords
rigid body
camera
coordinate system
central point
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110237550.5A
Other languages
Chinese (zh)
Inventor
吴迪云
许秋子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Realis Multimedia Technology Co Ltd
Original Assignee
Shenzhen Realis Multimedia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Realis Multimedia Technology Co Ltd filed Critical Shenzhen Realis Multimedia Technology Co Ltd
Priority to CN202110237550.5A priority Critical patent/CN113052915A/en
Publication of CN113052915A publication Critical patent/CN113052915A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Abstract

The application provides a camera external reference calibration method, which comprises the following steps: fixing the position of the rigid body, taking one of the reflective mark points on the rigid body as the central point of the rigid body, and acquiring three-dimensional coordinate data of the central point of the rigid body; acquiring a rigid body central point image shot by a current camera in real time, and acquiring pixel coordinate data of the rigid body central point; moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position; collecting three-dimensional space coordinates of a plurality of groups of rigid body center points at different positions, simultaneously obtaining corresponding two-dimensional code corner point coordinates, and calculating the rotation transformation relation from a world coordinate system to a camera body coordinate system
Figure DDA0002962540370000011
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure DDA0002962540370000012
According to
Figure DDA0002962540370000013
And

Description

Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium
Technical Field
The application belongs to the technical field of computers, and particularly relates to a camera external parameter calibration method and device, an augmented reality system, terminal equipment and a storage medium.
Background
An augmented reality technology, namely an ar (augmented reality) technology, is a relatively new technology content for promoting integration between real world information and virtual world information content, and implements analog simulation processing on the basis of terminal devices such as a computer on the basis of entity information which is difficult to experience in a real world space range, virtual information content is effectively applied in the real world after superposition, and the virtual information content can be perceived by human senses in the process, so that the sensory experience beyond reality is realized. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time, so that the AR technology is more and more emphasized, however, the augmented reality system needs to calculate the position and the posture of the real camera in the virtual three-dimensional space, that is, the external parameters of the camera need to be calculated to realize the superposition of the real world information and the virtual world information, and therefore, how to quickly calculate the external parameters of the camera becomes crucial.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for calibrating external parameters of a camera, an augmented reality system, a terminal device, and a storage medium, which can effectively and quickly obtain the external parameters of the camera.
In a first aspect, an embodiment of the present application provides a method for calibrating external parameters of a camera, where the method includes:
fixing the position of the rigid body, taking one of the reflective mark points on the rigid body as the central point of the rigid body, and acquiring three-dimensional coordinate data of the central point of the rigid body;
acquiring a rigid body central point image shot by a current camera in real time, and acquiring pixel coordinate data of the rigid body central point;
moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position;
three for collecting rigid body central points of multiple groups of different positionsDimensional space coordinates are obtained, corresponding two-dimensional code corner point coordinates are obtained at the same time, and the rotation transformation relation from a world coordinate system to a camera body coordinate system is calculated
Figure BDA0002962540350000021
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure BDA0002962540350000022
According to the above
Figure BDA0002962540350000023
And said
Figure BDA0002962540350000024
And calculating to obtain the external parameters of the camera.
The three-dimensional coordinate data of the rigid body center point can be transmitted through a VRPN network communication protocol.
The moving the rigid body into position includes:
moving the rigid body by more than 20cm while ensuring that the rigid body is within the visible range of the camera.
The three-dimensional space coordinates of the rigid body center points at different positions of the plurality of groups are collected, and the three-dimensional space coordinates comprise:
and acquiring three-dimensional space coordinates of the center points of the rigid bodies at more than 12 groups of different positions, or acquiring three-dimensional space coordinate data of the center points of the rigid bodies at 9 to 20 groups of different positions.
The rotation transformation relation from the world coordinate system to the camera body coordinate system can be calculated according to the EPNP algorithm
Figure BDA0002962540350000025
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure BDA0002962540350000026
According to the above
Figure BDA0002962540350000027
And said
Figure BDA0002962540350000028
Calculating external parameters of the camera, including:
calculating according to a formula to obtain the position t of the camera in the optical virtual space coordinate systemcAnd attitude rc
Figure BDA0002962540350000029
Said t iscAnd rcIs an external parameter of the camera.
In a second aspect, an embodiment of the present application provides a camera external reference calibration apparatus, where the apparatus includes:
the acquisition module is used for fixing the position of the rigid body, taking one of the reflective mark points on the rigid body as the central point of the rigid body and acquiring three-dimensional coordinate data of the central point of the rigid body;
the acquisition module is used for acquiring a rigid body central point image shot by a current camera in real time and acquiring pixel coordinate data of the rigid body central point; moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position;
the calculation module is used for acquiring three-dimensional space coordinates of a plurality of groups of rigid body center points at different positions, acquiring corresponding two-dimensional code corner point coordinates simultaneously, and calculating the rotation transformation relation from a world coordinate system to a camera body coordinate system
Figure BDA0002962540350000031
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure BDA0002962540350000032
According to the above
Figure BDA0002962540350000033
And said
Figure BDA0002962540350000034
And calculating to obtain the external parameters of the camera.
In a third aspect, an embodiment of the present application provides an augmented reality system, where the augmented reality system includes a rigid body, a camera, and a processor, and the augmented reality system is in communication connection with a large-space optical positioning system to receive position and posture information of the rigid body;
the camera is used for shooting the rigid body and is in communication connection with the processor;
the processor is configured to perform external reference calibration on the camera according to the camera external reference calibration method of any one of the first aspect.
In a fourth aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the camera external parameter calibration method described in any one of the above first aspects when executing the computer program.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the camera external reference calibration method described in any one of the first aspects.
In a sixth aspect, an embodiment of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the camera external reference calibration method described in any one of the first aspects.
It can be understood that, the beneficial effects of the second aspect to the sixth aspect may be referred to in the relevant description of the first aspect, for example, the external reference of the camera can be quickly and effectively calculated by the calibration method of the present application, and the calibration method is simple in operation, low in cost, and the like, and is not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a camera external reference calibration method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a camera external reference calibration apparatus according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an augmented reality system provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
As the augmented reality technology related to the background art, a rigid body, an inertial device and the like can be arranged on an AR prop to indirectly simulate the position and the posture of the AR prop in the space, wherein the data of the rigid body is obtained by calculation through a large-space optical positioning system, the system can set a three-dimensional virtual space coordinate system, the origin position of the coordinate system and the three-axis orientation of the coordinate system are set artificially, then, in a capture space range of an optical system, a plurality of reflective mark points can be used to form a rigid body, the central point and the orientation of the rigid body can also be set artificially, and generally, a single rigid body can form a fixed three-dimensional space relationship by 4 to 10 reflective mark points. The large space optical positioning system supports VRPN communication, so that the augmented reality system can receive the positions and postures of all rigid bodies calculated in the large space optical positioning system through a VRPN network communication protocol.
By adopting the augmented reality technology, the position and the posture of a real camera in a virtual three-dimensional space need to be calculated, the posture data is called as external parameters of the camera, and the calibration method aims at quickly calculating the external parameters of the camera.
The camera external parameter calibration method provided in the embodiment of the present application may be applied to a mobile terminal device such as a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, and a Personal Digital Assistant (PDA) provided with a rigid body, and the embodiment of the present application does not limit the specific type of the terminal device.
By way of example and not limitation, when the terminal device is an augmented reality device, the augmented reality device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. Augmented reality equipment includes that the function is complete, the size is big, can not rely on the smart mobile phone to realize complete or partial function etc.. Optionally, wearable technology can be applied to intelligently design the augmented reality device and develop a device with the function of acquiring electroencephalogram signals, namely the augmented reality device and the electroencephalogram signal acquisition device are combined to be designed into a portable device which can acquire electroencephalogram signals and has the augmented reality function.
Fig. 1 shows a schematic flow chart of a camera external reference calibration method provided in the present application, which may be applied to the augmented reality device described above by way of example and not limitation. As shown in fig. 1, the method comprises steps S101-S105, each of which is explained in detail below.
S101, fixing the position of the rigid body, taking one of the reflective mark points on the rigid body as the central point of the rigid body, and acquiring three-dimensional coordinate data of the central point of the rigid body;
in the above step S101, a plurality of reflective marker points are bound to the rigid body, and one of the reflective marker points is used as a central point of the rigid body, and the three-dimensional coordinate data of the central point of the rigid body can be transmitted through the VRPN network communication protocol, so that the three-dimensional coordinate data of the central point of the rigid body can be continuously received and acquired.
S102, acquiring a rigid body central point image shot by a current camera in real time, and acquiring pixel coordinate data of the rigid body central point;
in S102, a camera may be connected to the augmented reality system, and then the current rigid body center point image captured by the camera is collected in real time, and the captured rigid body center point image may be clicked by a mouse in a terminal device such as a computer, so as to obtain the pixel coordinates (u, v) of the rigid body center point at the click position.
S103, moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position;
in one embodiment, after steps S101 and S102 are performed, the rigid body may be moved by more than 20cm from the previous position while ensuring that the rigid body is within the visible range of the camera, the image of the rigid body center point of the current position continues to be acquired, and the three-dimensional coordinate R of the rigid body center point of the current position may be obtained by clicking the image of the rigid body center pointi=(xi,yi,zi) And pixel coordinate Qi=(Ui,Vi)。
S104, collecting three-dimensional space coordinates of a plurality of groups of rigid body center points at different positions, simultaneously obtaining corresponding two-dimensional code corner point coordinates, and calculating the rotation transformation relation from a world coordinate system to a camera body coordinate system
Figure BDA0002962540350000061
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure BDA0002962540350000062
In the above S104, three-dimensional space coordinates of the center points of the rigid bodies of 12 or more groups of different positions may be collected, or three-dimensional space coordinate data of the center points of the rigid bodies of 9 to 20 groups of different positions may be collected for solving the external parameters of the camera according to the steps S101 to S103. It should be noted that, if the acquired coordinate data of the central point is too little, the accuracy of solving the external parameters of the camera is affected, and if the data is too much, the defect of complicated calculation redundancy exists, which is not beneficial to quickly obtaining the external parameters of the camera.
For ease of understanding, the following briefly introduces several coordinate systems that are used in the prior art camera calibration process:
the world coordinate system, since the camera can be placed at any position, selects a reference coordinate in the environment to describe the position of the camera and uses it to describe the position of any object in the environment, which is called the world coordinate system.
Camera coordinate system: the coordinate system established on the camera, defined for describing the object position from the camera's perspective, is the middle ring that communicates the world coordinate system and the image/pixel coordinate system.
Image coordinate system: each digital image is an M × N array in a computer, and the numerical value of each element (called pixel) in the image of M rows and N columns is the gray value of an image point. A rectangular coordinate system u, v is defined in the image, and the coordinates (u, v) of each pixel are the column number and the row number of the pixel in the array, so (u, v) is the coordinates of the image coordinate system in pixel units.
Pixel coordinate system: the coordinate system is introduced for describing the coordinates of the image point (photo) on the digital image after the object is imaged, and is the coordinate system where the information is really read from the camera.
In this embodiment, the optical virtual space coordinate system is the world coordinate system, the image coordinate system is a transition coordinate system between the camera coordinate system and the pixel coordinate system, and both the optical virtual space coordinate system and the image coordinate system are set manually.
Collecting the multiple sets of coordinate data Ri=(xi,yi,zi) And Qi=(Ui,Vi) The corresponding two-dimensional code corner coordinates and the known camera internal parameters are substituted into a preset camera attitude estimation algorithm for calculation, and the rotation transformation relation from the world coordinate system to the camera body coordinate system can be obtained
Figure BDA0002962540350000071
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure BDA0002962540350000072
The preset camera pose estimation algorithm may be an EPNP (instantaneous-n-point) algorithm.
S105, according to the above
Figure BDA0002962540350000073
And
Figure BDA0002962540350000074
the external parameter of the camera can be calculated, namely the position t of the camera under the optical virtual space coordinate system is calculated according to a formulacAnd attitude rc
Figure BDA0002962540350000081
Wherein the content of the first and second substances,
Figure BDA0002962540350000082
is composed of
Figure BDA0002962540350000083
The pose data tcAnd rcNamely the external parameters of the camera.
Fig. 2 shows a block diagram of a camera external reference calibration apparatus 200 provided in the embodiment of the present application, which corresponds to the camera external reference calibration method described in the foregoing embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 2, the apparatus includes:
an obtaining module 201, configured to fix a position of a rigid body, use one of the reflective mark points on the rigid body as a central point of the rigid body, and obtain three-dimensional coordinate data of the rigid body central point;
the acquisition module 202 is configured to acquire a rigid body center point image shot by a current camera in real time and acquire pixel coordinate data of the rigid body center point; moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position;
a calculating module 203 for acquiring three-dimensional space coordinates of a plurality of rigid body center points at different positions, acquiring corresponding two-dimensional code corner coordinates, and calculating a rotation transformation relation from a world coordinate system to a camera body coordinate system
Figure BDA0002962540350000084
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure BDA0002962540350000088
According to the above
Figure BDA0002962540350000086
And
Figure BDA0002962540350000087
and calculating to obtain the external parameters of the camera.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Referring to fig. 3, on the basis of the method embodiment, the present application further provides an augmented reality system 300, where the augmented reality system 300 includes a rigid body 301, a camera 302 and a processor 303, and the augmented reality system 300 is connected in communication with the large-space optical positioning system to receive position and posture information of the rigid body 301;
the camera 302 is used for shooting the rigid body 301 and is in communication connection with the processor 303;
the processor 303 is configured to perform external reference calibration on the camera 302 according to the steps in the above-described embodiment of the camera external reference calibration method.
It is understood that in the augmented reality system 300, the rigid body 301 may be moved to a suitable position, then the position and posture information of the rigid body 301 is calculated by the large space optical positioning system and sent to the processor 303 in the augmented reality system 300, after the rigid body 301 is photographed by the camera 302, the image information is also sent to the processor 303, and after the processor 303 receives the information, the camera 302 is externally calibrated according to the steps in the above-mentioned camera external parameter calibration method embodiment.
Fig. 4 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 4, the terminal device 4 of this embodiment includes: at least one processor 40 (only one processor is shown in fig. 4), a memory 41, and a computer program 42 stored in the memory 41 and executable on the at least one processor 40, wherein the processor 40 executes the computer program 42 to implement the steps in the above-mentioned embodiment of the camera external reference calibration method.
The terminal device 4 may be a computing device such as a desktop computer, a notebook, a palm computer, and a cloud server, and specifically, the virtual reality device is described above. The terminal device may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of the terminal device 4, and does not constitute a limitation of the terminal device 4, and may include more or less components than those shown, or combine some components, or different components, such as an input-output device, a network access device, and the like.
The Processor 40 may be a Central Processing Unit (CPU), and the Processor 40 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may in some embodiments be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may be an external storage device of the terminal device 4 in other embodiments, such as a plug-in hard disk provided on the terminal device 4, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 41 may also include both an internal storage unit of the terminal device 4 and an external storage device. The memory 41 is used for storing an operating system, application programs, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs, and the like. The memory 41 may also be used to temporarily store data that has been output or is to be output.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned embodiment of the camera external reference calibration method may be implemented.
The embodiment of the present application provides a computer program product, which, when running on a mobile terminal, enables the mobile terminal to execute the steps in the above-mentioned camera external parameter calibration method embodiment.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A camera external reference calibration method is characterized by comprising the following steps:
fixing the position of the rigid body, taking one of the reflective mark points on the rigid body as the central point of the rigid body, and acquiring three-dimensional coordinate data of the central point of the rigid body;
acquiring a rigid body central point image shot by a current camera in real time, and acquiring pixel coordinate data of the rigid body central point;
moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position;
collecting three-dimensional space coordinates of a plurality of groups of rigid body center points at different positions, simultaneously obtaining corresponding two-dimensional code corner point coordinates, and calculating the rotation transformation relation from a world coordinate system to a camera body coordinate system
Figure FDA0002962540340000011
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure FDA0002962540340000012
According to the above
Figure FDA0002962540340000013
And said
Figure FDA0002962540340000014
And calculating to obtain the external parameters of the camera.
2. The camera external reference calibration method according to claim 1, wherein the three-dimensional coordinate data of the rigid body center point can be transmitted through a VRPN network communication protocol.
3. The camera external reference calibration method of claim 1, wherein the moving the rigid body into position comprises:
moving the rigid body by more than 20cm while ensuring that the rigid body is within the visible range of the camera.
4. The camera external reference calibration method according to claim 1, wherein the acquiring three-dimensional space coordinates of a plurality of sets of rigid body center points at different positions comprises:
and acquiring three-dimensional space coordinates of the center points of the rigid bodies at more than 12 groups of different positions, or acquiring three-dimensional space coordinate data of the center points of the rigid bodies at 9 to 20 groups of different positions.
5. The method for calibrating external parameters of camera according to claim 1, wherein the rotation transformation relationship from the world coordinate system to the coordinate system of the camera body is calculated according to the EPNP algorithm
Figure FDA0002962540340000015
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure FDA0002962540340000016
6. The camera external reference calibration method according to any one of claims 1 to 5, wherein the calibration method is based on the reference value
Figure FDA0002962540340000021
And said
Figure FDA0002962540340000022
Calculating external parameters of the camera, including:
calculating according to a formula to obtain the position t of the camera in the optical virtual space coordinate systemcAnd attitude rc
Figure FDA0002962540340000023
Said t iscAnd rcIs an external parameter of the camera.
7. A camera external reference calibration device is characterized by comprising:
the acquisition module is used for fixing the position of the rigid body, taking one of the reflective mark points on the rigid body as the central point of the rigid body and acquiring three-dimensional coordinate data of the central point of the rigid body;
the acquisition module is used for acquiring a rigid body central point image shot by a current camera in real time and acquiring pixel coordinate data of the rigid body central point; moving the rigid body to a proper position after each rigid body central point image is collected, and acquiring the three-dimensional coordinates and pixel coordinates of the rigid body central point at the current position;
the calculation module is used for acquiring three-dimensional space coordinates of a plurality of groups of rigid body center points at different positions, acquiring corresponding two-dimensional code corner point coordinates simultaneously, and calculating the rotation transformation relation from a world coordinate system to a camera body coordinate system
Figure FDA0002962540340000024
And translation transformation relation from the world coordinate system to the camera body coordinate system
Figure FDA0002962540340000025
According to the above
Figure FDA0002962540340000026
And said
Figure FDA0002962540340000027
And calculating to obtain the external parameters of the camera.
8. An augmented reality system comprising a rigid body, a camera and a processor, the augmented reality system communicatively coupled with a large space optical positioning system to receive position and attitude information of the rigid body;
the camera is used for shooting the rigid body and is in communication connection with the processor;
the processor is used for externally calibrating the camera according to the camera external reference calibration method of any one of claims 1-6.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
CN202110237550.5A 2021-03-04 2021-03-04 Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium Pending CN113052915A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110237550.5A CN113052915A (en) 2021-03-04 2021-03-04 Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110237550.5A CN113052915A (en) 2021-03-04 2021-03-04 Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium

Publications (1)

Publication Number Publication Date
CN113052915A true CN113052915A (en) 2021-06-29

Family

ID=76509739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110237550.5A Pending CN113052915A (en) 2021-03-04 2021-03-04 Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium

Country Status (1)

Country Link
CN (1) CN113052915A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781575A (en) * 2021-08-09 2021-12-10 上海奥视达智能科技有限公司 Camera parameter calibration method, device, terminal and storage medium
CN113822943A (en) * 2021-09-17 2021-12-21 中汽创智科技有限公司 External parameter calibration method, device and system of camera and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781575A (en) * 2021-08-09 2021-12-10 上海奥视达智能科技有限公司 Camera parameter calibration method, device, terminal and storage medium
CN113781575B (en) * 2021-08-09 2024-01-12 上海奥视达智能科技有限公司 Calibration method and device for camera parameters, terminal and storage medium
CN113822943A (en) * 2021-09-17 2021-12-21 中汽创智科技有限公司 External parameter calibration method, device and system of camera and storage medium

Similar Documents

Publication Publication Date Title
CN107223269B (en) Three-dimensional scene positioning method and device
US10726580B2 (en) Method and device for calibration
CN115631305A (en) Driving method of skeleton model of virtual character, plug-in and terminal equipment
CN110782499B (en) Calibration method and calibration device for augmented reality equipment and terminal equipment
CN113052915A (en) Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium
US20220319050A1 (en) Calibration method and apparatus, processor, electronic device, and storage medium
CN110956666B (en) Motion data calibration method and device, terminal equipment and storage medium
CN110807814A (en) Camera pose calculation method, device, equipment and storage medium
CN112085798A (en) Camera calibration method and device, electronic equipment and storage medium
CN112085771A (en) Image registration method and device, terminal equipment and computer readable storage medium
CN111508033A (en) Camera parameter determination method, image processing method, storage medium, and electronic apparatus
CN115830135A (en) Image processing method and device and electronic equipment
CN107534202A (en) A kind of method and apparatus for measuring antenna attitude
CN110567484A (en) method and device for calibrating IMU and rigid body posture and readable storage medium
CN113362445B (en) Method and device for reconstructing object based on point cloud data
CN112102415A (en) Depth camera external parameter calibration method, device and equipment based on calibration ball
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN113262048B (en) Spatial registration method and device, terminal equipment and intraoperative navigation system
CN111507894A (en) Image splicing processing method and device
CN114516048B (en) Zero point debugging method and device for robot, controller and storage medium
CN113570659B (en) Shooting device pose estimation method, device, computer equipment and storage medium
CN116128744A (en) Method for eliminating image distortion, electronic device, storage medium and vehicle
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
CN114926316A (en) Distance measuring method, distance measuring device, electronic device, and storage medium
CN105229706A (en) Image processing apparatus, image processing method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination