CN111429531A - Calibration method, calibration device and non-volatile computer-readable storage medium - Google Patents

Calibration method, calibration device and non-volatile computer-readable storage medium Download PDF

Info

Publication number
CN111429531A
CN111429531A CN202010330832.5A CN202010330832A CN111429531A CN 111429531 A CN111429531 A CN 111429531A CN 202010330832 A CN202010330832 A CN 202010330832A CN 111429531 A CN111429531 A CN 111429531A
Authority
CN
China
Prior art keywords
calibration
image
calibration image
camera
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010330832.5A
Other languages
Chinese (zh)
Inventor
陈彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010330832.5A priority Critical patent/CN111429531A/en
Publication of CN111429531A publication Critical patent/CN111429531A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a calibration method. The calibration method comprises the steps that when no optical element is arranged on the light path, a calibration pattern image is shot through an imaging system without the optical element on the light path to obtain a first calibration image; when the optical element is arranged on the light path, shooting a calibration pattern image through an imaging system provided with the optical element to obtain a second calibration image; correcting the second calibration image according to the first mapping matrix determined by the first calibration image and the second calibration image to obtain a third calibration image; and calibrating the internal parameters and the external parameters of the imaging system according to the third calibration image. The application also discloses a calibration device and a non-volatile computer-readable storage medium. And correcting the second calibration image according to the first mapping matrix determined by the first calibration image and the second calibration image, removing distortion influence caused by an optical element from the corrected image of the third calibration image, and ensuring that the calibrated imaging system has higher imaging quality.

Description

Calibration method, calibration device and non-volatile computer-readable storage medium
Technical Field
The present disclosure relates to the field of machine vision calibration technologies, and in particular, to a calibration method, a calibration apparatus, and a non-volatile computer-readable storage medium.
Background
The camera lens defined in theory does not introduce any distortion, however, in practical applications, distortion is introduced due to problems such as manufacturing accuracy of the lens and deviation of the assembly process. The distortion can lead to the coordinate accuracy reduction on the camera imaging plane, leads to the distortion of original image, and the object that sees directly that the shooting is out of shape can produce the deformation, and the image quality is relatively poor.
Disclosure of Invention
Embodiments of the present application provide a calibration method, a calibration apparatus, and a non-volatile computer-readable storage medium.
The calibration method of the embodiment of the application comprises the steps that when an optical element is not arranged on an optical path, an image is shot through an imaging system to obtain a first calibration image; when the optical element is arranged on the optical path, shooting an image through the imaging system to obtain a second calibration image, wherein the optical element can transmit light rays, and shooting parameters of the imaging system when shooting the first calibration image and the corresponding second calibration image are the same; determining a first mapping matrix according to the first calibration image and the second calibration image, and correcting the second calibration image according to the first mapping matrix to obtain a third calibration image; and calibrating the internal parameters and the external parameters of the imaging system according to the third calibration image.
The calibration device of the embodiment of the application comprises an optical element, an imaging system and a processor; the processor is used for shooting an image through an imaging system to obtain a first calibration image when an optical element is not arranged on an optical path, shooting an image through the imaging system to obtain a second calibration image when the optical element is arranged on the optical path, wherein the optical element can transmit light, shooting parameters of the imaging system when the imaging system shoots the first calibration image and the corresponding second calibration image are the same, determining a first mapping matrix according to the first calibration image and the second calibration image, correcting the second calibration image according to the first mapping matrix to obtain a third calibration image, and calibrating internal parameters and external parameters of the imaging system according to the third calibration image.
A non-transitory computer-readable storage medium containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the calibration method. The calibration method comprises the steps that when no optical element is arranged on an optical path, an image is shot through an imaging system to obtain a first calibration image; when the optical element is arranged on the optical path, shooting an image through the imaging system to obtain a second calibration image, wherein the optical element can transmit light rays, and shooting parameters of the imaging system when shooting the first calibration image and the corresponding second calibration image are the same; determining a first mapping matrix according to the first calibration image and the second calibration image, and correcting the second calibration image according to the first mapping matrix to obtain a third calibration image; and calibrating the internal parameters and the external parameters of the imaging system according to the third calibration image.
According to the calibration method, the calibration device and the nonvolatile computer readable storage medium, the imaging system is not provided with the optical element and is provided with the optical element to shoot respectively to obtain the first calibration image and the second calibration image, so that the second calibration image is corrected according to the first mapping matrix determined by the first calibration image and the second calibration image, distortion influence caused by the optical element in the second calibration image is eliminated, the image coordinate accuracy of the third calibration image obtained after correction is high, image registration and three-dimensional coordinate reconstruction are facilitated, the calibration effect can be improved, the imaging quality of the calibrated imaging system is improved, and the viewing experience of a user is guaranteed.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a calibration method according to some embodiments of the present application;
FIG. 2 is a schematic plan view of a calibration apparatus according to certain embodiments of the present application;
FIG. 3 is a schematic plan view from another perspective of a calibration device according to certain embodiments of the present application;
FIG. 4 is a schematic flow chart diagram of a calibration method according to some embodiments of the present application;
FIGS. 5 and 6 are schematic illustrations of a calibration method according to certain embodiments of the present application; and
FIG. 7 is a schematic diagram of a connection between a processor and a computer-readable storage medium according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout. In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
Referring to fig. 1 to 3, a calibration method according to an embodiment of the present disclosure includes the following steps:
011: when the optical element 20 is not arranged on the optical path, shooting an image through the imaging system 30 to obtain a first calibration image;
012: when the optical element 20 is arranged on the optical path, an image is captured by the imaging system 30 to obtain a second calibration image, wherein the optical element 20 can transmit light, and the capturing parameters of the imaging system 30 when capturing the first calibration image and the corresponding second calibration image are the same;
013: determining a first mapping matrix according to the first calibration image and the second calibration image, and correcting the second calibration image according to the first mapping matrix to obtain a third calibration image; and
014: and calibrating the internal parameters and the external parameters of the imaging system according to the third calibration image.
In certain embodiments, calibration apparatus 100 includes an optical element 20, an imaging system 30, and a processor 40. The processor 40 is configured to capture an image through the imaging system 30 to obtain a first calibration image when the optical element 20 is not disposed on the optical path, capture an image through the imaging system 30 to obtain a second calibration image when the optical element 20 is disposed on the optical path, determine a first mapping matrix according to the first calibration image and the second calibration image, correct the second calibration image according to the first mapping matrix to obtain a third calibration image, and calibrate internal parameters and external parameters of the imaging system according to the third calibration image. That is, steps 011 and 012 can be implemented by imaging system 30 in conjunction with processor 40, and steps 013 and 014 can be implemented by processor 40.
Specifically, the camera has a wide application field, and plays an important role in the fields of photography, machine vision, Augmented Reality (AR), and the like, and the camera has distortion due to the problems of lens manufacturing precision, assembly process deviation, and the like, and the distortion is visually seen as deformation of a shot object, so that the camera is generally calibrated when leaving a factory, and the calibration process is a distortion correction process, and of course, the calibration of the camera not only can correct the distortion, but also can realize one-to-one correspondence between the three-dimensional coordinates of the real world and the image coordinates of the shot image. The internal parameters and the external parameters determined by the calibration of the camera can be used for establishing an imaging model so as to determine the corresponding relation between an object point in a world coordinate system and an image point of the object point on an image coordinate system. The camera model is a geometric abstraction of a real camera, and the imaging process is a photographical transformation of spatial points. The camera parameters (internal parameters and external parameters) determine the specific mapping mode of the transformation, and the corresponding image point coordinates can be calculated as long as the parameters and the coordinates of the space points are known. Therefore, correcting distortion so that the one-to-one correspondence of image points in an image to three-dimensional coordinates in the world coordinate system is an important link in camera calibration.
In the existing calibration model, the situation that other optical elements 20 may be arranged on the light path of the camera is not considered, for example, in the AR field, the AR glasses are generally provided with a tracking camera, and the tracking camera is used for performing three-dimensional reconstruction on a shot scene, so that a virtual scene, a person and the like can be accurately superposed in the real world, and the combination of virtual and real is realized. In order to ensure the accuracy of the combination of virtual and reality and the viewing experience of the user, it is essential to accurately calibrate the tracking camera. In the conventional AR glasses, in order to protect the tracking camera, a transparent optical element 20 (such as a lens) is generally provided as a protective cover, and it can be understood that, due to the requirements of ID designs of different manufacturers, the form of the protective cover is very different, and the optical element 20 can be a protective cover of any form, and only the protective cover needs to be transparent. The addition of the protective cover can introduce asymmetric distortion, and if the distortion is not corrected, the calibration accuracy can be seriously influenced. The calibration method of the present application will be described below by taking the imaging system 30 as an example of the tracking camera.
In the calibration device 100 of the present application, when the optical path is not provided with the optical element 20 (such as the protection cover), the tracking camera takes an image to obtain a first calibration image, the first calibration image at this time is not affected by the protection cover, when the optical path is provided with the optical element 20 (such as the protection cover), the tracking camera takes an image to obtain a second calibration image, at this time, the protection cover can cover the view field range of the whole tracking camera, so that the light rays entering the tracking camera all pass through the protection cover, or the protection cover partially covers the view field range of the tracking camera, so that part of the light rays entering the tracking camera all pass through the protection cover, and the actual setting position of the protection cover in different AR glasses can be determined. The protective cover may be translucent, e.g., 70%, 80%, 90%, 95% transmittance of the protective cover, etc.
The captured image may be obtained by capturing a predetermined calibration pattern on the calibration plate 10 of the calibration device 100, or may be obtained by directly capturing a scene, and certainly, in order to facilitate obtaining the feature points and reduce the calculation amount for identifying the feature points, the first calibration image and the second calibration image are obtained by capturing the predetermined calibration pattern on the calibration plate 10 according to the embodiment of the present application. The pattern of the calibration pattern is not limited, for example, the calibration pattern can be a checkerboard image, and feature points are extracted by identifying corner points of the checkerboard; for another example, the calibration pattern may be a pattern in which a plurality of white mark points exist on a black background, and the feature points may be determined by identifying the mark points; for another example, the calibration pattern may be a pattern including a plurality of circles, and the feature points may be identified by identifying the centers of the circles.
It should be noted that, in order to ensure that the difference between the first calibration image and the second calibration image is substantially caused only by whether the protection cover is set when the first calibration image and the second calibration image are captured, it is necessary to reduce as much as possible the variables when the protection cover is set, such as whether the protection cover is set, or whether the protection cover is set and two variables of a capturing parameter (such as a capturing angle, a capturing distance, or an exposure parameter) that does not substantially affect the calibration result, or whether the protection cover is set and a plurality of environmental conditions (such as a light source type (natural light, human light, etc.) and a capturing parameter that does not substantially affect the calibration result) Shooting distance, exposure parameters and the like), environmental conditions (illumination, weather, shooting time and the like)) are basically unchanged, and it is ensured that the difference between the first calibration image and the second calibration image is basically caused by the influence of the protective cover. The parts of the first calibration image and the second calibration image corresponding to the calibration patterns are located in the central area of the field of view range of the tracking camera as far as possible, so that the influence of the distortion of the edge area of the field of view range on the first calibration image and the second calibration image is reduced.
The processor 40 then determines a first mapping matrix from the captured first calibration image and the captured second calibration image, the first mapping matrix representing a mapping of image coordinates of both the first calibration image and the second calibration image.
In order to reduce errors and ensure the accuracy and integrity of the first mapping matrix, a plurality of first calibration images and a plurality of second calibration images can be shot, then the first mapping matrix is determined according to the plurality of first calibration images and the plurality of corresponding second calibration images, and shooting parameters of the plurality of first calibration images are not completely the same, for example, shooting angles when the plurality of first calibration images are shot are the same, but shooting distances and exposure parameters are different, or shooting distances are the same, but shooting angles and exposure parameters are different, or shooting angles, shooting distances and exposure parameters are different, and the like. The shooting parameters of the plurality of second calibration images are not completely the same, for example, the shooting angles when the plurality of second calibration images are shot are the same, but the shooting distances and the exposure parameters are different, or the shooting distances are the same, but the shooting angles and the exposure parameters are different, or the shooting angles, the shooting distances and the exposure parameters are different, and the like. The shooting parameters of each first calibration image and the shooting parameters of the corresponding second calibration image are the same, the first calibration image and the corresponding second calibration image with the same shooting parameters are used as a group, and the determination of the first mapping matrix can be accurately and completely completed through the multiple groups of first calibration images and the multiple groups of second calibration images.
The processor 40 is connected with the tracking camera, and after the processor 40 acquires each group of the first calibration images and the corresponding second calibration images, the processor 40 can determine the mapping relation between the image coordinates of the first calibration images and the image coordinates of the second calibration images by identifying the image coordinates of the same feature point in the calibration patterns corresponding to the first calibration images and the second calibration images, so as to determine the first mapping matrix.
In addition, due to the addition of the protective cover, the protective cover may have a certain influence on the propagation direction of the light, which may cause a difference between the field of view of the tracking camera when the protective cover is not disposed on the light path and the field of view of the tracking camera when the protective cover is disposed on the light path, so that the processor 40 may first identify the overlapping portion of the first calibration image and the second calibration image (i.e., the portion corresponding to the overlapping region of the field of view of the tracking camera when the protective cover is not disposed and the field of view of the tracking camera when the protective cover is disposed), and the feature points of the calibration pattern in the overlapping portion may find corresponding image coordinates in both the first calibration image and the second calibration image, thereby obtaining a mapping relationship between the image coordinates of the first calibration image and. Therefore, the first mapping matrix can be accurately determined according to the overlapped part of the first calibration image and the overlapped part of the second calibration image.
The first mapping matrix may comprise a mapping matrix from the first calibration image to the second calibration image, and/or a mapping matrix from the second calibration image to the first calibration image, and in order to achieve calibration of the tracking camera of the AR glasses with the protective cover, the first mapping matrix of the present application comprises a mapping matrix from the second calibration image to the first calibration image.
Then, the processor 40 corrects the second calibration image according to the first mapping matrix to obtain a third calibration image, and the correcting process is as follows: the processor 40 maps all image coordinates of the second calibration image according to the first mapping matrix to obtain a third calibration image, the first mapping matrix can eliminate distortion influence caused by the protective cover, and the second calibration image is mapped into the third calibration image after the influence of the protective cover is eliminated.
Finally, based on the third calibration image, the processor 40 calibrates the internal and external parameters of the tracking camera. The internal parameters of the tracking camera may include a focal length, distortion parameters, etc., and the external parameters may include a rotation matrix and a translation matrix. Because the third calibration image eliminates the distortion introduced by the protective cover, the image coordinate accuracy of the third calibration image is higher, and the calibration effect can be improved.
According to the calibration method and the calibration device 100, the imaging system 30 is not provided with the optical element 20, and the imaging system is provided with the optical element 20 to respectively shoot to obtain the first calibration image and the second calibration image, so that the second calibration image is corrected according to the first mapping matrix determined by the first calibration image and the second calibration image, distortion influence caused by the optical element 20 in the second calibration image is removed, the image coordinate accuracy of the corrected third calibration image is high, image registration and three-dimensional coordinate reconstruction are facilitated, the calibration effect can be improved, the image quality of the calibrated imaging system 30 is improved, and the viewing experience of a user is guaranteed.
In some embodiments, the imaging system 30 is a binocular imaging system including a first camera 31 and a second camera 32, the first camera 31 and the second camera 32 taking images simultaneously.
The binocular imaging system has the first camera 31 and the second camera 32, and compared with the imaging system 30 having only one camera, the first camera 31 and the second camera 32 need to be calibrated separately, the calibration process has been discussed previously, and is not described herein again, and after calibrating the internal parameters and the external parameters of the first camera 31 and the second camera 32, the relative position relationship between the first camera 31 and the second camera 32 needs to be determined, so that the calibration of the entire binocular imaging system can be completed. During calibration, the first camera 31 and the second camera 32 need to shoot calibration patterns at the same time to obtain calibration images, so that the difference between the images of the overlapped part of the second calibration images corresponding to the first camera 31 and the second calibration images corresponding to the second camera 32 is basically determined by the relative position relationship between the two images, and thus, the relative position relationship between the first camera 31 and the second camera 32 can be accurately determined according to the difference between the two second calibration images obtained by shooting the calibration patterns at the same time.
Referring to fig. 2 to 4, in some embodiments, the calibration method further includes:
015: the rotation matrix and the translation matrix between the first camera 31 and the second camera 32 are determined according to the external parameters of the first camera 31 and the second camera 32.
In some embodiments, the processor 40 is further configured to determine a rotation matrix and a translation matrix between the first camera 31 and the second camera 32 according to the extrinsic parameters of the first camera 31 and the second camera 32. That is, step 015 may be implemented by processor 40.
Specifically, the calibration of the camera involves four coordinate systems, wherein each digital image captured by the camera is stored in the computer in the form of an array of M × N rows, and the value of each element (called pixel) in the image of M × N rows is called the gray scale (brightness) of the image, as shown in fig. 5, a rectangular coordinate system u, v is defined on the image, and the coordinate (u, v) of any pixel whose origin is the top left corner of the image is the row number and column number of the pixel in the array.
If the coordinates of O1 in the uv coordinate system are (u0, v0), and the physical dimensions of each pixel in the X-axis and the Y-axis are dX and dY, the coordinates of any one pixel in the image pixel coordinate system and the image physical coordinate system have the following relationships: X/Dx + u 0; v is Y/Dy + V0; for convenience, it can be expressed in homogeneous coordinate form as:
Figure BDA0002464895240000061
camera coordinate system: the origin of the camera coordinate system is defined to be on the optical center of the camera, the X-axis and the Y-axis are parallel to the X-axis and the Y-axis in the image physical coordinate system, and z is the optical axis of the camera, which is perpendicular to the image plane. The intersection point of the optical axis and the image plane is the origin of the image physical coordinate system. The camera imaging geometry is shown in fig. 6, in the form of a pinhole imaging model. In the figure, Oxy is a camera coordinate system, and OO1 is the focal length of the camera.
World coordinate system: since the camera can be installed at any position in the environment, a reference coordinate should be selected in the environment to describe the position of the camera and used to describe the position of any other object in the environment, and the coordinate system is called a world coordinate system. Consisting of Xw, Yw, and Zw axes, as shown in fig. 7, the relationship between the camera coordinate system and the world coordinate system can be described by a rotation matrix R and a translation matrix T. If the homogeneous coordinates of a certain point P in space in the world coordinate system and the camera coordinate system are (Xw, Yw, Zw, 1) T and (x, y, z, 1) T, respectively, the following relationship exists:
Figure BDA0002464895240000062
where R is an orthogonal identity matrix of 3 × 3 and T is a translation matrix.
The calibration process of the camera is a process of acquiring internal parameters (a focal length OO1, distortion parameters, etc.) and external parameters (a rotation matrix R and a translation matrix T) of the camera, specifically, a world coordinate system is established according to the calibration board 10, a calibration pattern on the calibration board 10 may be a checkerboard image, a plurality of corner points exist, after the world coordinate system is established, coordinates of each corner point can be determined, and then image coordinates of each corner point in a third calibration image are determined. Thus, according to the one-to-one correspondence between the coordinates of each corner point in the real world (i.e., the world coordinate system) and the image coordinates of each corner point in the image, the internal parameters and the external parameters of the camera can be calculated according to the coordinate conversion formula.
After the calibration of the first camera 31 and the second camera 32 is completed, the internal parameters and the external parameters of the first camera 31 and the second camera 32 can be obtained, at this time, the relative position relationship between the first camera 31 and the second camera 32 needs to be obtained, so that the images taken by the first camera 31 and the second camera 32 can be aligned (that is, the coordinates of the imaging origin of the left and right images are consistent, the optical axes of the first camera 31 and the second camera 32 are parallel, the left and right imaging planes are coplanar, and the epipolar line is aligned), because the external parameters of the first camera 31 and the second camera 32 are calculated at a single target timing, the processor 40 can calculate the relative position relationship (that is, the rotation matrix and the translation matrix) between the first camera 31 and the second camera 32 according to the external parameters of the first camera 31 and the second camera 32.
In one example, assume that there is a point P0 in the space, whose coordinates in the world coordinate system are PW (i.e., the third coordinate of the feature point in the world coordinate system), whose coordinates in the camera coordinate system of the first camera 31 and the camera coordinate system of the second camera 32 may be Pl and Pr, respectively, and that Pl and PW have the following relationship: pl ═ Rl × PW + Tl; pr and PW are related as follows: pr ═ Rr ═ PW + Tr; wherein, Pl and Pr have the following relationship: pr ═ R × Pl + T. By combining the above formula, it can be deduced that: r ═ Rr ═ RlT(ii) a T ═ Tr-R ═ Tl; rl, Tl are the rotation matrix and translation matrix of the first camera 31 relative to the calibration object obtained by monocular calibration, Rr, Tr are the rotation matrix and translation matrix of the second camera 32 relative to the calibration object obtained by monocular calibration, R and T are the rotation matrix and translation matrix between the first camera 31 and the second camera 32, respectively, in the above formula, if the rotation matrix R and the translation matrix T need to be calculated, only the external parameters (Rl and Tl) of the first camera 31 and the external parameters (Rr and Tr) of the first camera 32 need to be determined, and the external parameters (Rl and Tl) of the first camera 31 and the external parameters (Rr and Tr) of the first camera 32 are determined according to the external parameters (Rl and Tl) of the first camera 31 and the external parameters (Rl and Tl) of the first camera 32The parameters (Rr and Tr) can accurately determine the rotation matrix R and the translation matrix T between the first camera 31 and the second camera 32.
Referring to fig. 7, one or more non-transitory computer-readable storage media 300 containing computer-executable instructions 302 according to embodiments of the present application, when the computer-executable instructions 302 are executed by one or more processors 40, cause the processors 40 to perform the boot method according to any of the embodiments described above.
For example, referring to fig. 1-3, when the computer-executable instructions 302 are executed by one or more processors 40, the processor 40 is caused to perform the steps of:
011: shooting a calibration pattern through an imaging system on which the optical element 20 is not disposed on the optical path to acquire a first calibration image;
012: shooting a calibration pattern through an imaging system provided with an optical element 20 to obtain a second calibration image, wherein the optical element 20 can transmit light, and shooting parameters of the imaging system when shooting the first calibration image and the corresponding second calibration image are the same; and
013: correcting the second calibration image according to the first mapping matrix determined by the first calibration image and the second calibration image to obtain a third calibration image; and
014: and calibrating the internal parameters and the external parameters of the imaging system according to the third calibration image.
For another example, referring to fig. 2-4, when the computer-executable instructions 302 are executed by one or more processors 40, the processors 40 may further perform the steps of:
015: the rotation matrix and the translation matrix between the first camera 31 and the second camera 32 are determined according to the external parameters of the first camera 31 and the second camera 32.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (11)

1. A calibration method, comprising:
when the optical element is not arranged on the optical path, shooting an image through an imaging system to obtain a first calibration image;
when the optical element is arranged on the optical path, shooting an image through the imaging system to obtain a second calibration image, wherein the optical element can transmit light rays, and shooting parameters of the imaging system when shooting the first calibration image and the corresponding second calibration image are the same;
determining a first mapping matrix according to the first calibration image and the second calibration image, and correcting the second calibration image according to the first mapping matrix to obtain a third calibration image; and
and calibrating the internal parameters and the external parameters of the imaging system according to the third calibration image.
2. The calibration method according to claim 1, wherein the first calibration image and the second calibration image are both multiple, the shooting parameters of the multiple first calibration images are not completely the same, and the shooting parameters of the multiple second calibration images are not completely the same; the determining a first mapping matrix according to the first calibration image and the second calibration image includes:
and determining the first mapping matrix according to the plurality of first calibration images and the plurality of corresponding second calibration images.
3. A calibration method according to claim 1 or 2, wherein said determining a first mapping matrix from said first calibration image and said second calibration image comprises:
identifying a coincident portion of the first calibration image and the second calibration image; and
and determining the first mapping matrix according to the overlapped part of the first calibration image and the overlapped part of the second calibration image.
4. The calibration method according to claim 1, wherein the imaging system is a binocular imaging system, the binocular imaging system comprises a first camera and a second camera, and the first camera and the second camera capture images simultaneously.
5. The calibration method according to claim 4, further comprising:
determining a rotation matrix and a translation matrix between the first camera and the second camera according to the external parameters of the first camera and the second camera.
6. A calibration device, characterized in that the calibration device comprises an optical element, an imaging system and a processor; the processor is used for shooting an image through an imaging system to obtain a first calibration image when an optical element is not arranged on an optical path, shooting an image through the imaging system to obtain a second calibration image when the optical element is arranged on the optical path, wherein the optical element can transmit light, shooting parameters of the imaging system when the imaging system shoots the first calibration image and the corresponding second calibration image are the same, determining a first mapping matrix according to the first calibration image and the second calibration image, correcting the second calibration image according to the first mapping matrix to obtain a third calibration image, and calibrating internal parameters and external parameters of the imaging system according to the third calibration image.
7. The calibration device according to claim 6, wherein the first calibration image and the second calibration image are both multiple, the shooting parameters of the multiple first calibration images are not identical, and the shooting parameters of the multiple second calibration images are not identical; the processor is further configured to determine the first mapping matrix according to the plurality of first calibration images and the plurality of corresponding second calibration images.
8. Calibration arrangement according to claim 6 or 7, wherein the processor is further configured to identify an overlap of the first calibration image and the second calibration image, and to determine the first mapping matrix based on the overlap of the first calibration image and the overlap of the second calibration image.
9. The calibration device according to claim 6, wherein the imaging system is a binocular imaging system, the binocular imaging system includes a first camera and a second camera, and the first camera and the second camera capture images simultaneously.
10. The calibration apparatus according to claim 9, wherein the processor is further configured to determine a rotation matrix and a translation matrix between the first camera and the second camera according to the external parameters of the first camera and the second camera.
11. A non-transitory computer-readable storage medium containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform a calibration method as claimed in any one of claims 1 to 5.
CN202010330832.5A 2020-04-24 2020-04-24 Calibration method, calibration device and non-volatile computer-readable storage medium Pending CN111429531A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010330832.5A CN111429531A (en) 2020-04-24 2020-04-24 Calibration method, calibration device and non-volatile computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010330832.5A CN111429531A (en) 2020-04-24 2020-04-24 Calibration method, calibration device and non-volatile computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN111429531A true CN111429531A (en) 2020-07-17

Family

ID=71556648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010330832.5A Pending CN111429531A (en) 2020-04-24 2020-04-24 Calibration method, calibration device and non-volatile computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN111429531A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348756A (en) * 2020-11-04 2021-02-09 深圳市杰恩世智能科技有限公司 Image distortion correction method
CN112598751A (en) * 2020-12-23 2021-04-02 Oppo(重庆)智能科技有限公司 Calibration method and device, terminal and storage medium
CN112964184A (en) * 2021-04-12 2021-06-15 西华大学 Oil film thickness measuring device and measuring method based on surface friction resistance experiment
CN114419169A (en) * 2022-01-25 2022-04-29 北京理工大学 Optimization method and system for high-precision ultra-wide dynamic imaging of multiband camera
CN117876235A (en) * 2023-12-22 2024-04-12 深圳市富创优越科技有限公司 Method, device, terminal equipment and storage medium for ring-looking splicing

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11355813A (en) * 1998-06-04 1999-12-24 Honda Motor Co Ltd Device for deciding internal parameters of camera
JP2007158639A (en) * 2005-12-05 2007-06-21 Alpine Electronics Inc Car driving support apparatus
JP2009098839A (en) * 2007-10-16 2009-05-07 Honda Motor Co Ltd Image processing apparatus, image processing program, and image processing method
CN102013099A (en) * 2010-11-26 2011-04-13 中国人民解放军国防科学技术大学 Interactive calibration method for external parameters of vehicle video camera
CN107783310A (en) * 2017-11-08 2018-03-09 凌云光技术集团有限责任公司 A kind of scaling method and device of post lens imaging system
CN108830905A (en) * 2018-05-22 2018-11-16 苏州敏行医学信息技术有限公司 The binocular calibration localization method and virtual emulation of simulating medical instrument cure teaching system
CN110599412A (en) * 2019-08-15 2019-12-20 中国科学院遥感与数字地球研究所 Remote sensing data processing method and system based on unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11355813A (en) * 1998-06-04 1999-12-24 Honda Motor Co Ltd Device for deciding internal parameters of camera
JP2007158639A (en) * 2005-12-05 2007-06-21 Alpine Electronics Inc Car driving support apparatus
JP2009098839A (en) * 2007-10-16 2009-05-07 Honda Motor Co Ltd Image processing apparatus, image processing program, and image processing method
CN102013099A (en) * 2010-11-26 2011-04-13 中国人民解放军国防科学技术大学 Interactive calibration method for external parameters of vehicle video camera
CN107783310A (en) * 2017-11-08 2018-03-09 凌云光技术集团有限责任公司 A kind of scaling method and device of post lens imaging system
CN108830905A (en) * 2018-05-22 2018-11-16 苏州敏行医学信息技术有限公司 The binocular calibration localization method and virtual emulation of simulating medical instrument cure teaching system
CN110599412A (en) * 2019-08-15 2019-12-20 中国科学院遥感与数字地球研究所 Remote sensing data processing method and system based on unmanned aerial vehicle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
金辉: "用于全方位视觉导航的图像校正技术研究" *
金辉: "用于全方位视觉导航的图像校正技术研究", 中国优秀硕士学位论文全文数据库, pages 43 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348756A (en) * 2020-11-04 2021-02-09 深圳市杰恩世智能科技有限公司 Image distortion correction method
CN112598751A (en) * 2020-12-23 2021-04-02 Oppo(重庆)智能科技有限公司 Calibration method and device, terminal and storage medium
CN112964184A (en) * 2021-04-12 2021-06-15 西华大学 Oil film thickness measuring device and measuring method based on surface friction resistance experiment
CN112964184B (en) * 2021-04-12 2022-07-01 西华大学 Oil film thickness measuring device and measuring method based on surface friction resistance experiment
CN114419169A (en) * 2022-01-25 2022-04-29 北京理工大学 Optimization method and system for high-precision ultra-wide dynamic imaging of multiband camera
CN114419169B (en) * 2022-01-25 2024-05-28 北京理工大学 Optimization method and system for high-precision ultra-wide dynamic imaging of multiband camera
CN117876235A (en) * 2023-12-22 2024-04-12 深圳市富创优越科技有限公司 Method, device, terminal equipment and storage medium for ring-looking splicing

Similar Documents

Publication Publication Date Title
CN109272478B (en) Screen projection method and device and related equipment
CN111429531A (en) Calibration method, calibration device and non-volatile computer-readable storage medium
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN106127745B (en) The combined calibrating method and device of structure light 3 D vision system and line-scan digital camera
US20170127045A1 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN112655024B (en) Image calibration method and device
CN111402344A (en) Calibration method, calibration device and non-volatile computer-readable storage medium
CN110335307B (en) Calibration method, calibration device, computer storage medium and terminal equipment
CN108020175B (en) multi-grating projection binocular vision tongue surface three-dimensional integral imaging method
CN110099267A (en) Trapezoidal correcting system, method and projector
CN108364252A (en) A kind of correction of more fish eye lens panorama cameras and scaling method
CN106952219B (en) Image generation method for correcting fisheye camera based on external parameters
CN107527336B (en) Lens relative position calibration method and device
CN113298886B (en) Calibration method of projector
CN110006634B (en) Viewing field angle measuring method, viewing field angle measuring device, display method and display equipment
CN110505468B (en) Test calibration and deviation correction method for augmented reality display equipment
US11380063B2 (en) Three-dimensional distortion display method, terminal device, and storage medium
CN106886976B (en) Image generation method for correcting fisheye camera based on internal parameters
CN111009030A (en) Multi-view high-resolution texture image and binocular three-dimensional point cloud mapping method
CN115830103A (en) Monocular color-based transparent object positioning method and device and storage medium
CN108109111A (en) Pass through the method for the more fish eye lens panorama cameras of software and hardware combining assembly and adjustment
CN112950727B (en) Large-view-field multi-target simultaneous ranging method based on bionic curved compound eye
US11284052B2 (en) Method for automatically restoring a calibrated state of a projection system
CN113963065A (en) Lens internal reference calibration method and device based on external reference known and electronic equipment
CN111047651B (en) Method for correcting distorted image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination