CN111630851A - Camera calibration device, system and method and device with storage function - Google Patents

Camera calibration device, system and method and device with storage function Download PDF

Info

Publication number
CN111630851A
CN111630851A CN201880087316.5A CN201880087316A CN111630851A CN 111630851 A CN111630851 A CN 111630851A CN 201880087316 A CN201880087316 A CN 201880087316A CN 111630851 A CN111630851 A CN 111630851A
Authority
CN
China
Prior art keywords
camera
coordinate system
coordinates
cameras
reference point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880087316.5A
Other languages
Chinese (zh)
Inventor
阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen A&E Intelligent Technology Institute Co Ltd
Original Assignee
Shenzhen A&E Intelligent Technology Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen A&E Intelligent Technology Institute Co Ltd filed Critical Shenzhen A&E Intelligent Technology Institute Co Ltd
Publication of CN111630851A publication Critical patent/CN111630851A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof

Abstract

The application discloses a camera calibration device, a system and a method and a device with a storage function. The camera shoots the reference point in the shooting range of the camera to obtain the coordinate of the image coordinate system of the reference point, and then the coordinate of the image coordinate system of the reference point and the coordinate of the world coordinate system of the camera are obtained, so that the position relation between the cameras can be quickly obtained in each frame, and the shooting precision is further improved.

Description

Camera calibration device, system and method and device with storage function [ technical field ] A method for producing a semiconductor device
The present application relates to the field of vision technologies, and in particular, to a camera calibration apparatus, system, method, and apparatus with a storage function.
[ background of the invention ]
The binocular camera is an increasingly focused device capable of providing stereoscopic vision, and is based on a binocular parallax principle, two different images of the same measured object are acquired at different positions by using an imaging device, and then the three-dimensional space position of an object shot by the binocular camera relative to the camera is calculated. In general, two lenses of a binocular camera have optical axes parallel to each other, are arranged side by side in a direction perpendicular to the optical axes and have viewing windows separated from each other, and have different viewing angles for the same object to be imaged. The binocular stereo vision measuring method has the advantages of high efficiency, proper precision, simple system structure, low cost and the like, and is very suitable for online and non-contact product detection and quality control of a manufacturing site.
However, when two eyes are actually applied, if the relative position between the two eyes changes, external parameters such as a translation matrix and a rotation matrix between the cameras change, and the relative position relationship between the cameras cannot be calibrated again immediately when the relative position relationship between the cameras changes, so that the acquired matching data cannot be effectively used. In most cases, binocular systems require very stable fixed mounts, which adds cost and does not guarantee efficient handling of the relative positional relationship changes that occur between the cameras.
Therefore, it is necessary to provide a device, a system, a method for calibrating a camera and a device having a storage function to solve the above problems.
[ summary of the invention ]
The technical problem mainly solved by the application is to provide a camera calibration device, system and method and a device with a storage function, which can rapidly acquire the position relation between cameras in each frame when the relative position relation between the cameras changes.
In order to solve the above technical problem, a first technical solution adopted by the present application is to provide a camera calibration device, which includes a camera, a substrate located within a shooting range of the camera, and a processor coupled to the camera, wherein the substrate is provided with an identification layer, and the identification layer at least includes one reference point.
In order to solve the above technical problem, a second technical solution adopted by the present application is to provide a camera calibration system, where the camera calibration system includes a camera calibration device, the camera calibration device includes a camera, a substrate located within a shooting range of the camera, and a processor coupled to the camera, where the substrate is provided with a mark layer, and the mark layer includes at least one reference point.
In order to solve the above technical problem, a third technical solution adopted by the present application is to provide a camera calibration method, including: the camera shoots a reference point within the shooting range of the camera to acquire the coordinate of the reference point in an image coordinate system; and obtaining the coordinates of the camera in the reference coordinate system through the coordinates of the reference point in the image coordinate system and the reference coordinate system so as to obtain the position relation between the cameras.
In order to solve the above technical problem, a fourth technical solution adopted by the present application is to provide an apparatus having a storage function, where the apparatus stores program data, and the program data can be executed to implement the following method: the camera shoots a reference point within the shooting range of the camera to acquire the coordinate of the reference point in an image coordinate system; and obtaining the coordinates of the camera in the reference coordinate system through the coordinates of the reference point in the image coordinate system and the reference coordinate system so as to obtain the position relation between the cameras.
The beneficial effect of this application is: different from the prior art, the camera calibration device of the application comprises a camera, a substrate located in a camera shooting range and a processor coupled with the camera, wherein the substrate is provided with a mark layer, and the mark layer at least comprises a reference point. The camera shoots the reference point in the shooting range of the camera to obtain the coordinate of the image coordinate system of the reference point, and then the coordinate of the image coordinate system of the reference point and the coordinate of the world coordinate system of the camera are obtained, so that the position relation between the cameras can be quickly obtained in each frame, and the shooting precision is further improved.
[ description of the drawings ]
FIG. 1 is a schematic structural diagram of an embodiment of an apparatus for camera calibration provided in the present application;
FIG. 2 is a schematic top view of one embodiment of the transparent glass substrate with the marking layer of FIG. 1;
FIG. 3 is a schematic flow chart of a camera calibration method provided in the present application;
FIG. 4 is a schematic diagram of a system for camera calibration according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an embodiment of an apparatus with a storage function provided in the present application.
[ detailed description ] embodiments
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The calibration of the camera comprises calibration of internal parameters and external parameters, wherein the internal parameters mainly comprise principal point coordinates, focal lengths, radial distortion coefficients and transverse distortion coefficients, the external parameters mainly comprise rotation matrixes and translation matrixes, and in the actual use process, the position relation between the cameras is easy to change, namely the external parameters of the cameras are changed, the internal parameters are kept unchanged, so that the position relation between the cameras needs to be obtained again in order to improve the shooting precision. In order to rapidly acquire the position relationship between cameras in each frame, the substrate provided with the identification layer is arranged in the shooting range of the cameras, and the coordinates of the cameras in a world coordinate system are acquired through reference points on the identification layer, so that the position relationship between the cameras is acquired. Hereinafter, the present application will be described by taking an example of obtaining a positional relationship between two cameras.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of a camera calibration apparatus provided in the present application, and as shown in fig. 1, the camera calibration apparatus includes a substrate 103, a first camera 101 and a second camera 102 located on one side of the substrate 103, and a processor (not shown in the figure) coupled to the first camera 101 and the second camera 102, where the substrate 103 is located within a shooting range of the first camera 101 and the second camera 102, the first camera 101 and the second camera 102 form a binocular camera, the substrate 103 is a transparent glass substrate, and in the present application, the substrate 103 is a planar substrate, an identification layer 1031 is provided on the substrate 103, the identification layer 1031 includes at least one reference point, and the identification layer 1031 divides the transparent glass substrate 103 into a plurality of rectangular unit grids, and the at least one reference point is located at a cross point of the plurality of rectangular unit grids.
Further, the identification layer 1031 is disposed on the surface of the transparent glass substrate 103, and in other embodiments, the identification layer 1031 may also be disposed inside the transparent glass substrate, or partially disposed on the surface of the substrate and partially disposed inside the substrate. Further, the identification layer 1031 is a transflective reflective stripe of the segment display technology, i.e. the light incident on the reflective stripe is partially reflected and partially refracted.
In this embodiment, the transparent substrate is a glass substrate, and in other embodiments, the material of the transparent substrate may also be sapphire, silicon carbide, or an organic transparent body, which is not limited herein. In an alternative embodiment, referring to fig. 2, fig. 2 is a schematic top view of an embodiment of the transparent glass substrate with the mark layer in fig. 1. As shown in fig. 2, a transparent glass substrate 103 is coated with an identification layer 1031 on the surface thereof, and the identification layer 1031 divides the glass substrate 103 into a plurality of rectangular unit cells. The marking layer 1031 divides the glass substrate 103 into a plurality of rectangular unit grids, which is only one specific embodiment, and in other specific embodiments, the marking layer 1031 may be in other specific patterns.
In a specific embodiment, with reference to fig. 1 and 2, in order to quickly obtain the positional relationship between the cameras in each frame, the identification layer 1031 disposed on the surface of the transparent glass substrate 103 is arranged to divide the glass substrate 103 into a plurality of rectangular unit grids, the physical dimensions of each rectangular unit grid are the same, reference points Q1 and Q2 are selected at the intersection points of the plurality of rectangular unit grids, the reference point Q1 is located within the shooting range of the first camera 101, and the reference point Q2 is located within the shooting range of the second camera 102. The reference coordinate system plane is selected as the plane where the transparent glass substrate 103 is located, and the reference coordinate system is the world coordinate system, since the physical size of each rectangular unit grid is known, that is, the coordinates of the reference points Q1 and Q2 in the world coordinate system are known. The coordinates of the reference point Q1 in the image coordinate system can be obtained by shooting the reference point Q1 through the first camera 101, the coordinates of the reference point Q1 in the image coordinate system can be obtained through the coordinates of the reference point Q1 in the image coordinate system and internal parameters of the first camera 101, the optical center of the first camera 101 is selected as the origin of the camera coordinate system, and the rotation-translation relationship between the world coordinate system and the camera coordinate system of the first camera 101 can be obtained according to the coordinates of the reference point Q1 in the world coordinate system and the camera coordinate system of the first camera 101 respectively, so that the coordinates of the first camera 101 in the world coordinate system are obtained; the coordinates of the reference point Q2 in the image coordinate system can be obtained by shooting the reference point Q2 through the second camera 102, the coordinates of the reference point Q2 in the image coordinate system can be obtained through the coordinates of the reference point Q2 in the image coordinate system and the internal parameters of the second camera 102, the optical center of the second camera 102 is selected as the origin of the camera coordinate system, and the rotation-translation relationship between the world coordinate system and the camera coordinate system of the second camera 102 can be obtained according to the coordinates of the reference point Q2 in the world coordinate system and the camera coordinate system of the second camera 102 respectively, so that the coordinates of the second camera 102 in the world coordinate system are obtained; the position relationship between the first camera 101 and the second camera 102 can be obtained according to the coordinates of the first camera 101 and the second camera 102 in the world coordinate system.
Specifically, a camera coordinate system is established with the optical center of the first camera 101 as an origin, coordinates of the reference point Q1 in the camera coordinate system of the first camera 101 are set as (X1, Y1, Z1), a coordinate point (X1, Y1, Z1) is projected by a light ray to a point Q1(X1, Y1, f1) in an image coordinate system, where f1 represents an internal parameter focal length of the first camera 101, a plane of the image coordinate system is perpendicular to the optical axis of the first camera 101, a distance between the plane of the image coordinate system and the optical center of the first camera 101 is f1, and a proportional equation can be obtained according to a triangular proportional relationship: x1/f1 is X1/Z1, and Y1/f1 is Y1/Z1. The above proportional relationship can be expressed by a matrix as follows:
Figure PCTCN2018109737-APPB-000001
that is, the coordinates (X1, Y1, f1) of the reference point Q1 in the image coordinate system can be acquired by photographing the reference point Q1 by the first camera 101, and the coordinates (X1, Y1, Z1) of the reference point Q1 in the camera coordinate system of the first camera 101 can be obtained by the coordinates (X1, Y1, f 1). And the coordinates of the reference point Q1 in the world coordinate system are known quantities, and the coordinates of the reference point Q1 in the world coordinate system are represented by (X1 ', Y1 ', Z1 '). Assuming that the rotation matrix between the world coordinate system and the camera coordinate system of the first camera 101 is R1, R1 is a 3 × 3 matrix, and the translation vector is t1, the world coordinate system and the camera coordinate system of the first camera 101 have the following relationship:
Figure PCTCN2018109737-APPB-000002
wherein LW1Representing a rotational-translational relationship between the world coordinate system and the camera coordinate system of the first camera 101, with the vector O being (0, 0, 0), OTRepresenting a transpose of the vector (0, 0, 0). That is, the rotational-translational relationship L between the world coordinate system and the camera coordinate system of the first camera 101 can be obtained from the coordinates (X1 ', Y1 ', Z1 ') in the world coordinate system of the reference point Q1 and the coordinates (X1, Y1, Z1) thereof in the camera coordinate systemW1. The first camera 101 is located at the origin of its camera coordinate system, i.e. the coordinates of the first camera 101 in its camera coordinate system are known, the camera coordinates of the first camera 101 are rotated and translated by the relation LW1The coordinates of the target in a world coordinate system can be obtained after the rotational translation transformation.
Similarly, a camera coordinate system is established with the optical center of the second camera 102 as an origin, coordinates of the reference point Q2 in the camera coordinate system of the second camera 102 are set as (X2, Y2, Z2), a coordinate point (X2, Y2, Z2) is projected by a light ray to a point Q2(X2, Y2, f2) in an image coordinate system, where f2 represents an internal parameter focal length of the second camera 102, a plane of the image coordinate system is perpendicular to an optical axis of the second camera 102, a distance between the plane of the image coordinate system and the optical center of the second camera 102 is f2, and a proportional equation can be obtained according to a triangular proportional relationship: x2/f2 is X2/Z2, and Y2/f2 is Y2/Z2. The above proportional relationship can be expressed by a matrix as follows:
Figure PCTCN2018109737-APPB-000003
that is, the coordinates (X2, Y2, f2) of the reference point Q2 in the image coordinate system can be obtained by photographing the reference point Q2 by the second camera 102, and the coordinates (X2, Y2, Z2) of the reference point Q2 in the camera coordinate system of the second camera 102 can be obtained by the coordinates (X2, Y2, f 2). And the coordinates of the reference point Q2 in the world coordinate system are known quantities, and the coordinates of the reference point Q2 in the world coordinate system are represented by (X2 ', Y2 ', Z2 '). Assuming that the rotation matrix between the world coordinate system and the camera coordinate system of the second camera 102 is R2, R2 is a 3 × 3 matrix, and the translation vector is t2, the world coordinate system and the camera coordinate system of the second camera 102 have the following relationship:
Figure PCTCN2018109737-APPB-000004
wherein LW2Representing a rotational-translational relationship between the world coordinate system and the camera coordinate system of the second camera 102. That is, the rotational-translational relationship L between the world coordinate system and the camera coordinate system of the second camera 102 can be obtained by the coordinates (X2 ', Y2 ', Z2 ') in the world coordinate system of the reference point Q2 and the coordinates (X2, Y2, Z2) thereof in the camera coordinate systemW2. The second camera 102 is located at the origin of its camera coordinate system, i.e. the coordinates of the second camera 102 in its camera coordinate system are known, the coordinates of the second camera 102 are rotated and translated by the relation LW2The coordinates of the target in the world coordinate system can be obtained after the rotational translation transformation.
The coordinates of the first camera 101 and the second camera 102 in the same world coordinate system are obtained, that is, the position relationship between the first camera 101 and the second camera 102 can be obtained, so that the external parameters of the binocular camera can be quickly recalibrated in each frame.
In this embodiment, a time interval between two adjacent calibration of the binocular camera may be set, and the specific setting of the time interval is determined according to actual conditions, and is not specifically limited herein.
In this embodiment, the device for calibrating a camera includes two cameras, and in other embodiments, the device may further include three, four or more cameras, which is not specifically limited herein.
Two reference points Q1 and Q2 are selected in the present embodiment, where the reference point Q1 is within the shooting range of the first camera 101 and the reference point Q2 is within the shooting range of the second camera 102. In other embodiments, only one reference point may be selected on the identification layer, and the one reference point is located in both the shooting range of the first camera and the shooting range of the second camera; or more than two reference points are selected on the identification layer, wherein at least one reference point is positioned in the shooting range of the first camera, and at least one reference point is positioned in the shooting range of the second camera.
In this embodiment, the processor is disposed outside the first camera 101 and the second camera 102 and coupled to the two cameras 101 and 102, and in other embodiments, the processor may be directly installed inside the first camera 101 or the second camera 102, which is not limited in particular.
As can be seen from the above, the camera calibration device of the present application includes a camera, a substrate located in a shooting range of the camera, and a processor coupled to the camera, where the substrate is provided with a mark layer, and the mark layer includes at least one reference point. The camera shoots the reference point in the shooting range of the camera to obtain the coordinate of the image coordinate system of the reference point, and then the coordinate of the image coordinate system of the reference point and the coordinate of the world coordinate system of the camera are obtained, so that the position relation between the cameras can be quickly obtained in each frame, and the shooting precision is further improved.
Referring to fig. 3, fig. 3 is a schematic flow chart of a camera calibration method provided in the present application. The specific steps of the method are described in detail below.
Step 301: the camera photographs a reference point located within its photographing range to acquire coordinates of the reference point in an image coordinate system.
In this embodiment, an example is described in which two cameras respectively capture reference points located within their respective imaging ranges, but a larger number of cameras may be used in other embodiments, and the number is not limited herein.
In one embodiment, referring to fig. 1 and 2, a reference coordinate system plane is selected as a plane where the transparent glass substrate 103 is located, the reference coordinate system is a world coordinate system, the identification layer pattern on the transparent glass substrate 103 is a plurality of rectangular unit grids, two cross points on the rectangular unit grids are selected as two reference points, one reference point Q1 is located in a shooting range of the first camera 101, and the other reference point Q2 is located in a shooting range of the second camera 102, since physical dimensions of each rectangular unit grid are known, coordinates of the reference points Q1 and Q2 in the world coordinate system are known, coordinates of the reference points Q1 and Q2 in the world coordinate system are respectively represented by (X1 ', Y1', Z1 ') and (X2', Y2 ', Z2'), coordinates of the reference points Q1 and Q2 in the world coordinate system are obtained by shooting the reference point Q1 with the first camera 101, coordinates (X1, y1, f1) while photographing the reference point Q2 with the second camera 102 to acquire the coordinates (x2, y2, f2) of the reference point Q2 in the image coordinate system, and the two cameras 101 and 102 transmit the obtained image coordinate information of the two reference points Q1 and Q2 to the processor.
Step 302: and obtaining the coordinates of the camera in the reference coordinate system through the coordinates of the reference point in the image coordinate system and the reference coordinate system so as to obtain the position relation between the cameras.
Obtaining the coordinates of two reference points Q1 and Q2 in the image coordinate system in step 301, calculating the coordinates of the reference point Q1 in the camera coordinate system of the first camera 101 by the processor according to the coordinates of the reference point Q1 in the image coordinate system and the internal parameters of the first camera 101, and obtaining the rotational-translational relationship between the world coordinate system and the camera coordinate system of the first camera 101 according to the coordinates of the reference point Q1 in the world coordinate system and the coordinates in the camera coordinate system of the first camera 101, so as to obtain the coordinates of the first camera 101 in the world coordinate system; similarly, the processor calculates the coordinates of the reference point Q2 in the camera coordinate system of the second camera 102 according to the coordinates of the reference point Q2 in the image coordinate system and the internal parameters of the second camera 102, and obtains the rotational-translational relationship between the world coordinate system and the camera coordinate system of the second camera 102 according to the coordinates of the reference point Q2 in the world coordinate system and the coordinates in the camera coordinate system of the second camera 102, so as to obtain the coordinates of the second camera 102 in the world coordinate system; further, the position relationship between the first camera 101 and the second camera 102 can be obtained from the coordinates of the first camera 101 and the second camera 102 in the world coordinate system.
Specifically, with continuing reference to fig. 1 and fig. 2, the camera coordinate system is established with the optical center of the first camera 101 as the origin, the coordinates of the reference point Q1 in the camera coordinate system of the first camera 101 are set to (X1, Y1, Z1), the coordinate point (X1, Y1, Z1) is projected by the light to the point Q1(X1, Y1, f1) in the image coordinate system, and the proportional equation can be obtained according to the triangular proportional relationship: x1/f1 is X1/Z1, and Y1/f1 is Y1/Z1. The above proportional relationship can be expressed by a matrix as follows:
Figure PCTCN2018109737-APPB-000005
that is, the coordinates (X1, Y1, Z1) in the camera coordinate system of the first camera 101 can be obtained from the coordinates (X1, Y1, f1) of the reference point Q1 in the image coordinate system. And the coordinates of the reference point Q1 in the world coordinate system are known quantities, and the coordinates of the reference point Q1 in the world coordinate system are represented by (X1 ', Y1 ', Z1 '). Assuming that the rotation matrix between the world coordinate system and the camera coordinate system of the first camera 101 is R1 and the translation vector is t1, the following relationship exists between the world coordinate system and the camera coordinate system of the first camera 101:
Figure PCTCN2018109737-APPB-000006
that is, the world coordinate system and the phase of the first camera 101 can be obtained from the coordinates (X1 ', Y1 ', Z1 ') in the world coordinate system of the reference point Q1 and the coordinates (X1, Y1, Z1) thereof in the camera coordinate systemRotational-translational relationship L between machine coordinate systemsW1. The first camera 101 is located at the origin of its camera coordinate system, i.e. the coordinates of the first camera 101 in its camera coordinate system are known, the camera coordinates of the first camera 101 are rotated and translated by the relation LW1The coordinates of the target in a world coordinate system can be obtained after the rotational translation transformation.
Similarly, the processor calculates the world coordinate system coordinate of the second camera 102 by adopting the same step of calculating the world coordinate system coordinate of the first camera 101, so that the position relationship between the two cameras 101 and 102 can be obtained by the coordinates of the two cameras 101 and 102 in the same world coordinate system, and the external parameters of the binocular camera can be quickly re-calibrated in each frame.
As can be seen from the above, the camera calibration device of the present application includes a camera, a substrate located in a shooting range of the camera, and a processor coupled to the camera, where the substrate is provided with a mark layer, and the mark layer includes at least one reference point. The camera shoots the reference point in the shooting range of the camera to obtain the coordinate of the image coordinate system of the reference point, and then the coordinate of the image coordinate system of the reference point and the coordinate of the world coordinate system of the camera are obtained, so that the position relation between the cameras can be quickly obtained in each frame, and the shooting precision is further improved.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of a camera calibration system provided in the present application. The system 40 for camera calibration comprises a device 401 for camera calibration, wherein the device 401 for camera calibration comprises a camera, a substrate located in a shooting range of the camera, and a processor coupled to the camera, wherein the substrate is provided with an identification layer, and the identification layer comprises at least one reference point.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a device with a storage function provided in the present application. The device 50 with storage function stores program data 501, the program data 501 being executable to implement the following method: the camera shoots a reference point within the shooting range of the camera to acquire the coordinate of the reference point in an image coordinate system; and obtaining the coordinates of the camera in the reference coordinate system through the coordinates of the reference point in the image coordinate system and the reference coordinate system so as to obtain the position relation between the cameras.
The beneficial effect of this application is: different from the prior art, the camera calibration device of the application comprises a camera, a substrate located in a camera shooting range and a processor coupled with the camera, wherein the substrate is provided with a mark layer, and the mark layer at least comprises a reference point. The camera shoots the reference point in the shooting range of the camera to obtain the coordinate of the image coordinate system of the reference point, and then the coordinate of the image coordinate system of the reference point and the coordinate of the world coordinate system of the camera are obtained, so that the position relation between the cameras can be quickly obtained in each frame, and the shooting precision is further improved.
The above description is only an embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes performed by the content of the present specification and the attached drawings, or directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (16)

  1. The camera calibration device is characterized by comprising a camera, a substrate positioned in a shooting range of the camera and a processor coupled with the camera, wherein the substrate is provided with a mark layer, and the mark layer at least comprises one reference point.
  2. The device of claim 1, wherein the substrate is a transparent glass substrate.
  3. The apparatus according to claim 2, wherein the marking layer is located on the surface or/and inside of the transparent glass substrate.
  4. The apparatus of claim 1, wherein the number of cameras is two, the two cameras coupled with the processor.
  5. The device of claim 4, wherein the substrate is a planar substrate, and the identification layer of the planar substrate is provided with two reference points, and the two reference points are respectively located in the shooting ranges of the two cameras.
  6. The apparatus of claim 1, wherein the pattern of the identification layer on the substrate is a rectangular grid of cells, and the reference point is a cross-point of the rectangular grid of cells.
  7. The device of claim 1, wherein the indicia layer is a transflective reflective strip.
  8. A system for camera calibration, the system comprising a device for camera calibration, the device comprising a camera, a substrate located within a shooting range of the camera, and a processor coupled to the camera, wherein the substrate is provided with a marking layer, and the marking layer comprises at least one reference point.
  9. A method of camera calibration, the method comprising:
    the camera shoots a reference point located in a shooting range of the camera to acquire coordinates of the reference point in an image coordinate system;
    and obtaining the coordinates of the cameras in the reference coordinate system through the coordinates of the reference points in the image coordinate system and the reference coordinate system so as to obtain the position relation between the cameras.
  10. The method of claim 9, wherein the substrate is a transparent glass substrate.
  11. The method of claim 9, wherein the substrate is a planar substrate.
  12. The method of claim 9, wherein the pattern of the identification layer is a rectangular grid of cells and the reference point is a cross-point of the rectangular grid of cells.
  13. The method of claim 9, wherein the identification layer is a transflective bar of light.
  14. The method according to claim 9, wherein the number of the cameras is two, the number of the reference points is two, and the two reference points are respectively located in the shooting ranges of the two cameras, and the step of shooting the reference points located in the shooting ranges by the cameras to obtain the coordinates of the reference points in the image coordinate system comprises:
    the two cameras respectively shoot the two reference points within the shooting ranges of the two cameras so as to respectively acquire the coordinates of the two reference points in the image coordinate system.
  15. The method according to claim 14, wherein the step of obtaining the coordinates of the cameras in the reference coordinate system from the coordinates of the reference point in the image coordinate system and the reference coordinate system to obtain the position relationship between the cameras comprises:
    respectively acquiring coordinates of the two reference points in a camera coordinate system through the coordinates of the two reference points in an image coordinate system;
    and respectively acquiring the coordinates of the two cameras in the world coordinate system according to the coordinates of the two reference points in the camera coordinate system and the coordinates of the two reference points in the world coordinate system, so as to obtain the position relationship between the two cameras.
  16. An apparatus having a storage function, wherein the apparatus stores program data executable to implement a method comprising:
    the camera shoots a reference point located in a shooting range of the camera to acquire coordinates of the reference point in an image coordinate system;
    and obtaining the coordinates of the cameras in the reference coordinate system through the coordinates of the reference points in the image coordinate system and the reference coordinate system so as to obtain the position relation between the cameras.
CN201880087316.5A 2018-10-10 2018-10-10 Camera calibration device, system and method and device with storage function Pending CN111630851A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/109737 WO2020073261A1 (en) 2018-10-10 2018-10-10 Camera calibration apparatus, system and method, and apparatus with storage function

Publications (1)

Publication Number Publication Date
CN111630851A true CN111630851A (en) 2020-09-04

Family

ID=70164291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880087316.5A Pending CN111630851A (en) 2018-10-10 2018-10-10 Camera calibration device, system and method and device with storage function

Country Status (2)

Country Link
CN (1) CN111630851A (en)
WO (1) WO2020073261A1 (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101013505A (en) * 2007-02-05 2007-08-08 武汉大学 Camera calibration method and calibration apparatus thereof
US20080303906A1 (en) * 2004-05-24 2008-12-11 Centre De Recherche Industrielle Du Quebec Camera calibrating apparatus and method
CN202057299U (en) * 2010-12-14 2011-11-30 李军 Image calibration template capable of automatically establishing image coordinate system
CN102788552A (en) * 2012-02-28 2012-11-21 王锦峰 Linear coordinate calibration method
CN202814413U (en) * 2012-09-28 2013-03-20 北京航天计量测试技术研究所 High-precision calibration plate for calibrating camera parameters
CN103632364A (en) * 2013-11-06 2014-03-12 同济大学 Camera spatial position relation calibration device in multi-camera photographing measurement system
CN103679693A (en) * 2013-01-25 2014-03-26 杭州先临三维科技股份有限公司 Multi-camera single-view calibration device and calibration method thereof
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN104375375A (en) * 2014-11-17 2015-02-25 国家电网公司 Method and device for calibrating visible light camera and thermal infrared imager camera through checkerboard
CN104680535A (en) * 2015-03-06 2015-06-03 南京大学 Calibration target, calibration system and calibration method for binocular direct-vision camera
CN105469418A (en) * 2016-01-04 2016-04-06 中车青岛四方机车车辆股份有限公司 Photogrammetry-based wide-field binocular vision calibration device and calibration method
CN106713897A (en) * 2017-02-27 2017-05-24 驭势科技(北京)有限公司 Binocular camera and self-calibration method for binocular camera
CN107146254A (en) * 2017-04-05 2017-09-08 西安电子科技大学 The Camera extrinsic number scaling method of multicamera system
CN206863818U (en) * 2017-06-20 2018-01-09 成都通甲优博科技有限责任公司 A kind of calibrating template
CN108171758A (en) * 2018-01-16 2018-06-15 重庆邮电大学 Polyphaser scaling method based on minimum time principle and transparent glass scaling board

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105631853B (en) * 2015-11-06 2018-01-30 湖北工业大学 Vehicle-mounted binocular camera demarcation and Verification method
CN108416791B (en) * 2018-03-01 2021-07-23 燕山大学 Binocular vision-based parallel mechanism moving platform pose monitoring and tracking method

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080303906A1 (en) * 2004-05-24 2008-12-11 Centre De Recherche Industrielle Du Quebec Camera calibrating apparatus and method
CN101013505A (en) * 2007-02-05 2007-08-08 武汉大学 Camera calibration method and calibration apparatus thereof
CN202057299U (en) * 2010-12-14 2011-11-30 李军 Image calibration template capable of automatically establishing image coordinate system
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN102788552A (en) * 2012-02-28 2012-11-21 王锦峰 Linear coordinate calibration method
CN202814413U (en) * 2012-09-28 2013-03-20 北京航天计量测试技术研究所 High-precision calibration plate for calibrating camera parameters
CN103679693A (en) * 2013-01-25 2014-03-26 杭州先临三维科技股份有限公司 Multi-camera single-view calibration device and calibration method thereof
CN103632364A (en) * 2013-11-06 2014-03-12 同济大学 Camera spatial position relation calibration device in multi-camera photographing measurement system
CN104375375A (en) * 2014-11-17 2015-02-25 国家电网公司 Method and device for calibrating visible light camera and thermal infrared imager camera through checkerboard
CN104680535A (en) * 2015-03-06 2015-06-03 南京大学 Calibration target, calibration system and calibration method for binocular direct-vision camera
CN105469418A (en) * 2016-01-04 2016-04-06 中车青岛四方机车车辆股份有限公司 Photogrammetry-based wide-field binocular vision calibration device and calibration method
CN106713897A (en) * 2017-02-27 2017-05-24 驭势科技(北京)有限公司 Binocular camera and self-calibration method for binocular camera
CN107146254A (en) * 2017-04-05 2017-09-08 西安电子科技大学 The Camera extrinsic number scaling method of multicamera system
CN206863818U (en) * 2017-06-20 2018-01-09 成都通甲优博科技有限责任公司 A kind of calibrating template
CN108171758A (en) * 2018-01-16 2018-06-15 重庆邮电大学 Polyphaser scaling method based on minimum time principle and transparent glass scaling board

Also Published As

Publication number Publication date
WO2020073261A1 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
CN108765328B (en) High-precision multi-feature plane template and distortion optimization and calibration method thereof
US7479982B2 (en) Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device
CN111192235B (en) Image measurement method based on monocular vision model and perspective transformation
US8934721B2 (en) Microscopic vision measurement method based on adaptive positioning of camera coordinate frame
CN105678742B (en) A kind of underwater camera scaling method
US8619144B1 (en) Automatic camera calibration
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN105379264A (en) System and method for imaging device modelling and calibration
CN109325981B (en) Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points
CN109099838B (en) Three-dimensional information measuring method of monocular stereoscopic vision system based on refraction and reflection
CN102509261A (en) Distortion correction method for fisheye lens
US11212511B1 (en) Residual error mitigation in multiview calibration
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN111145269B (en) Calibration method for external orientation elements of fisheye camera and single-line laser radar
Yu et al. A calibration method based on virtual large planar target for cameras with large FOV
CN115359127A (en) Polarization camera array calibration method suitable for multilayer medium environment
JP2017098859A (en) Calibration device of image and calibration method
CN113554709A (en) Camera-projector system calibration method based on polarization information
EP3248369A1 (en) Camera focus for adas
CN113487683B (en) Target tracking system based on trinocular vision
CN109084679B (en) A kind of 3D measurement and acquisition device based on spatial light modulator
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN105203102B (en) Sky polarization mode detection method and system based on s- wave plates
TW565736B (en) Method for determining the optical parameters of a camera
CN111630851A (en) Camera calibration device, system and method and device with storage function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200904