CN113450415A - Imaging device calibration method and device - Google Patents

Imaging device calibration method and device Download PDF

Info

Publication number
CN113450415A
CN113450415A CN202010222903.XA CN202010222903A CN113450415A CN 113450415 A CN113450415 A CN 113450415A CN 202010222903 A CN202010222903 A CN 202010222903A CN 113450415 A CN113450415 A CN 113450415A
Authority
CN
China
Prior art keywords
straight line
spatial
space
points
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010222903.XA
Other languages
Chinese (zh)
Inventor
丁银章
陈康平
赖百胜
黄建强
华先胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010222903.XA priority Critical patent/CN113450415A/en
Publication of CN113450415A publication Critical patent/CN113450415A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses an imaging device calibration method and device, and the method comprises the following steps: acquiring two plane straight lines corresponding to two parallel space straight lines on a target image shot by imaging equipment to be calibrated; acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines; and taking the space straight line as a calibration reference object, and calibrating the imaging equipment to be calibrated by combining the plane straight line and the known distance. The method takes the parallel space straight line as the calibration reference object, and the parallel space straight line and the space points on the parallel space straight line are easy to obtain in the traffic scene, so that the problem that the camera calibration process is limited due to the limitation of selecting the calibration reference object in the existing camera calibration process aiming at the traffic scene can be solved by using the method.

Description

Imaging device calibration method and device
Technical Field
The application relates to the field, in particular to an imaging device calibration method. The application simultaneously relates to an imaging device calibration device and an electronic device.
Background
In the applications of computer vision, image measurement, three-dimensional scene reconstruction and the like, in order to correct lens distortion of a camera, determine a conversion relation between a physical size in a three-dimensional space and a pixel size of an image, and determine a mutual relation between a three-dimensional geometric position of a space object (or a certain point on the surface of the space object) and coordinates of a corresponding pixel point of the space object (or a certain point on the surface of the space object) in the image, a geometric model for camera imaging needs to be established, parameters of the geometric model are camera parameters, a process for solving the camera parameters is camera calibration or video camera calibration, and the accuracy of calibration results and the stability of an algorithm directly influence the working accuracy of the camera.
The camera calibration process mainly comprises the following steps: and obtaining internal and external parameters of the camera model by utilizing a predetermined algorithm by establishing a corresponding relation between a point with known coordinates on the calibration reference object and a pixel point of the calibration reference object on the image. The process can be divided into camera calibration based on a three-dimensional target, camera calibration based on a two-dimensional plane target, camera calibration based on radial constraint and the like according to different calibration reference objects.
The existing camera calibration method for traffic scenes mainly comprises the following two steps:
the method comprises the following steps: and using rectangles on parallel lane lines as calibration reference objects to calibrate the focal length and the azimuth information of the camera.
The method 2 comprises the following steps: and searching a group of three parallel lines with known intervals and a straight line with known slope and intersecting with the parallel lines on the road as a known calibration reference object, and calibrating the focal length and the azimuth information of the camera.
The above method has the following disadvantages:
the method 1 selects a rectangle in a road as a calibration reference object, the rectangle calibration reference object is not easy to obtain in most traffic scenes, and the selection of the calibration reference object has limitation, so that the camera calibration process has limitation;
the method 2 selects three parallel lines and a straight line with known slope and intersected with the parallel lines as a calibration reference object, in a traffic scene, the straight line intersected with the parallel lines is difficult to obtain, and the selection of the calibration reference object is limited, so that the camera calibration process is limited.
Disclosure of Invention
The embodiment of the application provides an imaging device calibration method, an imaging device calibration device and electronic equipment, and aims to solve the problem that in the existing camera calibration process aiming at traffic scenes, the camera calibration process is limited due to the fact that a selected calibration reference object is limited.
The embodiment of the application provides an imaging device calibration method, which comprises the following steps:
acquiring two plane straight lines corresponding to two parallel space straight lines on a target image shot by imaging equipment to be calibrated;
acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines;
and taking the space straight line as a calibration reference object, and calibrating the imaging equipment to be calibrated by combining the plane straight line and the known distance.
Optionally, the obtaining two plane straight lines corresponding to two parallel space straight lines on a target image captured by an imaging device to be calibrated includes: selecting a first plane straight line and a second plane straight line from a target image shot by imaging equipment to be calibrated, wherein the first plane straight line corresponds to a first space straight line in the real world, the second plane straight line corresponds to a second space straight line in the real world, and the first space straight line and the second space straight line are parallel to each other;
the obtaining the known distances between the plurality of spatial points on the spatial straight line and the obtaining the known distances between the two spatial straight lines comprises: taking the known distances among a plurality of preset space points on the first space straight line as a target space distance, and taking the distance between the first space straight line and the second space straight line as a parallel line distance;
the calibrating the imaging device to be calibrated by taking the space straight line as a calibration reference object and combining the plane straight line and the known distance comprises the following steps: and obtaining a mapping relation between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world according to the target space distance, the parallel line distance, the second plane straight line and pixel coordinate data of a plurality of plane points corresponding to the plurality of space points on the first plane straight line.
Optionally, the predetermined plurality of spatial points on the first spatial straight line include: a first space point, a second space point and a third space point which are sequentially distributed on the first space straight line;
a plurality of planar points on the first planar line corresponding to the plurality of spatial points comprises: a first plane point corresponding to the first spatial point, a second plane point corresponding to the second spatial point, and a third plane point corresponding to the third spatial point;
correspondingly, the target space distance includes: a first spatial distance between the first spatial point and the second spatial point, and a second spatial distance between the first spatial point and the third spatial point.
Optionally, the obtaining, according to the target spatial distance, the parallel line distance, the second planar straight line, and pixel coordinate data of a plurality of planar points on the first planar straight line corresponding to the plurality of spatial points, a mapping relationship between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world includes:
calculating and obtaining the focal length of the camera, the imaging equipment to be calibrated and the X coordinate data of the world coordinate system according to the first space distance, the second space distance, the distance between the parallel lines, the second plane straight line and the pixel coordinate data of a plurality of plane points on the first plane straight line corresponding to the plurality of space pointswOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes;
deriving an internal reference matrix from the pixel coordinate system to the world coordinate system according to the camera focal length;
and deriving an external parameter matrix from the pixel coordinate system to the world coordinate system according to the first included angle and the second included angle.
Optionally, the focal length of the camera, the imaging device to be calibrated, and the X coordinate of the world coordinate system are obtained through calculation according to the first spatial distance, the second spatial distance, the distance between the parallel lines, the second plane straight line, and the pixel coordinate data of the plurality of plane points on the first plane straight line corresponding to the plurality of spatial pointswOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes comprising:
performing straight line fitting based on pixel coordinates of a plurality of points on the second plane straight line to obtain the slope and intercept of the second plane straight line;
determining the plane where the first space straight line and the second space straight line are located as the X of the world coordinate systemwOYwA plane, obtaining one space point of the first space point, the second space point and the third space point and the second space straight line along the X according to the slope and the intercept of the second plane straight linewOYwPlanar XwIntersection pixel coordinate data corresponding to the spatial intersection in the axial direction;
and solving by using a pinhole imaging model and a similar triangle principle by using the first space distance, the second space distance, the distance between the parallel lines, the intersection point pixel coordinate data and the pixel coordinate data of a plurality of plane points corresponding to the plurality of space points on the first plane straight line as known data to obtain the camera focal length, the first included angle and the second included angle.
Optionally, the method further includes:
taking the known distance of the imaging device to be calibrated in the real world relative to the plane where the first space straight line and the second space straight line are located as the height of the imaging device to be calibrated;
according to the height of the imaging equipment to be calibrated, carrying out position transformation on the world coordinate system to obtain a building information model coordinate system;
obtaining the spatial coordinate data of a plurality of preset spatial points on the first spatial straight line according to the pixel coordinate data of a plurality of plane points on the first planar straight line corresponding to the plurality of spatial points and the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
obtaining model coordinate data of a plurality of space points corresponding to the building information model coordinate system according to the preset space coordinate data of the space points on the first space straight line;
obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection points of the second spatial straight line;
obtaining pixel coordinate data of the target image corresponding to the vertical projection point according to the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
and obtaining a mapping relation between the pixel coordinate system corresponding to the target image and the building information model coordinate system according to the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system, the model coordinate data of the vertical projection point corresponding to the building information model coordinate system, the pixel coordinate data of the vertical projection point corresponding to the target image and the pixel coordinate data of the plurality of plane points corresponding to the plurality of spatial points.
Optionally, the performing position transformation on the world coordinate system according to the height of the imaging device to be calibrated to obtain a building information model coordinate system includes:
according to the height of the imaging equipment to be calibrated, converting the origin of the world coordinate system to the X of the imaging equipment to be calibrated and the world coordinate systemwOYwAnd obtaining the target origin of the coordinate system of the building information model at the position of the vertical intersection point of the plane.
Optionally, the obtaining, according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line, the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system includes:
obtaining relative position information between a plurality of preset space points on the first space straight line and the target origin;
and obtaining model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line and the relative position information.
Optionally, the obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection point of the second spatial straight line includes: obtaining relative position information between the vertical projection point and the target origin according to the relative position information between the plurality of space points and the target origin; and obtaining model coordinate data of the vertical projection point according to the relative position information between the vertical projection point and the target origin point.
Optionally, the method further includes: and performing edge extraction on the image shot by the imaging equipment to be calibrated to obtain the target image.
Optionally, the first spatial straight line and the second spatial straight line are lane lines in a traffic scene.
The embodiment of the present application further provides an imaging device calibration apparatus, including:
the straight line acquisition unit is used for acquiring two parallel plane straight lines corresponding to two space straight lines on a target image shot by imaging equipment to be calibrated;
a distance data acquisition unit for acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines;
and the calibration unit is used for calibrating the imaging equipment to be calibrated by taking the space straight line as a calibration reference object and combining the plane straight line and the known distance.
The embodiment of the application also provides an electronic device, which comprises a processor and a memory; wherein the content of the first and second substances,
the memory is to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to perform operations comprising:
acquiring two plane straight lines corresponding to two parallel space straight lines on a target image shot by imaging equipment to be calibrated;
acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines;
and taking the space straight line as a calibration reference object, and calibrating the imaging equipment to be calibrated by combining the plane straight line and the known distance.
Compared with the prior art, the embodiment of the application has the following advantages:
the imaging device calibration method provided by the embodiment of the application obtains two plane straight lines corresponding to two parallel space straight lines on a target image shot by imaging devices to be calibrated; acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines; and taking the space straight line as a calibration reference object, and calibrating the imaging equipment to be calibrated by combining the plane straight line and the known distance. The method takes the parallel space straight line as the calibration reference object, and the parallel space straight line and the space points on the parallel space straight line are easy to obtain in the traffic scene, so that the problem that the camera calibration process is limited due to the limitation of the selected calibration reference object in the existing camera calibration process aiming at the traffic scene can be solved by using the method.
Drawings
Fig. 1 is a flowchart of an imaging device calibration method according to a first embodiment of the present disclosure;
FIG. 1-A is a schematic diagram of a relationship between a pixel coordinate system and a world coordinate system according to a first embodiment of the present application;
FIG. 1-B is a schematic diagram of a transformation of a world coordinate system and a BIM coordinate system provided in a first embodiment of the present application;
1-C is a schematic diagram of the relative positions of the calibration point and the imaging device to be calibrated in the BIM coordinate system provided in the first embodiment of the present application;
fig. 2 is a block diagram of a unit of a calibration apparatus of an imaging device according to a second embodiment of the present application;
fig. 3 is a schematic logical structure diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
Aiming at an imaging device calibration scene, in order to reduce the complexity of a calibration process, the application provides an imaging device calibration method, an imaging device calibration device corresponding to the method and electronic equipment. The following provides embodiments to explain the method, apparatus, and electronic device in detail.
A first embodiment of the present application provides an imaging device calibration method, an application subject of the method may be a computing device application for calibrating a camera/video camera, fig. 1 is a flowchart of the imaging device calibration method provided in the first embodiment of the present application, and the method provided in this embodiment is described in detail below with reference to fig. 1. The following description refers to embodiments for the purpose of illustrating the principles of the methods, and is not intended to be limiting in actual use.
As shown in fig. 1, the method for calibrating an imaging device provided in this embodiment includes the following steps:
s101, two parallel plane straight lines corresponding to two space straight lines on a target image shot by imaging equipment to be calibrated are obtained.
In the applications of computer vision, image measurement, three-dimensional scene reconstruction and the like, in order to correct the lens distortion of a camera, determine the conversion relation between the physical size in a three-dimensional space and the pixel size of an image, and determine the correlation between the three-dimensional geometric position of a space object (or a certain point on the surface of the space object) and the coordinates of a corresponding pixel point of the space object (or a certain point on the surface of the space object) in the image, a geometric model for camera imaging needs to be established, the parameters of the geometric model are camera parameters, and the process of solving the parameters is camera calibration or video camera calibration.
In this embodiment, two plane straight lines corresponding to two parallel spatial straight lines on a target image captured by an imaging device to be calibrated are obtained, specifically: the method comprises the steps of selecting a first plane straight line and a second plane straight line in a target image shot by imaging equipment to be calibrated, wherein the first plane straight line corresponds to a first space straight line in the real world, the second plane straight line corresponds to a second space straight line in the real world, and the first space straight line and the second space straight line are parallel to each other. The first space straight line and the second space straight line are calibration reference objects preset in a real space, the calibration reference objects refer to predetermined movable objects in the shooting range of the imaging device to be calibrated and are used for obtaining matched space coordinate information and image coordinate information in the calibration process of the imaging device, for example, the imaging device to be calibrated is a camera which is arranged at an airport, and the calibration reference objects can be airplanes in the process of taking off or landing with determined flight information; the camera is arranged on a road, and the calibration reference object can be a running vehicle; the camera is arranged in a market, and the calibration reference object can be a movable robot provided with a positioning sensor or a person wearing the positioning sensor. In this embodiment, the first spatial straight line and the second spatial straight line may be markers such as parallel lane lines on a highway, a straight line where a street lamp is located, or a reflective strip at a road edge. In the present embodiment, the first spatial straight line and the second spatial straight line are preferably lane lines in a traffic scene.
It should be noted that before the first planar straight line and the second planar straight line are obtained, edge extraction is further performed on the image captured by the imaging device to be calibrated to obtain the target image. For example, a bilateral filter is adopted to perform noise reduction on an original image captured by a camera, and an edge detection method such as a Canny operator or a Sobel operator, a Prewitt operator and the like is applied to perform edge detection on the original image subjected to noise reduction processing, so as to extract an edge position of the original image and obtain the target image.
S102, obtaining known distances between a plurality of spatial points on the spatial straight line, and obtaining the known distances between the two spatial straight lines.
In this embodiment, the obtaining the known distances between the plurality of spatial points on the spatial straight line and the obtaining the known distances between the two spatial straight lines may specifically refer to: the known distances between a plurality of preset space points on the first space straight line are used as target space distances, and the distance between the first space straight line and the second space straight line is used as a parallel line distance. The preset plurality of spatial points on the first spatial straight line may be a first spatial point, a second spatial point, and a third spatial point sequentially distributed on the first spatial straight line, such as a first spatial point a ', a second spatial point B ', and a third spatial point C ' shown in fig. 1-a, and in a traffic scene, the first spatial point, the second spatial point, and the third spatial point may be points preset on a lane line or a lane edge reflection band. The known distances between the plurality of preset spatial points on the first spatial straight line may be a first spatial distance between the first spatial point and the second spatial point (distance d1 between a 'and B'), and a second spatial distance between the first spatial point and the third spatial point (distance d2 between a 'and C'). In the present embodiment, the first spatial distance and the second spatial distance are known data, that is, the spatial points are spatial points whose distances are known. The parallel line pitch dw may be a vertical distance between two parallel lane lines acquired in advance.
S103, taking the space straight line as a calibration reference object, and calibrating the imaging device to be calibrated by combining the plane straight line and the known distance.
In this embodiment, the spatial straight line is used as a calibration reference object, and the imaging device to be calibrated is calibrated by combining the planar straight line and the known distance, which may specifically be: and obtaining a mapping relation between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world according to the target space distance, the parallel line distance, the second plane straight line and pixel coordinate data of a plurality of plane points on the first plane straight line corresponding to the plurality of space points.
The plurality of planar points on the first planar line corresponding to the plurality of spatial points may refer to: a first planar point corresponding to the first spatial point, a second planar point corresponding to the second spatial point, and a third planar point corresponding to the third spatial point. As shown in fig. I-a, the first plane point is a, the second plane point is B, and the third plane point is C, and in the pixel coordinate system, the pixel coordinate data of the first plane point, the pixel coordinate data of the second plane point, and the pixel coordinate data of the third plane point are all known data.
In this embodiment, the mapping relationship between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world may be obtained as follows:
firstly, according to the first space distance, the second space distance, the parallel line distance, the second plane straight line and the pixel coordinate data of a plurality of plane points on the first plane straight line corresponding to the plurality of space points, the focal length of the camera, the imaging device to be calibrated and the X coordinate of the world coordinate system are obtained through calculationwOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes. The process specifically comprisesThe following contents:
and performing straight line fitting based on the pixel coordinates of the plurality of points on the second plane straight line to obtain the slope and the intercept of the second plane straight line. For example, a linear equation y ═ kx + b corresponding to the second planar line is fitted by the least square method based on the pixel coordinates of 10 or more points on the second planar line, and the slopes and intercepts of the second planar line are corresponding to k and b.
Determining the planes where the first space straight line and the second space straight line are located as XwOYw planes of the world coordinate system, and obtaining pixel coordinate data of one space point of the first space point, the second space point and the third space point and a space intersection point of the second space straight line along the Xw axis direction of the XwOYw planes in a pixel coordinate system corresponding to the target image according to the slope and the intercept of the second plane straight line and the pixel coordinates of a plurality of points on the second plane straight line. As shown in fig. 1-a, A, B, C represents a plurality of plane points on the second plane straight line, D represents a spatial intersection point D 'along the Xw axis direction of a', which corresponds to the plane intersection point of the target image, the pixel coordinates of the plane points are a (h1, v1), B (h2, v2), C (h3, v3), and D (h4, v4), the image center coordinates are (h0, v0), and since point a 'is the same as point D' in horizontal coordinate, v4 is v1, and h4 is (v 1-B)/k.
Solving by using a pinhole imaging model and a similar triangle principle and using the first spatial distance D1, the second spatial distance D2, the parallel line distance dw, pixel coordinate data D ((v1-B)/k, v1) of the spatial intersection point in a pixel coordinate system corresponding to the target image and pixel coordinate data A (h1, v1), B (h2, v2) and C (h3, v3) of a plurality of plane points corresponding to the plurality of spatial points on the first plane straight line as known data to obtain the camera focal length f, the first included angle theta and the second included angle alpha. As shown in fig. 1-a, based on the positional relationship among the plurality of spatial points, the plurality of plane points, the spatial intersection point, the plane intersection point D ', the focal point (P ' point in fig. 1-a) corresponding to the imaging device to be calibrated, and the optical center (L point in fig. 1-a) of the imaging device to be calibrated, a similar proportional relation and a trigonometric function relation of similar triangles are obtained by using a pinhole imaging model and a similar triangle principle, and known similar proportional relations such as D1, D2, dw, D ((v1-B)/k, v1), a (h1, v1), B (h2, v2), C (h3, v3) and the trigonometric function relation are substituted into the similar proportional relation and the trigonometric function relation to obtain the camera focal length f (P ' L), the first included angle θ and the second included angle α. For example, the similar proportional relationship is: p ' L/PL-CL/C ' L-BL/B ' L-AL/a ' L-DL/D ' L, the trigonometric relationship is: substituting dw ═ a 'D' × sin α into the known data yields the following derived formula:
Figure BDA0002426694680000091
Figure BDA0002426694680000092
Figure BDA0002426694680000093
Figure BDA0002426694680000094
Figure BDA0002426694680000095
Figure BDA0002426694680000096
Figure BDA0002426694680000097
Figure BDA0002426694680000101
in the above formula, t1, λ, h12, h13, and d13 are intermediate variables obtained by simultaneous similar proportional relation, and the focal length f, the first included angle θ, and the second included angle α of the camera can be obtained by calculation through the above derivation formula.
And then, deriving an internal reference matrix from the pixel coordinate system to the world coordinate system according to the camera focal length, and deriving an external reference matrix from the pixel coordinate system to the world coordinate system according to the first included angle and the second included angle.
For example, the mapping matrix P may be represented as:
Figure BDA0002426694680000102
taking the coordinate of the central point of the target image as the coordinate of the principal point, the internal reference matrix K may be:
Figure BDA0002426694680000103
the external reference matrix may be:
Figure BDA0002426694680000104
the external parameter matrix comprises a rotation component R and a translation component T, the preset rolling angle of the camera is 0, and the first included angle theta and the second included angle alpha obtained by the calculation are combined to indicate that the three direction angles of the camera are determined, so that R can be obtained. The translation component may be determined based on the position of the camera in the BIM coordinate system, and assuming that the coordinate values of the camera in the BIM coordinate system are (x, y, z), T ═ x, y, z.
In this embodiment, after obtaining the mapping relationship between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world in the above process, the mapping relationship between the pixel coordinate system and the Building Information Model (BIM) coordinate system may also be obtained, and the process specifically includes the following steps:
A. taking the known distance of the imaging device to be calibrated in the real world relative to the plane where the first space straight line and the second space straight line are located as the height H of the imaging device to be calibrated; in this embodiment, the plane where the first spatial straight line and the second spatial straight line are located may be a road plane corresponding to the lane line, and the height of the imaging device to be calibrated may be a height of the camera relative to the road plane.
B. And according to the height H of the imaging equipment to be calibrated, carrying out position transformation on the world coordinate system to obtain a Building Information Model (BIM) coordinate system. In the present embodiment, the transformation process is as shown in fig. 1-B, and the origin of the world coordinate system is transformed to the imaging device to be calibrated and the X of the world coordinate systemwOYwAnd obtaining a target origin of a coordinate system of the building information model at the position of a vertical intersection point of the plane, wherein the transformation process needs to calculate a transformation translation vector of the world coordinate system and the coordinate system of the building information model according to the height H of the imaging device to be calibrated, because the translation is only performed on Y axes (Yw and Yb), the translation distance is OO', and OO ═ H × tan (theta) can be calculated through the known height H of the imaging device to be calibrated.
C. The spatial coordinate data of the plurality of spatial points a ', B ', C ' preset on the first spatial straight line are obtained according to the pixel coordinate data a (h1, v1), B (h2, v2), C (h3, v3) of the plurality of plane points corresponding to the plurality of spatial points on the first planar straight line, and the mapping relationship between the pixel coordinate system obtained in the step S103 and the world coordinate system.
D. And obtaining model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system according to the spatial coordinate data of the plurality of spatial points A ', B ' and C ' preset on the first spatial straight line. As shown in fig. 1-C, the three-dimensional height of the imaging device to be calibrated in the BIM coordinate system can be measured by the BIM, the relative position information between the spatial points a ', B ', C ' and the target origin O ″ of the building information model coordinate system is determined based on the relationship between the three-dimensional height and the height H of the imaging device to be calibrated, and the model coordinate data of the points a ', B ', C ' corresponding to the points a ", B", C "of the building information model coordinate system are obtained according to the spatial coordinate data of a ', B ', C ' and the relative position information. The three-dimensional height of the imaging device to be calibrated in the BIM coordinate system is measured through the BIM, and the relative position of the calibration reference object in the BIM coordinate system can be deduced by combining the real height H of the imaging device to be calibrated, so that a reliable basis is provided for the marking of the calibration reference object in the BIM coordinate system.
E. Obtaining model coordinate data of the space points A ', B', C 'corresponding to the vertical projection points D', E ', F' of the second space straight line, as shown in FIG. 1-C, and obtaining relative position information between the vertical projection points D ', E', F 'and the target origin O' according to the relative position information between the space points A ', B', C 'and the target origin O' of the building information model coordinate system; and obtaining model coordinate data of points D ', E', and F 'of the vertical projection points D', E ', and F', which correspond to the building information model coordinate system, according to the relative position information between the vertical projection points D ', E', and F 'and the target origin O'.
F. And determining the pixel coordinate data of the target image corresponding to the vertical projection points D ', E ' and F ' according to the mapping relation between the pixel coordinate system and the world coordinate system obtained in the step S103.
G. And obtaining the mapping relation between the pixel coordinate system corresponding to the target image and the building information model coordinate system according to the model coordinate data of the points A ', B', and C 'of the space points A', B ', C', corresponding to the building information model coordinate system, the model coordinate data of the points D ', E', and F 'of the vertical projection points D', E ', F', corresponding to the building information model coordinate system, the pixel coordinate data of the target image corresponding to the vertical projection points D ', E', F ', and the pixel coordinate data of the plane points A, B, C corresponding to the space points A', B ', C'. In this embodiment, the corresponding homography matrix may be calculated based on the assumption of coplanarity, so as to obtain a mapping relationship from the pixel coordinate system to the BIM coordinate system, and the homography matrix expresses a linear transformation relationship between planes, so that, in a road traffic scene, it may express a transformation relationship between a road plane and a corresponding road plane in an image. Expressed as X ═ H ' X by a formula, where X is a pixel coordinate point in the target image, X is a coordinate point of the BIM coordinate system, and H ' is a 3 × 3 matrix with 4 degrees of freedom, so that solving the unknowns of H ' requires 4 pairs of points at minimum to construct 4 equations. It should be noted that, in this embodiment, the projection matrix may also be used to replace the homography matrix to calculate the mapping relationship, and the projection matrix has a direct physical meaning compared with the homography matrix, that is, under the pinhole camera model, the intersection point of the light from the camera to the road plane is calculated. The homography matrix is used to calculate the transformation in the form of X ═ HX, and the projection matrix is used to calculate the transformation in the form of X ═ P 'X, where P' is the projection matrix. When solving the homography matrix, the solving equation is obtained by using 6 point pairs, and the solving projection matrix can be obtained by transforming the relation twice: calculating the transformation relation from the BIM coordinate system to the world coordinate system (solving and calculating by using the space coordinates of the space points A ', B', C 'and D' and the BIM three-dimensional coordinates corresponding to the space points), and combining the obtained mapping relation between the pixel coordinate system and the world coordinate system to obtain the projection transformation from the BIM coordinate system to the pixel coordinate system.
Three-dimensional information can be generated from a two-dimensional image through the mapping relation between the pixel coordinate system and a Building Information Model (BIM) coordinate system, for example, in the application of a face 3D recognition technology, a face three-dimensional model can be directly generated from a face image shot by a camera based on the mapping relation.
According to the imaging device calibration method provided by the embodiment, two parallel lines with known intervals in a real space and a space point with known distance on one of the parallel lines are used as a calibration reference object, the calibration reference object is easy to obtain in a traffic monitoring scene, and the obtained distance scale information is more accurate, so that the mapping relation between a pixel coordinate system and a world coordinate system is accurately and efficiently constructed. By using the method, the problem that the camera calibration process is limited due to the fact that the selected calibration reference object is limited in the existing camera calibration process aiming at the traffic scene can be solved.
In addition, the method for positioning the calibration reference object in the BIM measures the height information of the imaging device to be calibrated in the BIM coordinate system through the BIM, and can deduce the relative position of the calibration reference object in the BIM coordinate system by combining the real height of the imaging device to be calibrated, thereby providing a reliable basis for marking the calibration reference object in the BIM coordinate system, obtaining the accurate coordinate data of the calibration reference object in the BIM coordinate system, constructing the mapping relation between the monitoring image and the BIM, realizing the fusion of the information sensed by the camera into the three-dimensional model, and forming a global dynamic monitoring system.
The second embodiment of the present application further provides an imaging device calibration apparatus, since the apparatus embodiment is substantially similar to the method embodiment, so that the description is relatively simple, and the details of the related technical features can be found in the corresponding description of the method embodiment provided above, and the following description of the apparatus embodiment is only illustrative.
Referring to fig. 2, to understand the embodiment, fig. 2 is a block diagram of a unit of the apparatus provided in the embodiment, and as shown in fig. 2, the apparatus provided in the embodiment includes:
the straight line acquisition unit 201 is configured to acquire two planar straight lines corresponding to two parallel spatial straight lines on a target image captured by an imaging device to be calibrated;
a distance data obtaining unit 202, configured to obtain known distances between a plurality of spatial points on the spatial straight line, and obtain known distances between the two spatial straight lines;
the calibration unit 203 is configured to calibrate the imaging device to be calibrated by using the spatial straight line as a calibration reference and combining the planar straight line and the known distance.
Optionally, the obtaining two plane straight lines corresponding to two parallel space straight lines on a target image captured by an imaging device to be calibrated includes: selecting a first plane straight line and a second plane straight line from a target image shot by imaging equipment to be calibrated, wherein the first plane straight line corresponds to a first space straight line in the real world, the second plane straight line corresponds to a second space straight line in the real world, and the first space straight line and the second space straight line are parallel to each other;
the obtaining the known distances between the plurality of spatial points on the spatial straight line and the obtaining the known distances between the two spatial straight lines comprises: taking the known distances among a plurality of preset space points on the first space straight line as a target space distance, and taking the distance between the first space straight line and the second space straight line as a parallel line distance;
the calibrating the imaging device to be calibrated by taking the space straight line as a calibration reference object and combining the plane straight line and the known distance comprises the following steps: and obtaining a mapping relation between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world according to the target space distance, the parallel line distance, the second plane straight line and pixel coordinate data of a plurality of plane points corresponding to the plurality of space points on the first plane straight line.
Optionally, the predetermined plurality of spatial points on the first spatial straight line include: a first space point, a second space point and a third space point which are sequentially distributed on the first space straight line;
a plurality of planar points on the first planar line corresponding to the plurality of spatial points comprises: a first plane point corresponding to the first spatial point, a second plane point corresponding to the second spatial point, and a third plane point corresponding to the third spatial point;
correspondingly, the target space distance includes: a first spatial distance between the first spatial point and the second spatial point, and a second spatial distance between the first spatial point and the third spatial point.
Optionally, the obtaining, according to the target spatial distance, the parallel line distance, the second planar straight line, and pixel coordinate data of a plurality of planar points on the first planar straight line corresponding to the plurality of spatial points, a mapping relationship between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world includes:
calculating and obtaining the focal length of the camera, the imaging equipment to be calibrated and the X coordinate data of the world coordinate system according to the first space distance, the second space distance, the distance between the parallel lines, the second plane straight line and the pixel coordinate data of a plurality of plane points on the first plane straight line corresponding to the plurality of space pointswOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes;
deriving an internal reference matrix from the pixel coordinate system to the world coordinate system according to the camera focal length;
and deriving an external parameter matrix from the pixel coordinate system to the world coordinate system according to the first included angle and the second included angle.
Optionally, the focal length of the camera, the imaging device to be calibrated, and the X coordinate of the world coordinate system are obtained through calculation according to the first spatial distance, the second spatial distance, the distance between the parallel lines, the second plane straight line, and the pixel coordinate data of the plurality of plane points on the first plane straight line corresponding to the plurality of spatial pointswOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes comprising:
performing straight line fitting based on pixel coordinates of a plurality of points on the second plane straight line to obtain the slope and intercept of the second plane straight line;
determining the plane where the first space straight line and the second space straight line are located as the X of the world coordinate systemwOYwPlane according to the slope and intercept of the second plane line and the number of lines on the second plane lineObtaining pixel coordinates of a point, obtaining one space point of the first space point, the second space point and the third space point and the second space straight line along the XwOYwPlanar XwPixel coordinate data of a spatial intersection point in the axis direction in a pixel coordinate system corresponding to the target image;
with first space distance, second space distance the parallel line interval, the space intersection point is in the pixel coordinate data in the pixel coordinate system that the target image corresponds, and on the first plane straight line with the pixel coordinate data of a plurality of plane points that a plurality of space points correspond are as known data, utilize pinhole imaging model and similar triangle-shaped principle to solve, obtain camera focus the first contained angle and the second contained angle.
Optionally, the method further includes:
taking the known distance of the imaging device to be calibrated in the real world relative to the plane where the first space straight line and the second space straight line are located as the height of the imaging device to be calibrated;
according to the height of the imaging equipment to be calibrated, carrying out position transformation on the world coordinate system to obtain a building information model coordinate system;
obtaining the spatial coordinate data of a plurality of preset spatial points on the first spatial straight line according to the pixel coordinate data of a plurality of plane points on the first planar straight line corresponding to the plurality of spatial points and the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
obtaining model coordinate data of a plurality of space points corresponding to the building information model coordinate system according to the preset space coordinate data of the space points on the first space straight line;
obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection points of the second spatial straight line;
obtaining pixel coordinate data of the target image corresponding to the vertical projection point according to the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
and obtaining a mapping relation between the pixel coordinate system corresponding to the target image and the building information model coordinate system according to the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system, the model coordinate data of the vertical projection point corresponding to the building information model coordinate system, the pixel coordinate data of the vertical projection point corresponding to the target image and the pixel coordinate data of the plurality of plane points corresponding to the plurality of spatial points.
Optionally, the performing position transformation on the world coordinate system according to the height of the imaging device to be calibrated to obtain a building information model coordinate system includes:
according to the height of the imaging equipment to be calibrated, converting the origin of the world coordinate system to the X of the imaging equipment to be calibrated and the world coordinate systemwOYwAnd obtaining the target origin of the coordinate system of the building information model at the position of the vertical intersection point of the plane.
Optionally, the obtaining, according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line, the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system includes:
obtaining relative position information between a plurality of preset space points on the first space straight line and the target origin;
and obtaining model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line and the relative position information.
Optionally, the obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection point of the second spatial straight line includes:
obtaining relative position information between the vertical projection point and the target origin according to the relative position information between the plurality of space points and the target origin;
and obtaining model coordinate data of the vertical projection point according to the relative position information between the vertical projection point and the target origin point.
Optionally, the method further includes: and performing edge extraction on the image shot by the imaging equipment to be calibrated to obtain the target image.
Optionally, the first spatial straight line and the second spatial straight line are lane lines in a traffic scene.
The imaging device calibration device provided by the embodiment takes two parallel lines with known distance in a real space and a space point with known distance on one of the parallel lines as a calibration reference object, the calibration reference object is easy to obtain in a traffic monitoring scene, and the obtained distance scale information is more accurate, so that the mapping relation between a pixel coordinate system and a world coordinate system is accurately and efficiently constructed. By using the device, the problem that the camera calibration process is limited due to the fact that the selected calibration reference object is limited in the existing camera calibration process aiming at the traffic scene can be solved.
In the foregoing embodiments, an imaging device calibration method and an imaging device calibration apparatus are provided, and in addition, a third embodiment of the present application also provides an electronic device, which is basically similar to the method embodiment and therefore is relatively simple to describe, and the details of the related technical features may be obtained by referring to the corresponding description of the method embodiment provided above, and the following description of the electronic device embodiment is only illustrative. The embodiment of the electronic equipment is as follows:
please refer to fig. 3 for understanding the present embodiment, fig. 3 is a schematic diagram of an electronic device provided in the present embodiment.
As shown in fig. 3, the electronic device includes: a processor 301; a memory 302;
the memory 302 is used for storing an imaging device calibration program, and when the program is read and executed by the processor, the program performs the following operations:
acquiring two plane straight lines corresponding to two parallel space straight lines on a target image shot by imaging equipment to be calibrated;
acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines;
and taking the space straight line as a calibration reference object, and calibrating the imaging equipment to be calibrated by combining the plane straight line and the known distance.
Optionally, the obtaining two plane straight lines corresponding to two parallel space straight lines on a target image captured by an imaging device to be calibrated includes: selecting a first plane straight line and a second plane straight line from a target image shot by imaging equipment to be calibrated, wherein the first plane straight line corresponds to a first space straight line in the real world, the second plane straight line corresponds to a second space straight line in the real world, and the first space straight line and the second space straight line are parallel to each other;
the obtaining the known distances between the plurality of spatial points on the spatial straight line and the obtaining the known distances between the two spatial straight lines comprises: taking the known distances among a plurality of preset space points on the first space straight line as a target space distance, and taking the distance between the first space straight line and the second space straight line as a parallel line distance;
the calibrating the imaging device to be calibrated by taking the space straight line as a calibration reference object and combining the plane straight line and the known distance comprises the following steps: and obtaining a mapping relation between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world according to the target space distance, the parallel line distance, the second plane straight line and pixel coordinate data of a plurality of plane points corresponding to the plurality of space points on the first plane straight line.
Optionally, the predetermined plurality of spatial points on the first spatial straight line include: a first space point, a second space point and a third space point which are sequentially distributed on the first space straight line;
a plurality of planar points on the first planar line corresponding to the plurality of spatial points comprises: a first plane point corresponding to the first spatial point, a second plane point corresponding to the second spatial point, and a third plane point corresponding to the third spatial point;
correspondingly, the target space distance includes: a first spatial distance between the first spatial point and the second spatial point, and a second spatial distance between the first spatial point and the third spatial point.
Optionally, the obtaining, according to the target spatial distance, the parallel line distance, the second planar straight line, and pixel coordinate data of a plurality of planar points on the first planar straight line corresponding to the plurality of spatial points, a mapping relationship between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world includes:
calculating and obtaining the focal length of the camera, the imaging equipment to be calibrated and the X coordinate data of the world coordinate system according to the first space distance, the second space distance, the distance between the parallel lines, the second plane straight line and the pixel coordinate data of a plurality of plane points on the first plane straight line corresponding to the plurality of space pointswOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes;
deriving an internal reference matrix from the pixel coordinate system to the world coordinate system according to the camera focal length;
and deriving an external parameter matrix from the pixel coordinate system to the world coordinate system according to the first included angle and the second included angle.
Optionally, the focal length of the camera, the imaging device to be calibrated, and the X coordinate of the world coordinate system are obtained through calculation according to the first spatial distance, the second spatial distance, the distance between the parallel lines, the second plane straight line, and the pixel coordinate data of the plurality of plane points on the first plane straight line corresponding to the plurality of spatial pointswOYwA first angle between the planes and the first spatial line or lineThe second space straight line and the XwOYwPlanar XwA second angle between the axes comprising:
performing straight line fitting based on pixel coordinates of a plurality of points on the second plane straight line to obtain the slope and intercept of the second plane straight line;
determining the plane where the first space straight line and the second space straight line are located as the X of the world coordinate systemwOYwA plane for obtaining one spatial point of the first spatial point, the second spatial point and the third spatial point and the second spatial line along the X-ray according to the slope and the intercept of the second plane line and the pixel coordinates of a plurality of points on the second plane linewOYwPlanar XwPixel coordinate data of a spatial intersection point in the axis direction in a pixel coordinate system corresponding to the target image;
with first space distance, second space distance the parallel line interval, the space intersection point is in the pixel coordinate data in the pixel coordinate system that the target image corresponds, and on the first plane straight line with the pixel coordinate data of a plurality of plane points that a plurality of space points correspond are as known data, utilize pinhole imaging model and similar triangle-shaped principle to solve, obtain camera focus the first contained angle and the second contained angle.
Optionally, the method further includes:
taking the known distance of the imaging device to be calibrated in the real world relative to the plane where the first space straight line and the second space straight line are located as the height of the imaging device to be calibrated;
according to the height of the imaging equipment to be calibrated, carrying out position transformation on the world coordinate system to obtain a building information model coordinate system;
obtaining the spatial coordinate data of a plurality of preset spatial points on the first spatial straight line according to the pixel coordinate data of a plurality of plane points on the first planar straight line corresponding to the plurality of spatial points and the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
obtaining model coordinate data of a plurality of space points corresponding to the building information model coordinate system according to the preset space coordinate data of the space points on the first space straight line;
obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection points of the second spatial straight line;
obtaining pixel coordinate data of the target image corresponding to the vertical projection point according to the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
and obtaining a mapping relation between the pixel coordinate system corresponding to the target image and the building information model coordinate system according to the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system, the model coordinate data of the vertical projection point corresponding to the building information model coordinate system, the pixel coordinate data of the vertical projection point corresponding to the target image and the pixel coordinate data of the plurality of plane points corresponding to the plurality of spatial points.
Optionally, the performing position transformation on the world coordinate system according to the height of the imaging device to be calibrated to obtain a building information model coordinate system includes:
according to the height of the imaging equipment to be calibrated, converting the origin of the world coordinate system to the X of the imaging equipment to be calibrated and the world coordinate systemwOYwAnd obtaining the target origin of the coordinate system of the building information model at the position of the vertical intersection point of the plane.
Optionally, the obtaining, according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line, the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system includes:
obtaining relative position information between a plurality of preset space points on the first space straight line and the target origin;
and obtaining model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line and the relative position information.
Optionally, the obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection point of the second spatial straight line includes:
obtaining relative position information between the vertical projection point and the target origin according to the relative position information between the plurality of space points and the target origin;
and obtaining model coordinate data of the vertical projection point according to the relative position information between the vertical projection point and the target origin point.
Optionally, the method further includes: and performing edge extraction on the image shot by the imaging equipment to be calibrated to obtain the target image.
Optionally, the first spatial straight line and the second spatial straight line are lane lines in a traffic scene.
The electronic device provided by the embodiment takes two parallel lines with known distance in a real space and a space point with known distance on one of the parallel lines as a calibration reference object, the calibration reference object is easy to obtain in a traffic monitoring scene, and the obtained distance scale information is more accurate, so that the mapping relation between a pixel coordinate system and a world coordinate system is accurately and efficiently constructed. By using the electronic equipment, the problem that the camera calibration process is limited due to the fact that the selected calibration reference object is limited in the existing camera calibration process aiming at the traffic scene can be solved.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
1. Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
2. As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application, and those skilled in the art can make variations and modifications without departing from the spirit and scope of the present application, therefore, the scope of the present application should be determined by the claims that follow.

Claims (13)

1. An imaging device calibration method, comprising:
acquiring two plane straight lines corresponding to two parallel space straight lines on a target image shot by imaging equipment to be calibrated;
acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines;
and taking the space straight line as a calibration reference object, and calibrating the imaging equipment to be calibrated by combining the plane straight line and the known distance.
2. The imaging device calibration method according to claim 1, wherein the obtaining of two plane straight lines corresponding to two parallel spatial straight lines on a target image captured by the imaging device to be calibrated comprises:
selecting a first plane straight line and a second plane straight line from a target image shot by imaging equipment to be calibrated, wherein the first plane straight line corresponds to a first space straight line in the real world, the second plane straight line corresponds to a second space straight line in the real world, and the first space straight line and the second space straight line are parallel to each other;
the obtaining the known distances between the plurality of spatial points on the spatial straight line and the obtaining the known distances between the two spatial straight lines comprises:
taking the known distances among a plurality of preset space points on the first space straight line as a target space distance, and taking the distance between the first space straight line and the second space straight line as a parallel line distance;
the calibrating the imaging device to be calibrated by taking the space straight line as a calibration reference object and combining the plane straight line and the known distance comprises the following steps:
and obtaining a mapping relation between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world according to the target space distance, the parallel line distance, the second plane straight line and pixel coordinate data of a plurality of plane points corresponding to the plurality of space points on the first plane straight line.
3. The imaging apparatus calibration method according to claim 2, wherein the predetermined plurality of spatial points on the first spatial straight line include: a first space point, a second space point and a third space point which are sequentially distributed on the first space straight line;
a plurality of planar points on the first planar line corresponding to the plurality of spatial points comprises: a first plane point corresponding to the first spatial point, a second plane point corresponding to the second spatial point, and a third plane point corresponding to the third spatial point;
correspondingly, the target space distance includes: a first spatial distance between the first spatial point and the second spatial point, and a second spatial distance between the first spatial point and the third spatial point.
4. The imaging device calibration method according to claim 4, wherein said obtaining a mapping relationship between a pixel coordinate system corresponding to the target image and a world coordinate system corresponding to the real world according to the target spatial distance, the parallel line distance, the second planar straight line, and pixel coordinate data of a plurality of planar points corresponding to the plurality of spatial points on the first planar straight line comprises:
calculating and obtaining the focal length of the camera, the imaging device to be calibrated and the X coordinate of the world coordinate system according to the first space distance, the second space distance, the distance between the parallel lines, the second plane straight line and the pixel coordinate data of a plurality of plane points on the first plane straight line corresponding to the plurality of space pointswOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes;
deriving an internal reference matrix from the pixel coordinate system to the world coordinate system according to the camera focal length;
and deriving an external parameter matrix from the pixel coordinate system to the world coordinate system according to the first included angle and the second included angle.
5. The imaging device calibration method according to claim 4, wherein the camera focal length, the imaging device to be calibrated, and the X of the world coordinate system are obtained by calculation according to the first spatial distance, the second spatial distance, the parallel line distance, the second planar straight line, and the pixel coordinate data of the plurality of planar points corresponding to the plurality of spatial points on the first planar straight linewOYwA first included angle between the planes, and the first space straight line or the second space straight line and the XwOYwPlanar XwA second angle between the axes comprising:
performing straight line fitting based on pixel coordinates of a plurality of points on the second plane straight line to obtain the slope and intercept of the second plane straight line;
determining the plane where the first space straight line and the second space straight line are located as the X of the world coordinate systemwOYwA plane for obtaining one spatial point of the first spatial point, the second spatial point and the third spatial point and the second spatial line along the X-ray according to the slope and the intercept of the second plane line and the pixel coordinates of a plurality of points on the second plane linewOYwPlanar XwPixel coordinate data of a spatial intersection point in the axis direction in a pixel coordinate system corresponding to the target image;
with first space distance, second space distance the parallel line interval, the space intersection point is in the pixel coordinate data in the pixel coordinate system that the target image corresponds, and on the first plane straight line with the pixel coordinate data of a plurality of plane points that a plurality of space points correspond are as known data, utilize pinhole imaging model and similar triangle-shaped principle to solve, obtain camera focus the first contained angle and the second contained angle.
6. The imaging apparatus calibration method according to claim 2, further comprising:
taking the known distance of the imaging device to be calibrated in the real world relative to the plane where the first space straight line and the second space straight line are located as the height of the imaging device to be calibrated;
according to the height of the imaging equipment to be calibrated, carrying out position transformation on the world coordinate system to obtain a building information model coordinate system;
obtaining the spatial coordinate data of a plurality of preset spatial points on the first spatial straight line according to the pixel coordinate data of a plurality of plane points on the first planar straight line corresponding to the plurality of spatial points and the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
obtaining model coordinate data of a plurality of space points corresponding to the building information model coordinate system according to the preset space coordinate data of the space points on the first space straight line;
obtaining model coordinate data of the plurality of spatial points corresponding to the vertical projection points of the second spatial straight line;
obtaining pixel coordinate data of the target image corresponding to the vertical projection point according to the mapping relation between the pixel coordinate system corresponding to the target image and the world coordinate system corresponding to the real world;
and obtaining a mapping relation between the pixel coordinate system corresponding to the target image and the building information model coordinate system according to the model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system, the model coordinate data of the vertical projection point corresponding to the building information model coordinate system, the pixel coordinate data of the vertical projection point corresponding to the target image and the pixel coordinate data of the plurality of plane points corresponding to the plurality of spatial points.
7. The imaging device calibration method according to claim 6, wherein the obtaining of the building information model coordinate system by performing position transformation on the world coordinate system according to the height of the imaging device to be calibrated comprises:
according to the waiting markDetermining the height of the imaging equipment, and transforming the origin of the world coordinate system to the X of the imaging equipment to be calibrated and the world coordinate systemwOYwAnd obtaining the target origin of the coordinate system of the building information model at the position of the vertical intersection point of the plane.
8. The imaging device calibration method according to claim 7, wherein obtaining model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system according to spatial coordinate data of a plurality of spatial points preset on the first spatial straight line comprises:
obtaining relative position information between a plurality of preset space points on the first space straight line and the target origin;
and obtaining model coordinate data of the plurality of spatial points corresponding to the building information model coordinate system according to the spatial coordinate data of the plurality of spatial points preset on the first spatial straight line and the relative position information.
9. The imaging device calibration method according to claim 8, wherein said obtaining model coordinate data of the plurality of spatial points corresponding to the perpendicular projection point of the second spatial straight line comprises:
obtaining relative position information between the vertical projection point and the target origin according to the relative position information between the plurality of space points and the target origin;
and obtaining model coordinate data of the vertical projection point according to the relative position information between the vertical projection point and the target origin point.
10. The imaging apparatus calibration method according to claim 1, further comprising:
and performing edge extraction on the image shot by the imaging equipment to be calibrated to obtain the target image.
11. The imaging device calibration method according to claim 1, wherein the spatial straight line is a lane line in a traffic scene.
12. An imaging device calibration apparatus, comprising:
the straight line acquisition unit is used for acquiring two parallel plane straight lines corresponding to two space straight lines on a target image shot by imaging equipment to be calibrated;
a distance data acquisition unit for acquiring known distances between a plurality of spatial points on the spatial straight line and acquiring known distances between the two spatial straight lines;
and the calibration unit is used for calibrating the imaging equipment to be calibrated by taking the space straight line as a calibration reference object and combining the plane straight line and the known distance.
13. An electronic device comprising a processor and a memory; wherein the content of the first and second substances,
the memory is to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method of claims 1-11.
CN202010222903.XA 2020-03-26 2020-03-26 Imaging device calibration method and device Pending CN113450415A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010222903.XA CN113450415A (en) 2020-03-26 2020-03-26 Imaging device calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010222903.XA CN113450415A (en) 2020-03-26 2020-03-26 Imaging device calibration method and device

Publications (1)

Publication Number Publication Date
CN113450415A true CN113450415A (en) 2021-09-28

Family

ID=77807282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010222903.XA Pending CN113450415A (en) 2020-03-26 2020-03-26 Imaging device calibration method and device

Country Status (1)

Country Link
CN (1) CN113450415A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063290A (en) * 2022-08-17 2022-09-16 腾讯科技(深圳)有限公司 Image processing method, device, equipment, system and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101409009A (en) * 2008-11-05 2009-04-15 青岛海信电子产业控股股份有限公司 Method and system for road surface calibration
CN101727671A (en) * 2009-12-01 2010-06-09 湖南大学 Single camera calibration method based on road surface collinear three points and parallel line thereof
WO2018196391A1 (en) * 2017-04-28 2018-11-01 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101409009A (en) * 2008-11-05 2009-04-15 青岛海信电子产业控股股份有限公司 Method and system for road surface calibration
CN101727671A (en) * 2009-12-01 2010-06-09 湖南大学 Single camera calibration method based on road surface collinear three points and parallel line thereof
WO2018196391A1 (en) * 2017-04-28 2018-11-01 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063290A (en) * 2022-08-17 2022-09-16 腾讯科技(深圳)有限公司 Image processing method, device, equipment, system and storage medium

Similar Documents

Publication Publication Date Title
US11113835B2 (en) Plant point cloud acquisition, registration and optimization method based on TOF camera
US9965870B2 (en) Camera calibration method using a calibration target
CA2395257C (en) Any aspect passive volumetric image processing method
CN111383279B (en) External parameter calibration method and device and electronic equipment
JP6573419B1 (en) Positioning method, robot and computer storage medium
US10909395B2 (en) Object detection apparatus
CN109801333B (en) Volume measurement method, device and system and computing equipment
WO2021140886A1 (en) Three-dimensional model generation method, information processing device, and program
CN111473739A (en) Video monitoring-based surrounding rock deformation real-time monitoring method for tunnel collapse area
CN108564615B (en) Method, device and system for simulating laser radar detection and storage medium
CN113850126A (en) Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
CN108362205B (en) Space distance measuring method based on fringe projection
EP3944194B1 (en) Fisheye camera calibration system, method and apparatus, and electronic device and storage medium
US11259000B2 (en) Spatiotemporal calibration of RGB-D and displacement sensors
CN112254670A (en) 3D information acquisition equipment based on optical scanning and intelligent vision integration
CN113450415A (en) Imaging device calibration method and device
KR102065337B1 (en) Apparatus and method for measuring movement information of an object using a cross-ratio
Jiang et al. Bridge Deformation Measurement Using Unmanned Aerial Dual Camera and Learning‐Based Tracking Method
CN115565092A (en) Method and equipment for acquiring geographical position information of target object
CN113421300B (en) Method and device for determining actual position of object in fisheye camera image
CN111489398B (en) Imaging equipment calibration method and device
CN113790711A (en) Unmanned aerial vehicle low-altitude flight pose uncontrolled multi-view measurement method and storage medium
CN111489397A (en) Imaging device calibration method and device
CN114780762B (en) Point cloud ranging automatic labeling method and system for night vision image of power transmission line
Huang et al. Wide-angle vision for road views

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination