CN117523004A - Camera external parameter calibration method, device, equipment and vehicle - Google Patents

Camera external parameter calibration method, device, equipment and vehicle Download PDF

Info

Publication number
CN117523004A
CN117523004A CN202311605707.0A CN202311605707A CN117523004A CN 117523004 A CN117523004 A CN 117523004A CN 202311605707 A CN202311605707 A CN 202311605707A CN 117523004 A CN117523004 A CN 117523004A
Authority
CN
China
Prior art keywords
image
camera
pixel
coordinate
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311605707.0A
Other languages
Chinese (zh)
Inventor
李政斌
程风
刘忠泽
余东应
周珣
万国伟
朱振广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202311605707.0A priority Critical patent/CN117523004A/en
Publication of CN117523004A publication Critical patent/CN117523004A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides a camera external parameter calibration method, device, equipment and vehicle, relates to the technical field of artificial intelligence, and particularly relates to the technical fields of automatic driving, map positioning and the like. The camera external parameter calibration method comprises the following steps: aiming at the same calibration scene, a first original image acquired by a first camera and a second original image acquired by a second camera are acquired; acquiring a first mapping image and a second mapping image; performing matching processing on the first mapping image and the second mapping image to determine a first pixel and a second pixel which are matched with each other; acquiring a position coordinate offset based on a first position coordinate of the first pixel and a second position coordinate of the second pixel; and determining an external parameter between the first camera and the second camera based on the position coordinate offset and the focal length of the second camera. The method and the device can improve the accuracy of camera external parameter calibration.

Description

Camera external parameter calibration method, device, equipment and vehicle
Technical Field
The disclosure relates to the technical field of artificial intelligence, in particular to the technical fields of automatic driving, map positioning and the like, and particularly relates to a camera external parameter calibration method, device, equipment and a vehicle.
Background
A vehicle (e.g., an autonomous vehicle) has mounted on its forward support various types of cameras, including a short-focus camera for identifying near objects and a long-focus camera for identifying distant objects when the vehicle is traveling on a high speed or open road.
In order to ensure normal operation of the sensing function and ensure safe and stable running of the vehicle, the transformation relation between the long and short focal length cameras, namely external parameter calibration, needs to be accurately calculated.
In the related art, camera external parameter calibration is generally performed based on a calibration method between calibration.
Disclosure of Invention
The disclosure provides a camera external parameter calibration method, device, equipment and vehicle.
According to an aspect of the present disclosure, there is provided a camera external parameter calibration method, including: aiming at the same calibration scene, a first original image acquired by a first camera and a second original image acquired by a second camera are acquired; acquiring a first mapping image based on the first original image, and acquiring a second mapping image based on the second original image; the first mapping image and the second mapping image are located under the same coordinate system; performing matching processing on the first mapping image and the second mapping image to determine a first pixel and a second pixel which are matched with each other; acquiring a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image; and determining an external parameter between the first camera and the second camera based on the position coordinate offset and the focal length of the second camera.
According to another aspect of the present disclosure, there is provided a camera external parameter calibration apparatus, including: the acquisition module is used for acquiring a first original image acquired by the first camera and a second original image acquired by the second camera aiming at the same calibration scene; the mapping module is used for acquiring a first mapping image according to the first original image and acquiring a second mapping image based on the second original image; the first mapping image and the second mapping image are located under the same coordinate system; the matching module is used for carrying out matching processing on the first mapping image and the second mapping image so as to determine a first pixel and a second pixel which are matched with each other; the deviation module is used for acquiring a position coordinate offset according to a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image; and the determining module is used for determining the external parameters between the first camera and the second camera according to the position coordinate offset and the focal length of the second camera.
According to another aspect of the present disclosure, there is provided a vehicle including: a first camera and a second camera; an external reference between the first camera and the second camera is calibrated using the method of any of the above aspects.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the above aspects.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method according to any one of the above aspects.
According to another aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method according to any of the above aspects.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a camera exogenous calibration scenario provided in accordance with an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a first image coordinate system and a second image coordinate system provided in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of the overall architecture of a camera extrinsic calibration process provided in accordance with an embodiment of the present disclosure;
FIG. 5 is a schematic diagram according to a second embodiment of the present disclosure;
FIG. 6 is a schematic illustration of two original images acquired by two cameras, respectively, provided in accordance with an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of acquiring a window image based on a first mapped image provided in accordance with an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of optimal matching locations of a window image and a template image provided in accordance with an embodiment of the present disclosure;
FIG. 9 is a schematic diagram according to a third embodiment of the present disclosure;
FIG. 10 is a schematic diagram according to a fourth embodiment of the present disclosure;
fig. 11 is a schematic diagram of an electronic device for implementing a camera exogenous calibration method according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The calibration method based on the calibration room is characterized in that a three-dimensional scanner is used for scanning and modeling the calibration room in advance, the model comprises position information of a mark code of a plane target, a vehicle enters the calibration room, each camera extracts corner points of the mark code of the plane target in each image and matches the corner points in the scanning model, and the minimum reprojection error is used as constraint to obtain external parameters of the camera relative to the calibration room.
The calibration method based on the calibration room can obtain a better calibration result aiming at the short-focus camera. However, for a long-focus camera, due to the long focal length, focusing and imaging blurring cannot be performed at a short distance in a calibration room, and an acquired visible area is small, so that the number of corner points of an extractable marker code is small, the positions of the corner points are inaccurate, and corner point matching and accurate external parameter acquisition are not facilitated.
Therefore, for the long-focus camera, the external parameters of the long-focus camera relative to the calibration room cannot be accurately obtained, and further, the external parameters of the long-focus camera relative to the short-focus camera cannot be accurately obtained.
In order to improve the accuracy of camera extrinsic calibration, the present disclosure provides the following embodiments.
Fig. 1 is a schematic diagram of a first embodiment of the present disclosure, and the present embodiment provides a camera external parameter calibration method, which includes:
101. And aiming at the same calibration scene, acquiring a first original image acquired by the first camera and a second original image acquired by the second camera.
102. Acquiring a first mapping image based on the first original image, and acquiring a second mapping image based on the second original image; the first mapped image and the second mapped image are located in the same coordinate system.
103. And performing matching processing on the first mapping image and the second mapping image to determine a first pixel and a second pixel which are matched with each other.
104. And acquiring a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image.
105. And determining an external parameter between the first camera and the second camera based on the position coordinate offset and the focal length of the second camera.
In this embodiment, the first original image and the second original image are images for the same calibration scene, and pixels matched with each other exist in the two images, and based on image mapping and image matching, the pixels matched with each other can be obtained; because the position coordinate offset of the matched pixels is caused by the relative relation among the cameras, the relative relation among the cameras can be determined based on the position coordinate offset, namely, the external parameter calibration among the cameras is realized. The external parameter calibration is carried out through the relation between the images, is not limited by the calibration room, and can improve the accuracy of the external parameter calibration of the camera.
The first camera and the second camera refer to two cameras with relative position relation and/or relative posture relation. The relative positional relationship may be characterized by a relative translational amount and the relative attitude relationship may be characterized by a relative rotational angle.
Because the first camera and the second camera acquire images aiming at the same calibration scene, the first camera and the second camera are two cameras with similar deployment positions and similar optical center distances. For example, the first camera and the second camera are two cameras disposed on a forward support of the vehicle in close proximity.
Further, the first camera and the second camera may be two cameras with different focal lengths on the forward support, where the first camera is a camera with a longer focal length and may be called a tele camera, and the second camera is a camera with a shorter focal length and may be called a short-focus camera. Specifically, the focal length of the tele camera may be 25mm and the focal length of the tele camera may be 6mm.
For a tele camera, in order to improve imaging effect, the calibration scene may specifically refer to an outdoor scene, for example, a natural scene with an effective field of view greater than 150 meters away from a vehicle may be specifically selected, and the field of view in the left and right 20m of the vehicle is not blocked, so as to ensure that the tele camera focuses accurately, meanwhile, as many objects with obvious outline and color contrast are required in the field of view, so that scenes with simple textures (or referred to as a scene is simple) such as road sky are avoided, so as to ensure accuracy of template matching, and avoid mismatching.
Taking a long-focus camera and a short-focus camera which are installed on a forward support of a vehicle and have similar distances as an example, the calibration scene can be selected as the far outdoor scene.
In the case of a far outdoor scene, the relative translation amounts of the long-focus camera and the short-focus camera which are close to each other can be ignored, only the relative rotation angle needs to be calibrated, and the relative roll angle (roll) can also be ignored, so that the relative rotation angle needs to be calibrated, and the relative heading angle (yaw) and the relative pitch angle (pitch) are included.
In addition, in the above case, the long-focus camera and the short-focus camera have a narrow baseline characteristic, which means that the two cameras are positioned closer together without large rotation. In this case, the neighborhood of the corresponding pixel of the first original image acquired by the long-focus camera and the second original image acquired by the short-focus camera are similar, and therefore, image matching can be performed by using a neighborhood matching algorithm to obtain a first pixel and a second pixel that are matched with each other.
Because the first original image and the second original image are acquired by two cameras and are located in different coordinate systems, the two original images need to be mapped to the same coordinate system for subsequent image matching. The same coordinate system may be an image coordinate system, a pixel coordinate system, etc., and the pixel coordinate system is taken as an example in this embodiment.
After the first mapping image and the second mapping image under the same coordinate system are obtained, the two images can be matched. Because of the neighborhood similarity described above, a neighborhood matching algorithm may be used for matching. The image matching may employ a feature-based image matching algorithm or a gray-scale-based image matching algorithm, etc. The present embodiment takes a gray-based image matching algorithm as an example, and specifically, the image matching algorithm may be selected as a normalized cross-correlation (Normalized Cross Correlation, NCC) matching algorithm. The NCC matching algorithm is an image gray-scale based matching algorithm that can find the portion in the template image that best matches the window image.
After the first pixel in the first mapping image and the second pixel in the second mapping image which are matched with each other are obtained, the position coordinate offset between the two pixels can be calculated, and then the external parameters between cameras are calculated based on the offset.
The position coordinate offset may be an image coordinate offset or a pixel coordinate offset based on the difference in the mapped coordinate systems.
Taking the example that the external parameters comprise the relative course angle and the relative pitch angle, the external parameters can be specifically calculated according to the position coordinate offset and the focal length of the second camera. Further, if the position coordinate offset is an image coordinate offset, the pixel coordinate offset may be calculated from the image coordinate offset, and further based on the pixel coordinate offset and the focal length component of the second camera (the focal length component may be f x And f y Representation) calculation; if the offset is a pixel coordinate offset, the offset may be based on the pixel coordinate offset and the focal length component of the second camera (the focal length component may be f x And f y Representation) of the calculation. The focal length component (f x And f y ) Is calculated from the focal length f of the second camera.
For a specific calculation process, reference may be made to the following embodiments.
For better understanding of the embodiments of the present disclosure, application scenarios of the embodiments of the present disclosure are described. The present embodiment may be applied to an autopilot scenario.
Fig. 2 is a schematic diagram of an application scenario for implementing an embodiment of the present disclosure.
As shown in fig. 2, assuming that the first camera and the second camera are represented by a tele camera 201 and a short-focus camera 202, respectively, the tele camera 201 and the short-focus camera 202 take a photograph of the same outdoor scene, respectively, resulting in a first original image and a second original image.
The long-focus camera and the short-focus camera can be jointly mounted on a forward support of an automatic driving vehicle, and are relatively close to each other, and in a long-distance scene, an outdoor scene such as the outdoor scene is a scene greater than 150 meters, and the relative translation amount between the long-focus camera and the short-focus camera mounted on the vehicle is negligible because the outdoor scene is relatively far away from the vehicle, and external parameters needing calibration comprise a relative course angle (yaw) and a relative pitch angle (pitch) and can be respectively represented by psi and theta.
As shown in fig. 3, the coordinate system in which the first original image collected by the tele camera 201 is located may be referred to as a first image coordinate system, the origin and coordinate axes of which are respectively represented by O ', x ', y ', and the coordinate system in which the second original image collected by the tele camera 202 is located may be referred to as a second image coordinate system, the origin and coordinate axes of which are respectively represented by O, x, y; the positive directions of the x-axis and the x ' -axis are generally toward the right of the vehicle body, the positive directions of the y-axis and the y ' -axis are generally toward the upper side of the vehicle body (sky direction), and the positive directions of the three-dimensional z-axis and the z ' -axis (not shown in the figure) are toward the front of the vehicle body (the direction in which the scene to be imaged is located). Since there is an installation error between the long-focus camera and the short-focus camera, that is, there is a relative rotation angle, the specific relative rotation angle includes a relative heading angle ψ, which is a rotation angle of the first image coordinate system in the counterclockwise direction around the y axis of the second image coordinate system, and a relative pitch angle θ, which is a rotation angle of the first image coordinate system in the counterclockwise direction around the x axis of the second image coordinate system.
In order to calibrate the relative heading angle and the relative pitch angle, the embodiment may be determined based on a first original image acquired by the long-focus camera and a second original image acquired by the short-focus camera.
As shown in fig. 4, the first original image and the second original image are respectively represented by a long-focus image and a short-focus image, and the mapped coordinate system is a pixel coordinate system corresponding to the short-focus camera. For a long-focus image, mapping the long-focus image to a pixel coordinate system corresponding to a short-focus camera to obtain a first mapping image, and performing image screenshot on the first mapping image to obtain a window image; and mapping the short-focus image to a pixel coordinate system corresponding to the short-focus camera aiming at the short-focus image to obtain a second mapping image, and taking the second mapping image as a template image.
After the window image and the template image are obtained, the two images are subjected to matching processing, and normalized cross-correlation (Normalized Cross Correlation, NCC) matching is adopted in the embodiment, wherein NCC matching is a matching algorithm based on image gray scale, and the part which is most matched with the window image can be found in the template image.
After NCC matching is adopted, two pixels which are matched with each other can be obtained, namely a first pixel and a second pixel, wherein the first pixel is positioned in a first mapping image, the second pixel is positioned in a second mapping image, the pixel coordinates of the first pixel in the first mapping image are called as first pixel coordinates, and the pixel coordinates of the second pixel in the second mapping image are called as second pixel coordinates. Thereafter, a difference between the two pixel coordinates, which is referred to as a pixel coordinate offset, may be calculated. The directions of the pixel coordinates are the u direction and the v direction, and the corresponding pixel coordinate offsets can be respectively represented by deltau and deltav. And then, calculating the relative course angle and the relative pitch angle of the long-focus camera relative to the short-focus camera based on the pixel coordinate offset.
In combination with the application scenario, the present disclosure further provides a camera external parameter calibration method.
Fig. 5 is a schematic diagram of a second embodiment of the present disclosure, where the present embodiment provides a camera external parameter calibration method, the method includes:
501. and aiming at the same calibration scene, acquiring a first original image acquired by the first camera and a second original image acquired by the second camera.
For example, taking a long-focus camera and a short-focus camera as an example, as shown in fig. 6, for a certain outdoor scene, two images acquired by the long-focus camera and the short-focus camera are a first original image 601 and a second original image 602, respectively.
In order to facilitate the subsequent processing, after the first original image and the second original image acquired by the two cameras are obtained, the first original image and the second original image may be subjected to de-distortion processing and converted into a gray scale image, so that the first original image and the second original image subjected to the subsequent processing are the de-distorted gray scale image.
502. And mapping the first original image to a pixel coordinate system corresponding to the second camera to obtain a first mapped image.
503. And mapping the second original image to a pixel coordinate system corresponding to the second camera to obtain a second mapped image.
In this embodiment, by mapping both the first original image and the second original image to the pixel coordinate system corresponding to the second camera, the subsequent image matching is facilitated, and the processing accuracy is improved.
For a first original image: the mapping may be based on the internal parameters of the first camera and the second camera. The mapping formula can be expressed as:
wherein p is 1 Is the image coordinates, p, of the pixel p in the first original image 1 'is the pixel coordinate, K, of the pixel p' corresponding to the pixel p in the first mapped image 2 Is an internal reference of a short-focus camera, K 1 Is an internal reference of a long-focus camera.
For the second original image: because the second original image and the second mapping image both correspond to the second camera, the mapping of the second original image can be performed by adopting the conversion relationship between the image coordinate system and the pixel coordinate system of the same camera.
504. And acquiring a window image based on the first mapping image, and taking pixels at preset positions in the window image as first pixels.
505. And carrying out matching processing on the window image and the second mapping image to obtain a second pixel matched with the first pixel on the second mapping image.
In this embodiment, the window image is acquired based on the first mapping image, and then the window image and the second mapping image are matched, so that the matching effectiveness can be improved, and the processing efficiency can be improved.
Wherein, for the case that the focal length of the first camera is greater than the focal length of the second camera, if the first camera is a tele camera and the second camera is a short-focus camera, the window image may be an inscribed rectangular image of the first mapping image.
As shown in fig. 7, it is assumed that the upper left, lower left, upper right, and lower right pixels of the first mapped image are respectively represented by p 1 ',p 2 ',p 3 ',p 4 ' the window image is represented by M, and the upper left pixel and the lower right pixel of M are respectively represented by M 1 、m 2 And then m is represented by 1 And m 2 The pixel coordinates of (a) are:
wherein, (m) 1u ,m 1v ) Is m 1 Comprises pixel coordinates m of u-direction 1u And the pixel coordinate m in the v direction 1v ;p 1u ' is p 1 ' pixel coordinates in the u-direction, p 1v ' is p 1 The pixel coordinates of the' v direction; the remaining parameters are of similar meaning.
In this embodiment, since the first mapping image is an image corresponding to a camera with a longer focal length and has a narrower field of view than the second mapping image, a search needs to be performed in the second mapping image during matching, so that a portion related to the window image can be acquired in the second mapping image after taking the inscribed rectangular image of the first mapping image as the window image, thereby improving accuracy of image matching and further improving the external parameter calibration effect.
In this embodiment, if the preset position is selected to be the upper left position, the above m 1 As the first pixel.
The matching process described above may specifically employ NCC matching process.
As shown in fig. 8, a window image 801 is obtained by cutting the first mapping image, a template image 802 matched with the second mapping image as NCC is obtained in the template image 802 by NCC calculation, and the most matched part with the window image 801 is obtained.
In this embodiment, based on NCC matching processing, the first pixel and the second pixel that are matched with each other can be obtained simply and efficiently.
The matching result of NCC matching is within the range of [ -1,1], 1 representing perfect matching, and-1 representing perfect mismatch. The second pixel is a pixel corresponding to the first pixel in the second mapped image when the matching result=1.
The formula can be expressed as:
in order to solve the problem that the template image and the original image are affected by different brightness, the following formula exists:
wherein M is a window image, P 2 ' is the second mapped image; m' is a window image after brightness correction, P 2 "is the second mapped image after brightness correction;
r (u, v) represents the movement of the upper left corner pixel of the brightness corrected window image M' to the brightness corrected template image P 2 "position (u, v);
m '(u', v ') is the pixel gray value at the (u', v ') position of the image M';
P 2 "(u+u ', v+v') is the image P 2 Pixel gray values at the (u+u ', v+v') position of "x;
m (u ', v') is a pixel gradation value at the (u ', v') position of the image M;
m (u ", v") is the pixel gray value at the (u ", v") position of the image M;
P 2 ' where (u+u ', v+v ') is the image P 2 Pixel gray values at the (u+u ", v+v") position of' x;
w and h correspond to the width and height of the window image M, respectively.
Based on the above formula, the second pixel is the second mapped image P when R (u, v) =1 2 A pixel at the (u, v) position of'.
506. And acquiring a first pixel coordinate of a first pixel in the first mapping image and a second pixel coordinate of the second pixel in the second mapping image.
The first pixel is a pixel at a preset position in the window image, and the upper left pixel of the window image is taken as the first pixel if the preset position is the upper left corner position.
Referring to FIG. 7, the first pixel uses m 1 Representing that m can be obtained 1 As the first pixel coordinates, the pixel coordinates in the first mapped image P1' can be used (u 1 ,v 1 ) And (3) representing.
The second pixel is the second mapping image P when R (u, v) =1 2 The pixel at the (u, v) position of 'and the corresponding second pixel coordinate is (u, v) of the second mapped image P2' when R (u, v) =1, using (u 2 ,v 2 ) And (3) representing.
Due to the first and second mapping images P1' and P 2 The' are all located under the pixel coordinate system corresponding to the second camera, so the first pixel coordinate is the pixel coordinate of the first pixel under the pixel coordinate system corresponding to the second camera, and the second pixel coordinate is the pixel coordinate of the second pixel under the pixel coordinate system corresponding to the second camera.
507. And calculating the difference value between the first pixel coordinate and the second pixel coordinate as the pixel coordinate offset.
The pixel coordinate offset includes an offset in the u direction and an offset in the v direction, which are respectively denoted by Δu and Δv, and the calculation formula of the offset is:
where (Δu, Δv) is the pixel coordinate offset, (u) 1 ,v 1 ) Is the first pixel coordinate, (u) 2 ,v 2 ) Is the second pixel coordinate.
In this embodiment, the difference between the first pixel coordinate and the second pixel coordinate is used as the pixel coordinate offset, so that the pixel coordinate offset can be accurately and efficiently obtained, and further the processing efficiency and accuracy are improved.
508. An outlier of the first camera relative to the second camera is determined based on the pixel coordinate offset and a focal length of the second camera.
Specifically, the pixel coordinate offset includes: pixel coordinate offset in the horizontal axis direction and pixel coordinate offset in the vertical axis direction;
the external parameters comprise: a relative heading angle and a relative pitch angle;
the determining an outlier between the first camera and the second camera based on the position coordinate offset includes:
determining a relative heading angle of the first camera relative to the second camera based on the pixel coordinate offset in the lateral axis direction and a focal length of the second camera in the lateral axis direction;
a relative pitch angle of the first camera relative to the second camera is determined based on the pixel coordinate offset in the longitudinal axis direction and a focal length of the second camera in the longitudinal axis direction.
In this embodiment, based on the pixel coordinate offsets and focal lengths in different directions, the relative heading angle and the relative pitch angle can be simply and efficiently determined.
The external parameters comprise a relative course angle and a relative pitch angle which are respectively expressed by psi and theta.
The angle may be calculated based on the pixel coordinate offset and the component of the focal length of the short-focal camera in the x/y direction.
The calculation formula is:
wherein, psi and theta are the relative course angle and the relative pitch angle of the long-focus camera relative to the short-focus camera respectively; Δu, Δv are the pixel coordinate offsets in the u-direction and the v-direction, respectively, that is, the pixel coordinate offset in the horizontal axis direction and the pixel coordinate offset in the vertical axis direction; f (f) x ,f y Respectively short-focus camerasFocal length in x-direction and y-direction, i.e., focal length in the transverse axis direction and focal length in the longitudinal axis direction; x and y are coordinate axes of an image coordinate system corresponding to the short-focus camera, and the calculation formulas are respectively as follows:wherein f is the focal length of the short-focal camera; k is the size of each pixel of the short-focus camera in the x-direction, and l is the size of each pixel of the short-focus camera in the y-direction.
The above is exemplified by acquiring the pixel coordinate offset, and determining the relative heading angle and the relative pitch angle based on the pixel coordinate offset. In some embodiments, an image coordinate offset may also be obtained and a relative heading angle and a relative pitch angle may be determined based on the image coordinate offset.
Specifically, if the first mapped image and the second mapped image are located in the same image coordinate system, the first position coordinates include: a first image coordinate of the first pixel under the same image coordinate system; the second position coordinates include: a second image coordinate of the second pixel under the same image coordinate system; the position coordinate offset includes an image coordinate offset;
The obtaining a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image includes: and calculating the difference value between the first image coordinate and the second image coordinate as the image coordinate offset.
For example, the first image coordinates are used (x 1 ,y 1 ) Is expressed as a second image coordinate (x 2 ,y 2 ) The image coordinate offset is (Δx, Δy), Δx=x 1 -x 2 ,Δy=y 1 -y 2
Further, the image coordinate offset includes: an image coordinate offset in the horizontal axis direction and an image coordinate offset in the vertical axis direction; the external parameters comprise: a relative heading angle and a relative pitch angle; the determining an outlier between the first camera and the second camera based on the position coordinate offset includes: acquiring a pixel coordinate offset in a transverse axis direction based on the image coordinate offset in the transverse axis direction, and determining a relative course angle of the first camera relative to the second camera based on the pixel coordinate offset in the transverse axis direction and the focal length of the second camera in the transverse axis direction; and acquiring a pixel coordinate offset in the longitudinal axis direction based on the image coordinate offset in the longitudinal axis direction, and determining a relative pitch angle of the first camera relative to the second camera based on the pixel coordinate offset in the longitudinal axis direction and the focal length of the second camera in the longitudinal axis direction.
Wherein Δx is an image coordinate offset in the horizontal axis direction, Δy is an image coordinate offset in the vertical axis direction, and the pixel coordinate offset Δu in the horizontal axis direction and the pixel coordinate offset Δv in the vertical axis direction can be obtained from the conversion relationship between the same image coordinate system in which the first mapped image and the second mapped image are located and the pixel coordinate system corresponding to the second camera, and the image coordinate offset can be based on Δu, Δv, and f by using the calculation formula x ,f y And calculating to obtain psi and theta. The same image coordinate system may be an image coordinate system corresponding to the second camera, and correspondingly, the conversion relationship is a conversion relationship between the image coordinate system corresponding to the same camera and the pixel coordinate system, for example, u=x/k, v=y/l; where (u, v) is the pixel coordinates, (x, y) is the image coordinates, and (k, l) is the size of each pixel corresponding to the same camera in the x and y directions.
In this embodiment, by acquiring the image coordinate offset and determining the external parameters between cameras based on the image coordinate offset, external parameter calibration can be performed from the image coordinate angle, thereby improving the diversity of implementation schemes.
The long-focus camera and the short-focus camera which are arranged on the forward support of the automatic driving vehicle are used for calibrating the external parameters of the long-focus camera relative to the short-focus camera based on the acquired images of the two cameras, so that the problem of inaccuracy caused by a calibration mode based on a calibration room is solved. Because the distance between the long-focus camera and the short-focus camera is relatively close, the characteristic of narrow base line between the cameras which are both forward-looking can be utilized, namely, the two phases are not far apart, the distance between the optical centers is relatively close, and the neighborhood of the corresponding point is similar, under a long-distance scene, the translation quantity between the cameras is negligible, only the rotation component is required to be calibrated, and the translation quantity is not required to be calibrated. Therefore, a classical neighborhood cross correlation method is utilized to match long-short focal images, and the relative course angle and the relative pitch angle of the long-short focal camera relative to the short-short focal camera are calibrated rapidly. Thereafter, the external parameters of the tele camera relative to the car body and other sensors may be obtained at intervals based on the external parameters of the tele camera relative to the tele camera. The method has the advantages of accuracy, high efficiency, simplicity and reliability, and provides guarantee for long-distance sensing and stable operation of the unmanned vehicle.
Fig. 9 is a schematic diagram of a third embodiment of the present disclosure, where the present embodiment provides a camera external parameter calibration device 900, and the device includes: an acquisition module 901, a mapping module 902, a matching module 903, a deviation module 904, and a determination module 905.
The acquisition module 901 is used for acquiring a first original image acquired by a first camera and a second original image acquired by a second camera aiming at the same calibration scene; the mapping module 902 is configured to obtain a first mapped image according to the first original image, and obtain a second mapped image based on the second original image; the first mapping image and the second mapping image are located under the same coordinate system; the matching module 903 is configured to perform matching processing on the first mapped image and the second mapped image to determine a first pixel and a second pixel that are matched with each other; the deviation module 904 is configured to obtain a position coordinate offset according to a first position coordinate of the first pixel in the first mapped image and a second position coordinate of the second pixel in the second mapped image; the determining module 905 is configured to determine an external parameter between the first camera and the second camera according to the position coordinate offset and the focal length of the second camera.
In this embodiment, the first original image and the second original image are images for the same calibration scene, and pixels matched with each other exist in the two images, and based on image mapping and image matching, the pixels matched with each other can be obtained; because the position coordinate offset of the matched pixels is caused by the relative relation among the cameras, the relative relation among the cameras can be determined based on the position coordinate offset, namely, the external parameter calibration among the cameras is realized. The external parameter calibration is carried out through the relation between the images, is not limited by the calibration room, and can improve the accuracy of the external parameter calibration of the camera.
In some embodiments, the mapping module 902 is further configured to:
mapping the first original image to a pixel coordinate system corresponding to the second camera to obtain a first mapped image;
and mapping the second original image to a pixel coordinate system corresponding to the second camera to obtain a second mapped image.
In this embodiment, by mapping both the first original image and the second original image to the pixel coordinate system corresponding to the second camera, the subsequent image matching is facilitated, and the processing accuracy is improved.
In some embodiments, the matching module 903 is further configured to:
Acquiring a window image based on the first mapping image, and taking pixels at preset positions in the window image as the first pixels;
and carrying out matching processing on the window image and the second mapping image to obtain a second pixel matched with the first pixel on the second mapping image.
In this embodiment, the window image is acquired based on the first mapping image, and then the window image and the second mapping image are matched, so that the matching effectiveness can be improved, and the processing efficiency can be improved.
In some embodiments, the focal length of the first camera is greater than the focal length of the second camera;
the matching module 903 is further configured to:
and intercepting an inscribed rectangle image of the first mapping image as the window image.
In this embodiment, since the first mapping image is an image corresponding to a camera with a longer focal length and has a narrower field of view than the second mapping image, a search needs to be performed in the second mapping image during matching, so that a portion related to the window image can be acquired in the second mapping image after taking the inscribed rectangular image of the first mapping image as the window image, thereby improving accuracy of image matching and further improving the external parameter calibration effect.
In some embodiments, the matching module 903 is further configured to:
and performing NCC matching processing on the window image and the second mapping image.
In this embodiment, based on NCC matching processing, the first pixel and the second pixel that are matched with each other can be obtained simply and efficiently.
In some embodiments, the first location coordinates include: a first pixel coordinate of the first pixel under a pixel coordinate system corresponding to the second camera;
the second position coordinates include: a second pixel coordinate of the second pixel under a pixel coordinate system corresponding to the second camera;
the position coordinate offset includes a pixel coordinate offset;
the deviation module 904 is further configured to:
and calculating the difference value between the first pixel coordinate and the second pixel coordinate as the pixel coordinate offset.
In this embodiment, the difference between the first pixel coordinate and the second pixel coordinate is used as the pixel coordinate offset, so that the pixel coordinate offset can be accurately and efficiently obtained, and further the processing efficiency and accuracy are improved.
In some embodiments, the pixel coordinate offset includes: pixel coordinate offset in the horizontal axis direction and pixel coordinate offset in the vertical axis direction;
The external parameters comprise: a relative heading angle and a relative pitch angle;
the determining module 905 is further configured to:
determining a relative heading angle of the first camera relative to the second camera based on the pixel coordinate offset in the lateral axis direction and a focal length of the second camera in the lateral axis direction;
a relative pitch angle of the first camera relative to the second camera is determined based on the pixel coordinate offset in the longitudinal axis direction and a focal length of the second camera in the longitudinal axis direction.
In this embodiment, based on the pixel coordinate offsets and focal lengths in different directions, the relative heading angle and the relative pitch angle can be simply and efficiently determined.
In some embodiments, if the first mapped image and the second mapped image are located in the same image coordinate system, the first position coordinates include: a first image coordinate of the first pixel under the same image coordinate system; the second position coordinates include: a second image coordinate of the second pixel under the same image coordinate system; the position coordinate offset includes an image coordinate offset; the deviation module 904 is further configured to: and calculating the difference value between the first image coordinate and the second image coordinate as the image coordinate offset.
In this embodiment, the difference between the first image coordinate and the second image coordinate is used as the image coordinate offset, so that the image coordinate offset can be accurately and efficiently obtained, and further the processing efficiency and accuracy are improved.
In some embodiments, the image coordinate offset includes: an image coordinate offset in the horizontal axis direction and an image coordinate offset in the vertical axis direction; the external parameters comprise: a relative heading angle and a relative pitch angle; the determining module 905 is further configured to:
acquiring a pixel coordinate offset in a transverse axis direction based on the image coordinate offset in the transverse axis direction, and determining a relative course angle of the first camera relative to the second camera based on the pixel coordinate offset in the transverse axis direction and the focal length of the second camera in the transverse axis direction;
and acquiring a pixel coordinate offset in the longitudinal axis direction based on the image coordinate offset in the longitudinal axis direction, and determining a relative pitch angle of the first camera relative to the second camera based on the pixel coordinate offset in the longitudinal axis direction and the focal length of the second camera in the longitudinal axis direction.
Fig. 10 is a schematic diagram of a fourth embodiment according to the present disclosure, which provides a vehicle, such as an autonomous vehicle, the vehicle 1000 including: a first camera 1001 and a second camera 1002. The first camera and the second camera can perform external parameter calibration by adopting the method.
It is to be understood that in the embodiments of the disclosure, the same or similar content in different embodiments may be referred to each other.
It can be understood that "first", "second", etc. in the embodiments of the present disclosure are only used for distinguishing, and do not indicate the importance level, the time sequence, etc.
In the technical scheme of the disclosure, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user accord with the regulations of related laws and regulations, and the public order colloquial is not violated.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 11 illustrates a schematic block diagram of an example electronic device 1100 that can be used to implement embodiments of the present disclosure. The electronic device 1100 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, servers, blade servers, mainframes, and other appropriate computers. The electronic device 1100 may also represent various forms of mobile apparatuses, such as personal digital assistants, cellular telephones, smartphones, wearable devices, and other similar computing apparatuses. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 11, the electronic device 1100 includes a computing unit 1101 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1102 or a computer program loaded from a storage unit 11011 into a Random Access Memory (RAM) 1103. In the RAM 1103, various programs and data required for the operation of the electronic device 1100 can also be stored. The computing unit 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
A number of components in the electronic device 1100 are connected to the I/O interface 1105, including: an input unit 1106 such as a keyboard, a mouse, etc.; an output unit 1107 such as various types of displays, speakers, and the like; a storage unit 1108, such as a magnetic disk, optical disk, etc.; and a communication unit 1109 such as a network card, modem, wireless communication transceiver, or the like. The communication unit 1109 allows the electronic device 1100 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunications networks.
The computing unit 1101 may be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 1101 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 1101 performs the various methods and processes described above, such as the camera exogenous calibration method. For example, in some embodiments, the camera exogenous calibration method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1108. In some embodiments, some or all of the computer programs may be loaded and/or installed onto electronic device 1100 via ROM 1102 and/or communication unit 1109. When the computer program is loaded into the RAM 1103 and executed by the computing unit 1101, one or more steps of the camera exogenous calibration method described above may be performed. Alternatively, in other embodiments, the computing unit 1101 may be configured to perform the camera exogenous calibration method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems-on-chips (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable load balancing apparatus, such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service ("Virtual Private Server" or simply "VPS") are overcome. The server may also be a server of a distributed system or a server that incorporates a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (22)

1. A camera external parameter calibration method comprises the following steps:
aiming at the same calibration scene, a first original image acquired by a first camera and a second original image acquired by a second camera are acquired;
acquiring a first mapping image based on the first original image, and acquiring a second mapping image based on the second original image; the first mapping image and the second mapping image are located under the same coordinate system;
Performing matching processing on the first mapping image and the second mapping image to determine a first pixel and a second pixel which are matched with each other;
acquiring a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image;
and determining an external parameter between the first camera and the second camera based on the position coordinate offset and the focal length of the second camera.
2. The method of claim 1, wherein the acquiring a first mapped image based on the first original image and a second mapped image based on the second original image comprises:
mapping the first original image to a pixel coordinate system corresponding to the second camera to obtain a first mapped image;
and mapping the second original image to a pixel coordinate system corresponding to the second camera to obtain a second mapped image.
3. The method of claim 2, wherein the matching the first and second mapped images to determine first and second pixels that match each other comprises:
Acquiring a window image based on the first mapping image, and taking pixels at preset positions in the window image as the first pixels;
and carrying out matching processing on the window image and the second mapping image to obtain a second pixel matched with the first pixel on the second mapping image.
4. The method of claim 3, wherein,
the focal length of the first camera is greater than the focal length of the second camera;
the obtaining a window image based on the first mapping image includes:
and intercepting an inscribed rectangle image of the first mapping image as the window image.
5. A method according to claim 3, wherein said matching of said window image and said second map image comprises:
and carrying out normalized cross-correlation NCC matching processing on the window image and the second mapping image.
6. The method of claim 2, wherein,
the first position coordinates include: a first pixel coordinate of the first pixel under a pixel coordinate system corresponding to the second camera;
the second position coordinates include: a second pixel coordinate of the second pixel under a pixel coordinate system corresponding to the second camera;
The position coordinate offset includes a pixel coordinate offset;
the obtaining a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image includes:
and calculating the difference value between the first pixel coordinate and the second pixel coordinate as the pixel coordinate offset.
7. The method of claim 6, wherein,
the pixel coordinate offset includes: pixel coordinate offset in the horizontal axis direction and pixel coordinate offset in the vertical axis direction;
the external parameters comprise: a relative heading angle and a relative pitch angle;
the determining an outlier between the first camera and the second camera based on the position coordinate offset includes:
determining a relative heading angle of the first camera relative to the second camera based on the pixel coordinate offset in the lateral axis direction and a focal length of the second camera in the lateral axis direction;
a relative pitch angle of the first camera relative to the second camera is determined based on the pixel coordinate offset in the longitudinal axis direction and a focal length of the second camera in the longitudinal axis direction.
8. The method of claim 1, wherein,
if the first mapped image and the second mapped image are located in the same image coordinate system,
the first position coordinates include: a first image coordinate of the first pixel under the same image coordinate system;
the second position coordinates include: a second image coordinate of the second pixel under the same image coordinate system;
the position coordinate offset includes an image coordinate offset;
the obtaining a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image includes:
and calculating the difference value between the first image coordinate and the second image coordinate as the image coordinate offset.
9. The method of claim 8, wherein,
the image coordinate offset includes: an image coordinate offset in the horizontal axis direction and an image coordinate offset in the vertical axis direction;
the external parameters comprise: a relative heading angle and a relative pitch angle;
the determining an outlier between the first camera and the second camera based on the position coordinate offset includes:
Acquiring a pixel coordinate offset in a transverse axis direction based on the image coordinate offset in the transverse axis direction, and determining a relative course angle of the first camera relative to the second camera based on the pixel coordinate offset in the transverse axis direction and the focal length of the second camera in the transverse axis direction;
and acquiring a pixel coordinate offset in the longitudinal axis direction based on the image coordinate offset in the longitudinal axis direction, and determining a relative pitch angle of the first camera relative to the second camera based on the pixel coordinate offset in the longitudinal axis direction and the focal length of the second camera in the longitudinal axis direction.
10. A camera exogenous reference calibration device, comprising:
the acquisition module is used for acquiring a first original image acquired by the first camera and a second original image acquired by the second camera aiming at the same calibration scene;
the mapping module is used for acquiring a first mapping image according to the first original image and acquiring a second mapping image based on the second original image; the first mapping image and the second mapping image are located under the same coordinate system;
the matching module is used for carrying out matching processing on the first mapping image and the second mapping image so as to determine a first pixel and a second pixel which are matched with each other;
The deviation module is used for acquiring a position coordinate offset according to a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image;
and the determining module is used for determining the external parameters between the first camera and the second camera according to the position coordinate offset and the focal length of the second camera.
11. The apparatus of claim 10, wherein the mapping module is further to:
mapping the first original image to a pixel coordinate system corresponding to the second camera to obtain a first mapped image;
and mapping the second original image to a pixel coordinate system corresponding to the second camera to obtain a second mapped image.
12. The apparatus of claim 11, wherein the matching module is further to:
acquiring a window image based on the first mapping image, and taking pixels at preset positions in the window image as the first pixels;
and carrying out matching processing on the window image and the second mapping image to obtain a second pixel matched with the first pixel on the second mapping image.
13. The apparatus of claim 12, wherein,
the focal length of the first camera is greater than the focal length of the second camera;
the matching module is further configured to:
and intercepting an inscribed rectangle image of the first mapping image as the window image.
14. The apparatus of claim 12, wherein the matching module is further to:
and carrying out normalized cross-correlation NCC matching processing on the window image and the second mapping image.
15. The apparatus of claim 11, wherein,
the first position coordinates include: a first pixel coordinate of the first pixel under a pixel coordinate system corresponding to the second camera;
the second position coordinates include: a second pixel coordinate of the second pixel under a pixel coordinate system corresponding to the second camera;
the position coordinate offset includes a pixel coordinate offset;
the bias module is further to:
and calculating the difference value between the first pixel coordinate and the second pixel coordinate as the pixel coordinate offset.
16. The apparatus of claim 13, wherein,
the pixel coordinate offset includes: pixel coordinate offset in the horizontal axis direction and pixel coordinate offset in the vertical axis direction;
The external parameters comprise: a relative heading angle and a relative pitch angle;
the determination module is further to:
determining a relative heading angle of the first camera relative to the second camera based on the pixel coordinate offset in the lateral axis direction and a focal length of the second camera in the lateral axis direction;
a relative pitch angle of the first camera relative to the second camera is determined based on the pixel coordinate offset in the longitudinal axis direction and a focal length of the second camera in the longitudinal axis direction.
17. The apparatus of claim 10, wherein,
if the first mapped image and the second mapped image are located in the same image coordinate system,
the first position coordinates include: a first image coordinate of the first pixel under the same image coordinate system;
the second position coordinates include: a second image coordinate of the second pixel under the same image coordinate system;
the position coordinate offset includes an image coordinate offset;
the obtaining a position coordinate offset based on a first position coordinate of the first pixel in the first mapping image and a second position coordinate of the second pixel in the second mapping image includes:
And calculating the difference value between the first image coordinate and the second image coordinate as the image coordinate offset.
18. The apparatus of claim 17, wherein,
the image coordinate offset includes: an image coordinate offset in the horizontal axis direction and an image coordinate offset in the vertical axis direction;
the external parameters comprise: a relative heading angle and a relative pitch angle;
the determining an outlier between the first camera and the second camera based on the position coordinate offset includes:
acquiring a pixel coordinate offset in a transverse axis direction based on the image coordinate offset in the transverse axis direction, and determining a relative course angle of the first camera relative to the second camera based on the pixel coordinate offset in the transverse axis direction and the focal length of the second camera in the transverse axis direction;
and acquiring a pixel coordinate offset in the longitudinal axis direction based on the image coordinate offset in the longitudinal axis direction, and determining a relative pitch angle of the first camera relative to the second camera based on the pixel coordinate offset in the longitudinal axis direction and the focal length of the second camera in the longitudinal axis direction.
19. A vehicle, comprising:
A forward mount, and a first camera and a second camera mounted on the forward mount;
external parameters between the first camera and the second camera are calibrated by the method according to any of claims 1-9.
20. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
21. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-9.
22. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any of claims 1-9.
CN202311605707.0A 2023-11-28 2023-11-28 Camera external parameter calibration method, device, equipment and vehicle Pending CN117523004A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311605707.0A CN117523004A (en) 2023-11-28 2023-11-28 Camera external parameter calibration method, device, equipment and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311605707.0A CN117523004A (en) 2023-11-28 2023-11-28 Camera external parameter calibration method, device, equipment and vehicle

Publications (1)

Publication Number Publication Date
CN117523004A true CN117523004A (en) 2024-02-06

Family

ID=89760611

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311605707.0A Pending CN117523004A (en) 2023-11-28 2023-11-28 Camera external parameter calibration method, device, equipment and vehicle

Country Status (1)

Country Link
CN (1) CN117523004A (en)

Similar Documents

Publication Publication Date Title
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
CN111862201B (en) Deep learning-based spatial non-cooperative target relative pose estimation method
CN113989450B (en) Image processing method, device, electronic equipment and medium
CN108279670B (en) Method, apparatus and computer readable medium for adjusting point cloud data acquisition trajectory
CN112967344B (en) Method, device, storage medium and program product for calibrating camera external parameters
US20140300736A1 (en) Multi-sensor camera recalibration
WO2020119467A1 (en) High-precision dense depth image generation method and device
WO2022183685A1 (en) Target detection method, electronic medium and computer storage medium
CN112700486B (en) Method and device for estimating depth of road surface lane line in image
CN111368927A (en) Method, device and equipment for processing labeling result and storage medium
CN112017236A (en) Method and device for calculating position of target object based on monocular camera
CN115719436A (en) Model training method, target detection method, device, equipment and storage medium
CN113496503B (en) Point cloud data generation and real-time display method, device, equipment and medium
CN117232499A (en) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN113450334B (en) Overwater target detection method, electronic equipment and storage medium
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN114255274A (en) Vehicle positioning method, system, equipment and storage medium based on two-dimension code recognition
CN113554712A (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN117523004A (en) Camera external parameter calibration method, device, equipment and vehicle
CN114926545A (en) Camera calibration precision evaluation method and device, electronic equipment and storage medium
CN112991463A (en) Camera calibration method, device, equipment, storage medium and program product
CN112598736A (en) Map construction based visual positioning method and device
CN115018935B (en) Calibration method and device for camera and vehicle, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination