CN117315048B - External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium - Google Patents

External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium Download PDF

Info

Publication number
CN117315048B
CN117315048B CN202311560281.1A CN202311560281A CN117315048B CN 117315048 B CN117315048 B CN 117315048B CN 202311560281 A CN202311560281 A CN 202311560281A CN 117315048 B CN117315048 B CN 117315048B
Authority
CN
China
Prior art keywords
vanishing
original image
camera
points
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311560281.1A
Other languages
Chinese (zh)
Other versions
CN117315048A (en
Inventor
李兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DeepRoute AI Ltd
Original Assignee
DeepRoute AI Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DeepRoute AI Ltd filed Critical DeepRoute AI Ltd
Priority to CN202311560281.1A priority Critical patent/CN117315048B/en
Publication of CN117315048A publication Critical patent/CN117315048A/en
Application granted granted Critical
Publication of CN117315048B publication Critical patent/CN117315048B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application discloses an external parameter self-calibration method of a vehicle-mounted camera. The method comprises the following steps: acquiring two original images acquired by any two cameras on a vehicle correspondingly; performing vanishing point detection and feature point detection on the two original images to obtain vanishing points and feature points in the two original images; matching the vanishing points in the two original images to obtain matched vanishing point pairs in the two images, and determining vanishing point residual equations corresponding to the vanishing point pairs based on the vanishing point pairs; performing matching operation on the characteristic points in the two original images to obtain matched characteristic point pairs in the two images, and determining a characteristic point residual equation corresponding to the characteristic point pairs based on the characteristic point pairs; and carrying out optimization solution on an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair to obtain external parameters of any two cameras. The application also discloses an electronic device and a storage medium. The method and the device improve the accuracy and the robustness of camera external parameter calibration.

Description

External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium
Technical Field
The disclosed embodiments of the present application relate to the technical field of intelligent driving of vehicles, and more particularly, to an external parameter self-calibration method of an on-board camera, an electronic device, and a storage medium.
Background
Camera external parameters can be used to determine the relative positional relationship between camera coordinates and the world coordinate system, which is important in fields where it is desirable to determine the position of an object from an image, such as autopilot. The multi-camera external parameter calibration algorithm is a hotspot problem of academic circles and engineering circles, and calibration methods based on a calibration workshop, an SLAM algorithm, a motion and the like are common, but the online calibration of the camera external parameter still has the problems of poor robustness, low precision and the like.
Disclosure of Invention
According to an embodiment of the application, the application provides an external parameter self-calibration method of a vehicle-mounted camera, electronic equipment and a storage medium, so as to solve the problems.
The first aspect of the application discloses an external parameter self-calibration method of a vehicle-mounted camera, which comprises the following steps: acquiring two original images acquired by any two cameras on a vehicle correspondingly; detecting vanishing points of the two original images to obtain vanishing points in the two original images; matching the vanishing points in the two original images to obtain a matched vanishing point pair in the two images, and determining a vanishing point residual equation corresponding to the vanishing point pair based on the vanishing point pair; detecting the characteristic points of the two original images to obtain the characteristic points in the two original images; performing matching operation on the characteristic points in the two original images to obtain matched characteristic point pairs in the two images, and determining a characteristic point residual equation corresponding to the characteristic point pairs based on the characteristic point pairs; and carrying out optimization solution on an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair to obtain the external parameters of any two cameras.
In some embodiments, the two cameras comprise a first camera and a second camera, the two original images comprising a first original image acquired by the first camera and a second original image acquired by the second camera, wherein the first original image and the second original image have portions of the same area; detecting vanishing points of the two original images to obtain vanishing points in the two original images, including: detecting vanishing points of the first original image to obtain first vanishing points in the first original image; and detecting vanishing points of the second original image to obtain second vanishing points in the second original image.
In some embodiments, performing a matching operation on vanishing points in the two original images to obtain a matched vanishing point pair in the two images includes: acquiring a first ray direction vector corresponding to a first vanishing point in the first original image under a coordinate system of the first camera; acquiring a second ray direction vector corresponding to a second vanishing point in the second original image under the coordinate system of the second camera; acquiring a corresponding included angle between the first ray direction vector and the second ray direction vector; and obtaining a vanishing point pair matched with a first vanishing point in the first original image and a second vanishing point in the second original image according to the included angle between the first ray direction vector and the second ray direction vector, wherein the vanishing point pair meets a preset condition.
In some embodiments, the preset condition includes that the vanishing point pair only matches each other, and an included angle between a first ray direction vector corresponding to a first vanishing point and a second ray direction vector corresponding to a second vanishing point in the vanishing point pair is smaller than a preset value.
In some embodiments, the vanishing point residual equation for the vanishing point pair is characterized by the included angle for the vanishing point pair.
In some embodiments, the two cameras comprise a first camera and a second camera, the two original images comprising a first original image acquired by the first camera and a second original image acquired by the second camera, wherein the first original image and the second original image have portions of the same area; detecting the characteristic points of the two original images to obtain the characteristic points in the two original images, wherein the characteristic points comprise: detecting the characteristic points of the first original image to obtain first characteristic points in the first original image; detecting the characteristic points of the second original image to obtain second characteristic points in the second original image; performing matching operation on the feature points in the two original images to obtain matched feature point pairs in the two images, wherein the matching operation comprises the following steps: and performing feature point matching based on the same region of the first original image and the second original image to obtain a feature point pair of which the first feature point in the first original image is matched with the second feature point in the second original image.
In some embodiments, determining a feature point residual equation corresponding to the feature point pair based on the feature point pair includes: acquiring a third ray direction vector corresponding to a first characteristic point in the characteristic point pair under a coordinate system of the first camera; acquiring a fourth ray direction vector corresponding to a second characteristic point in the characteristic point pair under the coordinate system of the second camera; and determining a characteristic point residual equation corresponding to the characteristic point pair according to the third ray direction vector and the fourth ray direction vector.
The second aspect of the application discloses an external parameter self-calibration method of a vehicle-mounted camera, which comprises the following steps: acquiring at least two original images acquired by at least two cameras on a vehicle correspondingly; performing a calibration process by using each two original images in the at least two original images to obtain external parameters of two cameras corresponding to the two original images; the calibration process includes the external parameter self-calibration method of the vehicle-mounted camera in the first aspect.
A third aspect of the application discloses an electronic device, including a memory and a processor coupled to each other, where the processor is configured to execute program instructions stored in the memory, so as to implement the method for self-calibrating an external parameter of the vehicle-mounted camera described in the first aspect, or to implement the method for self-calibrating an external parameter of the vehicle-mounted camera described in the second aspect.
A fourth aspect of the present application discloses a non-transitory computer readable storage medium having stored thereon program instructions that, when executed by a processor, implement the method of self-calibrating external parameters of an in-vehicle camera described in the first aspect, or implement the method of self-calibrating external parameters of an in-vehicle camera described in the second aspect.
The beneficial effects of this application are: acquiring two original images acquired by any two cameras on a vehicle correspondingly; vanishing point detection and characteristic point detection are carried out on the two original images to obtain vanishing points and characteristic points in the two original images, matching operation is carried out on the two original images to obtain a vanishing point pair and a characteristic point pair matched in the two images, a vanishing point residual equation corresponding to the vanishing point pair and a characteristic point residual equation corresponding to the characteristic point pair are respectively determined, and an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair is optimized and solved, so that external parameters of any two cameras can be obtained, and accuracy and robustness of external parameter calibration of the cameras are improved.
Drawings
The application will be further described with reference to the accompanying drawings and embodiments, in which:
FIG. 1 is a flow chart of an external parameter self-calibration method of an in-vehicle camera according to an embodiment of the present application;
FIG. 2 is a flow chart of an external parameter self-calibration method of an in-vehicle camera according to another embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a nonvolatile computer-readable storage medium according to an embodiment of the present application.
Detailed Description
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The term "and/or" in this application is merely an association relation describing an associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. Further, "a plurality" herein means two or more than two. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C. Furthermore, the terms "first," "second," and "third" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In order to enable those skilled in the art to better understand the technical solutions of the present application, the technical solutions of the present application are described in further detail below with reference to the accompanying drawings and the detailed description.
Referring to fig. 1, fig. 1 is a flow chart of an external parameter self-calibration method of an in-vehicle camera according to an embodiment of the disclosure. The execution subject of the method may be an electronic device or an in-vehicle device with computing functionality.
It should be noted that, if there are substantially the same results, the method of the present application is not limited to the flow sequence shown in fig. 1.
In some possible implementations, the method may be implemented by a processor invoking computer readable instructions stored in a memory, as shown in fig. 1, and may include the steps of:
s11: and acquiring two original images acquired by any two cameras on the vehicle.
Two original images which are correspondingly acquired by any two cameras on the vehicle are acquired, wherein the cameras on the vehicle can be a rear-view camera, a front-view camera, a side-view camera, a round-view camera and the like, and can also be a long-focus camera and a short-focus camera with the same visual angle. For example, any two cameras may be camera l and camera r, where at the same time t, camera l captures an original image a and camera r captures an original image B.
S12: and detecting vanishing points of the two original images to obtain vanishing points in the two original images.
The vanishing point detection is performed on two original images acquired correspondingly by the two cameras, wherein the vanishing point refers to a projection point of an intersection point of a group of parallel lines in the images, for example, vanishing point detection is performed on the original image A and the original image B, so that vanishing points in the original image A and vanishing points in the original image B can be obtained, wherein a plurality of vanishing points can be obtained, or the vanishing points can be determined as one according to a preset rule, for example, the vanishing points can be lane line vanishing points, namely, in visual perception, when a vehicle runs far, the lane line on a road seems to be converged.
S13: and carrying out matching operation on vanishing points in the two original images to obtain matched vanishing point pairs in the two images, and determining a vanishing point residual equation corresponding to the vanishing point pairs based on the vanishing point pairs.
And performing matching operation on vanishing points in the two original images, for example, performing matching operation on vanishing points in the original image A and the original image B, namely determining corresponding vanishing points between the original image A and the original image B, and removing wrong vanishing points, thereby obtaining matched vanishing point pairs in the two images. Based on the vanishing point pair, determining a vanishing point residual equation corresponding to the vanishing point pair, for example, obtaining a vector representation of the vanishing point in the vanishing point pair, and constructing a residual equation related to the vehicle-mounted camera external parameter by the vector representation of the vanishing point in the vanishing point pair, namely, determining the vanishing point residual equation corresponding to the vanishing point pair.
S14: and detecting the characteristic points of the two original images to obtain the characteristic points in the two original images.
And detecting the characteristic points of the two original images by using a detection algorithm, for example, detecting the 2D characteristic points of the original image A and the original image B by using the detection algorithm to obtain the characteristic points in the two original images, and obtaining the 2D characteristic points in the original image A and the original image B.
S15: and carrying out matching operation on the characteristic points in the two original images to obtain matched characteristic point pairs in the two images, and determining a characteristic point residual equation corresponding to the characteristic point pairs based on the characteristic point pairs.
And (3) performing matching operation on the characteristic points in the two original images, for example, performing matching operation on the 2D characteristic points in the original image A and the original image B by utilizing the common view area of the camera l and the camera r to obtain matched characteristic point pairs in the two images, namely obtaining matched characteristic point pairs in the original image A and the original image B, wherein the matched characteristic point pairs can be understood to be formed by the characteristic points of the same object in the original image A and the characteristic points in the original image B. And determining a characteristic point residual equation corresponding to the characteristic point pair based on the characteristic point pair, for example, acquiring vector representation of the characteristic point in the characteristic point pair, and constructing a residual equation related to the external parameters of the vehicle-mounted camera by the vector representation of the characteristic point in the characteristic point pair, namely, determining the characteristic point residual equation corresponding to the characteristic point pair.
S16: and carrying out optimization solution on an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair to obtain external parameters of any two cameras.
And carrying out optimization solution on an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair to obtain external parameters of any two cameras, namely constructing a nonlinear optimization equation set according to the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair, so as to realize optimization of the external parameters of any two cameras, for example, the external parameters can be rotation external parameters of the cameras.
In the embodiment, two original images acquired by any two cameras on a vehicle are acquired correspondingly; vanishing point detection and characteristic point detection are carried out on the two original images to obtain vanishing points and characteristic points in the two original images, matching operation is carried out on the two original images to obtain a vanishing point pair and a characteristic point pair matched in the two images, a vanishing point residual equation corresponding to the vanishing point pair and a characteristic point residual equation corresponding to the characteristic point pair are respectively determined, and an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair is optimized and solved, so that external parameters of any two cameras can be obtained, and accuracy and robustness of external parameter calibration of the cameras are improved.
In some embodiments, any two cameras include a first camera and a second camera, and the two original images include a first original image captured by the first camera and a second original image captured by the second camera, wherein the first original image and the second original image have portions of the same area.
Any two cameras comprise a first camera and a second camera, at this time, two original images acquired by any two cameras correspondingly can be a first original image acquired by the first camera and a second original image acquired by the second camera, wherein the first original image and the second original image have parts with the same area, for example, the first camera and the second camera have the same view angle and are front cameras of vehicles with different focal lengths, and the same area of the first original image acquired by the first camera and the second original image acquired by the second camera can be an area under the front view of the vehicles.
Further, vanishing point detection is performed on the two original images to obtain vanishing points in the two original images, including: detecting vanishing points of the first original image to obtain first vanishing points in the first original image; and detecting vanishing points of the second original image to obtain second vanishing points in the second original image.
Detecting vanishing points of the original image to obtain vanishing points in the original image, wherein the vanishing points detection method comprises the steps of performing de-distortion operation on the original image to obtain a de-distortion result diagram corresponding to the original image; performing straight line detection based on the de-distortion result graph to obtain a straight line segment detection result; and estimating vanishing points of the straight Line segment detection result by using a preset algorithm to further obtain vanishing points in the original image, wherein the straight Line detection can be completed by a LSD (Line Segment Detector) straight Line segment detection algorithm, and also can be completed by a canny Edge detection algorithm and a Hough Line straight Line detection algorithm together, and the vanishing point estimation can be completed by a J-Linkage algorithm.
Performing vanishing point detection on the first original image, namely performing de-distortion operation on the first original image to obtain a first de-distortion result image corresponding to the first original image, performing straight line detection based on the first de-distortion result image to obtain a straight line segment detection result, and further performing vanishing point estimation on the straight line segment detection result by using a J-link algorithm to obtain a first vanishing point in the first original image. And performing vanishing point detection on the second original image, namely performing de-distortion operation on the second original image to obtain a second de-distortion result diagram corresponding to the second original image, performing straight line detection based on the second de-distortion result diagram to obtain a straight line segment detection result, and further performing vanishing point estimation on the straight line segment detection result by using a J-link algorithm to obtain a second vanishing point in the second original image.
In some embodiments, performing a matching operation on vanishing points in two original images to obtain a matched vanishing point pair in the two images includes: acquiring a first ray direction vector corresponding to a first vanishing point in a first original image under a coordinate system of a first camera; acquiring a second ray direction vector corresponding to a second vanishing point in a second original image under a coordinate system of a second camera; acquiring a corresponding included angle between the first ray direction vector and the second ray direction vector; and obtaining a vanishing point pair matched with the first vanishing point in the first original image and the second vanishing point in the second original image according to the included angle between the first ray direction vector and the second ray direction vector, wherein the vanishing point pair meets the preset condition.
In particular, the 2D spatial coordinates of the disappearance in the original image can be obtainedThe back projection can be converted into a ray direction vector +.>Wherein K is an internal reference matrix after image de-distortion.
Acquiring first ray direction vectors corresponding to first vanishing points in the first original image under the coordinate system of the first camera, for example, detecting corresponding m vanishing points in the first original image, which are respectivelyThe ray direction vector of the ith vanishing point in the first original image in the first camera coordinate system can be calculated>K is an internal reference matrix after image de-distortion; acquiring a second ray direction vector corresponding to a second vanishing point in the second original image under the coordinate system of the second camera, for example, detecting n vanishing points corresponding to the second original image respectivelyThe ray direction vector of the jth vanishing point in the second original image in the second camera coordinate system can be calculated>Wherein K is the image after de-distortionAn internal reference matrix.
Further, a corresponding angle between the first ray direction vector and the second ray direction vector is obtained, e.g. the first ray direction vector isThe second ray direction vector is +>And taking into account the external parameters of the vehicle camera, i.e. the rotational external parameters between the first camera and the second camera +.>At this time, the corresponding angle between the first ray direction vector and the second ray direction vector may be expressed as +.>It can be understood that if the corresponding vanishing point is exactly the vanishing point of the same group of parallel lines and the external parameters are accurate, the corresponding included angle +.>Is 0.
According to the angle between the first ray direction vector and the second ray direction vector, for example, the corresponding angle between the first ray direction vector and the second ray direction vector is determined asBased onAnd carrying out matching judgment to obtain a vanishing point pair of which the first vanishing point in the first original image is matched with the second vanishing point in the second original image, wherein the vanishing point pair meets a preset condition, for example, the preset condition can be about the magnitude of an included angle between the first ray direction vector and the second ray direction vector, namely, the vanishing point pair meeting the preset condition can be obtained through the matching judgment.
In some embodiments, the preset condition includes that the vanishing point pair only matches each other, and an included angle between a first ray direction vector corresponding to a first vanishing point in the vanishing point pair and a second ray direction vector corresponding to a second vanishing point is smaller than a preset value.
According to the included angle between the first ray direction vector and the second ray direction vectorObtaining a vanishing point pair of which a first vanishing point in the first original image matches a second vanishing point in the second original image, wherein the vanishing point pair meets a preset condition, wherein the preset condition comprises that the vanishing point pair only matches each other, it is understood that there is and only one vanishing point in the first original image, e.g. the first vanishing point->In the second original image there is and only one vanishing point, e.g. second vanishing point +.>At this time, the first vanishing point +.>Only with the second vanishing point->Matching is carried out; the preset condition further includes that the included angle between the first ray direction vector corresponding to the first vanishing point and the second ray direction vector corresponding to the second vanishing point in the vanishing point pair is smaller than a preset value, for example, the preset value is 5 degrees, that is, the included angle between the first ray direction vector corresponding to the first vanishing point and the second ray direction vector corresponding to the second vanishing point in the vanishing point pair->Less than 5 degrees.
In some embodiments, the vanishing point residual equation for a vanishing point pair is characterized by the included angle for the vanishing point pair.
Obtaining pairs of vanishing points in the first original image matching the second vanishing points in the second original image, e.g. from the first vanishing pointsAnd the second vanishing point->A vanishing point pair successfully matched, so that the corresponding vanishing point residual equation of the vanishing point pair can be represented by the corresponding included angle of the vanishing point pair, and the vanishing point residual equation can be expressed asIt can be appreciated that the vanishing point residual equation is based on rotation between the first camera and the second camera by extrinsic +.>And constructing a residual equation based on the included angle between the obtained first ray direction vector and the second ray direction vector.
In some embodiments, any two cameras include a first camera and a second camera, and the two original images include a first original image captured by the first camera and a second original image captured by the second camera, wherein the first original image and the second original image have portions of the same area.
Specifically, the two original images acquired by the two cameras are the first original image acquired by the first camera and the second original image acquired by the second camera, wherein the first original image and the second original image have the same area, for example, the first camera and the second camera have the same viewing angle, are front cameras of vehicles with different focal lengths, and the same area of the first original image acquired by the first camera and the second original image acquired by the second camera can be an area in front of the vehicles.
Further, feature point detection is performed on the two original images to obtain feature points in the two original images, including: detecting characteristic points of the first original image to obtain first characteristic points in the first original image; detecting the characteristic points of the second original image to obtain second characteristic points in the second original image; performing matching operation on the feature points in the two original images to obtain matched feature point pairs in the two images, wherein the matching operation comprises the following steps: and performing feature point matching based on the same region of the first original image and the second original image to obtain feature point pairs with the first feature point in the first original image matched with the second feature point in the second original image.
For example, a feature extraction and matching algorithm such as SIFT, SURF, AKZE is used to detect 2D feature points in an image, feature point detection is performed on a first original image, so as to obtain a first feature point in the first original image, that is, the feature extraction and matching algorithm is used to detect the 2D feature points in the first original image, and feature point detection is performed on a second original image, so as to obtain a second feature point in the second original image, that is, the feature extraction and matching algorithm is used to detect the 2D feature points in the second original image.
And performing feature point matching on the same region of the first original image and the second original image, for example, performing matching operation on the 2D feature points in the first original image and the second original image to obtain a feature point pair of which the first feature point in the first original image is matched with the second feature point in the second original image, namely obtaining a matched feature point pair in the first original image and the second original image, wherein the matched feature point pair can be understood to be formed by different feature points of the same object in the first original image and the second original image.
In some embodiments, determining a feature point residual equation corresponding to a feature point pair based on the feature point pair includes: acquiring a third ray direction vector corresponding to a first characteristic point in the characteristic point pair under a coordinate system of a first camera; acquiring a fourth ray direction vector corresponding to a second characteristic point in the characteristic point pair under a coordinate system of a second camera; and determining a characteristic point residual equation corresponding to the characteristic point pair according to the third ray direction vector and the fourth ray direction vector.
Acquiring a third ray direction vector corresponding to a first feature point in the pair of feature points under the coordinate system of the first camera, e.g. corresponding m features are detected in the first original imageDots respectively ofThe ray direction vector of the ith feature point in the first original image in the first camera coordinate system can be calculated>K is an internal reference matrix after image de-distortion; acquiring fourth ray direction vectors corresponding to second feature points in the feature point pairs under the coordinate system of the second camera, for example, detecting corresponding n feature points in the second original image, wherein the n feature points are +.>The ray direction vector of the jth feature point in the second original image under the second camera coordinate system can be calculatedWherein K is an internal reference matrix after image de-distortion.
Determining a characteristic point residual equation corresponding to the characteristic point pair according to the third ray direction vector and the fourth ray direction vector, for example, considering the rotation external parameters of the vehicle-mounted cameraAt this time, the characteristic point residual equation can be expressed as using epipolar constraintWherein->Is a transformation matrix between the first camera coordinate system and the second camera coordinate system.
Further, the optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair is optimized and solved to obtain external parameters of any two cameras, and understandably, the vanishing point constraint is realizedConstraint with the polar line>And constructing a nonlinear optimization equation set to optimize and obtain rotation external parameters of any two cameras, so that the on-line calibration of the vehicle-mounted camera which does not depend on equipment and does not need manual intervention can be realized.
Referring to fig. 2, fig. 2 is a flow chart of an external parameter self-calibration method of an in-vehicle camera according to another embodiment of the present application, and the method can be applied to an in-vehicle device with functions of calculation and the like. It should be noted that, if there are substantially the same results, the method of the present application is not limited to the flow sequence shown in fig. 2.
In some possible implementations, the method may be implemented by a processor invoking computer readable instructions stored in a memory, as shown in fig. 2, and may include the steps of:
s21: at least two original images acquired by at least two cameras on the vehicle are acquired correspondingly.
At least two original images acquired by at least two cameras on the vehicle are acquired correspondingly, and the cameras on the vehicle can be a rear-view camera, a front-view camera, a side-view camera, a round-view camera and the like, and can also be a long-focus camera and a short-focus camera with the same visual angle. For example, the at least two cameras may be camera l, camera r, and camera f, the camera l capturing at least two original images a, the camera r capturing at least two original images B, and the camera f capturing at least two original images C at the same time t.
S22: and carrying out a calibration process by using each two original images in the at least two original images to obtain external parameters of the two cameras corresponding to the two original images.
The calibration process is performed by using each two original images in the at least two original images, for example, the calibration process may be performed by performing the calibration process on one original image a and one original image B in the at least two original images a and the at least two original images B to obtain external parameters of the two cameras corresponding to the two original images, that is, external parameters between the camera l and the camera r, where the calibration process includes an external parameter self-calibration method of the vehicle-mounted camera as described above, and specific contents are not repeated. The corresponding two camera external parameters are obtained by carrying out calibration optimization on each two original images in the at least two original images, and the external parameters corresponding to the at least two cameras on the vehicle can be obtained by a calibration process in sequence.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 30 comprises a memory 31 and a processor 32 coupled to each other, the processor 32 being configured to execute program instructions stored in the memory 31 to implement the steps of the above-described embodiment of the external reference self-calibration method of the onboard camera. In one particular implementation scenario, electronic device 30 may include, but is not limited to: the microcomputer and the server are not limited herein.
In particular, the processor 32 is adapted to control itself and the memory 31 to implement the steps of the external parameter self-calibration method embodiment of the above-described onboard camera. The processor 32 may also be referred to as a CPU (Central Processing Unit ), and the processor 32 may be an integrated circuit chip with signal processing capabilities. The processor 32 may also be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 32 may be commonly implemented by an integrated circuit chip.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a non-volatile computer readable storage medium according to an embodiment of the present application. The non-transitory computer readable storage medium 40 is used to store a computer program 401, which computer program 401, when executed by a processor, for example by the processor 32 in the above-described embodiment of fig. 3, is used to implement the steps of the above-described embodiment of the self-calibration method for an on-board camera.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
In the several embodiments provided in this application, it should be understood that the disclosed methods and related devices may be implemented in other ways. For example, the above-described embodiments of related devices are merely illustrative, e.g., the division of modules or elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication disconnection between the illustrated or discussed elements may be through some interface, indirect coupling or communication disconnection of a device or element, electrical, mechanical, or other form.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all or part of the technical solution contributing to the prior art or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those skilled in the art will readily appreciate that many modifications and variations are possible in the device and method while maintaining the teachings of the present application. Accordingly, the above disclosure should be viewed as limited only by the scope of the appended claims.

Claims (9)

1. The external parameter self-calibration method of the vehicle-mounted camera is characterized by comprising the following steps of:
acquiring two original images acquired by any two cameras on a vehicle correspondingly;
detecting vanishing points of the two original images to obtain vanishing points in the two original images;
performing matching operation on vanishing points in the two original images to obtain a matched vanishing point pair in the two original images, and determining a vanishing point residual equation corresponding to the vanishing point pair based on the vanishing point pair, wherein the vanishing point pair is a vanishing point which is matched with each other in the two original images and is determined according to an included angle between ray direction vectors, and the vanishing point residual equation is characterized by the included angle corresponding to the vanishing point pair;
detecting the characteristic points of the two original images to obtain the characteristic points in the two original images;
performing matching operation on the characteristic points in the two original images to obtain matched characteristic point pairs in the two original images, and determining a characteristic point residual equation corresponding to the characteristic point pairs based on the characteristic point pairs, wherein the characteristic point pairs are characteristic points with the characteristic points matched with each other in the same area of the two original images, and the characteristic point residual equation is determined according to ray direction vectors of the characteristic point pairs;
and carrying out optimization solution on an optimization equation set formed by the vanishing point residual equation corresponding to the vanishing point pair and the characteristic point residual equation corresponding to the characteristic point pair to obtain the external parameters of any two cameras.
2. The method of claim 1, wherein the arbitrary two cameras comprise a first camera and a second camera, the two original images comprising a first original image acquired by the first camera and a second original image acquired by the second camera, wherein the first original image and the second original image have portions of the same area;
detecting vanishing points of the two original images to obtain vanishing points in the two original images, including:
detecting vanishing points of the first original image to obtain first vanishing points in the first original image;
and detecting vanishing points of the second original image to obtain second vanishing points in the second original image.
3. The method according to claim 2, wherein performing a matching operation on vanishing points in the two original images to obtain a matched pair of vanishing points in the two original images comprises:
acquiring a first ray direction vector corresponding to a first vanishing point in the first original image under a coordinate system of the first camera;
acquiring a second ray direction vector corresponding to a second vanishing point in the second original image under the coordinate system of the second camera;
acquiring a corresponding included angle between the first ray direction vector and the second ray direction vector;
and obtaining a vanishing point pair matched with a first vanishing point in the first original image and a second vanishing point in the second original image according to the included angle between the first ray direction vector and the second ray direction vector, wherein the vanishing point pair meets a preset condition.
4. A method according to claim 3, wherein the predetermined condition comprises that the vanishing point pairs only match each other and that an angle between a first ray direction vector corresponding to a first vanishing point of the vanishing point pairs and a second ray direction vector corresponding to a second vanishing point is smaller than a predetermined value.
5. The method of claim 1, wherein the arbitrary two cameras comprise a first camera and a second camera, the two original images comprising a first original image acquired by the first camera and a second original image acquired by the second camera, wherein the first original image and the second original image have portions of the same area;
detecting the characteristic points of the two original images to obtain the characteristic points in the two original images, wherein the characteristic points comprise:
detecting the characteristic points of the first original image to obtain first characteristic points in the first original image;
detecting the characteristic points of the second original image to obtain second characteristic points in the second original image;
performing matching operation on the feature points in the two original images to obtain matched feature point pairs in the two images, wherein the matching operation comprises the following steps:
and performing feature point matching based on the same region of the first original image and the second original image to obtain a feature point pair of which the first feature point in the first original image is matched with the second feature point in the second original image.
6. The method of claim 5, wherein determining a feature point residual equation corresponding to the feature point pair based on the feature point pair comprises:
acquiring a third ray direction vector corresponding to a first characteristic point in the characteristic point pair under a coordinate system of the first camera;
acquiring a fourth ray direction vector corresponding to a second characteristic point in the characteristic point pair under the coordinate system of the second camera;
and determining a characteristic point residual equation corresponding to the characteristic point pair according to the third ray direction vector and the fourth ray direction vector.
7. The external parameter self-calibration method of the vehicle-mounted camera is characterized by comprising the following steps of:
acquiring at least two original images acquired by at least two cameras on a vehicle correspondingly;
performing a calibration process by using each two original images in the at least two original images to obtain external parameters of two cameras corresponding to the two original images;
wherein the calibration procedure comprises the external parameter self-calibration method of the vehicle-mounted camera according to any one of claims 1 to 6.
8. An electronic device comprising a memory and a processor coupled to each other, the processor configured to execute program instructions stored in the memory to implement the method of self-calibrating external parameters of the onboard camera of any of claims 1-7.
9. A non-transitory computer readable storage medium having program instructions stored thereon, wherein the program instructions when executed by a processor implement the method of self-calibration of an on-board camera according to any of claims 1 to 7.
CN202311560281.1A 2023-11-22 2023-11-22 External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium Active CN117315048B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311560281.1A CN117315048B (en) 2023-11-22 2023-11-22 External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311560281.1A CN117315048B (en) 2023-11-22 2023-11-22 External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117315048A CN117315048A (en) 2023-12-29
CN117315048B true CN117315048B (en) 2024-04-12

Family

ID=89281311

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311560281.1A Active CN117315048B (en) 2023-11-22 2023-11-22 External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117315048B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310335A (en) * 2018-03-27 2019-10-08 杭州海康威视数字技术股份有限公司 A kind of camera angle determines method, apparatus, equipment and system
CN111524192A (en) * 2020-04-20 2020-08-11 北京百度网讯科技有限公司 Calibration method, device and system for external parameters of vehicle-mounted camera and storage medium
CN111696160A (en) * 2020-06-22 2020-09-22 深圳市中天安驰有限责任公司 Automatic calibration method and device for vehicle-mounted camera and readable storage medium
CN112348752A (en) * 2020-10-28 2021-02-09 武汉极目智能技术有限公司 Lane line vanishing point compensation method and device based on parallel constraint
CN114549654A (en) * 2022-01-19 2022-05-27 福思(杭州)智能科技有限公司 External parameter calibration method, device, equipment and storage medium for vehicle-mounted camera
CN115578468A (en) * 2022-09-22 2023-01-06 深圳元戎启行科技有限公司 External parameter calibration method and device, computer equipment and storage medium
CN116664691A (en) * 2023-05-25 2023-08-29 嬴彻星创智能科技(上海)有限公司 External parameter calibration method, device, equipment, vehicle and medium of vehicle-mounted camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310335A (en) * 2018-03-27 2019-10-08 杭州海康威视数字技术股份有限公司 A kind of camera angle determines method, apparatus, equipment and system
CN111524192A (en) * 2020-04-20 2020-08-11 北京百度网讯科技有限公司 Calibration method, device and system for external parameters of vehicle-mounted camera and storage medium
CN111696160A (en) * 2020-06-22 2020-09-22 深圳市中天安驰有限责任公司 Automatic calibration method and device for vehicle-mounted camera and readable storage medium
CN112348752A (en) * 2020-10-28 2021-02-09 武汉极目智能技术有限公司 Lane line vanishing point compensation method and device based on parallel constraint
CN114549654A (en) * 2022-01-19 2022-05-27 福思(杭州)智能科技有限公司 External parameter calibration method, device, equipment and storage medium for vehicle-mounted camera
CN115578468A (en) * 2022-09-22 2023-01-06 深圳元戎启行科技有限公司 External parameter calibration method and device, computer equipment and storage medium
CN116664691A (en) * 2023-05-25 2023-08-29 嬴彻星创智能科技(上海)有限公司 External parameter calibration method, device, equipment, vehicle and medium of vehicle-mounted camera

Also Published As

Publication number Publication date
CN117315048A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
EP3678096B1 (en) Method for calculating a tow hitch position
JP4943034B2 (en) Stereo image processing device
CN108229406B (en) Lane line detection method, device and terminal
CN109741241B (en) Fisheye image processing method, device, equipment and storage medium
WO2023016271A1 (en) Attitude determining method, electronic device, and readable storage medium
CN112489136B (en) Calibration method, position determination device, electronic equipment and storage medium
EP3690799A1 (en) Vehicle lane marking and other object detection using side fisheye cameras and three-fold de-warping
CN114067001B (en) Vehicle-mounted camera angle calibration method, terminal and storage medium
CN115578468A (en) External parameter calibration method and device, computer equipment and storage medium
CN110673607A (en) Feature point extraction method and device in dynamic scene and terminal equipment
CN117315048B (en) External parameter self-calibration method of vehicle-mounted camera, electronic equipment and storage medium
CN111105465B (en) Camera device calibration method, device, system electronic equipment and storage medium
CN110827337B (en) Method and device for determining posture of vehicle-mounted camera and electronic equipment
EP3629292A1 (en) Reference point selection for extrinsic parameter calibration
CN112419399A (en) Image ranging method, device, equipment and storage medium
CN111656404B (en) Image processing method, system and movable platform
JP7064400B2 (en) Object detection device
JP2021051348A (en) Object distance estimation apparatus and object distance estimation method
JPWO2020122143A1 (en) Measurement system, measurement method, and measurement program
CN111462244A (en) On-line calibration method, system and device for vehicle-mounted all-round-looking system
CN117115242B (en) Identification method of mark point, computer storage medium and terminal equipment
KR101906999B1 (en) Apparatus and method for calculating distance between vehicle and object of adjacent lane
CN117437295A (en) On-line calibration method for external parameters of vehicle-mounted camera, electronic equipment and storage medium
CN112099031B (en) Vehicle distance measuring method and device
CN115147803A (en) Object detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant