CN114049404B - Method and device for calibrating internal phase and external phase of vehicle - Google Patents
Method and device for calibrating internal phase and external phase of vehicle Download PDFInfo
- Publication number
- CN114049404B CN114049404B CN202210029319.1A CN202210029319A CN114049404B CN 114049404 B CN114049404 B CN 114049404B CN 202210029319 A CN202210029319 A CN 202210029319A CN 114049404 B CN114049404 B CN 114049404B
- Authority
- CN
- China
- Prior art keywords
- matching
- image
- video
- matching points
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 239000011159 matrix material Substances 0.000 claims abstract description 34
- 239000013598 vector Substances 0.000 claims abstract description 34
- 238000007781 pre-processing Methods 0.000 claims description 19
- 230000000877 morphologic effect Effects 0.000 claims description 10
- 238000003708 edge detection Methods 0.000 claims description 9
- 230000003628 erosive effect Effects 0.000 claims description 9
- 230000010339 dilation Effects 0.000 claims description 7
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
- G06T5/30—Erosion or dilatation, e.g. thinning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method and a device for calibrating external parameters of an automobile internal phase. According to the method, the template image is obtained in advance, the image acquired after the position of the camera is changed is matched with the template image, and the rotation matrix and the translation vector of the camera, namely the camera external parameter, are automatically estimated according to the matching result, so that the camera external parameter can be automatically estimated after the position of the camera is changed, and the camera external parameter calibration efficiency is improved.
Description
Technical Field
The invention relates to the technical field of computer vision, in particular to a method and a device for calibrating external parameters of an automobile interior phase.
Background
At present, a commonly used camera external reference calibration method comprises: firstly fixing a camera at a position, placing a calibration plate at a specific position to ensure that the calibration plate is filled with a camera picture and is in the middle of the camera picture as far as possible, then controlling the camera to take a picture, and calculating the external parameter of the camera at the current position by a designed external parameter calibration program of the camera. When the position of the camera is changed artificially or non-artificially, actively or passively, the external parameters calculated by re-calibration are also changed, so that the calibration accuracy is influenced, and the re-calibration operation is greatly limited due to the complex operation and long time consumption of the camera external parameter calibration method.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides the method and the device for calibrating the external parameters of the internal camera of the vehicle, which can automatically estimate the external parameters of the camera after the position of the camera is changed, and improve the calibration efficiency of the external parameters of the camera.
In order to solve the above technical problem, in a first aspect, an embodiment of the present invention provides a method for calibrating external parameters of an internal vehicle, including;
installing a camera on a pipe column of a steering wheel, installing a calibration plate above a driving position, and enabling the camera to completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position;
when the steering wheel moves to the first position, acquiring a reference image acquired by the camera, and intercepting an image of the matching area from the reference image as a template image;
acquiring a first video acquired by the camera during the movement of the steering wheel from the first position to the second position, acquiring a second video acquired by the camera during the movement of the steering wheel from the third position to the fourth position, and acquiring a third video acquired by the camera during the S-shaped movement of the steering wheel within the first position, the second position, the third position and the fourth position;
and respectively matching each image in the first video, the second video and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points and all the third matching points.
Further, the matching each image in the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points, and all the third matching points specifically includes:
respectively determining the optimal matching proportion of the template image and each image in the first video, the second video and the third video, and respectively matching each image in the first video, the second video and the third video with the template image which is scaled by the corresponding optimal matching proportion to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points;
calibrating the rotation matrix of the camera according to all the first matching points, all the second matching points and the corresponding optimal matching proportion, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, all the third matching points and the corresponding optimal matching proportion.
Further, the step of capturing the image of the matching region from the reference image as a template image specifically includes:
and carrying out image preprocessing on the reference image to obtain an intermediate image, and intercepting the image of the matching area from the intermediate image as the template image.
Further, the matching each image in the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points specifically is:
respectively carrying out image preprocessing on each image in the first video, the second video and the third video to obtain a plurality of first images to be matched, a plurality of second images to be matched and a plurality of third images to be matched, and respectively matching each first image to be matched, each second image to be matched and each third image to be matched with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points.
Further, the image preprocessing includes edge detection, morphological erosion, and dilation.
In a second aspect, an embodiment of the present invention provides an external reference calibration apparatus for an interior vehicle, including;
the matching area confirming module is used for installing a camera on a pipe column of a steering wheel, installing a calibration plate above a driving position and enabling the camera to completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position;
the template image acquisition module is used for acquiring a reference image acquired by the camera when the steering wheel moves to the first position, and intercepting an image of the matching area from the reference image as a template image;
a multi-directional video acquisition module, configured to acquire a first video captured by the camera during movement of the steering wheel from the first position to the second position, acquire a second video captured by the camera during movement of the steering wheel from the third position to the fourth position, and acquire a third video captured by the camera during S-shaped movement of the steering wheel within the first position, the second position, the third position, and the fourth position;
and the camera external parameter calibration module is used for respectively matching each image in the first video, the second video and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points and all the third matching points.
Further, the matching each image in the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points, and all the third matching points specifically includes:
respectively determining the optimal matching proportion of the template image and each image in the first video, the second video and the third video, and respectively matching each image in the first video, the second video and the third video with the template image which is scaled by the corresponding optimal matching proportion to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points;
calibrating the rotation matrix of the camera according to all the first matching points, all the second matching points and the corresponding optimal matching proportion, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, all the third matching points and the corresponding optimal matching proportion.
Further, the step of capturing the image of the matching region from the reference image as a template image specifically includes:
and carrying out image preprocessing on the reference image to obtain an intermediate image, and intercepting the image of the matching area from the intermediate image as the template image.
Further, the matching each image in the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points specifically is:
respectively carrying out image preprocessing on each image in the first video, the second video and the third video to obtain a plurality of first images to be matched, a plurality of second images to be matched and a plurality of third images to be matched, and respectively matching each first image to be matched, each second image to be matched and each third image to be matched with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points.
Further, the image preprocessing includes edge detection, morphological erosion, and dilation.
The embodiment of the invention has the following beneficial effects:
the camera is arranged on a pipe column of a steering wheel, a calibration plate is arranged above a driving position, the camera can completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position, a reference image collected by the camera is obtained when the steering wheel moves to the first position, an image of the matching area is intercepted from the reference image and is used as a template image, a first video collected by the camera is obtained in the process that the steering wheel moves from the first position to the second position, a second video collected by the camera is obtained in the process that the steering wheel moves from the third position to the fourth position, a third video collected by the camera is obtained in the process that the steering wheel moves in an S shape in the first position, the second position, the third position and the fourth position, and each image of the first video, the second video and the third video is matched with the template image respectively, and obtaining a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating translation vectors of the camera according to all the first matching points, all the second matching points and all the third matching points to realize the external reference calibration of the vehicle inside the vehicle. Compared with the prior art, the embodiment of the invention matches the acquired image of the camera after the position of the camera is changed with the template image by acquiring the template image in advance, and automatically estimates the rotation matrix and the translation vector of the camera, namely the camera external parameter according to the matching result, so that the camera external parameter can be automatically estimated after the position of the camera is changed, and the camera external parameter calibration efficiency is improved.
Drawings
Fig. 1 is a schematic flow chart of an external reference calibration method for an interior vehicle according to a first embodiment of the present invention;
fig. 2 is a schematic view of a camera mounting position exemplified in the first embodiment of the present invention;
fig. 3 is a schematic view of an exemplary moving position of a steering wheel in the first embodiment of the present invention;
FIG. 4 is a schematic illustration of the formation of a ChAruco Board as exemplified in the first embodiment of the present invention;
fig. 5 is a schematic structural diagram of an external reference calibration device in a vehicle according to a second embodiment of the present invention.
Detailed Description
The technical solutions in the present invention will be described clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, the step numbers in the text are only for convenience of explanation of the specific embodiments, and do not serve to limit the execution sequence of the steps. The method provided by the embodiment can be executed by the relevant terminal device, and the following description takes a processor as an execution subject as an example.
As shown in FIG. 1, a first embodiment provides a method for calibrating internal phase and external phase references of an automobile, comprising steps S1-S4;
s1, mounting the camera on a column of a steering wheel, and mounting a calibration plate above a driving position, so that the camera can completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position;
s2, when the steering wheel moves to the first position, acquiring a reference image acquired by the camera, and intercepting an image of the matching area from the reference image as a template image;
s3, acquiring a first video acquired by a camera in the process that the steering wheel moves from the first position to the second position, acquiring a second video acquired by the camera in the process that the steering wheel moves from the third position to the fourth position, and acquiring a third video acquired by the camera in the process that the steering wheel moves in an S shape in the first position, the second position, the third position and the fourth position;
s4, matching each image in the first video, the second video and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points and all the third matching points.
It should be noted that the camera external parameters are composed of a rotation matrix R and a translation vector T, and are generally written asThis matrix determines the pose of the camera.
As an example, in step S1, a camera is mounted on the column of the steering wheel according to the mounting position of the camera as shown in fig. 2, and when the position of the steering wheel is adjusted, the position of the camera changes, and when the steering wheel extends back and forth around the point O as a fulcrum and moves between the first position a and the second position B, or moves between the third position C and the fourth position D, the position of the camera changes, and when the steering wheel rotates up and down around the point O as a fulcrum and moves between the first position a and the fourth position D, or moves between the second position B and the third position C, the position of the camera also changes, as shown in fig. 3.
The calibration plate is arranged above the driving position, so that the camera can completely and clearly shoot the calibration plate and a preset matching area when the steering wheel moves to the first position A, the second position B, the third position C and the fourth position D, for example, an area of a window of the driving position is preset as the matching area.
It is understood that the calibration plate is a flat plate with a pattern array with a fixed pitch, and the conversion relation between the physical size and the pixels can be determined by the calibration plate and a calibration algorithm, calculating camera parameters and distortion coefficients, and the like in computer vision. The most common calibration board is the checkerboard calibration board Chessboard with the checkerboard pattern.
In a preferred embodiment of this example, the calibration plate is a checkerboard calibration plate ChArUco Board with ArUco marks. A schematic of the formation of the ChAruco Board is shown in FIG. 4.
In step S2, the steering wheel is adjusted to move to the first position a, the camera is controlled to shoot the calibration board and the matching area, and the reference image collected by the camera is obtained and recorded as IaAnd from a reference picture IaTaking the image of the middle intercepted matching area as a template image, and marking as It。
In step S3, the calibration board is kept stationary, the steering wheel is adjusted to move from the first position a to the second position B, the camera is controlled to record the calibration board and the matching area during the movement of the steering wheel, and a first video recorded as V is acquired by the cameraabAdjusting the steering wheel to move from a third position C to a fourth position D, controlling the camera to record the calibration plate and the matching area in the moving process of the steering wheel, and acquiring a second video collected by the camera and recorded as VcdAdjusting the steering wheel to move in an S-shape in the first position a, the second position B, the third position C and the fourth position D, for example, taking a bisection position between the first position a and the second position B as 0.5AB, similarly, taking a bisection position between the fourth position D and the third position C as 0.5DC, moving in an S-shape according to a path of a → D → 0.5AB → 0.5DC → B → C, in order to obtain a better effect, more equally dividing AB and DC and moving in an S-shape, controlling the camera to record a calibration plate and a matching area during the moving of the steering wheel, and obtaining a third video collected by the camera as V, and marking as Vs。
In step S4, the first video V is put into effectabEach image in (1) and template image ItMatching to obtain each image and template image I in the first videotThe best matching location point, i.e. the first matching points, is denoted as [ M ]ab1(xab1,yab1), Mab2(xab2,yab2),…, Mabm(xabm,yabm)]Second video VcdEach image in (1) and template image ItMatching to obtain each image and template image I in the second videotThe best matching location point, i.e. the second plurality of matching points, is denoted as [ M ]cd1(xcd1,ycd1), Mcd2(xcd2,ycd2),…, Mcdn(xcdn,ycdn)]A third video VsEach image in (1) and template image ItMatching to obtain each image and template image I in the third videotThe most matched location points, i.e. the third plurality of matching points, are denoted as [ M ]s1(xs1,ys1), Ms2(xs2,ys2),…, Msp(xsp,ysp)]。
It is understood that template matching is a method of pattern recognition in image processing, where a template is a known small image and template matching is a search for objects in a large image.
The camera external parameters can be rotated by a 3x3 rotation matrix R and a translation vector T (x)t,yt,zt) And (4) showing. Rotation relationships instead of using a rotation matrix representation, a rotation vector R may be usedvOr quaternion q (w)q,xq,yq,zq) Indicating that the three can be mutually transformed.
The pose of the camera is estimated for the image in the video by using OpenCV, namely the external parameters of the camera are estimated, and the rotation vector R from the camera to a coordinate system of a calibration plate can be obtainedvAnd a translation vector T. For ease of calculation, a quaternion q is used to represent the rotational relationship in the camera external parameters.
Considering that the steering wheel moves between a first position A and a second position B or moves between a third position C and a fourth position when the steering wheel stretches back and forth with a point O as a fulcrumWhen D moves, the rotation relation between the camera and the calibration plate is unchanged, and only the translation and rotation relation q existstQuaternion q with first position A and second position B0And quaternion q at third position C and fourth position D1Slerp calculation yields:
qt=Slerp(q0,q1,t) (1);
wherein t is a rational number, and t is more than or equal to 0 and less than or equal to 1.
It will be appreciated that Slerp is a spherical linear interpolation, a linear interpolation operation of quaternions, primarily used to smooth differences between two quaternions representing rotations.
According to all the first matching points [ M ]ab1(xab1,yab1), Mab2(xab2,yab2),…, Mabm(xabm,yabm)]And all second matching points [ M ]cd1(xcd1,ycd1), Mcd2(xcd2,ycd2),…, Mcdn(xcdn,ycdn)]And calibrating a rotation matrix R of the camera.
Specifically, the matching points M (x, y) and t satisfy:
where J is a 3x1 matrix.
T =0 during the movement of the steering wheel from the first position a to the second position B, and t =1 during the movement of the steering wheel from the third position C to the fourth position D, then for the first video VabAnd a second video VcdAfter image preprocessing, each image in the image is combined with a template image ItMatching to obtain multiple first matching points [ M ]ab1,Mab2,…,Mabm]And a plurality of second matching points [ M ]cd1,Mcd2,…,Mcdn]And then:
the following can be obtained:
the t required in formula (1) is determined by substituting formula (4) into formula (2), and q is known0、q1Substituting the obtained t into formula (1), namely Slerp, to obtain a quaternion q representing a rotation relationtThus, the rotation relation q can be obtainedt。
According to all the first matching points [ M ]ab1(xab1,yab1), Mab2(xab2,yab2),…, Mabm(xabm,yabm)]All second matching points [ M ]cd1(xcd1,ycd1), Mcd2(xcd2,ycd2),…, Mcdn(xcdn,ycdn)]And all third matching points [ M ]s1(xs1.ys1), Ms2(xs2.ys2),…, Msp(xsp,ysp)]And calibrating a translation vector T of the camera.
Specifically, the matching point M (x, y) and the translation vector T satisfy:
where L is a 3x3 matrix.
First video VabThe second video VcdAnd a third video VsEach image in (1) has a corresponding matching point M and a translation vector T, and a series of equations can be obtained:
the following can be obtained:
the translation vector T can be obtained by bringing L into formula (5).
According to the calculated rotation relation qtAnd translating vector T, namely external reference of the camera can be marked.
According to the embodiment, the template image is obtained in advance, the image acquired after the position of the camera is changed is matched with the template image, and the rotation matrix and the translation vector of the camera, namely the camera external parameter, are automatically estimated according to the matching result, so that the camera external parameter can be automatically estimated after the position of the camera is changed, and the camera external parameter calibration efficiency is improved.
In a preferred embodiment, the matching each image of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, calibrating the rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, and all the third matching points specifically includes: respectively determining the optimal matching proportion of the template image and each image in the first video, the second video and the third video, and respectively matching each image in the first video, the second video and the third video with the template image which is scaled by the corresponding optimal matching proportion to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points; calibrating the rotation matrix of the camera according to all the first matching points, all the second matching points and the corresponding optimal matching proportion, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, all the third matching points and the corresponding optimal matching proportion.
Illustratively, in practical applications, the template image I is acquired in advancetThe matrix J in equation (2) and the matrix L in equation (5) are calculated by matching the image captured by the camera with the template image I without the aid of a ChAruco Board or other calibration platetMatching results in a matching point M (x, y) to estimate camera external parameters, i.e., rotation matrix R and translation directionThe amount T.
When template matching is performed, the following problems may occur: first, the matching takes longer time; secondly, due to different camera positions, matching regions in the acquired image may have different sizes, which affects the matching accuracy.
Aiming at the first problem, a series of processing and matching can be carried out after the image is cut and reduced, so that the resource consumption is reduced.
For the second problem, the template image may be scaled into a plurality of sizes and then matched to find the best matching scaling k, i.e. the best matching scaling and the matching point M.
Thus, equation (2) may be modified as follows:
formula (5) can be modified as follows:
in a corresponding manner, the first and second electrodes are,
in a preferred embodiment, the capturing an image of the matching region from the reference image as a template image specifically includes: and carrying out image preprocessing on the reference image to obtain an intermediate image, and intercepting an image of the matching area from the intermediate image as a template image.
In a preferred embodiment, the matching each image in the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points specifically is: the method comprises the steps of conducting image preprocessing on each image in a first video, a second video and a third video respectively to obtain a plurality of first images to be matched, a plurality of second images to be matched and a plurality of third images to be matched, and matching each first image to be matched, each second image to be matched and each third image to be matched with a template image respectively to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points.
In a preferred embodiment, image pre-processing includes edge detection, morphological erosion and dilation.
It will be appreciated that edge detection is a fundamental problem in image processing and computer vision, the purpose of which is to identify points in a digital image where changes in brightness are significant. Morphological processing is a technique of analyzing an image by a computer using basic operations of mathematical morphology, the most basic morphological operations being dilation and erosion.
In the embodiment, before the image of the matching area is intercepted from the reference image as the template image, the edge detection, the morphological erosion and the expansion are carried out on the reference image, and before each image in the video is matched with the template image, the edge detection, the morphological erosion and the expansion are carried out on each image in the video, so that the matching accuracy is improved.
Based on the same inventive concept as the first embodiment, the second embodiment provides an internal vehicle external reference calibration apparatus as shown in fig. 5, including; the matching area confirming module 21 is used for installing the camera on a pipe column of the steering wheel, installing a calibration plate above the driving position and enabling the camera to completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position; the template image acquisition module 22 is configured to acquire a reference image acquired by the camera when the steering wheel moves to the first position, and intercept an image of the matching area from the reference image as a template image; the multi-directional video acquisition module 23 is configured to acquire a first video acquired by the camera in a process that the steering wheel moves from the first position to the second position, acquire a second video acquired by the camera in a process that the steering wheel moves from the third position to the fourth position, and acquire a third video acquired by the camera in a process that the steering wheel moves in an S-shape in the first position, the second position, the third position, and the fourth position; the camera external reference calibration module 24 is configured to match each image of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, calibrate a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrate a translation vector of the camera according to all the first matching points, all the second matching points, and all the third matching points.
In a preferred embodiment, the matching each image of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, calibrating the rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, and all the third matching points specifically includes: respectively determining the optimal matching proportion of the template image and each image in the first video, the second video and the third video, and respectively matching each image in the first video, the second video and the third video with the template image which is scaled by the corresponding optimal matching proportion to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points; calibrating the rotation matrix of the camera according to all the first matching points, all the second matching points and the corresponding optimal matching proportion, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, all the third matching points and the corresponding optimal matching proportion.
In a preferred embodiment, the capturing an image of the matching region from the reference image as a template image specifically includes: and carrying out image preprocessing on the reference image to obtain an intermediate image, and intercepting an image of the matching area from the intermediate image as a template image.
In a preferred embodiment, the matching each image in the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points specifically is: the method comprises the steps of conducting image preprocessing on each image in a first video, a second video and a third video respectively to obtain a plurality of first images to be matched, a plurality of second images to be matched and a plurality of third images to be matched, and matching each first image to be matched, each second image to be matched and each third image to be matched with a template image respectively to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points.
In a preferred embodiment, image pre-processing includes edge detection, morphological erosion and dilation.
In summary, the embodiment of the present invention has the following advantages:
the camera is arranged on a pipe column of a steering wheel, a calibration plate is arranged above a driving position, the camera can completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position, a reference image collected by the camera is obtained when the steering wheel moves to the first position, an image of the matching area is intercepted from the reference image and is used as a template image, a first video collected by the camera is obtained in the process that the steering wheel moves from the first position to the second position, a second video collected by the camera is obtained in the process that the steering wheel moves from the third position to the fourth position, a third video collected by the camera is obtained in the process that the steering wheel moves in an S shape in the first position, the second position, the third position and the fourth position, and each image of the first video, the second video and the third video is matched with the template image respectively, and obtaining a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating translation vectors of the camera according to all the first matching points, all the second matching points and all the third matching points to realize the external reference calibration of the vehicle inside the vehicle. According to the embodiment of the invention, the template image is obtained in advance, the image acquired after the position of the camera is changed is matched with the template image, and the rotation matrix and the translation vector of the camera, namely the camera external parameter, are automatically estimated according to the matching result, so that the camera external parameter can be automatically estimated after the position of the camera is changed, and the camera external parameter calibration efficiency is improved.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that all or part of the processes of the above embodiments may be implemented by hardware related to instructions of a computer program, and the computer program may be stored in a computer readable storage medium, and when executed, may include the processes of the above embodiments. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
Claims (10)
1. A calibration method for external parameters of an in-vehicle phase is characterized by comprising the following steps of;
installing a camera on a pipe column of a steering wheel, installing a calibration plate above a driving position, and enabling the camera to completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position;
when the steering wheel moves to the first position, acquiring a reference image acquired by the camera, and intercepting an image of the matching area from the reference image as a template image;
acquiring a first video acquired by the camera during the movement of the steering wheel from the first position to the second position, acquiring a second video acquired by the camera during the movement of the steering wheel from the third position to the fourth position, and acquiring a third video acquired by the camera during the S-shaped movement of the steering wheel within the first position, the second position, the third position and the fourth position;
and respectively matching each image in the first video, the second video and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points and all the third matching points.
2. The in-vehicle external reference calibration method according to claim 1, wherein the matching is performed on each of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points, and all the third matching points, specifically:
respectively determining the optimal matching proportion of the template image and each image in the first video, the second video and the third video, and respectively matching each image in the first video, the second video and the third video with the template image which is scaled by the corresponding optimal matching proportion to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points;
calibrating the rotation matrix of the camera according to all the first matching points, all the second matching points and the corresponding optimal matching proportion, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, all the third matching points and the corresponding optimal matching proportion.
3. The in-vehicle external reference calibration method according to claim 1, wherein the image of the matching area is cut from the reference image as a template image, specifically:
and carrying out image preprocessing on the reference image to obtain an intermediate image, and intercepting the image of the matching area from the intermediate image as the template image.
4. The in-vehicle external reference calibration method according to claim 1, wherein the step of matching each image of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points comprises:
respectively carrying out image preprocessing on each image in the first video, the second video and the third video to obtain a plurality of first images to be matched, a plurality of second images to be matched and a plurality of third images to be matched, and respectively matching each first image to be matched, each second image to be matched and each third image to be matched with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points.
5. The in-vehicle external reference calibration method according to claim 3 or 4, wherein the image preprocessing comprises edge detection, morphological erosion and dilation.
6. An external reference calibration device for an automobile interior phase is characterized by comprising;
the matching area confirming module is used for installing a camera on a pipe column of a steering wheel, installing a calibration plate above a driving position and enabling the camera to completely shoot the calibration plate and a preset matching area when the steering wheel moves to a first position, a second position, a third position and a fourth position;
the template image acquisition module is used for acquiring a reference image acquired by the camera when the steering wheel moves to the first position, and intercepting an image of the matching area from the reference image as a template image;
a multi-directional video acquisition module, configured to acquire a first video captured by the camera during movement of the steering wheel from the first position to the second position, acquire a second video captured by the camera during movement of the steering wheel from the third position to the fourth position, and acquire a third video captured by the camera during S-shaped movement of the steering wheel within the first position, the second position, the third position, and the fourth position;
and the camera external parameter calibration module is used for respectively matching each image in the first video, the second video and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points, calibrating a rotation matrix of the camera according to all the first matching points and all the second matching points, and calibrating a translation vector of the camera according to all the first matching points, all the second matching points and all the third matching points.
7. The in-vehicle external reference calibration device according to claim 6, wherein the matching is performed on each of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, the calibration of the rotation matrix of the camera is performed according to all the first matching points and all the second matching points, and the calibration of the translation vector of the camera is performed according to all the first matching points, all the second matching points, and all the third matching points, specifically:
respectively determining the optimal matching proportion of the template image and each image in the first video, the second video and the third video, and respectively matching each image in the first video, the second video and the third video with the template image which is scaled by the corresponding optimal matching proportion to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points;
calibrating the rotation matrix of the camera according to all the first matching points, all the second matching points and the corresponding optimal matching proportion, and calibrating the translation vector of the camera according to all the first matching points, all the second matching points, all the third matching points and the corresponding optimal matching proportion.
8. The in-vehicle external reference calibration device according to claim 6, wherein the image of the matching area is cut from the reference image as a template image, specifically:
and carrying out image preprocessing on the reference image to obtain an intermediate image, and intercepting the image of the matching area from the intermediate image as the template image.
9. The in-vehicle external reference calibration device according to claim 6, wherein the matching is performed on each of the first video, the second video, and the third video with the template image to obtain a plurality of first matching points, a plurality of second matching points, and a plurality of third matching points, specifically:
respectively carrying out image preprocessing on each image in the first video, the second video and the third video to obtain a plurality of first images to be matched, a plurality of second images to be matched and a plurality of third images to be matched, and respectively matching each first image to be matched, each second image to be matched and each third image to be matched with the template image to obtain a plurality of first matching points, a plurality of second matching points and a plurality of third matching points.
10. The in-vehicle external reference calibration device according to claim 8 or 9, wherein the image preprocessing includes edge detection, morphological erosion and dilation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210029319.1A CN114049404B (en) | 2022-01-12 | 2022-01-12 | Method and device for calibrating internal phase and external phase of vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210029319.1A CN114049404B (en) | 2022-01-12 | 2022-01-12 | Method and device for calibrating internal phase and external phase of vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114049404A CN114049404A (en) | 2022-02-15 |
CN114049404B true CN114049404B (en) | 2022-04-05 |
Family
ID=80196345
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210029319.1A Active CN114049404B (en) | 2022-01-12 | 2022-01-12 | Method and device for calibrating internal phase and external phase of vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114049404B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018177004A (en) * | 2017-04-13 | 2018-11-15 | クラリオン株式会社 | Calibration device for on-vehicle camera |
JP2019050622A (en) * | 2018-11-28 | 2019-03-28 | 株式会社東芝 | Image processing apparatus, image processing method, image processing program, and image processing system |
CN109690623A (en) * | 2016-06-29 | 2019-04-26 | 醒眸行有限公司 | The system and method for the posture of camera in scene for identification |
CN112183512A (en) * | 2020-12-02 | 2021-01-05 | 深圳佑驾创新科技有限公司 | Camera calibration method, device, vehicle-mounted terminal and storage medium |
CN112330756A (en) * | 2021-01-04 | 2021-02-05 | 中智行科技有限公司 | Camera calibration method and device, intelligent vehicle and storage medium |
WO2021026705A1 (en) * | 2019-08-09 | 2021-02-18 | 华为技术有限公司 | Matching relationship determination method, re-projection error calculation method and related apparatus |
CN112489113A (en) * | 2020-11-25 | 2021-03-12 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
CN112509054A (en) * | 2020-07-20 | 2021-03-16 | 北京智行者科技有限公司 | Dynamic calibration method for external parameters of camera |
CN112862899A (en) * | 2021-02-07 | 2021-05-28 | 黑芝麻智能科技(重庆)有限公司 | External parameter calibration method, device and system for image acquisition equipment |
WO2021233309A1 (en) * | 2020-05-21 | 2021-11-25 | 杭州海康威视数字技术股份有限公司 | Extrinsic parameter change detection method and apparatus, electronic device, and detection system |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4751939B2 (en) * | 2009-03-31 | 2011-08-17 | アイシン精機株式会社 | Car camera calibration system |
JP6565769B2 (en) * | 2016-04-03 | 2019-08-28 | 株式会社デンソー | In-vehicle camera mounting angle detection device, mounting angle calibration device, mounting angle detection method, mounting angle calibration method, and computer program |
JP6776202B2 (en) * | 2017-07-28 | 2020-10-28 | クラリオン株式会社 | In-vehicle camera calibration device and method |
JP7314486B2 (en) * | 2018-09-06 | 2023-07-26 | 株式会社アイシン | camera calibration device |
US10991121B2 (en) * | 2018-11-27 | 2021-04-27 | GM Global Technology Operations LLC | Movement tracking of operator-facing cameras |
-
2022
- 2022-01-12 CN CN202210029319.1A patent/CN114049404B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109690623A (en) * | 2016-06-29 | 2019-04-26 | 醒眸行有限公司 | The system and method for the posture of camera in scene for identification |
JP2018177004A (en) * | 2017-04-13 | 2018-11-15 | クラリオン株式会社 | Calibration device for on-vehicle camera |
JP2019050622A (en) * | 2018-11-28 | 2019-03-28 | 株式会社東芝 | Image processing apparatus, image processing method, image processing program, and image processing system |
WO2021026705A1 (en) * | 2019-08-09 | 2021-02-18 | 华为技术有限公司 | Matching relationship determination method, re-projection error calculation method and related apparatus |
WO2021233309A1 (en) * | 2020-05-21 | 2021-11-25 | 杭州海康威视数字技术股份有限公司 | Extrinsic parameter change detection method and apparatus, electronic device, and detection system |
CN112509054A (en) * | 2020-07-20 | 2021-03-16 | 北京智行者科技有限公司 | Dynamic calibration method for external parameters of camera |
CN112489113A (en) * | 2020-11-25 | 2021-03-12 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
CN112183512A (en) * | 2020-12-02 | 2021-01-05 | 深圳佑驾创新科技有限公司 | Camera calibration method, device, vehicle-mounted terminal and storage medium |
CN112330756A (en) * | 2021-01-04 | 2021-02-05 | 中智行科技有限公司 | Camera calibration method and device, intelligent vehicle and storage medium |
CN112862899A (en) * | 2021-02-07 | 2021-05-28 | 黑芝麻智能科技(重庆)有限公司 | External parameter calibration method, device and system for image acquisition equipment |
Non-Patent Citations (3)
Title |
---|
Automatic Extrinsic Calibration for an Onboard Camera;Jun TAN et al;《2013 Chinese Automation Congress》;20131231;340-343 * |
基于双目立体视觉的倒车环境障碍物测量方法;刘昱岗等;《交通运输系统工程与信息》;20160815;第16卷(第04期);79-87 * |
基于图像处理与计算机视觉的车辆四轮定位仪的设计与实现;张娟等;《现代电子技术》;20161001;第39卷(第19期);42-46 * |
Also Published As
Publication number | Publication date |
---|---|
CN114049404A (en) | 2022-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102750697A (en) | Parameter calibration method and device | |
US8368768B2 (en) | Image processing apparatus, image processing method, and program | |
EP2870585A1 (en) | A method and system for correcting a distorted input image | |
CN106920210B (en) | A kind of flake video real time panoramic bearing calibration based on development of a sphere model | |
JP2009135921A (en) | Image projection apparatus, and image projection method | |
EP3101623B1 (en) | Homography rectification | |
CN110225321B (en) | Training sample data acquisition system and method for trapezoidal correction | |
CN113223066B (en) | Multi-source remote sensing image matching method and device based on characteristic point fine tuning | |
US20190251677A1 (en) | Homography rectification | |
CN113838138B (en) | System calibration method, system, device and medium for optimizing feature extraction | |
CN105335977B (en) | The localization method of camera system and target object | |
CN114049404B (en) | Method and device for calibrating internal phase and external phase of vehicle | |
CN111815714B (en) | Fisheye camera calibration method and device, terminal equipment and storage medium | |
CN113837949A (en) | Image processing method and device | |
CN111383352A (en) | Automatic color filling and abstracting method for three-order magic cube | |
CN116524041A (en) | Camera calibration method, device, equipment and medium | |
CN110188756B (en) | Product positioning method | |
CN113920196A (en) | Visual positioning method and device and computer equipment | |
Wang et al. | A vision location system design of glue dispensing robot | |
CN109767390A (en) | A kind of digital picture of block parallel disappears image rotation method | |
CN106023127B (en) | A kind of flake video correction method based on multiframe | |
CN114419168B (en) | Calibration method and device for image feature points | |
CN114820289B (en) | Fisheye image correction method based on radial symmetric projection model | |
CN113313646B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN118053160A (en) | Seal image recognition method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20221229 Address after: No. 602-165, Complex Building, No. 1099, Qingxi Second Road, Hezhuang Street, Qiantang District, Hangzhou, Zhejiang, 310000 Patentee after: Hangzhou Ruijian Zhixing Technology Co.,Ltd. Address before: 518051 401, building 1, Shenzhen new generation industrial park, No. 136, Zhongkang Road, Meidu community, Meilin street, Futian District, Shenzhen, Guangdong Province Patentee before: SHENZHEN MINIEYE INNOVATION TECHNOLOGY Co.,Ltd. |