CN116051652A - Parameter calibration method, electronic equipment and storage medium - Google Patents

Parameter calibration method, electronic equipment and storage medium Download PDF

Info

Publication number
CN116051652A
CN116051652A CN202310024917.4A CN202310024917A CN116051652A CN 116051652 A CN116051652 A CN 116051652A CN 202310024917 A CN202310024917 A CN 202310024917A CN 116051652 A CN116051652 A CN 116051652A
Authority
CN
China
Prior art keywords
calibration
image
region
distortion
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310024917.4A
Other languages
Chinese (zh)
Inventor
周亚振
余士超
陈豪
许成军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Zero Run Technology Co Ltd
Original Assignee
Zhejiang Zero Run Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Zero Run Technology Co Ltd filed Critical Zhejiang Zero Run Technology Co Ltd
Priority to CN202310024917.4A priority Critical patent/CN116051652A/en
Publication of CN116051652A publication Critical patent/CN116051652A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

The application discloses a parameter calibration method, electronic equipment and a storage medium, wherein the parameter calibration method comprises the following steps: obtaining a calibration image obtained by shooting a calibration object by a camera; determining image information of each region in the calibration image, wherein each region in the calibration image is obtained by dividing the calibration image into regions; and determining calibration parameters of a plurality of groups of cameras based on the image information of each region in the calibration image, wherein each group of calibration parameters is obtained by acquiring the image information of one region, and each group of calibration parameters is used for image processing. According to the scheme, different calibration parameters can be obtained through different areas, so that the calibration precision is improved, and a wider application range can be provided through combination and collocation of the calibration parameters.

Description

Parameter calibration method, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of intelligent identification, and in particular, to a parameter calibration method, an electronic device, and a storage medium.
Background
In the prior art, the internal reference and external reference calibration schemes of the all-around camera are mostly used for separately and independently calibrating. In addition, for the looking-around camera, the structure of the commonly used camera model is different from that of the real physical camera, and even if the camera model is accurately calibrated, different calibration results can be displayed in the center area and the edge area of the camera, and only one group of internal parameters are provided, so that the adaptation to all areas of the camera model can not be realized.
Disclosure of Invention
The application at least provides a parameter calibration method, electronic equipment and storage medium, and can provide a wider application range by using calibration parameters of different groups of cameras.
The first aspect of the application provides a parameter calibration method, which comprises the following steps: obtaining a calibration image obtained by shooting a calibration object by a camera; determining image information of each region in the calibration image, wherein each region in the calibration image is obtained by dividing the calibration image into regions; and determining calibration parameters of a plurality of groups of cameras based on the image information of each region in the calibration image, wherein each group of calibration parameters is obtained by acquiring the image information of one region, and each group of calibration parameters is used for image processing.
Each region comprises at least one of a middle region, an edge region and a global region, and each set of calibration parameters comprises a camera internal parameter corresponding to one region and a camera external parameter corresponding to one region.
The calibration image comprises a first calibration image, each group of calibration parameters comprises a camera internal reference corresponding to an area, and the calibration parameters of a plurality of groups of cameras are determined based on the image information of each area in the calibration image, and the method comprises the following steps: performing feature detection on each region in the first calibration image to obtain a plurality of first feature points; and for each region, acquiring the camera internal parameters corresponding to each region by using each first characteristic point in the region through a Zhang Zhengyou calibration method.
The calibration image comprises a second calibration image, each group of calibration parameters comprises a camera external parameter corresponding to an area, and the calibration parameters of a plurality of groups of cameras are determined based on the image information of each area in the calibration image, and the method comprises the following steps: performing feature detection on each region in the second calibration image to obtain a plurality of second feature points; for each region, determining camera external parameters corresponding to each region based on the corresponding relation between each second characteristic point and the three-dimensional point corresponding to each second characteristic point in the region.
Before feature detection is performed on each region in the second calibration image to obtain a plurality of second feature points, the method further comprises: performing de-distortion treatment on the second calibration image to obtain a corrected image; feature detection is carried out on each region in the second calibration image to obtain a plurality of second feature points, wherein the feature detection comprises the following steps: and performing feature detection on each region in the corrected image to obtain a plurality of second feature points.
The area includes an intermediate area, and the de-distortion processing is performed on the second calibration image to obtain a corrected image, including: establishing a de-distortion model by using camera internal parameters corresponding to the middle area; performing de-distortion treatment on distortion points in the second calibration image by using a de-distortion model; and replacing the distortion points in the second calibration image with the distortion points subjected to the de-distortion treatment to obtain a corrected image.
The area further includes an edge area, and the de-distortion processing is performed on the second calibration image to obtain a corrected image, including: performing de-distortion processing on the second calibration image by using de-distortion parameters to obtain a corrected image, wherein the de-distortion parameters are obtained by converting the first calibration image subjected to de-distortion in the middle area into a bird's-eye view, and the corrected image is the bird's-eye view; performing feature detection on each region in the corrected image to obtain a plurality of second feature points, wherein the feature detection comprises the following steps: obtaining distortion removal feature points in the corrected image; carrying out distortion treatment on the distortion-removed characteristic points and projecting the distortion-removed characteristic points to a second calibration image; and performing distortion removal processing on the distortion-removed characteristic points in the second calibration image by using camera internal parameters corresponding to the edge area to obtain a plurality of second characteristic points.
Wherein the method further comprises: receiving a target image processing instruction; determining target calibration parameters corresponding to target image processing instructions based on association relations between the groups of calibration parameters and the image processing instructions; and performing image processing corresponding to the target image processing instruction by using the target calibration parameters.
A second aspect of the present application provides an electronic device, including a memory and a processor coupled to each other, where the processor is configured to execute program instructions stored in the memory, so as to implement the parameter calibration method in the first aspect.
A third aspect of the present application provides a computer readable storage medium having stored thereon program instructions which, when executed by a processor, implement the parameter calibration method of the first aspect described above.
According to the scheme, the calibration objects are shot through the cameras, the calibration images are obtained, the calibration objects in all areas in the calibration images are identified, the image information of all areas of the calibration images is determined, the image information is utilized to obtain the calibration parameters of a plurality of groups of cameras, and different calibration parameters can be obtained through different areas, so that the calibration precision is improved, and a wider application range can be provided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the technical aspects of the application.
FIG. 1 is a flow chart of an embodiment of a parameter calibration method of the present application;
FIG. 2 is a schematic illustration of an internal flow chart of an embodiment of the parameter calibration method of the present application;
FIG. 3 is a schematic illustration of a flow chart of an embodiment of the parameter calibration method of the present application;
FIG. 4 is a flow chart of another embodiment of the parameter calibration method of the present application;
FIG. 5 is a flow chart of another embodiment of the parameter calibration method of the present application;
FIG. 6 is a schematic diagram of a framework of an embodiment of the electronic device of the present application;
FIG. 7 is a schematic diagram of a framework of one embodiment of a computer readable storage medium of the present application.
Detailed Description
The following describes the embodiments of the present application in detail with reference to the drawings.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, interfaces, techniques, etc., in order to provide a thorough understanding of the present application.
The term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, e.g., including at least one of A, B, C, may mean including any one or more elements selected from the group consisting of A, B and C.
Referring to fig. 1, fig. 1 is a flowchart of a training method of an image detection model according to an embodiment of the present application. Specifically, the method may include the steps of:
step S110: and obtaining a calibration image obtained by shooting the calibration object by the camera.
The execution main body of the parameter calibration method provided in this embodiment may be a server or a client, for example, an automobile looking-around system. The calibration image acquired may be acquired from a client. It is to be understood that the camera for photographing the calibration object may be a fisheye camera or a pinhole camera, which is not particularly limited herein.
The calibration parameters obtained by the parameter calibration method can be applied to a vehicle-mounted looking-around system, and the method can be used for identifying the road environment or the surrounding environment of the vehicle and can also be applied to the field of target tracking.
In some embodiments, before the calibration object is photographed by the camera to obtain the calibration image, the calibration object may be randomly placed in the photographing area of the camera, or the calibration object may be uniformly distributed in the photographing area of the camera. It is to be understood that the calibration object may be a checkerboard formed by black and white colors, a two-dimensional code, or the like, and is not particularly limited herein.
Step S120: determining image information of each region in the calibration image, wherein each region in the calibration image is obtained by dividing the calibration image into regions.
In some embodiments, the calibration image is an image captured by calibration plates of a black-and-white checkerboard, the region of the calibration image is divided into a middle region and an edge region, and the calibration plates of at least one of the middle region, the edge region and the global region are identified to obtain image information corresponding to the middle region, the edge region and the global region in the calibration image. Illustratively, the boundary between the center region and the edge region is set to 1/6 of the total width and total height of the photographing region of the camera.
In other embodiments, the acquired image information is corner information of the calibration object. And carrying out corner detection on different areas of the calibration image to obtain corner information corresponding to each area. Specifically, the corner detection method can be used for detecting the corner in the calibration image. It is to be understood that the corner detection algorithm may use Harris operator, SIFT operator, detection algorithm in opencv, etc., and is not limited herein.
Step S130: and determining calibration parameters of a plurality of groups of cameras based on the image information of each region in the calibration image.
Each set of calibration parameters is obtained by acquiring image information of one area, and each set of calibration parameters is used for image processing.
In some embodiments, the determined set of camera calibration parameters includes a camera intrinsic parameter corresponding to one of the regions and a camera extrinsic parameter corresponding to one of the regions by calibrating image information of each of the regions in the image. The internal parameters of the camera are internal parameters of the camera, such as focal length parameters of the camera, and the external parameters of the camera are pose of the camera relative to the outside. In this embodiment, the camera internal parameters may include a camera matrix and camera distortion coefficients, and the external parameters of the camera may include a rotation matrix and a translation vector of the camera with respect to the outside world.
In other embodiments, the calibration image comprises a first calibration image, and each set of calibration parameters comprises a camera reference corresponding to an area. Determining camera internal parameters corresponding to each region through the image information of each region in the first calibration image, wherein the specific steps are S1311 to S1312:
step S1311: and carrying out feature detection on each region in the first calibration image to obtain a plurality of first feature points.
Step S1312: and for each region, acquiring the camera internal parameters corresponding to each region by using each first characteristic point in the region through a Zhang Zhengyou calibration method.
In some embodiments, the first calibration image is divided into an edge region, a middle region and a global region, and first, corner points in a calibration plate of the middle region of the first calibration image are detected, wherein the corner points are first feature points. After the corner points of the middle area of the first calibration image are obtained, the first internal reference corresponding to the middle area of the first calibration image is obtained through calculation by a Zhang Zhengyou calibration method. Then, detecting the corner points in the calibration plate of the first calibration image edge area, and calculating by a Zhang Zhengyou calibration method after obtaining the corner points of the first calibration image edge area, so as to obtain a second internal reference corresponding to the first calibration image edge area. And the same, a third internal reference corresponding to the global area of the first calibration image can be obtained.
In some embodiments, the calibration image comprises a second calibration image, and each set of calibration parameters comprises a camera external parameter corresponding to an area. The camera external parameters corresponding to each region are determined according to the image information of each region in the second calibration image, and the specific steps are S1321 to S1322:
step S1321: and carrying out feature detection on each region in the second calibration image to obtain a plurality of second feature points.
Step S1322: and for each region, determining camera external parameters corresponding to each region based on the corresponding relation between each second characteristic point and the three-dimensional point corresponding to each second characteristic point in the region.
In some embodiments, the second calibration image captured by the camera may have distortion, and before the second calibration image is detected, the second calibration image needs to be subjected to de-distortion processing, so as to obtain a corrected image. And then carrying out feature detection on each region in the corrected image to obtain a plurality of second feature points. The feature detection method may be the same as the feature detection method used in the camera reference acquisition process. And obtaining the camera external parameters corresponding to each region through the corresponding relation between the second characteristic points and the three-dimensional points corresponding to each second characteristic point.
In one embodiment, the middle region of the second calibration image is de-distorted to obtain a corrected image. A de-distortion model can be established by using a first internal parameter corresponding to the middle area, and distortion points in the second calibration image are de-distorted by using the de-distortion model; and replacing the distortion points in the second calibration image with the distortion points subjected to the de-distortion treatment to obtain a corrected image. And detecting the corner of the corrected image to obtain the corner corresponding to the middle region of the corrected image. By correcting the correspondence between all the corner points of the middle region of the image and the three-dimensional points thereof, the external parameters of the camera can be obtained.
In another embodiment, the edge region of the second calibration image is subjected to de-distortion processing to obtain a corrected image. And performing de-distortion processing on the second calibration image by using de-distortion parameters to obtain a corrected image, wherein the de-distortion parameters are obtained by converting the first calibration image subjected to de-distortion in the middle area into a bird's-eye view, and the corrected image is the bird's-eye view. And obtaining de-distortion characteristic points in the corrected image by using a characteristic point detection method. And carrying out distortion treatment on the undistorted characteristic points and projecting the distortion treatment to the second calibration image so as to obtain characteristic point data of the second calibration image. And performing distortion removal processing on the distortion-removed characteristic points in the second calibration image by using camera internal parameters corresponding to the edge area to obtain a plurality of second characteristic points, wherein the second characteristic points are more accurate characteristic points.
In some embodiments, the acquired camera intrinsic parameters and camera extrinsic parameters of different areas can be freely matched and combined according to different task requirements. Specifically, after receiving a target image processing instruction, determining a target calibration parameter corresponding to the target image processing instruction according to the association relation between each group of calibration parameters and each image processing instruction; and performing image processing corresponding to the target image processing instruction by using target calibration parameters, wherein the target calibration parameters comprise camera internal parameters and camera external parameters.
In a specific application scene, the method is applied to a vehicle-mounted looking-around system, and parameters of the vehicle-mounted looking-around camera are calibrated, including the internal participation and external participation of the looking-around camera. The used calibration object is a black-and-white checkerboard calibration plate.
First, the reference of the looking-around camera is calibrated, please refer to fig. 2. The method comprises the steps of acquiring a calibration plate image of the middle area of the looking-around camera, namely a first calibration image, through a data acquisition module 1 in a vehicle-mounted looking-around system, acquiring the calibration plate image of the edge area through a data acquisition module 2, detecting corner points through a corner point detection algorithm of the calibration plate, and acquiring a first internal reference and a second internal reference of the looking-around camera through a Zhengyou calibration method respectively. And finally, combining the image obtained by the data acquisition module 1 with the image obtained by the data acquisition module 2, detecting the angular point, and obtaining a third internal reference of the camera through a Zhang Zhengyou calibration method. In the process of calibrating the internal parameters of the looking-around camera, the corner detection algorithm can use Harris operator, SIFT operator, detection algorithm in opencv and the like.
Before data acquisition 1 is carried out, a manual or machine control calibration plate is needed to be sequentially arranged in the edge area and the middle area of the looking-around camera, in the process, the boundary between the middle area and the edge area can be determined according to engineering experience, and the boundary can be approximately set to be 1/6 of the total width and the total height of pixels of the looking-around camera. The calibration plates are composed of checkerboards with black and white colors, the size and the number of the checkerboards can be combined at will, in the process of data acquisition, the calibration plates can be placed and moved in any gesture through a hand or machine control, but the positions where the calibration plates are needed to be placed can be summed up to cover most areas of the looking-around camera, and the number of the edge area calibration plates needs to be larger than that of the middle area calibration plates.
The internal reference calibration can be controlled by using a mechanical arm, and the automatic internal reference calibration can be completed only by setting a control program in advance.
Before data acquisition 2, a manual or machine control calibration plate is required to be placed in any posture at the middle area of the looking-around camera, and the size of the middle area can be determined according to engineering experience, and is usually limited by 1/6 of the total width and the total height of the looking-around camera from the boundary of the camera. In the data acquisition process, the boundary of the calibration plate is not required to be strictly limited to be completely positioned in the central area of the looking-around camera, and only most of the calibration plate image is required to be positioned in the central area of the looking-around camera.
Secondly, when external parameter calibration is carried out, a specific calibration site is required to be arranged, the main device comprises a vehicle carrying a plurality of looking-around cameras and a plurality of calibration plates, wherein the arrangement of the calibration plates is required to meet the following conditions:
1) The looking-around camera is fixed at a certain position of the vehicle, and cannot move along with the movement of the vehicle in the calibration process;
2) At least one reference calibration plate is needed in the area right in front of each vehicle-mounted looking-around camera, and the reference calibration plate is positioned in the middle area of the visual field of the looking-around camera;
3) At least two optimized calibration plates are needed in the edge area of each vehicle-mounted looking-around camera, the two optimized calibration plates are respectively positioned at the two side edges of the looking-around camera, and if a plurality of optimized calibration plates are needed, the optimized calibration plates are uniformly positioned at the two side edges;
4) All calibration plates are composed of checkerboards consisting of a plurality of different black and white rectangles, and the size and the number of the rectangles in the checkerboards can be combined at will;
5) The calibration plate needs to be free from moving in the calibration process, and meanwhile, shielding or interference is avoided.
The external parameter calibration is not required to be changed after the field arrangement is finished, and the calibration process is completely automatic without manual intervention; if the internal parameter calibration is controlled by the mechanical arm, the automatic calibration of the internal parameter and the external parameter can be completed without manual intervention in the whole process by stopping the vehicle at a designated position.
Referring to fig. 3, in the arranged calibration sites, for N vehicle-mounted looking-around cameras, first, second calibration images of the looking-around cameras are acquired, and all angles on the second calibration images are de-distorted by using calibrated first internal parameters, wherein the de-distortion process is as follows:
let the point on the second calibration image be (x, y) and the de-distortion coordinate be (u, v).
The vehicle-mounted looking-around camera uses the following distortion model:
Figure BDA0004042190480000081
wherein f x ,f y ,c x ,c y ,k 1 ,k 2 ,k 3 ,k 4 Is a calibrated camera internal reference, wherein f x ,f y ,c x ,c y Variables in the camera matrix, i.e
Figure BDA0004042190480000082
k 1 ,k 2 ,k 3 ,k 4 For camera distortion coefficients, i.e. d= (k 1 ,k 2 ,k 3 ,k 4 ),θ d Is the de-distortion angle. Meanwhile, as can be seen from the distortion model, the distortion degree is larger at the point close to the edge of the camera, and the distortion degree is smaller or even no distortion is generated at the point close to the center of the camera. Therefore, more accurate camera matrix K and distortion coefficients can be acquired by performing camera calibration in the divided regions, respectively.
Defining the above-mentioned transformation (1) as P (u, v) = (x, y), the process of applying distortion to the de-distorted coordinates on the image can be implemented, and the above-mentioned process can be inversely changed, so that the process of de-distorting points on the distorted image, namely P -1 (x,y)=(u,v)。
In the above de-distortion process, the process of de-distortion all points by using the first internal reference can be realized by using the first internal reference K, D. Thereby obtaining a corrected image.
Reference calibration board B corresponding to middle region of detection correction image j The angular point (the detection algorithm can be the same as the angular point detection algorithm in the internal reference calibration process) of the standard is obtained
Figure BDA0004042190480000091
Then with the actually measured 3D points
Figure BDA0004042190480000092
Establishing a constraint equation:
Figure BDA0004042190480000093
wherein W is bj For the fact that the corner points of the middle area are in the vehicle body coordinate systemAnd measuring a 3D point, wherein K is a camera matrix in the first internal reference, R and t are external references of the camera to be calculated, namely the pose of the camera under a vehicle body coordinate system, R is a rotation matrix, and t is a translation vector.
Because all the angular points in the reference calibration plate are in one-to-one correspondence with the 3D points of the vehicle body coordinate system, a plurality of groups of constraint equations of the formula (2) can be established. And considering that the external parameters of the camera are unknown and the detection of the corner points has noise, so that the formula (2) has errors, the errors of all the points are summed, and the following least square problem can be constructed:
Figure BDA0004042190480000094
the above formula (3) is a least square problem, and can be solved by adopting any convex optimization method, such as Gaussian-Newton method and Levenberg-Marquardt method. For convenience, the solution can also be performed by adopting a solvePnP () operator in OpenCV, and the loss function can also be solved by using an open source library Ceres-solver tool, so that the optimal camera external parameters are obtained. R calculated by using the reference calibration plate * ,t * The first external parameter of the looking-around camera is the first external parameter.
The corner points of the optimized calibration plate in the edge area of the second calibration image are difficult to directly detect, and corresponding coordinates in the aerial view can be obtained according to the world coordinates of the corner points in the reference calibration plate and the first external parameters through calculation:
Figure BDA0004042190480000095
wherein I is w ,I h For the size of the image of the camera,
Figure BDA0004042190480000096
actually measured 3D point of corner point under vehicle body coordinate system>
Figure BDA0004042190480000097
(x, y) coordinates, t, in the bird's eye view x ,t y To map the translation in the x, y axis of the translation vector t in the first argument, pix cm Typically, 1 may be set for the number of pixels per square centimeter.
The coordinate corresponding to the corner point in the reference calibration plate under the aerial view is obtained by utilizing the formula (4), and the coordinate corresponds to the de-distortion coordinate of the reference calibration plate under the camera view one by one, so that the homography matrix H between the de-distorted camera image and the aerial view can be obtained by solving a linear equation set:
p BEV =H·p W (5)
after the homography matrix H is obtained, converting an input image of the optimized calibration plate in the edge area of the second calibration image into a bird's-eye view angle, and detecting the coordinates of the optimized calibration plate in the bird's-eye view by using the same algorithm as that used for detecting the corner of the reference calibration plate. Then, according to the inverse change of the homography matrix H in the formula (5), the angular point coordinates under the aerial view are converted into de-distortion coordinates of the camera, distortion is applied to the angular point coordinates through the formula (2), the angular point coordinates are mapped into an original second calibration image of the camera, and accurate sub-pixel coordinates of the angular point coordinates are calculated through a vector calculation mode. The subpixel coordinate detection algorithm can be obtained by adopting a corersubpix () algorithm in an open source opencv library, and can also be obtained by adopting a harrs corner detection algorithm.
After accurate sub-pixel corner points are obtained, performing de-distortion treatment on the corner points by using second internal parameters calibrated by using an edge region, constructing an equation set shown in a formula (2) according to 3D points corresponding to an optimized calibration plate under a vehicle body coordinate system, and solving second external parameters after nonlinear optimization.
And combining the angular points in the first external parameter obtaining process and the angular points in the second external parameter obtaining process, constructing an equation set shown in the formula (2) by utilizing the 3D points of the reference calibration plate and the optimized calibration plate under the vehicle body coordinate system, and obtaining a third external parameter after nonlinear optimization.
Referring to fig. 4, after receiving the target image processing instruction, the looking-around camera in the vehicle-mounted looking-around system may select the corresponding looking-around camera internal parameters and external parameters to combine according to the association relationship between each set of calibration parameters and each image processing instruction, so as to complete the image processing corresponding to the target image processing instruction. The visual ranging task can use a first internal parameter and a first external parameter combination, the panoramic looking-around splicing task can use a second internal parameter and a third external parameter combination, the rest of the tasks can use a third internal parameter and a third external parameter combination, and the rest of the tasks can be used under the condition that the above combination does not meet the task precision. The scheme is not limited to three sets of internal parameters and external parameters respectively, but more combination modes can be generated under specific use conditions.
Referring to fig. 5, fig. 5 is a flowchart of another embodiment of the parameter calibration method of the present application. The method comprises the following specific steps:
step S510: and acquiring image information of different calibration areas.
In some embodiments, a camera may be used to capture a calibration object to obtain a calibration image, and the calibration image may be cropped according to each calibration area to obtain image information of each calibration area.
Step S520: and extracting the characteristic points of the image information.
In some embodiments, after the acquired image information of different calibration areas, the images are screened to acquire effective image information. The effective image information is that most areas in the image are calibration objects. After the effective image information is obtained, the effective image information is subjected to feature point detection, and feature points are extracted.
Step S530: and acquiring camera calibration parameters based on the feature points.
This step is the same as step S1312 and step S1322 described above, and will not be described in detail here.
Step S540: and combining the calibration parameters based on different calibration areas to obtain a plurality of combination schemes.
This step is the same as step S130 described above, and will not be described in detail here.
In this context, the internal and external parameters of the camera may be calibrated by different regions, so that the calibrated internal and external parameters have higher accuracy. And different internal parameters and external parameters obtained through calibration can be combined and matched freely according to tasks, and different application ranges can be used.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Referring to fig. 6, fig. 6 is a schematic diagram of a frame of an embodiment of an electronic device 60 of the present application. The electronic device 60 comprises a memory 61 and a processor 62 coupled to each other, the processor 62 being adapted to execute program instructions stored in the memory 61 for implementing the steps of any of the parameter calibration method embodiments described above. In one particular implementation scenario, electronic device 60 may include, but is not limited to: the microcomputer and the server, and the electronic device 60 may also include a mobile device such as a notebook computer and a tablet computer, which is not limited herein.
In particular, the processor 62 is adapted to control itself and the memory 61 to implement the steps of any of the parameter calibration method embodiments described above. The processor 62 may also be referred to as a CPU (Central Processing Unit ). The processor 62 may be an integrated circuit chip having signal processing capabilities. The processor 62 may also be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 62 may be commonly implemented by an integrated circuit chip.
Referring to FIG. 7, FIG. 7 is a schematic diagram illustrating an embodiment of a computer-readable storage medium 70 of the present application. The computer readable storage medium 70 stores program instructions 701 capable of being executed by a processor, the program instructions 701 being used to implement the steps of any of the parameter calibration method embodiments described above.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
In the several embodiments provided in the present application, it should be understood that the disclosed methods and apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical, or other forms.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all or part of the technical solution contributing to the prior art or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (10)

1. The parameter calibration method is characterized by comprising the following steps of:
obtaining a calibration image obtained by shooting a calibration object by a camera;
determining image information of each region in the calibration image, wherein each region in the calibration image is obtained by dividing the calibration image into regions;
and determining a plurality of groups of calibration parameters of the camera based on the image information of each region in the calibration image, wherein each group of calibration parameters is obtained by acquiring the image information of one region, and each group of calibration parameters is used for image processing.
2. The method of claim 1, wherein each region comprises at least one of a middle region, an edge region, and a global region, and each set of calibration parameters comprises a camera intrinsic to one of the regions and a camera extrinsic to one of the regions.
3. The method according to claim 1 or 2, wherein the calibration image comprises a first calibration image, each set of calibration parameters comprises a camera reference corresponding to an area, and determining calibration parameters of a plurality of sets of cameras based on image information of each area in the calibration image comprises:
performing feature detection on each region in the first calibration image to obtain a plurality of first feature points;
and for each region, acquiring the camera internal parameters corresponding to each region by using each first characteristic point in the region through a Zhang Zhengyou calibration method.
4. The method according to claim 1 or 2, wherein the calibration image includes a second calibration image, each set of calibration parameters includes a camera external parameter corresponding to an area, and determining calibration parameters of the plurality of sets of cameras based on image information of each area in the calibration image includes:
performing feature detection on each region in the second calibration image to obtain a plurality of second feature points;
and for each region, determining the camera external parameters corresponding to each region based on the corresponding relation between each second characteristic point and the three-dimensional point corresponding to each second characteristic point in the region.
5. The method of claim 4, further comprising, prior to said feature detecting each of said regions in said second calibration image to obtain a plurality of second feature points:
performing de-distortion treatment on the second calibration image to obtain a corrected image;
performing feature detection on each region in the second calibration image to obtain a plurality of second feature points, including:
and carrying out feature detection on each region in the corrected image to obtain the plurality of second feature points.
6. The method of claim 5, wherein the region comprises an intermediate region, the de-distorting the second calibration image to obtain a corrected image, comprising:
establishing a de-distortion model by using camera internal parameters corresponding to the middle area;
performing de-distortion treatment on distortion points in the second calibration image by using the de-distortion model;
and replacing the distortion points in the second calibration image with the distortion points subjected to the de-distortion treatment to obtain a corrected image.
7. The method of claim 6, wherein the region further comprises an edge region, the de-distorting the second calibration image to obtain a corrected image, comprising:
performing de-distortion processing on the second calibration image by using de-distortion parameters to obtain a corrected image, wherein the de-distortion parameters are obtained by converting the first calibration image de-distorted in the middle area into a bird's-eye view, and the corrected image is the bird's-eye view;
and performing feature detection on each region in the corrected image to obtain the plurality of second feature points, including:
obtaining de-distortion characteristic points in the corrected image;
carrying out distortion treatment on the de-distortion characteristic points and projecting the distortion treatment to the second calibration image;
and performing de-distortion processing on the de-distortion characteristic points subjected to the distortion processing in the second calibration image by using the camera internal parameters corresponding to the edge region to obtain a plurality of second characteristic points.
8. The method according to claim 1, wherein the method further comprises:
receiving a target image processing instruction;
determining target calibration parameters corresponding to the target image processing instructions based on the association relation between each group of calibration parameters and each image processing instruction;
and performing image processing corresponding to the target image processing instruction by using the target calibration parameters.
9. An electronic device comprising a memory and a processor coupled to each other, the processor being configured to execute program instructions stored in the memory to implement the parameter calibration method of any one of claims 1 to 8.
10. A computer readable storage medium having stored thereon program instructions, which when executed by a processor, implement the parameter calibration method of any of claims 1 to 8.
CN202310024917.4A 2023-01-06 2023-01-06 Parameter calibration method, electronic equipment and storage medium Pending CN116051652A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310024917.4A CN116051652A (en) 2023-01-06 2023-01-06 Parameter calibration method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310024917.4A CN116051652A (en) 2023-01-06 2023-01-06 Parameter calibration method, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116051652A true CN116051652A (en) 2023-05-02

Family

ID=86114296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310024917.4A Pending CN116051652A (en) 2023-01-06 2023-01-06 Parameter calibration method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116051652A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116952132A (en) * 2023-07-27 2023-10-27 湖南视比特机器人有限公司 Partition calibration method and system for multi-vision measurement

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116952132A (en) * 2023-07-27 2023-10-27 湖南视比特机器人有限公司 Partition calibration method and system for multi-vision measurement

Similar Documents

Publication Publication Date Title
CN111179358B (en) Calibration method, device, equipment and storage medium
JP6764533B2 (en) Calibration device, chart for calibration, chart pattern generator, and calibration method
CN107767422B (en) Fisheye lens correction method and device and portable terminal
JP5739584B2 (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
WO2017092631A1 (en) Image distortion correction method for fisheye image, and calibration method for fisheye camera
CN109479082B (en) Image processing method and apparatus
CN109920004B (en) Image processing method, device, calibration object combination, terminal equipment and calibration system
CN110689581A (en) Structured light module calibration method, electronic device and computer readable storage medium
CN110717942A (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108734738B (en) Camera calibration method and device
CN113841384B (en) Calibration device, chart for calibration and calibration method
CN112907675B (en) Calibration method, device, system, equipment and storage medium of image acquisition equipment
CN112270719A (en) Camera calibration method, device and system
CN111243034A (en) Panoramic auxiliary parking calibration method, device, equipment and storage medium
CN112307912A (en) Method and system for determining personnel track based on camera
KR101926258B1 (en) Method of automatic calibration of AVM system
CN113329179A (en) Shooting alignment method, device, equipment and storage medium
CN111652937B (en) Vehicle-mounted camera calibration method and device
CN116051652A (en) Parameter calibration method, electronic equipment and storage medium
EP3967969A1 (en) Fisheye camera calibration system, method and apparatus, electronic device, and storage medium
CN110163922B (en) Fisheye camera calibration system, fisheye camera calibration method, fisheye camera calibration device, electronic equipment and storage medium
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN113610927B (en) AVM camera parameter calibration method and device and electronic equipment
CN115965697A (en) Projector calibration method, calibration system and device based on Samm's law
CN111489384B (en) Method, device, system and medium for evaluating shielding based on mutual viewing angle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination