CN110738707B - Distortion correction method and device for camera, equipment and storage medium - Google Patents

Distortion correction method and device for camera, equipment and storage medium Download PDF

Info

Publication number
CN110738707B
CN110738707B CN201910983234.5A CN201910983234A CN110738707B CN 110738707 B CN110738707 B CN 110738707B CN 201910983234 A CN201910983234 A CN 201910983234A CN 110738707 B CN110738707 B CN 110738707B
Authority
CN
China
Prior art keywords
distortion
pixel coordinates
orthodontic
camera
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910983234.5A
Other languages
Chinese (zh)
Other versions
CN110738707A (en
Inventor
郭建亚
李骊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing HJIMI Technology Co Ltd
Original Assignee
Beijing HJIMI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing HJIMI Technology Co Ltd filed Critical Beijing HJIMI Technology Co Ltd
Priority to CN201910983234.5A priority Critical patent/CN110738707B/en
Publication of CN110738707A publication Critical patent/CN110738707A/en
Application granted granted Critical
Publication of CN110738707B publication Critical patent/CN110738707B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Abstract

The application discloses a distortion correction method and device for a camera, equipment and a storage medium. The method comprises the steps of obtaining a plane array image acquired by a camera and an orthodontic plane array image obtained by correcting distortion of the plane array image, obtaining pixel coordinates of preset feature points in the plane array image as distorted pixel coordinates, and obtaining pixel coordinates of the preset feature points in the orthodontic plane array image as orthodontic pixel coordinates. According to the orthodontic pixel coordinates, the predicted pixel coordinates of the preset feature points are obtained, the distortion parameters of the camera are calculated at least according to the distorted pixel coordinates and the predicted pixel coordinates, and the images acquired by the camera are corrected by using the distortion parameters. As can be seen from the above steps, the calculation process of the distortion parameters only needs to use pixel coordinates, so that external parameters of the camera and other internal parameters except the distortion parameters are not required to be introduced, and therefore, the calculation amount can be significantly reduced.

Description

Distortion correction method and device for camera, equipment and storage medium
Technical Field
The present invention relates to the field of electronic information, and in particular, to a method and apparatus for correcting distortion of a camera, a device, and a storage medium.
Background
Since there is distortion in the lens of the camera, in order to avoid the influence of distortion on the image captured by the camera, distortion correction of the image is generally required.
The existing distortion correction method is defined in an image coordinate system, so that besides distortion parameters, internal parameters of a camera such as focal length, principal points and the like, and external parameters such as a conversion matrix from a world coordinate system to the image coordinate system and the like need to be calibrated. Also, the image de-distortion process requires the participation of these parameters. Therefore, the conventional distortion correction method of the camera is large in calculation amount.
Disclosure of Invention
The application provides a distortion correction method and device, equipment and a storage medium of a camera, and aims to solve the problem that the calculated amount of the distortion correction method of the camera is large.
In order to achieve the above object, the present application provides the following technical solutions:
a distortion correction method of a camera, comprising:
acquiring a plane array image and an orthodontic plane array image, wherein the plane array image is acquired by a camera, and the orthodontic plane array image is obtained by correcting distortion of the plane array image;
acquiring pixel coordinates of preset feature points in the planar array image as distorted pixel coordinates;
acquiring pixel coordinates of the preset feature points in the orthodontic plane array image as orthodontic pixel coordinates;
acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates;
calculating distortion parameters of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates;
correcting an image acquired by the camera using the distortion parameters.
Optionally, according to the orthodontic pixel coordinate, obtaining a predicted pixel coordinate of the preset feature point includes:
acquiring a predicted line, wherein the predicted line corresponding to an ith line of orthodontic pixels in the orthodontic plane array image is determined by a slope i and a pixel coordinate of a reference point i, the slope i is a slope of a line fitting line of the ith line of orthodontic pixels, the reference point i is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a perpendicular line of the line fitting line, and the perpendicular line is a line passing through the main point and perpendicular to the line fitting line;
acquiring a predicted column straight line, wherein the predicted column straight line corresponding to a j-th column orthodontic pixel in the orthodontic plane array image is determined by a slope j and a pixel coordinate of a reference point j, the slope j is a slope of a column fitting straight line of the j-th column orthodontic pixel, the reference point j is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a vertical line of the column fitting straight line in the j-th column orthodontic pixel, and the vertical line of the column fitting straight line is a straight line passing through the main point and vertical to the column fitting straight line;
and taking the intersection point of the predicted line and the predicted column line as the predicted pixel coordinate.
Optionally, calculating a distortion parameter of the camera at least from the distorted pixel coordinates and the predicted pixel coordinates, including:
substituting the distorted pixel coordinates and the predicted pixel coordinates into a preset distortion model to obtain a first distortion equation set; the distortion model is used for representing the distorted pixel coordinates and the predicted pixel coordinates and generating a transformation relation according to the distortion parameters;
and obtaining the distortion parameters by solving a least square solution of an equation set, wherein the equation set at least comprises the first distortion equation set.
Optionally, the system of equations further includes:
and the second distortion equation set is obtained by substituting the predicted pixel coordinates and the distortion pixel coordinates of the preset virtual feature points into the distortion model.
Optionally, calculating a distortion parameter of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates, further includes:
after determining the distortion parameters, if the distortion parameters do not converge, performing the following procedures, and iteratively calculating the distortion parameters until the distortion parameters converge: correcting the plane array image by using the distortion parameters obtained in the last iteration process, updating the orthodontic plane array image, and obtaining pixel coordinates of the preset characteristic points in the orthodontic plane array image as orthodontic pixel coordinates; acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates; and calculating the distortion parameters of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates.
Optionally, the condition for convergence of the distortion parameter includes at least one of:
calculating the iteration times of the distortion parameters to reach a preset value;
the deviation between the value of the distortion parameter obtained in the current iteration process and the value of the distortion parameter obtained in the last iteration process is smaller than a first preset threshold value;
the deviation between the current distorted pixel coordinate and the historical distorted pixel coordinate is smaller than a second preset threshold value, the current distorted pixel coordinate is determined by using the distorted parameter obtained in the current iteration process and a preset real pixel coordinate, and the historical distorted pixel coordinate is determined by using the distorted parameter obtained in the last iteration process and the real pixel coordinate;
and the deviation between the predicted pixel coordinates and the orthodontic pixel coordinates, which are determined by using the distortion parameters obtained in the current iterative process, is smaller than a third preset threshold.
Optionally, the distortion parameters in the distortion model include:
the multi-order radial distortion model order, the radial distortion parameters, and the tangential distortion parameters.
A distortion correction apparatus of a camera, comprising:
the image acquisition unit is used for acquiring a plane array image and an orthodontic plane array image, wherein the plane array image is acquired by a camera, and the orthodontic plane array image is obtained by correcting distortion of the plane array image;
a first coordinate acquiring unit, configured to acquire pixel coordinates of a preset feature point in the planar array image as distorted pixel coordinates;
the second coordinate acquisition unit is used for acquiring pixel coordinates of the preset characteristic points in the orthodontic plane array image and taking the pixel coordinates as orthodontic pixel coordinates;
the third coordinate acquisition unit is used for acquiring the predicted pixel coordinates of the preset characteristic points according to the orthodontic pixel coordinates;
a distortion parameter calculation unit for calculating a distortion parameter of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates;
and the image correction unit is used for correcting the image acquired by the camera by using the distortion parameters.
A distortion correction apparatus of a camera, comprising: a memory and a processor;
the memory is used for storing programs;
the processor is configured to execute the program to implement the respective steps of the distortion correction method of the camera as described above.
A readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of a distortion correction method for a camera as described above.
According to the technical scheme, a plane array image acquired by a camera and an orthodontic plane array image obtained by correcting distortion of the plane array image are acquired, pixel coordinates of preset feature points in the plane array image are acquired and used as distorted pixel coordinates, and pixel coordinates of preset feature points in the orthodontic plane array image are acquired and used as orthodontic pixel coordinates. According to the orthodontic pixel coordinates, the predicted pixel coordinates of the preset feature points are obtained, the distortion parameters of the camera are calculated at least according to the distorted pixel coordinates and the predicted pixel coordinates, and the images acquired by the camera are corrected by using the distortion parameters. As can be seen from the above steps, the calculation process of the distortion parameters only needs to use pixel coordinates, so that external parameters of the camera and other internal parameters other than the distortion parameters do not need to be introduced, and therefore, the calculation amount can be significantly reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a distortion correction system of a camera according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a distortion correction method of a camera according to an embodiment of the present disclosure;
FIG. 3 illustrates a planar array view;
fig. 4 is a flowchart of obtaining predicted pixel coordinates of a preset feature point by using a straight line fitting method according to an embodiment of the present application;
FIG. 5 is a flowchart of a method for calculating distortion parameters of a camera according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a distortion correction apparatus of a camera according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a distortion correction apparatus of a camera according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Fig. 1 is a diagram of a distortion correction system for a camera, including a planar array pattern, a camera, and a computing device, according to an embodiment of the present application. The camera shoots the planar array pattern to obtain a planar array image, and shoots other scenes with the camera to obtain a scene image, wherein the planar array image and the scene image are collectively called an original image (i.e. an image with distortion).
The computing device internally comprises a distortion correction unit, and the distortion correction unit runs the distortion correction method of the camera.
Fig. 2 is a distortion correction method of a camera according to an embodiment of the present application, including the following steps:
s201: and acquiring a plane array image and an orthodontic plane array image.
The plane array image is an image obtained by collecting the plane array pattern by the camera. Examples of planar array patterns are shown in fig. 3, which may be black and white checkerboard patterns (left in fig. 3) or circular array patterns (right in fig. 3).
The orthodontic plane array image is an image obtained by performing distortion correction on the plane array image. Specifically, the distortion parameter is used to correct the distortion of the planar array image, and in this embodiment, the initial value of the distortion parameter is set to 0, and in this case, the initial orthodontic planar array image is the planar array image.
Of course, the initial value of the distortion parameter may be set to other empirical values, and the distortion correction may be performed on the planar array image by using other empirical values, so as to obtain an initial orthodontic planar array image.
Of course, other ways of obtaining an orthodontic planar array image may be used, and are not limited herein.
S202: and acquiring pixel coordinates of preset feature points in the planar array image as distorted pixel coordinates.
The preset feature points are selected according to the characteristics of the planar array pattern. For example, for a black-and-white checkerboard, with the upper left corner of each cell as a preset feature point, the pixel coordinates of the preset feature point may be detected by using a corner detection method. For a circular array pattern, the center of a circle is used as a preset feature point, and the pixel coordinates of the preset feature point can be detected by adopting a circle detection and circle center calculation method.
The above specific detection algorithms can be seen in the prior art. Optionally, after detecting the pixel coordinates of the preset feature points, a subpixel interpolation method may also be used to obtain more accurate pixel coordinates of the preset feature points, which may be specifically referred to the prior art and will not be described herein.
Let plane array image be
Figure GDA0003813600880000061
The resolution is N×M, N is the resolution in the vertical direction, and M is the resolution in the horizontal direction. It is assumed that the planar array pattern includes n rows and m columns of array elements (e.g., one cell of a checkerboard pattern or one circle of a circular pattern), and that each array element includes a predetermined feature point (hereinafter, simply referred to as a feature point). Let i be the row number of the array unit in the array, i=1, 2, …, n; let j be the column number of the array unit in the array, j=1, 2, …, m; let t=i·m+j. The pixel sitting mark of the feature point of the t-th array unit in the original planar array image is +.>
Figure GDA0003813600880000062
For the purpose of distinguishing the needs described later, pixels in the planar array image as preset feature points are referred to as distorted pixels, and pixel coordinates obtained in this step are also referred to as distorted pixel coordinates.
S203: and acquiring pixel coordinates of preset characteristic points in the orthodontic plane array image, and taking the pixel coordinates as orthodontic pixel coordinates.
Set orthodontic plane array image as
Figure GDA0003813600880000063
The resolution is also N M. And obtaining the pixel coordinates of the preset characteristic points in the orthodontic plane array image by using the detection mode which is the same as that of the plane array image. The pixel coordinates of the feature point of the t-th array element are expressed as +.>
Figure GDA0003813600880000064
For the purpose of distinguishing the needs described later, pixels serving as preset feature points in the orthodontic plane array image are called orthodontic pixels, and the pixel coordinates obtained in the step are also called orthodontic pixel coordinates.
S204: and obtaining predicted pixel coordinates of the preset characteristic points according to the orthodontic pixel coordinates.
Specifically, a linear fitting mode may be adopted to obtain predicted pixel coordinates of the preset feature points:
and obtaining a predicted line, wherein the predicted line corresponding to the ith line of orthodontic pixels in the orthodontic plane array image is determined by the slope i and the pixel coordinate of a reference point i, the slope i is the slope of a line fitting line of the ith line of orthodontic pixels, the reference point i is the orthodontic pixel point closest to a preset main point in the ith line of orthodontic pixels, or the orthodontic pixel point closest to the vertical line of the line fitting line, and the vertical line of the line fitting line is a line passing through the main point and vertical to the line fitting line.
And obtaining a predicted column straight line, wherein the predicted column straight line corresponding to the j-th column orthodontic pixel in the orthodontic plane array image is determined by a slope j and a pixel coordinate of a reference point j, the slope j is the slope of a column fitting straight line of the j-th column orthodontic pixel, the reference point j is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a vertical line of the column fitting straight line in the j-th column orthodontic pixel, and the vertical line of the column fitting straight line is a straight line passing through the main point and vertical to the column fitting straight line.
And taking the intersection point of the predicted line and the predicted column line as predicted pixel coordinates.
Wherein the pixel coordinates (u 0 ,v 0 ) The method can be obtained by other camera internal reference calibration methods, and can be specifically referred to the prior art, and can also be approximately represented by pixel coordinates (0.5M+0.5 and 0.5N+0.5) at the center of the image, wherein N is the resolution of the image in the vertical direction, and M is the resolution of the image in the horizontal direction.
A more detailed implementation of the steps of obtaining predicted pixel coordinates of the preset feature points is shown in the flowchart of fig. 4.
In the following description, the predicted pixel coordinates of the feature point of the t-th array unit are expressed as P t =(u t ,v t )。
S205: and calculating distortion parameters of the camera at least according to the distortion pixel coordinates and the predicted pixel coordinates.
Specifically, the distortion model is used for representing distorted pixel coordinates and predicted pixel coordinates, and is based on transformation relations generated by distortion parameters.
Optionally, in this embodiment, the distortion model is as shown in formula (1):
Figure GDA0003813600880000081
in the above formula, q is the radial distortion model order; k (k) 1 ,k 2 ,…,k q Is a radial distortion parameter; p is p 1 ,p 2 Is a tangential distortion parameter;
Figure GDA0003813600880000082
is a distorted pixel coordinate; u, v is the ideal pixel coordinates; u (u) 0 ,v 0 Pixel coordinates that are the principal point of the image; r is a pixel coordinate normalization coefficient and is a constant.
In the present embodiment, the constant R in the distortion model can be calculated using the following formula:
R=0.5(M 2 +N 2 ) 0.5
where M and N are the image horizontal resolution and vertical resolution, respectively, the use of this formula means that the value of R is set to half the diagonal length of the image.
Therefore, the distorted pixel coordinates and the predicted pixel coordinates are substituted into the distorted model to obtain an equation set, and the distortion parameters are obtained by solving the least square solution of the equation set.
Specific solving processes can be seen in the prior art.
Optionally, in order to prevent over-fitting and obtain an optimal solution, as shown in fig. 5, virtual feature points may also be used as a calculation basis, to calculate distortion parameters, and a specific calculation flow of the distortion parameters.
S206: using the distortion parameters, the image acquired by the camera is corrected.
Let the de-distorted image be I, its resolution be n×m, N be the vertical resolution, and M be the horizontal resolution. Let l=i·m+j, the pixel coordinates of the ith row and j column be expressed as (u) l ,v l ) Obviously u l =j,v l =i. The image de-distortion method comprises the following steps:
A1. using the obtained distortion parameters to remove the distortion of the ith row and j columns of pixels (u l ,v l ) Substituting the ideal pixel coordinate into the formula (1) to calculate the corresponding distorted pixel coordinate
Figure GDA0003813600880000083
A2. According to distorted pixel coordinates
Figure GDA0003813600880000084
The neighborhood pixels of the location are fetched from the original image I. />
A3. Pixel interpolation is carried out according to the pixel values of the neighborhood pixels to obtain pixel coordinates in the original image
Figure GDA0003813600880000085
New pixel value at->
Figure GDA0003813600880000086
The pixel interpolation can adopt nearest neighbor interpolation, bilinear interpolation or bicubic interpolation and other formulasThe method is realized.
A4 use
Figure GDA0003813600880000087
Is used to fill the ith row and j column pixels I (u l ,v l ) Is a value of (2).
It should be noted that the above distortion correction method is only an example, and other existing methods may be used to correct the image according to the distortion parameters.
The distortion correction method of the camera shown in fig. 2 has the following effects:
1. the pixel coordinates are used for operation, but not the image coordinates, the distortion parameter independent calculation can be realized in the distortion calibration process, so that the dependency relationship of other internal parameters and external parameters except the distortion parameter is removed, the image distortion removal process only participates in the distortion parameter, and the calculated amount is reduced.
2. Because the method is based on pixel coordinates, the distortion parameters can be calibrated by only one plane array image, and a plurality of plane array images which are shot without angles are not needed, so that the calculation efficiency is high.
3. Most of the existing camera distortion calibration methods do not consider tangential distortion, so that the tangential distortion of a camera cannot be corrected. The distortion model of the present embodiment includes tangential distortion parameters, so that high-order tangential distortion of the camera can be corrected.
4. The radial distortion model adopted by the existing camera distortion calibration method is low in order, most of the radial distortion model only supports 1-3 order radial distortion, only can correct smaller radial distortion, and is not suitable for distortion correction of the fisheye camera. The order k of the radial distortion in the distortion model in this embodiment can be set, and 4 or more radial distortions can be supported, so that the fisheye camera can be corrected for distortion.
Fig. 4 is a specific flow of obtaining predicted pixel coordinates of a preset feature point by using a line fitting method, which includes the following steps:
providing orthodontic plane array images
Figure GDA0003813600880000091
The characteristic point set of the ith row array unit in the middle is expressed as
Figure GDA0003813600880000092
Obtaining a predicted line of the characteristic points of the ith row of array units according to S401-S404:
s401: determining a set of points S using a line fitting method i Is a line fitting straight line, assuming that the slope of the straight line is gamma i
S402: find the passing principal point (u) 0 ,v 0 ) And a linear equation u= - γ perpendicular to the best fit line i v+(u 0i v 0 ) This line is called the perpendicular to the line-fitting line.
S403: s is obtained i The point closest to the perpendicular to the line-fitted straight line in the middle is assumed to have a pixel coordinate (α ii )。
S404: find the passing point (alpha) ii ) And the slope is gamma i Equation v=γ for the straight line of (2) i u+(β ii α i ) This straight line is called a predicted line of the i-th line array unit feature point.
In the case where the distortion of the camera is small, S403 may be used to calculate S in the process of calculating the prediction straight line of the i-th line array unit feature point i Middle distance principal point (u) 0 ,v 0 ) The method of the nearest point is replaced to reduce the calculation amount.
Providing orthodontic plane array images
Figure GDA0003813600880000101
The j-th array unit feature point set in the middle is
Figure GDA0003813600880000102
Obtaining a predicted line of the characteristic points of the jth array unit according to S405-S408:
s405: obtaining the best fitting straight line (namely column fitting straight line) of the point set by using a straight line fitting method, and assuming that the slope of the straight line is
Figure GDA0003813600880000103
S406: find the passing principal point (u) 0 ,v 0 ) And a straight line equation perpendicular to the best fit straight line
Figure GDA0003813600880000104
This line is called the perpendicular to the column fitting line.
S407: calculate T j The point closest to the perpendicular to the line of column fitting is assumed to have a pixel coordinate (ρ jj )。
S408: the passing point (ρ) is obtained jj ) And the slope is
Figure GDA0003813600880000105
Equation of straight line>
Figure GDA0003813600880000106
This straight line is called a predicted line of the characteristic points of the j-th array unit.
In the case where the distortion of the camera is small, S407 may be used to find S in the process of finding the predicted straight line of the feature points of the jth array cell i Middle distance principal point (u) 0 ,v 0 ) The method of the nearest point is replaced to reduce the calculation amount.
S409: the intersection of the predicted line and predicted column lines is used as the predicted pixel coordinates.
Specifically, since the predicted pixel coordinates (u t ,v t ) Is the intersection point of two predicted straight lines, and the coordinates are substituted into two straight line equations to obtain:
v t =γ i u t +(β ii α i )
Figure GDA0003813600880000107
the predicted pixel coordinates (u t ,v t ) Is the above binary once equationSolution of group.
In the flow shown in fig. 4, the predicted pixel coordinates can be obtained using only a linear algorithm, and therefore, the calculation speed is fast and the hardware implementation is easy.
Fig. 5 is a flowchart of calculating distortion parameters of a camera according to at least distorted pixel coordinates and predicted pixel coordinates according to an embodiment of the present application, including the following steps:
s501: and setting the initial distortion parameter to be 0, namely setting the initial orthodontic plane array image as a plane array image.
S502: and obtaining predicted pixel coordinates and distorted pixel coordinates of preset virtual feature points.
In order to solve the problem, in the embodiment, edge pixel points in a planar array image are selected as virtual feature points, distorted pixel coordinates of the virtual feature points are obtained from the planar array image, and predicted pixel coordinates of the virtual feature points are obtained from predicted pixel coordinates of an orthodontic planar array image.
S503: substituting the distorted pixel coordinates and the predicted pixel coordinates into a preset distortion model to obtain a first distortion equation set.
Distorted pixel coordinates are pixel coordinates of preset feature points in the current planar array image. And predicting pixel coordinates, namely, the pixel coordinates of preset characteristic points in the current orthodontic plane array image.
S504: substituting the predicted pixel coordinates and the distorted pixel coordinates of the virtual feature points into the distortion model to obtain a second distortion equation set.
S505: and obtaining distortion parameters by solving a least square solution of the first distortion equation set and the second distortion equation set.
Specifically, let w=m·n, based on the predicted pixel coordinates and the distorted pixel coordinates of each unit feature point, substituting the predicted pixel coordinates as ideal pixel coordinates into the distorted model formula (1) can obtain the following first distorted equation set:
Figure GDA0003813600880000111
Figure GDA0003813600880000112
in the above two formulas, x t =(u t -u 0 )R -1 ,y t =(v t -v 0 )R -1 ,
Figure GDA0003813600880000113
Based on the predicted pixel coordinates and the distorted pixel coordinates of the virtual feature points, substituting the predicted pixel coordinates as ideal pixel coordinates into the distorted model formula (1) to obtain the following second distorted equation set
Figure GDA0003813600880000121
Figure GDA0003813600880000122
In the above two formulas, X t =(U t -u 0 )R -1 ,Y t =(V t -v 0 )R -1 ,
Figure GDA0003813600880000123
The distortion formulas (2), (3), (4) and (5) of w feature points and c virtual feature points are combined, and in order to reduce the contribution of the virtual feature points to distortion parameters, the equal sign sides of the formulas (4) and (5) are multiplied by a smaller weight epsilon s Wherein 0 ε is equal to or less than s Not more than 1, s=1, 2,. The following linear system of equations can be constructed:
Figure GDA0003813600880000124
let f= [ k ] 1 …k q p 1 p 2 ] T The above linear equation set is noted as af=b, and the least squares solution of the distortion parameters is obtained as:
f=(A T A) -1 A T b (7)
s506: and judging whether the distortion parameters are converged, if so, ending the flow, and if not, executing S507.
Convergence indicates that the distortion parameters have approached the optimal solution.
Specifically, the condition for convergence of the distortion parameters includes at least one of:
1. and (5) calculating the iteration times of the distortion parameters to reach a preset value.
2. The deviation between the value of the distortion parameter obtained in the current iteration process and the value of the distortion parameter obtained in the last iteration process is smaller than a first preset threshold value.
3. The deviation of the current distorted pixel coordinates and the historical distorted pixel coordinates is smaller than a second preset threshold value.
The current distorted pixel coordinates are determined by using distortion parameters obtained in the current iteration process and preset real pixel coordinates, and the historical distorted pixel coordinates are determined by using distortion parameters obtained in the last iteration process and the real pixel coordinates.
The real pixel coordinates are known real coordinate pixels, for example, coordinates of pixels of 4 vertices of a planar array image: (U) 1 ,V 1 )=(1,1),(U 2 ,V 2 )=(M,1),(U 3 ,V 3 )=(M,N),(U 4 ,V 4 )=(1,N)。
4. And the deviation between the predicted pixel coordinates and the orthodontic pixel coordinates, which are determined by using the distortion parameters obtained in the current iteration process, is smaller than a third preset threshold.
I.e. the deviation of the orthodontic pixel coordinates of the orthodontic plane array image used by the current iteration process and the corresponding predicted pixel coordinates calculated therefrom.
S507: correcting the plane array image by using the distortion parameters obtained by calculation to obtain a new orthodontic plane array image, and replacing the orthodontic plane array image used in the previous iteration process by using the new orthodontic plane array image to update the orthodontic plane array image. Execution returns to S502.
The weights ε of c virtual feature points s May be different and may change its value accordingly with each iteration.
For example, with 4 virtual feature points, the weights of the 4 virtual feature points are set to a fixed value of 0.003, ε s ≡0.003, s=1, 2,..4. Alternatively, 2 virtual feature points are employed, and the initial value of the weights of the 2 virtual feature points is set as ε 1 =0.004,ε 2 =0.005, epsilon after each iteration calculation of distortion parameters 1 And epsilon 2 The value of (2) is increased by 0.001.
Optionally, in order to ensure that the means for preventing overfitting has a good effect, in this embodiment, pixels of 4 vertices of the abnormal plane array image may be used as virtual feature points, and in this case, actual pixel coordinates of the pixels of 4 vertices may be used as predicted pixel coordinates of the virtual feature points: i.e. the predicted pixel coordinates of the virtual feature points are always (U) 1 ,V 1 )=(1,1),(U 2 ,V 2 )=(M,1),(U 3 ,V 3 )=(M,N),(U 4 ,V 4 )=(1,N)。
In each iteration process, after distortion parameters are calculated, the predicted pixel coordinates of the 4 virtual feature points are used as ideal pixel coordinates to be respectively substituted into a formula (1), 4 distorted pixel coordinates are obtained, and the distorted pixel coordinates of the 4 virtual feature points are updated to be used as distorted pixel coordinates of the virtual feature points in the next iteration process.
The flow shown in fig. 5 has the following beneficial effects:
1. in order to avoid the problem of serious distortion at a position far from a principal point in the image after de-distortion caused by the over-fitting phenomenon in the distortion parameter calculation, in this embodiment, additional constraint conditions (i.e., virtual feature points) except for preset feature points are adopted to prevent the optimized distortion parameter solving process from over-fitting, so that distortion at a position far from the principal point in the image after de-distortion is avoided.
2. And updating the orthodontic plane array image after the least square solution is obtained, and iteratively calculating the distortion parameters until convergence is achieved, so that the optimal distortion parameters are obtained.
Fig. 6 is a distortion correction apparatus of a camera according to an embodiment of the present application, including:
an image obtaining unit 601, configured to obtain a planar array image and an orthodontic planar array image, where the planar array image is collected by a camera, and the orthodontic planar array image is obtained by correcting distortion of the planar array image;
a first coordinate acquiring unit 602, configured to acquire pixel coordinates of a preset feature point in the planar array image as distorted pixel coordinates;
a second coordinate acquiring unit 603, configured to acquire, as orthodontic pixel coordinates, pixel coordinates of the preset feature point in the orthodontic plane array image;
a third coordinate acquiring unit 604, configured to acquire predicted pixel coordinates of the preset feature point according to the orthodontic pixel coordinates;
a distortion parameter calculation unit 605 for calculating a distortion parameter of the camera based at least on the distorted pixel coordinates and the predicted pixel coordinates;
an image correction unit 606 for correcting the image acquired by the camera using the distortion parameters.
Optionally, the third coordinate obtaining unit is configured to obtain, according to the orthodontic pixel coordinate, a predicted pixel coordinate of the preset feature point, including:
the third coordinate acquisition unit is specifically configured to:
acquiring a predicted line, wherein the predicted line corresponding to an ith line of orthodontic pixels in the orthodontic plane array image is determined by a slope i and a pixel coordinate of a reference point i, the slope i is a slope of a line fitting line of the ith line of orthodontic pixels, the reference point i is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a perpendicular line of the line fitting line, and the perpendicular line is a line passing through the main point and perpendicular to the line fitting line;
acquiring a predicted column straight line, wherein the predicted column straight line corresponding to a j-th column orthodontic pixel in the orthodontic plane array image is determined by a slope j and a pixel coordinate of a reference point j, the slope j is a slope of a column fitting straight line of the j-th column orthodontic pixel, the reference point j is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a vertical line of the column fitting straight line in the j-th column orthodontic pixel, and the vertical line of the column fitting straight line is a straight line passing through the main point and vertical to the column fitting straight line;
and taking the intersection point of the predicted line and the predicted column line as the predicted pixel coordinate.
Optionally, the distortion parameter calculating unit is configured to calculate a distortion parameter of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates, including:
the distortion parameter calculation unit is specifically configured to:
substituting the distorted pixel coordinates and the predicted pixel coordinates into a preset distortion model to obtain a first distortion equation set; the distortion model is used for representing the distorted pixel coordinates and the predicted pixel coordinates and generating a transformation relation according to the distortion parameters;
and obtaining the distortion parameters by solving a least square solution of an equation set, wherein the equation set at least comprises the first distortion equation set.
Optionally, the system of equations further includes:
and the second distortion equation set is obtained by substituting the predicted pixel coordinates and the distortion pixel coordinates of the preset virtual feature points into the distortion model.
Optionally, the distortion parameter calculating unit is configured to calculate a distortion parameter of the camera at least according to the distorted pixel coordinate and the predicted pixel coordinate, and further includes:
the distortion parameter calculation unit is specifically configured to:
after determining the distortion parameters, if the distortion parameters do not converge, performing the following procedures, and iteratively calculating the distortion parameters until the distortion parameters converge: correcting the plane array image by using the distortion parameters obtained in the last iteration process, updating the orthodontic plane array image, and obtaining pixel coordinates of the preset characteristic points in the orthodontic plane array image as orthodontic pixel coordinates; acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates; and calculating the distortion parameters of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates.
Optionally, the condition for convergence of the distortion parameter includes at least one of:
calculating the iteration times of the distortion parameters to reach a preset value;
the deviation between the value of the distortion parameter obtained in the current iteration process and the value of the distortion parameter obtained in the last iteration process is smaller than a first preset threshold value;
the deviation between the current distorted pixel coordinate and the historical distorted pixel coordinate is smaller than a second preset threshold value, the current distorted pixel coordinate is determined by using the distorted parameter obtained in the current iteration process and a preset real pixel coordinate, and the historical distorted pixel coordinate is determined by using the distorted parameter obtained in the last iteration process and the real pixel coordinate;
and the deviation between the predicted pixel coordinates and the orthodontic pixel coordinates, which are determined by using the distortion parameters obtained in the current iterative process, is smaller than a third preset threshold.
Optionally, the distortion parameters in the distortion model include:
the multi-order radial distortion model order, the radial distortion parameters, and the tangential distortion parameters.
The embodiment of the application also discloses a distortion correction device of the camera, please refer to fig. 7, which shows a schematic structural diagram of the distortion correction device of the camera, and the device may include: at least one processor 701, at least one communication interface 702, at least one memory 703 and at least one communication bus 704;
in the embodiment of the present application, the number of the processor 701, the communication interface 702, the memory 703 and the communication bus 704 is at least one, and the processor 701, the communication interface 702 and the memory 703 complete communication with each other through the communication bus 704;
the processor 701 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention, etc.;
the memory 703 may comprise a high speed RAM memory, and may also include a non-volatile memory (non-volatile memory) or the like, such as at least one disk memory;
wherein the memory stores a program, the processor is operable to invoke the program stored in the memory, the program operable to:
acquiring a plane array image and an orthodontic plane array image, wherein the plane array image is acquired by a camera, and the orthodontic plane array image is obtained by correcting distortion of the plane array image;
acquiring pixel coordinates of preset feature points in the planar array image as distorted pixel coordinates;
acquiring pixel coordinates of the preset feature points in the orthodontic plane array image as orthodontic pixel coordinates;
acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates;
calculating distortion parameters of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates;
correcting an image acquired by the camera using the distortion parameters.
Alternatively, the refinement function and the extension function of the program may be described with reference to the above.
The embodiment of the application also discloses a readable storage medium, which can store a program suitable for being executed by a processor, and the program is used for:
acquiring a plane array image and an orthodontic plane array image, wherein the plane array image is acquired by a camera, and the orthodontic plane array image is obtained by correcting distortion of the plane array image;
acquiring pixel coordinates of preset feature points in the planar array image as distorted pixel coordinates;
acquiring pixel coordinates of the preset feature points in the orthodontic plane array image as orthodontic pixel coordinates;
acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates;
calculating distortion parameters of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates;
correcting an image acquired by the camera using the distortion parameters.
Alternatively, the refinement function and the extension function of the program may be described with reference to the above.
The functions described in the methods of the present application, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computing device readable storage medium. Based on such understanding, a portion of the embodiments of the present application that contributes to the prior art or a portion of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computing device (which may be a personal computer, a server, a mobile computing device or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. A distortion correction method for a camera, comprising:
acquiring a plane array image and an orthodontic plane array image, wherein the plane array image is acquired by a camera, and the orthodontic plane array image is obtained by correcting distortion of the plane array image;
acquiring pixel coordinates of preset feature points in the planar array image as distorted pixel coordinates;
acquiring pixel coordinates of the preset feature points in the orthodontic plane array image as orthodontic pixel coordinates;
acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates;
substituting the distorted pixel coordinates and the predicted pixel coordinates into a preset distortion model to obtain a first distortion equation set; the distortion model is used for representing the distorted pixel coordinates and the predicted pixel coordinates and generating a transformation relation according to distortion parameters;
obtaining the distortion parameters by solving a least square solution of an equation set, wherein the equation set at least comprises the first distortion equation set and the second distortion equation set; the second distortion equation set is obtained by substituting predicted pixel coordinates and distortion pixel coordinates of preset virtual feature points into the distortion model; the distortion parameters include: the multi-order radial distortion model order, radial distortion parameters and tangential distortion parameters; the multi-order radial distortion model orders comprise 4 orders and more;
correcting an image acquired by the camera using the distortion parameters; the process of de-distorting the image acquired by the camera is the same and only the distortion parameters participate.
2. The method according to claim 1, wherein the obtaining the predicted pixel coordinates of the preset feature point according to the orthodontic pixel coordinates includes:
acquiring a predicted line, wherein the predicted line corresponding to an ith line of orthodontic pixels in the orthodontic plane array image is determined by a slope i and a pixel coordinate of a reference point i, the slope i is a slope of a line fitting line of the ith line of orthodontic pixels, the reference point i is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a perpendicular line of the line fitting line, and the perpendicular line is a line passing through the main point and perpendicular to the line fitting line;
acquiring a predicted column straight line, wherein the predicted column straight line corresponding to a j-th column orthodontic pixel in the orthodontic plane array image is determined by a slope j and a pixel coordinate of a reference point j, the slope j is a slope of a column fitting straight line of the j-th column orthodontic pixel, the reference point j is an orthodontic pixel point closest to a preset main point or an orthodontic pixel point closest to a vertical line of the column fitting straight line in the j-th column orthodontic pixel, and the vertical line of the column fitting straight line is a straight line passing through the main point and vertical to the column fitting straight line;
and taking the intersection point of the predicted line and the predicted column line as the predicted pixel coordinate.
3. The method of claim 1, wherein calculating distortion parameters of the camera further comprises:
after determining the distortion parameters, if the distortion parameters do not converge, performing the following procedures, and iteratively calculating the distortion parameters until the distortion parameters converge: correcting the plane array image by using the distortion parameters obtained in the last iteration process, updating the orthodontic plane array image, and obtaining pixel coordinates of the preset characteristic points in the orthodontic plane array image as orthodontic pixel coordinates; acquiring predicted pixel coordinates of the preset feature points according to the orthodontic pixel coordinates; and calculating the distortion parameters of the camera at least according to the distorted pixel coordinates and the predicted pixel coordinates.
4. A method according to claim 3, wherein the condition for convergence of the distortion parameters comprises at least one of:
calculating the iteration times of the distortion parameters to reach a preset value;
the deviation between the value of the distortion parameter obtained in the current iteration process and the value of the distortion parameter obtained in the last iteration process is smaller than a first preset threshold value;
the deviation between the current distorted pixel coordinate and the historical distorted pixel coordinate is smaller than a second preset threshold value, the current distorted pixel coordinate is determined by using the distorted parameter obtained in the current iteration process and a preset real pixel coordinate, and the historical distorted pixel coordinate is determined by using the distorted parameter obtained in the last iteration process and the real pixel coordinate;
and the deviation between the predicted pixel coordinates and the orthodontic pixel coordinates, which are determined by using the distortion parameters obtained in the current iterative process, is smaller than a third preset threshold.
5. A distortion correction apparatus for a camera, comprising:
the image acquisition unit is used for acquiring a plane array image and an orthodontic plane array image, wherein the plane array image is acquired by a camera, and the orthodontic plane array image is obtained by correcting distortion of the plane array image;
a first coordinate acquiring unit, configured to acquire pixel coordinates of a preset feature point in the planar array image as distorted pixel coordinates;
the second coordinate acquisition unit is used for acquiring pixel coordinates of the preset characteristic points in the orthodontic plane array image and taking the pixel coordinates as orthodontic pixel coordinates;
the third coordinate acquisition unit is used for acquiring the predicted pixel coordinates of the preset characteristic points according to the orthodontic pixel coordinates;
a distortion parameter calculating unit, configured to calculate a distortion parameter of a camera based on a preset distortion model at least according to the distortion pixel coordinate and the predicted pixel coordinate, where the distortion parameter in the distortion model includes: the multi-order radial distortion model order, radial distortion parameters and tangential distortion parameters; the multi-order radial distortion model orders comprise 4 orders and more;
an image correction unit for correcting an image acquired by the camera using the distortion parameter; the image collected by the camera is subjected to de-distortion, and only the distortion parameters participate in the process;
the distortion parameter calculation unit is specifically configured to:
substituting the distorted pixel coordinates and the predicted pixel coordinates into a preset distortion model to obtain a first distortion equation set; the distortion model is used for representing the distorted pixel coordinates and the predicted pixel coordinates and generating a transformation relation according to the distortion parameters;
and obtaining the distortion parameters by solving a least square solution of an equation set, wherein the equation set at least comprises the first distortion equation set and a second distortion equation set, and the second distortion equation set is obtained by substituting predicted pixel coordinates and distortion pixel coordinates of preset virtual feature points into the distortion model.
6. A distortion correction apparatus of a camera, comprising: a memory and a processor;
the memory is used for storing programs;
the processor is configured to execute the program to implement the respective steps of the distortion correction method of the camera according to any one of claims 1 to 4.
7. A readable storage medium having stored thereon a computer program, which, when executed by a processor, implements the respective steps of the distortion correction method of a camera according to any one of claims 1 to 4.
CN201910983234.5A 2019-10-16 2019-10-16 Distortion correction method and device for camera, equipment and storage medium Active CN110738707B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910983234.5A CN110738707B (en) 2019-10-16 2019-10-16 Distortion correction method and device for camera, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910983234.5A CN110738707B (en) 2019-10-16 2019-10-16 Distortion correction method and device for camera, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110738707A CN110738707A (en) 2020-01-31
CN110738707B true CN110738707B (en) 2023-05-26

Family

ID=69269060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910983234.5A Active CN110738707B (en) 2019-10-16 2019-10-16 Distortion correction method and device for camera, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110738707B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111387987A (en) * 2020-03-26 2020-07-10 苏州沃柯雷克智能系统有限公司 Height measuring method, device, equipment and storage medium based on image recognition
CN111415314B (en) * 2020-04-14 2023-06-20 北京神工科技有限公司 Resolution correction method and device based on sub-pixel level visual positioning technology
CN113822937B (en) * 2020-06-18 2024-01-26 中移(苏州)软件技术有限公司 Image correction method, device, equipment and storage medium
CN113115017B (en) * 2021-03-05 2022-03-18 上海炬佑智能科技有限公司 3D imaging module parameter inspection method and 3D imaging device
CN113284189A (en) * 2021-05-12 2021-08-20 深圳市格灵精睿视觉有限公司 Distortion parameter calibration method, device, equipment and storage medium
CN112947885B (en) * 2021-05-14 2021-08-06 深圳精智达技术股份有限公司 Method and device for generating curved surface screen flattening image
CN113298699B (en) * 2021-05-27 2023-02-21 上海电机学院 Fisheye image correction method
CN113838138A (en) * 2021-08-06 2021-12-24 杭州灵西机器人智能科技有限公司 System calibration method, system, device and medium for optimizing feature extraction
CN114331814A (en) * 2021-12-24 2022-04-12 合肥视涯技术有限公司 Distorted picture correction method and display equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701776A (en) * 2016-01-07 2016-06-22 武汉精测电子技术股份有限公司 Lens distortion correcting method and system used for automatic optical detection
CN105957041A (en) * 2016-05-27 2016-09-21 上海航天控制技术研究所 Wide-angle lens infrared image distortion correction method
CN108245788A (en) * 2017-12-27 2018-07-06 苏州雷泰医疗科技有限公司 A kind of binocular range unit and method, the accelerator radiotherapy system including the device
CN108986172A (en) * 2018-07-25 2018-12-11 西北工业大学 A kind of single-view linear camera scaling method towards small depth of field system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701776A (en) * 2016-01-07 2016-06-22 武汉精测电子技术股份有限公司 Lens distortion correcting method and system used for automatic optical detection
CN105957041A (en) * 2016-05-27 2016-09-21 上海航天控制技术研究所 Wide-angle lens infrared image distortion correction method
CN108245788A (en) * 2017-12-27 2018-07-06 苏州雷泰医疗科技有限公司 A kind of binocular range unit and method, the accelerator radiotherapy system including the device
CN108986172A (en) * 2018-07-25 2018-12-11 西北工业大学 A kind of single-view linear camera scaling method towards small depth of field system

Also Published As

Publication number Publication date
CN110738707A (en) 2020-01-31

Similar Documents

Publication Publication Date Title
CN110738707B (en) Distortion correction method and device for camera, equipment and storage medium
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
JP5075757B2 (en) Image processing apparatus, image processing program, image processing method, and electronic apparatus
CN106683139B (en) Fisheye camera calibration system based on genetic algorithm and image distortion correction method thereof
WO2016065632A1 (en) Image processing method and device
US20090141148A1 (en) Distortion-corrected image generation unit and distortion-corrected image generation method
CN109544643B (en) Video camera image correction method and device
JP4566591B2 (en) Image deformation estimation method and image deformation estimation apparatus
CN112862895B (en) Fisheye camera calibration method, device and system
CN103164846B (en) A kind of infrared stripes asymmetric correction method based on turning to kernel estimates
CN112241976A (en) Method and device for training model
US20170330311A1 (en) Image processing device and method, image capturing device, program, and record medium
CN112233189B (en) Multi-depth camera external parameter calibration method and device and storage medium
CN110874827B (en) Turbulent image restoration method and device, terminal equipment and computer readable medium
JP6830712B1 (en) Random sampling Consistency-based effective area extraction method for fisheye images
CN109547692B (en) Image posture correction method, equipment and terminal
CN111583119B (en) Orthoimage splicing method and equipment and computer readable medium
CN116012241A (en) Image distortion correction method, apparatus, computer device, and storage medium
CN110413805B (en) Image storage method and device, electronic equipment and storage medium
CN112365421A (en) Image correction processing method and device
CN114283095B (en) Image distortion correction method, system, electronic equipment and storage medium
CN107492080B (en) Calibration-free convenient monocular head image radial distortion correction method
JP7210337B2 (en) Image processing device and distortion correction coefficient calculation method
CN111028296B (en) Method, device, equipment and storage device for estimating focal length value of spherical camera
JP2006010613A (en) Correcting method of image distortion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant