CN114066763A - Image processing method and related device - Google Patents

Image processing method and related device Download PDF

Info

Publication number
CN114066763A
CN114066763A CN202111390221.0A CN202111390221A CN114066763A CN 114066763 A CN114066763 A CN 114066763A CN 202111390221 A CN202111390221 A CN 202111390221A CN 114066763 A CN114066763 A CN 114066763A
Authority
CN
China
Prior art keywords
matrix
projection
image
determining
reprojection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111390221.0A
Other languages
Chinese (zh)
Inventor
符顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Lianji Technology Co ltd
Original Assignee
Hangzhou Lianji Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Lianji Technology Co ltd filed Critical Hangzhou Lianji Technology Co ltd
Priority to CN202111390221.0A priority Critical patent/CN114066763A/en
Publication of CN114066763A publication Critical patent/CN114066763A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application is suitable for the technical field of image processing. The embodiment of the application provides an image processing method and a related device, which are used for carrying out distortion correction on an original image and determining a reprojection matrix H through an internal reference matrix M in the distortion correction process2', then based on the reprojection matrix H2The image after distortion correction is re-projected to be a target image, the image monitored by the high-rise building can be preprocessed to obtain a re-projected image, and the problem that the existing high-rise building detection effect is poor is solved.

Description

Image processing method and related device
Technical Field
The present application relates to image processing technologies, and in particular, to an image processing method and a related apparatus.
Background
At present, along with the development of social economy, high buildings in cities are more and more. Meanwhile, many high-altitude parabolic events occur along with the high-altitude parabolic events. In order to monitor high altitude parabolic events, special monitoring equipment is installed in many places to monitor high buildings in order to determine the relevant accident liability, etc.
In the existing high-rise monitoring, early warning and detection are mostly aimed at high-altitude parabolic events. However, because the existing monitoring equipment has an imaging relationship of "near-far-near-small" during shooting, if the event detection is directly performed on the image, when the monitored building is tall, the detection effect of the object at a high position is poor because the pixel is too small.
Therefore, there is a need for an image processing method for preprocessing the image monitored by the high-rise building, so as to improve the detection effect.
Disclosure of Invention
The embodiment of the application provides an image processing method and a related device, which can be used for preprocessing an image monitored by a high-rise building to obtain a re-projected image and solve the problem of poor detection effect of the high-rise building at present.
In a first aspect, an embodiment of the present application provides an image processing method, including: acquiring an original image; distortion correction is carried out on the original image and an internal parameter matrix M in the distortion correction process is determined; determining a reprojection matrix H according to the internal reference matrix M2'; from the reprojection matrix H2' reprojecting the distortion corrected image to a target image.
The image processing method provided by the embodiment of the application carries out distortion correction on an original image and determines a reprojection matrix H through an internal reference matrix M in the distortion correction process2', then based on the reprojection matrix H2The image after distortion correction is re-projected to be a target image, the image monitored by the high-rise building can be preprocessed to obtain a re-projected image, and the problem that the existing high-rise building detection effect is poor is solved.
With reference to the first aspect, in an implementation manner of the embodiment of the present application, the reprojection matrix H is determined according to the internal reference matrix M2', includes: according to the determined projection internal parameter matrix M2', the determined projection rotation matrix R' and the internal reference matrix M determine a reprojection matrix H2′。
With reference to the first aspect, in an implementation manner of the embodiment of the present application, the projection rotation matrix R' is determined by: selecting four reference points of a building area in the image subjected to distortion correction; determining circumscribed rectangles of the four reference points; determining a homography matrix H according to the four reference points and four vertexes of the circumscribed rectangle; and determining a projection intrinsic parameter matrix M 'and a projection rotation matrix R' according to the homography matrix H and the intrinsic parameter matrix M.
With reference to the first aspect, in an implementation manner of the embodiment of the present application, determining a projection intrinsic parameter matrix M 'and a projection rotation matrix R' according to a homography matrix H and an intrinsic parameter matrix M includes: calculating the product of the homography matrix H and the internal reference matrix M to obtain a first matrix Q; performing matrix decomposition on the first matrix Q to obtain an upper triangular matrix and a unit orthogonal matrix; the upper triangular matrix is set as a projection internal parameter matrix M ', and the unit orthogonal matrix is set as a projection rotation matrix R'.
With reference to the first aspect, in an implementation manner of the embodiment of the present application, the projection rotation matrix R' may be determined by: acquiring a preset group of xyz rotation angles; a projection rotation matrix R' is determined from the set of xyz rotation angles.
With reference to the first aspect, in an implementation manner of the embodiment of the present application, the intra-projection parameter matrix M2' is determined by the following steps: calculating a new projection intrinsic parameter matrix M according to the projection intrinsic parameter matrix M' and the homography matrix H2′。
With reference to the first aspect, in an implementation manner of the embodiment of the present application, a new intra-projection parameter matrix M is calculated according to a intra-projection parameter matrix M' and a homography matrix H2', includes: selecting a rectangle RV of the image after distortion correction; projecting the rectangle RV through a homography matrix H to obtain a polygon RV'; calculating a first multiple scale according to the scaling relation between the rectangle RV and the polygon RVxAnd a second multiple scaley(ii) a According to the first multiple scalexAnd a second multiple scaleyConverting the projection intrinsic parameter matrix M' into a new projection intrinsic parameter matrix M2′。
With reference to the first aspect, in an implementation manner of the embodiment of the present application, the intra-projection parameter matrix M is obtained2', the projection rotation matrix R' and the internal reference matrix M determine a reprojection matrix H2', includes: calculating a reprojection matrix H by a reprojection matrix calculation formula2'; the reprojection matrix calculation formula is as follows: h'2=M'2R'M-1
In a second aspect, an embodiment of the present application provides an apparatus for image processing, including: the acquisition module is used for acquiring an original image; a processing module for distortion correction of the original image, determining an internal reference matrix M in the distortion correction process, and determining a re-projection matrix H according to the internal reference matrix M2'; the re-projection module is used for re-projecting,for projecting matrix H according to the reprojection2' reprojecting the distortion corrected image to a target image.
In a third aspect, an embodiment of the present application provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the method as in the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the method of any one of the above first aspects.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that:
the embodiment of the application provides an image processing method and a related device, which are used for carrying out distortion correction on an original image and determining a reprojection matrix H through an internal reference matrix M in the distortion correction process2', then based on the reprojection matrix H2The image after distortion correction is re-projected to be a target image, the image monitored by the high-rise building can be preprocessed to obtain a re-projected image, and the problem that the existing high-rise building detection effect is poor is solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart of a method for image processing according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating detailed steps in a method for image processing according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an image after distortion correction;
FIG. 4 is a simplified schematic diagram of a distortion corrected image;
FIG. 5 is a schematic diagram of a circumscribed rectangle in an embodiment of the present application;
FIG. 6 is a schematic diagram of a rectangular transformation in an embodiment of the present application;
FIG. 7 is a schematic diagram of a target image according to an embodiment of the present application;
FIG. 8 is a simplified diagram of a target image in accordance with an embodiment of the present application;
fig. 9 is an exemplary diagram of a terminal device provided in an embodiment of the present application;
fig. 10 is a schematic diagram of an apparatus for image processing according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The image processing method provided by the embodiment of the application is mainly applied to monitoring equipment for monitoring a high-rise building. Of course, in some possible embodiments, the method may also be applied to processing images taken by a cell phone, or images taken by other devices. The following describes the application of the monitoring device in detail, and other application scenarios may be implemented with reference to the embodiment of the present application, which is not described in detail again.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present disclosure. The process comprises the following steps:
101. acquiring an original image;
in the embodiment of the present application, the original image may be an image captured by the monitoring device. The original image includes the building area. Generally, the monitoring device is installed on a low floor of a high building and then photographs toward the top of the high building. Thus, the original image generally includes areas of buildings.
102. Distortion correction is carried out on the original image and an internal parameter matrix M in the distortion correction process is determined;
in the embodiment of the application, distortion correction of the original image can be realized by adopting a camera calibration algorithm. Illustratively, distortion correction may be performed by the tensor calibration method. The concrete model is as follows:
Figure BDA0003364504880000051
wherein (x)w,yw,zw) Being the three-dimensional coordinates of the real world, (u, v) are pixel coordinate values on the image. R is a rotation matrix of 3x3, t is a translation vector of 3x1, f is the focal length of the camera, dx and dy are the physical dimensions of the pixel in the x-axis and y-axis directions, respectively, (u)0,v0) Is the coordinates of the origin of the image coordinate system. The values of the parameters R, t,
Figure BDA0003364504880000061
(u0,v0) Can be obtained by calculating through a camera calibration algorithm.
When the three-dimensional real world is consistent with the origin of the coordinate system of the image camera, the parameter t is 0, and the formula at this time is:
Figure BDA0003364504880000062
wherein M is an internal reference matrix of the camera. The internal reference matrix M can be obtained according to the formula.
103. Determining a reprojection matrix H according to the internal reference matrix M2′;
In the embodiment of the present application, the reprojection matrix H may be calculated by a reprojection matrix calculation formula2′;
The reprojection matrix calculation formula is as follows: h2′=M2′R′M-1
Wherein M is2'is a determined projection intrinsic parameter matrix M2', R 'is a determined projection rotation matrix R', and M is an intrinsic parameter matrix M.
When the monitoring camera is fixed, the projection internal parameter matrix M2 'and the projection rotation matrix R' are obtained by pre-calculation, and when a plurality of frames of images are processed subsequently, only the internal parameter matrix M needs to be determined, and the reprojection matrix H can be determined according to the reprojection matrix calculation formula2′。
When the monitoring camera is not fixed, or the fixed monitoring camera performs image processing for the first time, the intra-projection parameter matrix M2 'and the projection rotation matrix R' may be calculated by the steps shown in fig. 2:
1031. selecting four reference points for a building area;
fig. 3 is a schematic diagram of an image after distortion correction. Fig. 4 is a simplified schematic diagram of a distortion corrected image. The image after the distortion correction still has an imaging relationship of 'big and small near', if the image is directly subjected to event detection, when the monitored building is higher, the object at the high position has poor detection effect because the pixel is too small.
In the embodiment of the present application, four reference points may be selected in the building region of the image subjected to the distortion correction. Generally, four points which can form a rectangle on a high building in the orthographic projection direction of the camera are selected. In order to ensure that the selected points are rectangular in the orthographic direction, some corner points of the building itself having a vertical relationship may be used, such as four reference points taken in fig. 3 and 4: p1(u1,v1)、P2(u2,v2)、P3(u3,v3)、P4(u4,v4). The four points are only required to be rectangular in the orthographic projection direction, the selection method can be manual selection, and can also be obtained by calculation of other computer-related algorithms, and the method is not limited in the embodiment of the application.
In the embodiment of the present application, the farther the four points are from each other, the better. The longer the better because it is avoided that the points taken cannot be rectangular, the farther the distance the error of manual selection can be reduced.
1032. Determining circumscribed rectangles of the four reference points;
fig. 5 is a schematic diagram of a circumscribed rectangle in the embodiment of the present application. It can be seen that the circumscribed rectangle of the trapezoid P1P2P3P4 can be P1 'P2' P3 'P4'. The determination mode can be manually selected, and can also be obtained by other computer-related algorithm calculation. Illustratively, if the four reference points are P1(u1,v1)、P2(u2,v2)、P3(u3,v3)、P4(u4,v4) The coordinate P1' of the upper left corner position may be (min (u)2,u1),min(v1,v4) The P2' coordinate for the lower left corner position may be (min (u)2,u1),max(v2,v3) The P3' coordinate for the lower right corner position may be (max (u)3,u4),max(v2,v3) The coordinate P4' for the upper right corner position may be (max (u)3,u4),min(v1,v4))。
1033. Determining a homography matrix H according to the four reference points and four vertexes of the circumscribed rectangle;
in the embodiment of the present application, the four reference points may be set as P first1(u1,v1)、P2(u2,v2)、P3(u3,v3)、P4(u4,v4);
Then, four vertexes of the circumscribed rectangle are set to be P1′(u1′,v1′)、P2′(u2′,v2′)、P3′(u3′,v3′)、P4′(u4′,v4′);
Then, solving the first relational expression to determine a homography matrix H;
the first relation is:
Figure BDA0003364504880000081
wherein the content of the first and second substances,
Figure BDA0003364504880000082
is a matrix corresponding to four vertices of a circumscribed rectangle, α being such that
Figure BDA0003364504880000083
The last term is a normalized coefficient of 1,
Figure BDA0003364504880000084
a matrix of four reference points.
During solving, the homography matrix H can be expanded to obtain:
Figure BDA0003364504880000085
then:
ui′=α(h1ui+h2vi+h3);
vi′=α(h4ui+h5vi+h6);
1=α(h7ui+h8vi+h9);
elimination of α gives:
ui′(h7ui+h8vi+h9)=(h1ui+h2vi+h3);
vi′(h7ui+h8vi+h9)=(h4ui+h5vi+h6);
finishing to obtain:
h7uiui′+h8viui′+h9ui′-h1ui-h2vi-h3=0;
h7uivi′+h8vivi′+h9vi′-h4ui-h5vi-h6=0;
writing the above equation as a multiplication of the matrix and the vector has:
Aih=0;
Figure BDA0003364504880000091
h=[h1 h2 h3 h4 h5 h6 h7 h8 h9];
from the above equation, it can be seen that 1 pair of matching points can provide 2 linear equations. h has 9 unknowns, let h be91, 4 pairs of 8 equations of matching points can solve the remaining 8 unknowns.
And after H is obtained by solving, a homography matrix H can be obtained.
1034. Determining a reprojection matrix H according to the homography matrix H and the internal reference matrix M2′;
In the embodiment of the present application, the re-projection refers to re-projecting the distortion-corrected image as a target image. To achieve reprojection, a reprojection matrix H may be determined from the homography matrix H and the internal reference matrix M2'. Comprises the following steps:
firstly, the method comprises the following steps: determining a projection internal parameter matrix M 'and a projection rotation matrix R' according to the homography matrix H and the internal parameter matrix M;
for the re-projection process, it can be understood as re-rotating and center-projecting the camera coordinate system on the original basis.
Expressed by formula, the original image satisfies:
Figure BDA0003364504880000092
the re-projected image satisfies the following conditions:
Figure BDA0003364504880000093
because:
Figure BDA0003364504880000094
therefore, the following can be obtained:
H=M′R′M-1
since the camera's internal reference matrix M is available through step 102 and the homography matrix H is also obtained in step 105, there are:
Q=HM=M′R′;
in the formula, the projection intrinsic parameter matrix M 'is an upper triangular matrix, and the projection rotation matrix R' is an orthogonal matrix, so that the projection intrinsic parameter matrix M 'and the projection rotation matrix R' can be obtained by performing "RQ" decomposition on the matrices. It can be understood that the matrix Q can be decomposed by RQ in theory, and then the decomposed upper triangular matrix and unit orthogonal matrix can be regarded as an internal reference matrix and a rotation matrix, respectively.
Therefore, according to the homography matrix H and the internal reference matrix M, the product of the homography matrix H and the internal reference matrix M can be calculated to obtain a first matrix Q; then, matrix decomposition (e.g., RQ decomposition) may be performed on the first matrix Q to obtain an upper triangular matrix and an identity orthogonal matrix; finally, the obtained upper triangular matrix can be set as a projection internal parameter matrix M ', and the obtained unit orthogonal matrix is set as a projection rotation matrix R'.
In some embodiments, the method of selecting four reference points to obtain the projection rotation matrix R' may not be employed. The projection rotation matrix R' may be determined by: acquiring a preset group of xyz rotation angles; a projection rotation matrix R' is determined from the set of xyz rotation angles. At this time, the intrinsic parameter matrix M and the projection intrinsic parameter matrix M' may be set to be equal.
In some cases, the distortion corrected image may be projected directly from the intra-projection parameter matrix M', however, the image obtained in this mannerThere are a lot of black border areas, which are not favorable for direct display, therefore, the projection internal parameter matrix M' needs to be adjusted to obtain a new projection internal parameter matrix M2'. The method of adjustment is not exclusive, and the adjustment can be performed as follows.
Secondly, the method comprises the following steps: calculating a new projection intrinsic parameter matrix M according to the projection intrinsic parameter matrix M' and the homography matrix H2′;
Firstly, selecting a rectangle RV in an original image after distortion correction;
fig. 6 is a schematic diagram of rectangular transformation in the embodiment of the present application. In the embodiment of the present application, a rectangle RV of the distortion-corrected image is first selected, and four vertices of the rectangle RV are RV1 (u)1,v1)、RV2(u2,v2)、RV3(u3,v3)、RV4(u4,v4). The selection method may be manual selection, or may be obtained by calculation through other computer-related algorithms, which is not limited in the embodiment of the present application.
Then, projecting the rectangle RV through a homography matrix H to obtain a polygon RV';
as shown in FIG. 6, a polygon RV ' is obtained by projecting a rectangle RV through a homography matrix H, and the four vertexes of the polygon RV ' are RV1 ' (u1′,v1′)、RV2′(u2′,v2′)、RV3′(u3′,v3′)、RV4′(u4′,v4′)。
Then, a first multiple scale is calculated according to the scaling relation between the rectangle RV and the polygon RVxAnd a second multiple scaley
Specifically, the first multiple scale may be calculated according to a first multiple formulaxAnd calculating a second multiplier scale according to a second multiplier formulay
The first multiple formula is: scalex=W/W′;
The second multiplier equation is:
Figure BDA0003364504880000111
wherein W is the width of the rectangle RVW 'is the width of the maximum rectangle embedded in the polygon RV',
Figure BDA0003364504880000112
and
Figure BDA0003364504880000113
for given parameters, i.e. the internal reference matrix M in the next step2' of the formula (I).
FIG. 6 shows the case where the polygon RV 'embeds the largest rectangle, subject to vertex RV 1' (u)1′,v1′)、RV2′(u2′,v2′)、RV3′(u3′,v3′)、RV4′(u4′,v4') with the top left vertex of the nested maximum rectangle being (max (u)2′,u1′),max(v4′,v1') and the vertex at the lower right corner is (min (u)4′,u3′),min(v2′,v3')). The width W 'and height H' of the embedded maximum rectangle are:
W′=min(u4′,u3′)-max(u2′,u1′);
H′=min(v2′,v3′)-max(v4′,v1′)。
finally, according to the first multiple scalexAnd a second multiple scaleyConverting the projection intrinsic parameter matrix M' into a new projection intrinsic parameter matrix M2′。
The expression of the projection internal parameter matrix M' is set as:
Figure BDA0003364504880000121
wherein, the parameter gamma is introduced by selecting four reference points which can not guarantee strict rectangular relation. It will be appreciated that because the foregoing requirement is that four reference points in the orthographic projection be selected to be rectangular, the included angle of the vertex position 90 cannot be strictly guaranteed due to some human or picture error, and that the point that may be selected is actually 89 degrees. If the selected rectangle is 90 degrees, the parameter γ is 0. The parameter γ is-cot θ/dx and represents the physical dimension of a 1/dx pixel in the x-axis direction, i.e., what the length of a pixel actually is in the x-direction. And theta is an included angle between a transverse coordinate axis and a longitudinal coordinate axis under the pixel coordinate system. Parameters are introduced, the vertex angle of the selected four points is not strictly required to be 90 degrees, an algorithm allows an error selected by naked eyes, and the robustness of the scheme is improved.
Then, parameters can be determined by combining the projection internal parameter matrix M' obtained by RQ decomposition and the expression of the projection internal parameter matrix M
Figure BDA0003364504880000122
γ、u0' and v0Specific values of'.
Then according to the width W' of the embedded maximum rectangle
Figure BDA0003364504880000123
The first multiple scalex and the second multiple scaley can be obtained through the first multiple formula and the second multiple formula.
And a new center of projection (u) can be determined0″,v0") is:
u″0=[u0′-max(u2′,u1′)]*scalex
v″0=[v0′-max(v4′,v1′)]*scaley
a new in-projection parameter matrix M is then determined2' is:
Figure BDA0003364504880000131
according to the first multiple formula and the second multiple formula, the following can be obtained:
Figure BDA0003364504880000132
therefore, in the embodiment of the present application, the internal reference of the projection is based on the aboveNumber matrix M2The determined re-projection scheme ensures that the resolution of the pixels in the width and height directions is consistent after re-projection, so that the picture is more in line with the visual perception of human eyes.
Thirdly, the method comprises the following steps: according to the projection internal parameter matrix M2', the projection rotation matrix R' and the internal reference matrix M determine a reprojection matrix H2′。
In the embodiment of the present application, the reprojection matrix H may be calculated by a reprojection matrix calculation formula2′;
The reprojection matrix calculation formula is as follows: h2′=M2′R′M-1
104. From the reprojection matrix H2' reprojecting the distortion corrected image to a target image.
First, based on the reprojection matrix H2Determining the coordinate position corresponding relation between the image after distortion correction and the target image, wherein the coordinate position corresponding relation is as follows:
Figure BDA0003364504880000133
wherein.
Figure BDA0003364504880000134
Is a coordinate matrix corresponding to the target image,
Figure BDA0003364504880000135
and the coordinate matrix is corresponding to the image after distortion correction. α "is a normalized coefficient.
Where (u ", v") of integer coordinate positions corresponds to (u, v), if with a fractional part. Interpolation can be carried out by adopting an interpolation algorithm similar to bilinear interpolation, and the coordinate position can also be directly rounded.
Then, the image after distortion correction is re-projected as a target image according to the coordinate position correspondence.
FIG. 7 is a schematic diagram of a target image according to an embodiment of the present application. Fig. 8 is a simplified view of a target image according to an embodiment of the present application. It can be seen that the points P1 ', P2', P3 ', P4' in the original image eventually translate into P1 ', P2', P3 ', P4'.
105. And carrying out parabolic detection according to the target image.
In the application scene of the tall building parabola, the parabola detection can be carried out according to the target image. Specifically, first, a moving object in the picture can be obtained by performing foreground detection on the picture, and the foreground detection algorithm may use an algorithm such as a code index (codebook). And then, whether a parabolic event occurs is obtained according to whether the track of the moving object in the picture conforms to a parabolic law, wherein if the parabolic law conforms to the gradual falling from top to bottom, the track can basically conform to the parabolic law and the like.
Fig. 9 is an exemplary diagram of a terminal device according to an embodiment of the present application. The terminal device 9 includes: a processor 901 and a memory 902, the processor 901 and the memory 902 communicating, the processor 901 further being communicatively connected to a plurality of lamps via a communication module, the memory 902 having stored thereon a computer program 903. The processor 901, when executing the computer program 903, implements the methods of the various embodiments described above in relation to fig. 1 or fig. 2.
Fig. 10 is a schematic diagram of an apparatus for image processing according to an embodiment of the present application. The image processing apparatus 10 includes:
an obtaining module 1001, configured to execute or implement step 101 in each embodiment corresponding to fig. 1;
a processing module 1002, configured to execute or implement step 102 and step 103 in each embodiment corresponding to fig. 1, or execute or implement a flow of each embodiment corresponding to fig. 2;
a re-projection module 1003, configured to perform or implement step 104 in the above-described embodiments corresponding to fig. 1.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (11)

1. An image processing method, comprising:
acquiring an original image;
carrying out distortion correction on the original image and determining an internal reference matrix M in the distortion correction process;
determining a reprojection matrix H according to the internal reference matrix M2′;
According to the reprojection matrix H2' re-projecting the distortion corrected image into a target image.
2. The method of claim 1, wherein the determining a reprojection matrix H from the internal reference matrix M2', includes:
according to the determined projection internal parameter matrix M2', the determined projection rotation matrix R' and the internal reference matrix M are determinedDetermining the reprojection matrix H2′。
3. The method of claim 2, wherein the projection rotation matrix R' is determined by:
selecting four reference points of a building area in the image subjected to distortion correction;
determining a circumscribed rectangle of the four reference points;
determining a homography matrix H according to the four reference points and four vertexes of the circumscribed rectangle;
and determining a projection intrinsic parameter matrix M 'and a projection rotation matrix R' according to the homography matrix H and the intrinsic parameter matrix M.
4. The method of claim 3, wherein determining a projection intrinsic parameter matrix M 'and a projection rotation matrix R' from the homography matrix H and the intrinsic parameter matrix M comprises:
calculating the product of the homography matrix H and the internal reference matrix M to obtain a first matrix Q;
performing matrix decomposition on the first matrix Q to obtain an upper triangular matrix and a unit orthogonal matrix;
setting the upper triangular matrix as the projection internal parameter matrix M 'and setting the unit orthogonal matrix as the projection rotation matrix R'.
5. The method of claim 2, wherein the projection rotation matrix R' can be determined by:
acquiring a preset group of xyz rotation angles;
determining the projection rotation matrix R' according to the xyz rotation angle.
6. The method of claim 3, wherein the in-projection parameter matrix M2' is determined by the following steps:
calculating new projection internal parameters according to the projection internal parameter matrix M' and the homography matrix HNumber matrix M2′。
7. The method of claim 6, wherein the computing of the new in-projection parameter matrix M from the in-projection parameter matrix M' and the homography matrix H2', includes:
selecting a rectangle RV of the image after distortion correction;
projecting the rectangle RV through the homography matrix H to obtain a polygon RV';
calculating a first multiple scale according to the scaling relation between the rectangle RV and the polygon RVxAnd a second multiple scaley
According to the first multiple scalexAnd said second octave scaleyConverting the projection intrinsic parameter matrix M' into the new projection intrinsic parameter matrix M2′。
8. The method of claim 2, wherein said determining is based on said intra-projection parameter matrix M2', the projection rotation matrix R' and the reference matrix M determine the reprojection matrix H2', includes:
calculating the reprojection matrix H by a reprojection matrix calculation formula2′;
The reprojection matrix calculation formula is as follows: h'2=M'2R'M-1
9. An apparatus for image processing, comprising:
the acquisition module is used for acquiring an original image;
a processing module for distortion correction of the original image, determining an internal parameter matrix M in the distortion correction process, and determining a re-projection matrix H according to the internal parameter matrix M2′;
A re-projection module for re-projecting the matrix H according to2' re-projecting the distortion corrected image into a target image.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the computer program.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 8.
CN202111390221.0A 2021-11-19 2021-11-19 Image processing method and related device Pending CN114066763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111390221.0A CN114066763A (en) 2021-11-19 2021-11-19 Image processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111390221.0A CN114066763A (en) 2021-11-19 2021-11-19 Image processing method and related device

Publications (1)

Publication Number Publication Date
CN114066763A true CN114066763A (en) 2022-02-18

Family

ID=80279070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111390221.0A Pending CN114066763A (en) 2021-11-19 2021-11-19 Image processing method and related device

Country Status (1)

Country Link
CN (1) CN114066763A (en)

Similar Documents

Publication Publication Date Title
CN110009561B (en) Method and system for mapping surveillance video target to three-dimensional geographic scene model
WO2022078156A1 (en) Method and system for parking space management
KR100653200B1 (en) Method and apparatus for providing panoramic view with geometry correction
US11330172B2 (en) Panoramic image generating method and apparatus
CN105046657B (en) A kind of image stretch distortion self-adapting correction method
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN112686877B (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
US20050265619A1 (en) Image providing method and device
EP3523777A1 (en) System and method for rectifying a wide-angle image
CN111652937A (en) Vehicle-mounted camera calibration method and device
JP6345345B2 (en) Image processing apparatus, image processing method, and image processing program
CN114004890B (en) Attitude determination method and apparatus, electronic device, and storage medium
JP4751084B2 (en) Mapping function generation method and apparatus, and composite video generation method and apparatus
US11544839B2 (en) System, apparatus and method for facilitating inspection of a target object
CN113132708B (en) Method and apparatus for acquiring three-dimensional scene image using fisheye camera, device and medium
CN108734666B (en) Fisheye image correction method and device
CN108596981B (en) Aerial view angle re-projection method and device of image and portable terminal
US20190230264A1 (en) Image capturing apparatus and image capturing method
CN114066763A (en) Image processing method and related device
CN116051652A (en) Parameter calibration method, electronic equipment and storage medium
CN115753019A (en) Pose adjusting method, device and equipment of acquisition equipment and readable storage medium
CN114742726A (en) Blind area detection method and device, electronic equipment and storage medium
CN112116524A (en) Method and device for correcting street view image facade texture
KR102076635B1 (en) Apparatus and method for generating panorama image for scattered fixed cameras
Zhang et al. Fisheye image correction based on straight-line detection and preservation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination