CN112066950B - Multi-optical-axis parallel mapping camera single-center projection conversion method - Google Patents

Multi-optical-axis parallel mapping camera single-center projection conversion method Download PDF

Info

Publication number
CN112066950B
CN112066950B CN202010724024.7A CN202010724024A CN112066950B CN 112066950 B CN112066950 B CN 112066950B CN 202010724024 A CN202010724024 A CN 202010724024A CN 112066950 B CN112066950 B CN 112066950B
Authority
CN
China
Prior art keywords
camera unit
camera
coordinate system
ith
focal plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010724024.7A
Other languages
Chinese (zh)
Other versions
CN112066950A (en
Inventor
张翠
刘秀
王斌
钟灿
汪洲
袁胜帮
宋立国
李永昆
曹桂丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Space Research Mechanical and Electricity
Original Assignee
Beijing Institute of Space Research Mechanical and Electricity
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Space Research Mechanical and Electricity filed Critical Beijing Institute of Space Research Mechanical and Electricity
Priority to CN202010724024.7A priority Critical patent/CN112066950B/en
Publication of CN112066950A publication Critical patent/CN112066950A/en
Application granted granted Critical
Publication of CN112066950B publication Critical patent/CN112066950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures

Abstract

The invention discloses a single-center projection conversion method for a multi-optical-axis parallel surveying and mapping camera, which provides a conversion model from a multi-center projection image to a single-center projection image and a conversion error calculation formula, solves the problem that the conversion model of the multi-optical-axis parallel surveying and mapping camera cannot be accurately described by the conventional single-center projection conversion method, and realizes the core problem of a large-breadth aerial surveying camera by splicing optical-axis parallel multi-area arrays.

Description

Multi-optical-axis parallel mapping camera single-center projection conversion method
Technical Field
The invention belongs to the technical field of aerial photogrammetry, and particularly relates to a single-center projection conversion method for a surveying and mapping camera with multiple parallel optical axes.
Background
The surveying and mapping method is widely applied to the fields of natural disaster emergency monitoring, land resource investigation, ecological environment monitoring, urban fine management and the like.
The large-format surveying camera is the future development direction of area array surveying equipment, large-format imaging is difficult to realize due to the limitation of the pixel scale of a single CCD or CMOS device, and the common solution is to adopt a multi-area array splicing technology, and the methods generally comprise inner view field splicing, outer view field splicing, inner and outer view field mixed splicing and the like. Generally, the inner view field is spliced by adopting a single lens, and single-center photogrammetry is adopted; the outer view field splicing generally adopts inclined photography, a main optical axis of each lens has a certain angle with the vertical direction, and an approximate single-center photogrammetric spliced image is obtained through data processing; the inner and outer fields of view are mixed and spliced to form a common lens in a linear arrangement, the height approximate to that of a single-center photogrammetry is realized through delayed simultaneous exposure, and the data processing method of the splicing method is relatively mature.
Under the limitation of the device scale and the limitation of a surveying and mapping equipment mounting platform, the splicing method has difficulty in further increasing the imaging breadth. The splicing of multiple lenses and multiple detectors with parallel main optical axes can realize the splicing integration of more imaging devices and realize the imaging with larger breadth. However, in the application of the multi-lens multi-detector splicing camera with parallel main optical axes in the surveying and mapping field, the single-center projection conversion process and the virtual single-center image generation method are very important to the imaging precision and the application of the camera.
Aiming at multi-lens multi-detector splicing surveying and mapping equipment with parallel main optical axes, the existing single-center projection conversion method cannot accurately describe a conversion model of a surveying and mapping camera with parallel multi-optical axes.
Disclosure of Invention
The technical problem solved by the invention is as follows: the method overcomes the defects of the prior art, provides a single-center projection conversion method of the surveying and mapping camera with multiple parallel optical axes, solves the problem that the conventional single-center projection conversion method cannot accurately describe a conversion model of the surveying and mapping camera with multiple parallel optical axes, and realizes the core problem of a large-breadth aerial surveying camera through splicing of optical axis parallel multi-area arrays.
The purpose of the invention is realized by the following technical scheme: a method of single-center projection conversion for a multi-axis parallel mapping camera, the method comprising the steps of:
the method comprises the following steps: the method comprises the steps that S camera units are arranged in a preset camera, the parameters of lenses in each camera unit are the same, and the main optical axes of the lenses are parallel; the camera has M multiplied by N focal plane components participating in splicing, the number of pixels of each focal plane component which are effectively spliced is M multiplied by N, and the size of a virtual focal plane at the focal plane of the lens of each camera unit is Mm multiplied by Nn; wherein M is the number of rows participating in splicing the focal plane assembly array, N is the number of columns participating in splicing the focal plane assembly array, M is the number of rows of the number of effective splicing pixels of each focal plane assembly, and N is the number of columns of the number of effective splicing pixels of each focal plane assembly;
establishing a focal plane coordinate system o of the lens of the ith camera unit by taking the virtual focal plane pixel center of the lens of the ith camera unit as a coordinate origin, taking the virtual focal plane pixel row direction as a u axis and taking the virtual focal plane pixel column direction as a v axis i -u i v i Wherein i =1,2, … S;
using the shooting station of the ith camera unit as the coordinate origin S i Establishing an image space coordinate system S of the ith camera unit according to a right-hand coordinate system by taking the pixel row direction of the parallel virtual focal plane as an x axis and the pixel column direction of the parallel virtual focal plane as a y axis i -x i y i z i Wherein i =1,2, … S;
step two: establishing a ground auxiliary coordinate system O-XYZ; wherein the camera station S of the 1 st camera unit 1 The vertical projection point on the ground is O point, and x of the image space coordinate system of the 1 st camera unit 1 The axis parallel to the axis is the X-axis of the ground auxiliary coordinate system and the y-axis of the image space coordinate system of the 1 st camera unit 1 The axis parallel to the axis is the y axis of the ground auxiliary coordinate system;
step three: obtaining a rotation matrix of each camera unit except the 1 st camera unit relative to the 1 st camera unit according to a line element of each camera unit except the 1 st camera unit relative to the 1 st camera unit and an angle element of each camera unit except the 1 st camera unit relative to the 1 st camera unit;
step four: with the image space coordinate system S of the 1 st camera unit 1 -x 1 y 1 z 1 And (3) as a virtual image space coordinate system, carrying out projection conversion on coordinate points of all camera units except the 1 st camera unit to the virtual image space coordinate system according to a mapping camera single-center projection conversion formula with multiple parallel optical axes to obtain a large-breadth virtual image with equivalent single-center projection.
In the above single-center projection conversion method for multi-optical-axis parallel mapping camera, in step three, a rotation matrix of the ith camera unit relative to the 1 st camera unit is obtained according to the line element of the ith camera unit relative to the 1 st camera unit and the angle element of the ith camera unit relative to the 1 st camera unit.
In the above single-center projection conversion method for a multi-axis parallel mapping camera, the rotation matrix of the ith camera unit relative to the 1 st camera unit is obtained by the following formula:
Figure GDA0003773696590000031
wherein, a i1 、a i2 、a i3 、b i1 、b i2 、b i3 、c i1 、c i2 And c i3 Are all coefficients in the rotation matrix of the i-th camera element relative to the 1 st camera element, Δ X i Is the abscissa, deltaY, of the camera station of the ith camera unit relative to the 1 st camera unit in the face auxiliary coordinate system O-XYZ i For the camera station of the ith camera unit, in the ground-assisted coordinate system O-XYZ, relative to the ordinate, Δ Z, of the 1 st camera unit i Is the vertical coordinate, phi, of the camera station of the ith camera unit relative to the 1 st camera unit in the ground auxiliary coordinate system O-XYZ i As image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 X axial rotation angle of, omega i For the image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 Y-axis rotation angle of (k) i As image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 Z-axis rotation angle.
In the foregoing method for converting single-center projection of a multi-optical-axis parallel surveying camera, in step four, the conversion formula of single-center projection of the multi-optical-axis parallel surveying camera is:
Figure GDA0003773696590000032
wherein x is 1 A virtual ideal imaging point a of the ground point A in the 1 st camera unit in the ground auxiliary coordinate system O-XYZ 1 Abscissa of (a), y 1 A virtual ideal imaging point a of the ground point A in the 1 st camera unit in the ground auxiliary coordinate system O-XYZ 1 H is the camera site S of the 1 st camera unit 1 Vertical coordinates in a ground auxiliary coordinate system O-XYZ; f. of 1 Is the principal distance of the 1 st camera unit, a i1 、a i2 、a i3 、b i1 、b i2 And b i3 Are all coefficients in the rotation matrix of the i-th camera element relative to the 1 st camera element, f i Is the principal distance, x, of the ith camera unit i0 Is the principal point abscissa, y, of the ith camera unit i0 Is the principal point ordinate of the i-th camera element.
The single-center projection conversion method of the mapping camera with the parallel multiple optical axes further comprises the following steps: and calculating the theoretical conversion error of the large-format virtual image of the equivalent single-center projection.
In the above single-center projection conversion method for a surveying and mapping camera with parallel multiple optical axes, the conversion error is as follows:
Figure GDA0003773696590000041
where Δ x is the conversion error of the abscissa, Δ y is the conversion error of the ordinate, H 0 Is the actual elevation of the ground point A, f 1 Is the principal distance of the 1 st camera unit, and H is the camera site S of the 1 st camera unit 1 Vertical coordinate, Δ X, in the ground-assisted coordinate system O-XYZ i Is the abscissa, deltaY, of the camera station of the ith camera unit in the ground auxiliary coordinate system O-XYZ relative to the 1 st camera unit i The camera station for the ith camera element is the ordinate of the 1 st camera element in the ground-assisted coordinate system O-XYZ.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a single-center projection conversion method of a surveying camera with multiple parallel optical axes, provides a conversion model from a multi-center projection image to a single-center projection image from a principle level, provides a conversion error calculation formula, solves the problem that the conversion model of the surveying camera with multiple parallel optical axes cannot be accurately described by the conventional single-center projection conversion method, and realizes the core problem of a large-breadth aerial surveying camera by splicing optical axis parallel multi-area arrays. And the approximate single-center projection image can be output according to the invention, the image output interface is the same as that of the existing single-center aerial survey camera, and the seamless butt joint with the existing aerial survey system processing software is realized.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a schematic diagram of an image space coordinate system provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of a ground-assisted coordinate system provided by an embodiment of the invention;
FIG. 3 is a schematic diagram of a multi-center to single-center projection transformation model provided by an embodiment of the invention;
fig. 4 is a schematic diagram of an imaging system provided by an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. It should be noted that the embodiments and features of the embodiments of the present invention may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
The embodiment provides a single-center projection conversion method of a mapping camera with multiple parallel optical axes, which comprises the following steps:
the method comprises the following steps: the method comprises the steps that S camera units are arranged in a preset camera, the parameters of lenses in each camera unit are the same, and the main optical axes of the lenses are parallel; the camera has M multiplied by N focal plane components participating in splicing, the number of pixels of each focal plane component which are effectively spliced is M multiplied by N, and the size of a virtual focal plane at the focal plane of the lens of each camera unit is Mm multiplied by Nn; wherein M is the number of rows participating in splicing the focal plane assembly array, N is the number of columns participating in splicing the focal plane assembly array, M is the number of rows of the number of effective splicing pixels of each focal plane assembly, and N is the number of columns of the number of effective splicing pixels of each focal plane assembly;
establishing a focal plane coordinate system o of the lens of the ith camera unit by taking the virtual focal plane pixel center of the lens of the ith camera unit as a coordinate origin, taking the virtual focal plane pixel row direction as a u axis and taking the virtual focal plane pixel column direction as a v axis i -u i v i Wherein i =1,2, … S;
using the shooting station of the ith camera unit as the coordinate origin S i Establishing an image space coordinate system S of the ith camera unit according to a right-hand coordinate system by taking the pixel row direction of the parallel virtual focal plane as an x axis and the pixel row direction of the parallel virtual focal plane as a y axis i -x i y i z i Wherein i =1,2, … S;
step two: establishing a ground auxiliary coordinate system O-XYZ; wherein the camera station S of the 1 st camera unit 1 The vertical projection point on the ground is O point, and x of the image space coordinate system of the 1 st camera unit 1 The axis parallel to the axis is the X-axis of the ground auxiliary coordinate system and the y-axis of the image space coordinate system of the 1 st camera unit 1 The axis parallel to the axis is the y axis of the ground auxiliary coordinate system; as shown in fig. 2.
Step three: obtaining a rotation matrix of each camera unit except the 1 st camera unit relative to the 1 st camera unit according to a line element of each camera unit except the 1 st camera unit relative to the 1 st camera unit and an angle element of each camera unit except the 1 st camera unit relative to the 1 st camera unit;
step four: using the image space coordinate system S of the lens of the 1 st camera unit 1 -x 1 y 1 z 1 And (3) as a virtual image space coordinate system, carrying out projection conversion on coordinate points of all camera units except the 1 st camera unit to the virtual image space coordinate system according to a mapping camera single-center projection conversion formula with multiple parallel optical axes to obtain a large-breadth virtual image with equivalent single-center projection.
Specifically, in the first step, each focal plane pixel is renamed. The size of the virtual focal plane at the lens focal plane in each camera unit is Mm N, as in FIG. 1, the focal plane at row A and column B, where point P (i, j) is renamed to P ((A-1) m + i, (B-1) n + j).
Establishing a focal plane coordinate system o by taking the center of a virtual focal plane pixel as a coordinate origin, the row direction of the virtual focal plane pixel as a u axis and the row direction of the virtual focal plane pixel as a v axis i -u i v i Where i =1,2, … S.
Using camera station as coordinate origin S i Establishing an image space coordinate system S according to a right-hand coordinate system, wherein the x axis is parallel to the pixel row direction of the virtual focal plane and the y axis is parallel to the pixel row direction of the virtual focal plane i -x i y i z i Where i =1,2, … S.
Second, the pickup station of the 1 st camera unit is S 1 O point is S 1 And (3) projecting in a ground auxiliary coordinate system, establishing a virtual ground auxiliary coordinate system by taking O as an origin, wherein a X, Y axis is parallel to x and y axes of an image space coordinate system, and establishing a ground auxiliary coordinate system projection O-XYZ according to a right-hand coordinate system.
And thirdly, the internal orientation element comprises a principal point, a principal distance and a distortion parameter. Measuring the interior orientation element in each camera unit, the principal point being (x) i0 ,y i0 ) Main pitch f i And a distortion parameter, wherein i =1,2, … S.
Fourthly, measuring relative exterior orientation elements in each camera unit, wherein the relative exterior orientation elements in each camera unit comprise relative exteriorAn azimuth element and an outer azimuth element. The line element of each camera unit with respect to the 1 st camera unit is (Δ X) i ,ΔY i ,ΔZ i ) Angle element phi of each camera unit with respect to the 1 st camera unit i 、ω i 、κ i The corresponding rotation matrix is R i Defined as:
Figure GDA0003773696590000071
wherein i =2,3, … S. a is i1 、a i2 、a i3 、b i1 、b i2 、b i3 、c i1 、c i2 And c i3 Are all coefficients in the rotation matrix of the i-th camera element relative to the 1 st camera element, Δ X i Is the abscissa, deltaY, of the camera station of the ith camera unit in the ground auxiliary coordinate system O-XYZ relative to the 1 st camera unit i For the camera station of the ith camera unit, in the ground-assisted coordinate system O-XYZ, relative to the ordinate, Δ Z, of the 1 st camera unit i Is the vertical coordinate, phi, of the camera station of the ith camera unit relative to the 1 st camera unit in the ground auxiliary coordinate system O-XYZ i As image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 X axial rotation angle, ω i For the image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 Y-axis rotation angle of (k) i As image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 Z-axis rotation angle.
The fifth step is to use the image space coordinate system S of the 1 st camera unit 1 -x 1 y 1 z 1 Mapping camera units according to multiple parallel optical axes for a virtual image space coordinate systemThe central projection conversion formula performs projection conversion on coordinate points of all the camera units except the 1 st camera unit to a virtual image space coordinate system to obtain a large-format virtual image projected by an equivalent single center. The derivation process of the conversion formula (6) of the single-center projection of the mapping camera with multiple parallel optical axes is as follows:
as shown in FIG. 3, assume the 1 st camera unit' S photosite S in the ground-aided coordinate system O-XYZ 1 The coordinate is S 1 (0, H) wherein H is S 1 Vertical coordinates in a ground auxiliary coordinate system O-XYZ;
the line element from the fourth step ith camera unit to the 1 st camera unit is (Δ X) i ,ΔY i ,ΔZ i ) Easy to obtain the shooting point S of the ith camera unit in the ground auxiliary coordinate system O-XYZ i The coordinate is S i (ΔX i ,ΔY i ,H+ΔZ i );
Ground point A (X, Y, 0) in the ground auxiliary coordinate system O-XYZ, which is the imaging point a in the i-th camera unit i (x i ,y i ,-f i ) Virtual imaging point a in the 1 st camera unit 1 (x 1 ,y 1 ,-f 1 )。
Shooting station S of ith camera unit in ground auxiliary coordinate system O-XYZ i And in the ground auxiliary coordinate system O-XYZ, the ground point A (X, Y, 0) is imaged in the ith camera unit in a space coordinate system S i -x i y i z i Middle real imaging point a i And a ground point A in the ground auxiliary coordinate system O-XYZ, wherein the three points are collinear and are represented by a collinear equation:
Figure GDA0003773696590000081
Figure GDA0003773696590000082
x i is a i Abscissa, y, of the imaged point i Is a i Ordinate, x, of the imaging point i0 Is the principal point abscissa, y, of the ith camera unit i0 Is the principal point ordinate, f, of the i-th camera unit i Is the principal distance of the ith camera element,
Figure GDA0003773696590000083
is a rotation matrix in the formula (1),
Figure GDA0003773696590000084
is a transition matrix.
It is possible to obtain a solution of,
Figure GDA0003773696590000085
shooting station S of the 1 st camera unit in the ground auxiliary coordinate system O-XYZ 1 And an ideal imaging point a of the ground point A (X, Y, 0) in the 1 st camera unit in the ground auxiliary coordinate system O-XYZ 1 The ground point A in the ground auxiliary coordinate system O-XYZ is collinear, and the collinear equation is as follows:
Figure GDA0003773696590000086
x 1 is a 1 Abscissa, y, of the imaged point 1 Is a 1 Ordinate of imaging point, f 1 Is the principal distance of the camera 1.
The formula (4) with the person formula (5) has
Figure GDA0003773696590000091
And the formula (6) is a single-center projection conversion formula of the mapping camera with multiple parallel optical axes.
And sixthly, correcting the ground hypothesis as a plane by a formula (6) in order to realize the splicing construction of the virtual image approximate to the single center. The fluctuation of the actual ground surface can cause the theoretical error of splicing in the conversion, as shown in FIG. 3, the actual elevation of the point A is assumed to be H 0 Relative H and H-H in actual engineering implementation 0 Δ Z i Very small, let H + Δ Z i ≈H,H-H 0 +ΔZ i ≈H-H 0 Can obtain the productError is as follows
Figure GDA0003773696590000092
In the embodiment of the single-center projection conversion method of the multi-optical-axis parallel mapping camera, optical axes of 4 lenses corresponding to an imaging system are parallel, each lens corresponds to one group of focal planes, namely 2 x 2 focal plane components participate in splicing, the effective splicing pixel number of each focal plane component is 5120 x 5120, and the total effective splicing pixel number after splicing is 2 x 5120 x 2 x 5120. The method comprises the following specific steps:
first, the pixels at each focal plane are renamed. The size of the virtual focal plane at each lens focal plane is 2 × 5120 × 2 × 5120, as in FIG. 4, the focal plane at row 2 and column 2, with point P (i, j) renamed to P ((2-1) × 5120+ i, (2-1) × 5120+ j).
Establishing a focal plane coordinate system o by taking the center of a focal plane pixel as a coordinate origin and the pixel arrangement directions as u and v axes respectively i -u i v i Where i =1,2, ….
Using camera station as coordinate origin S i The pixel arrangement directions are respectively x-axis and y-axis, and an image space coordinate system S is established according to a right-hand coordinate system i -x i y i z i Where i =1,2, ….
In the second step, the O point is S 1 And (3) projecting in a ground auxiliary coordinate system, establishing a virtual ground auxiliary coordinate system by taking O as an origin, wherein a X, Y axis is parallel to an x axis and a y axis of an image space coordinate system, and establishing a ground auxiliary coordinate system projection O-XYZ according to a right-hand coordinate system.
Thirdly, measuring the internal orientation elements of each camera, wherein the principal point is (x) i0 ,y i0 ) Main distance f i And a distortion parameter, wherein i =1,2, ….
And fourthly, measuring the relative external orientation elements of each camera, wherein the relative external orientation elements of each camera comprise relative external orientation angle elements and external orientation line elements. The line element of each camera unit with respect to the 1 st camera unit is (Δ X) i ,ΔY i ,ΔZ i ) Angle element phi of each camera unit with respect to the 1 st camera unit i 、ω i 、κ i To is aligned withThe corresponding rotation matrix is R i Defined as:
Figure GDA0003773696590000101
wherein i =2,3, ….
Fifthly, the image space coordinate system of the 1 st camera unit is a virtual image space coordinate system, and other camera units derive the formula converted from the 1 st camera unit. Assume that the pickup station S is, as in FIG. 3 1 (0, H), from the second step, S i (ΔX i ,ΔY i ,H+ΔZ i ) A ground point A (X, Y, 0) which is an imaging point a in the i-th camera unit i (x i ,y i ,-f i ) Virtual imaging point a in the 1 st camera unit 1 (x 1 ,y 1 ,-f 1 )。
S i A, A are collinear, and the collinearity equation is as follows:
Figure GDA0003773696590000102
Figure GDA0003773696590000103
wherein
Figure GDA0003773696590000104
Is a transition matrix.
It is possible to obtain,
Figure GDA0003773696590000111
S 1 、a 1 a collinearity, from collinearity equation:
Figure GDA0003773696590000112
obtaining the single-center projection conversion formula of the surveying and mapping camera with parallel multiple optical axes by using the two formulas
Figure GDA0003773696590000113
The single-center projection conversion formula of the surveying and mapping camera with the parallel multiple optical axes is a single-center projection conversion formula of the surveying and mapping camera with the parallel multiple optical axes. Single-center projection conversion and final projection transformation of multi-optical-axis parallel mapping camera to virtual image plane S 1 -x 1 y 1 And generating a large-format virtual image of the equivalent single-center projection through splicing conversion in the standard.
And sixthly, in order to realize splicing and construct a virtual image approximate to a single center, a mapping camera single-center projection conversion formula with multiple parallel optical axes corrects the ground by assuming the ground as a plane. The actual ground undulation will cause errors in the stitching in the transition, as shown in FIG. 3, assuming that the actual elevation at point A is H 0 Relative H and H-H in actual engineering implementation 0 Δ Z i Very small, order H + Δ Z i ≈H,H-H 0 +ΔZ i ≈H-H 0 The available error is
Figure GDA0003773696590000114
The main optical axis parallel mapping system is a novel mapping imaging system, the embodiment provides a single-center projection conversion method for a mapping camera with multiple parallel optical axes, a conversion model from multiple projection images to a single-center projection image is given from a principle level, a conversion error calculation formula is given, the problem that the conversion model of the mapping camera with multiple parallel optical axes cannot be accurately described by the existing single-center projection conversion method is solved, and the core problem of a large-breadth aerial mapping camera is realized by splicing multiple parallel optical axes. The invention can output approximate single-center projection images, has the same interface with the output images of the existing single-center aerial survey camera, and realizes seamless connection with the processing software of the existing aerial survey system.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.

Claims (1)

1. A method for converting single-center projection of a multi-optical-axis parallel mapping camera, the method comprising the steps of:
the method comprises the following steps: the method comprises the steps that S camera units are arranged in a preset camera, the parameters of lenses in each camera unit are the same, and the main optical axes of the lenses are parallel; the camera has M multiplied by N focal plane components for splicing, the effective splicing pixel number of each focal plane component is M multiplied by N, and the size of a virtual focal plane at the lens focal plane of each camera unit is Mm multiplied by Nn; wherein M is the number of rows participating in splicing the focal plane assembly array, N is the number of columns participating in splicing the focal plane assembly array, M is the number of rows of the number of effective splicing pixels of each focal plane assembly, and N is the number of columns of the number of effective splicing pixels of each focal plane assembly;
establishing a focal plane coordinate system o of the lens of the ith camera unit by taking the virtual focal plane pixel center of the lens of the ith camera unit as a coordinate origin, taking the virtual focal plane pixel row direction as a u axis and taking the virtual focal plane pixel column direction as a v axis i -u i v i Wherein i =1,2, … S;
taking the camera station of the ith camera unit as a coordinate origin S i Establishing an image space coordinate system S of the ith camera unit according to a right-hand coordinate system by taking the pixel row direction of the parallel virtual focal plane as an x axis and the pixel column direction of the parallel virtual focal plane as a y axis i -x i y i z i Wherein i =1,2, … S;
step two: establishing a ground auxiliary coordinate system O-XYZ; wherein the camera station S of the 1 st camera unit 1 The vertical projection point on the ground is O point, and x of the image space coordinate system of the 1 st camera unit 1 The axis parallel to the axis is the X axis of the ground auxiliary coordinate systemY in the image space coordinate system of the 1 st camera unit 1 The axis parallel to the axis is the y axis of the ground auxiliary coordinate system;
step three: obtaining a rotation matrix of each camera unit except the 1 st camera unit relative to the 1 st camera unit according to a line element of each camera unit except the 1 st camera unit relative to the 1 st camera unit and an angle element of each camera unit except the 1 st camera unit relative to the 1 st camera unit;
step four: with the image space coordinate system S of the 1 st camera unit 1 -x 1 y 1 z 1 For a virtual image space coordinate system, carrying out projection conversion on coordinate points of all camera units except the 1 st camera unit to the virtual image space coordinate system according to a mapping camera single-center projection conversion formula with multiple parallel optical axes to obtain a large-breadth virtual image with equivalent single-center projection;
in step three, a rotation matrix of the ith camera unit relative to the 1 st camera unit is obtained according to the line element of the ith camera unit relative to the 1 st camera unit and the angle element of the ith camera unit relative to the 1 st camera unit;
the rotation matrix of the ith camera element relative to the 1 st camera element is obtained by the following formula:
Figure FDA0003773696580000021
wherein, a i1 、a i2 、a i3 、b i1 、b i2 、b i3 、c i1 、c i2 And c i3 Are all coefficients in the rotation matrix of the ith camera element relative to the 1 st camera element i As image space coordinate system S of the ith camera unit i -x i y i z i Relative to the image space coordinate system S of the 1 st camera unit 1 -x 1 y 1 z 1 X axial rotation angle of, omega i For the image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 Y-axis rotation angle of (k) i For the image space coordinate system S of the ith camera unit i -x i y i z i Image space coordinate system S relative to the 1 st camera unit 1 -x 1 y 1 z 1 Z-axis rotation angle of (c);
in step four, the conversion formula of the single-center projection of the mapping camera with parallel multiple optical axes is as follows:
Figure FDA0003773696580000022
wherein x is 1 A virtual ideal imaging point a of the 1 st camera unit for the ground point A in the ground auxiliary coordinate system O-XYZ 1 Abscissa of (a), y 1 A virtual ideal imaging point a of the 1 st camera unit for the ground point A in the ground auxiliary coordinate system O-XYZ 1 H is the camera site S of the 1 st camera unit 1 Vertical coordinates in a ground auxiliary coordinate system O-XYZ; f. of 1 Is the principal distance of the 1 st camera unit, a i1 、a i2 、a i3 、b i1 、b i2 And b i3 Are all coefficients in the rotation matrix of the i-th camera element relative to the 1 st camera element, f i Is the principal distance, x, of the ith camera unit i0 Is the principal point abscissa, y, of the ith camera unit i0 Is the principal point ordinate of the ith camera unit;
calculating a theoretical conversion error existing in a large-format virtual image of the equivalent single-center projection;
the theoretical conversion error is:
Figure FDA0003773696580000031
where Δ x is the conversion error of the abscissa, Δ y is the conversion error of the ordinate, H 0 Is the actual elevation of the ground point A, f 1 Is the principal distance of the 1 st camera unit, and H is the camera site S of the 1 st camera unit 1 On the groundVertical coordinate, Δ X, in an auxiliary coordinate system O-XYZ i Is the abscissa, deltaY, of the camera station of the ith camera unit in the ground auxiliary coordinate system O-XYZ relative to the 1 st camera unit i The camera station for the ith camera unit is the ordinate of the 1 st camera unit in the ground-assisted coordinate system O-XYZ.
CN202010724024.7A 2020-07-24 2020-07-24 Multi-optical-axis parallel mapping camera single-center projection conversion method Active CN112066950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010724024.7A CN112066950B (en) 2020-07-24 2020-07-24 Multi-optical-axis parallel mapping camera single-center projection conversion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010724024.7A CN112066950B (en) 2020-07-24 2020-07-24 Multi-optical-axis parallel mapping camera single-center projection conversion method

Publications (2)

Publication Number Publication Date
CN112066950A CN112066950A (en) 2020-12-11
CN112066950B true CN112066950B (en) 2022-10-14

Family

ID=73656772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010724024.7A Active CN112066950B (en) 2020-07-24 2020-07-24 Multi-optical-axis parallel mapping camera single-center projection conversion method

Country Status (1)

Country Link
CN (1) CN112066950B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000136905A (en) * 1998-11-02 2000-05-16 Nissan Motor Co Ltd Apparatus for measuring position of end of object, and judgment apparatus for passage of moving body
CN102042825A (en) * 2010-11-09 2011-05-04 青岛市光电工程技术研究院 Three-dimensional imaging measurement system combining planar array imaging with laser scanning
CN102313536A (en) * 2011-07-21 2012-01-11 清华大学 Method for barrier perception based on airborne binocular vision
CN102538763A (en) * 2012-02-14 2012-07-04 清华大学 Method for measuring three-dimensional terrain in river model test
CN108827245A (en) * 2018-05-17 2018-11-16 北京林业大学 A kind of twin-lens portable forestry investigation apparatus
CN110836662A (en) * 2019-11-04 2020-02-25 南京理工大学 Slope displacement monitoring method based on relative orientation and absolute orientation algorithm
CN111339826A (en) * 2020-05-06 2020-06-26 山西大学 Landslide unmanned aerial vehicle linear sensor network frame detection system

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59182688A (en) * 1983-03-31 1984-10-17 Toshiba Corp Stereoscopic processor
JP5311365B2 (en) * 2007-10-23 2013-10-09 独立行政法人産業技術総合研究所 Stereo camera calibration method and calibration system
CN101441076B (en) * 2008-12-29 2010-06-02 东软集团股份有限公司 Method and device for detecting barrier
CN102062599B (en) * 2010-11-23 2012-09-26 中国科学院遥感应用研究所 Spliced imaging system based on axis-shifting principle
JP5787695B2 (en) * 2011-09-28 2015-09-30 株式会社トプコン Image acquisition device
JP6151902B2 (en) * 2012-09-20 2017-06-21 株式会社トプコン Camera for photo measurement and aerial photographic equipment
CN103115613B (en) * 2013-02-04 2015-04-08 安徽大学 Three-dimensional space positioning method
CN103245322B (en) * 2013-04-10 2015-11-11 南京航空航天大学 A kind of distance-finding method based on binocular stereo vision and system
CN103323028B (en) * 2013-06-14 2015-10-21 武汉大学 One locates conforming satellite multispectral image method for registering based on object space
CN105627926B (en) * 2016-01-22 2017-02-08 尹兴 Four-camera group planar array feature point three-dimensional measurement system and measurement method
CN108627142B (en) * 2018-05-02 2020-07-17 成都纵横自动化技术股份有限公司 Target positioning method combining offline elevation and airborne photoelectric pod
CN108955642B (en) * 2018-05-07 2020-09-01 江苏师范大学 Large-breadth equivalent center projection image seamless splicing method
CN109668525B (en) * 2019-01-30 2020-08-07 哈尔滨超精密装备工程技术中心有限公司 High-precision three-dimensional angle measuring method and device based on reflection grating
CN109931906B (en) * 2019-03-28 2021-02-23 华雁智科(杭州)信息技术有限公司 Camera ranging method and device and electronic equipment
CN112082571B (en) * 2020-07-24 2022-09-23 北京空间机电研究所 Large-breadth mapping camera system and calibration method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000136905A (en) * 1998-11-02 2000-05-16 Nissan Motor Co Ltd Apparatus for measuring position of end of object, and judgment apparatus for passage of moving body
CN102042825A (en) * 2010-11-09 2011-05-04 青岛市光电工程技术研究院 Three-dimensional imaging measurement system combining planar array imaging with laser scanning
CN102313536A (en) * 2011-07-21 2012-01-11 清华大学 Method for barrier perception based on airborne binocular vision
CN102538763A (en) * 2012-02-14 2012-07-04 清华大学 Method for measuring three-dimensional terrain in river model test
CN108827245A (en) * 2018-05-17 2018-11-16 北京林业大学 A kind of twin-lens portable forestry investigation apparatus
CN110836662A (en) * 2019-11-04 2020-02-25 南京理工大学 Slope displacement monitoring method based on relative orientation and absolute orientation algorithm
CN111339826A (en) * 2020-05-06 2020-06-26 山西大学 Landslide unmanned aerial vehicle linear sensor network frame detection system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种多视点平行阵列摄像机标定方法;赵琨等;《信息技术》;20161025(第10期);178-183 *

Also Published As

Publication number Publication date
CN112066950A (en) 2020-12-11

Similar Documents

Publication Publication Date Title
US8717361B2 (en) Method for generating orthophoto image
CN107492069B (en) Image fusion method based on multi-lens sensor
CN110211054B (en) Method for manufacturing distortion-free image of satellite-borne push-broom optical sensor
US20040066454A1 (en) Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device
CN109903227A (en) Full-view image joining method based on camera geometry site
CN103869595B (en) A kind of method that off-axis three anti-camera focal plane is debug
CN101577002A (en) Calibration method of fish-eye lens imaging system applied to target detection
CN103697864B (en) A kind of narrow visual field double camera image splicing method based on large virtual camera
CN110505468B (en) Test calibration and deviation correction method for augmented reality display equipment
US20210364288A1 (en) Optical measurement and calibration method for pose based on three linear array charge coupled devices (ccd) assisted by two area array ccds
CN101882306A (en) High-precision joining method of uneven surface object picture
CN108830811A (en) A kind of aviation image real-time correction method that flight parameter is combined with camera internal reference
CN101424530B (en) Method for generating approximate kernel line of satellite stereo image pairs based on projection reference surface
CN111986267B (en) Coordinate system calibration method of multi-camera vision system
CN102098442B (en) Method and system for calibrating non-overlap ratio of optical axis and visual axis of zoom camera
CN108269234A (en) A kind of lens of panoramic camera Attitude estimation method and panorama camera
CN106289317A (en) The unit calibration method of a kind of single-lens digital aviation measuring camera and device
CN110223233B (en) Unmanned aerial vehicle aerial photography image building method based on image splicing
CN108955642B (en) Large-breadth equivalent center projection image seamless splicing method
JP4631048B2 (en) Imaging apparatus and imaging system parameter calibration method
CN112066950B (en) Multi-optical-axis parallel mapping camera single-center projection conversion method
JP2005244861A (en) Imaging apparatus and imaging system parameter correction method
CN102519484B (en) Multi-disc overall adjustment calibration method of rotary photogrammetry system
CN111583117A (en) Rapid panoramic stitching method and device suitable for space complex environment
CN107728316A (en) With the Equivalent analysis method of off-axis three reflecting optical systems imaging law

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant