CN104406607A - Multi-visual field composite optical sensor calibration device and method - Google Patents

Multi-visual field composite optical sensor calibration device and method Download PDF

Info

Publication number
CN104406607A
CN104406607A CN201410676243.7A CN201410676243A CN104406607A CN 104406607 A CN104406607 A CN 104406607A CN 201410676243 A CN201410676243 A CN 201410676243A CN 104406607 A CN104406607 A CN 104406607A
Authority
CN
China
Prior art keywords
cosω
sinω
sensor
partiald
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410676243.7A
Other languages
Chinese (zh)
Other versions
CN104406607B (en
Inventor
江洁
李宁
闫劲云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201410676243.7A priority Critical patent/CN104406607B/en
Publication of CN104406607A publication Critical patent/CN104406607A/en
Application granted granted Critical
Publication of CN104406607B publication Critical patent/CN104406607B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Abstract

The invention provides a multi-visual field composite optical sensor calibration device and method. The multi-visual field composite optical sensor calibration device comprises a three-shaft high precision turntable, a single starlight simulator, a marble platform used for supporting the single starlight simulator, a data processing computer and a multi-visual field composite optical sensor. According to the multi-visual field composite optical sensor calibration device and method provided by the invention, a die of starlight incidence vector, starlight reflection vector and perspective projection imaging of the multi-visual field composite optical sensor is established, and a single sensor and a composite optical sensor are respectively calibrated by utilizing a two-step method. With the adoption of measuring device and method, various parameters and mounting parameters of an optical system of the multi-visual field composite optical sensor can be measured.

Description

The caliberating device of a kind of many visual fields complex optics sensor and method
Technical field
The invention provides caliberating device and the method for a kind of many visual fields complex optics sensor, belong to aerospace measurement technology.
Background technology
Many visual fields complex optics sensor is a kind of novel aerospace measurement instrument, and it utilizes multiple virtual visual field simultaneously to near-Earth object (as the earth, the moon) and fixed star imaging, for spacecraft provides attitude and positional information.General autonomous astronomical navigation system is locus and the attitude that the mode combined by multiple optical sensor (as star sensor, sun sensor, infrared horizon, ultraviolet moon sensor) determines spacecraft.These sensors based on same optical principle are functionally carried out compound by many visual fields complex optics sensor, achieve " one is quick multiplex ".The specific implementation of this sensor is shown in patent " a kind of many visual fields complex optics sensor calibration apparatus and method (patent No. ZL201310631365.X) ".Its specific implementation principle is: the optical system of many visual fields complex optics sensor is made up of four prisms and lens, according to optical reflection principle, cmos imaging face is divided into nonoverlapping multiple imaging region, fictionalize multiple observation visual field thus, and respectively to fixed star and near-Earth object imaging; Then the Image semantic classification program through system can obtain the information such as asterism barycenter and earth edge; Data handling system utilizes these information to carry out importance in star map recognition and navigation clearing, calculates the navigation information of spacecraft.Its principle schematic as shown in Figure 1.
Optical sensor must measure its inside and outside parameter before being taken into use, is referred to as the demarcation of optical sensor.Many visual fields complex optics sensor is similar to the star sensor with four virtual visual fields on structural principle, so its demarcation can adopt the scaling method of star sensor.The scaling method of tradition star sensor adopts the standardization based on two-axis platcform and single star simulator usually, it mainly make use of the feature that starlight is directional light, carry out unified Modeling to the parameter of star sensor, its concrete calibration process is shown in patent ' element correcting method (patent No. ZL2005101125537) ' inside and outside a kind of star sensor.
This method is widely used in demarcation and the accuracy test of traditional star sensor now, but for novel many visual fields complex optics sensor, this traditional scaling method is no longer applicable, its main cause have following some:
(1) four virtual optical axis of novel many visual fields complex optics sensor are towards four different directions, light inlet and optical axis are bordering on plumbness, cannot install under diaxon status condition, and two-axle rotating table cannot according to calibration request motion make asterism linear be covered with whole visual field.
(2) optical system of many visual fields complex optics sensor comprises optical prism, lens and image device, and its parameter is more complicated.Because prism parameters and sensor installation parameter, lens parameter are coupled similar, adopt traditional scaling method carry out unified Modeling calculate time, computation process can be caused to restrain or fitting precision extremely low.
Due to the restriction of device, modeling and computing method, traditional star sensor scaling method based on two-axis platcform is not suitable for the demarcation of novel many visual fields complex optics sensor.
Summary of the invention
Technology of the present invention is dealt with problems: for the demarcation Problems existing of many visual fields complex optics sensor, caliberating device and the method for a kind of novel many visual fields complex optics sensor are provided, single sensor calibration and complex optics sensor integral calibrating two parts are divided into many visual fields complex optics sensor, and carry out difference Modeling Calculation, utilize non-linear fitting method iteration to go out the best estimate of parameter.
Technical scheme of the present invention is: the caliberating device of a kind of many visual fields complex optics sensor, comprise three axle high precision turntable, for support single star optical simulator marble platform, for simulating single star optical simulator of starlight incidence, many visual fields complex optics sensor and data handling machine.Wherein single star optical simulator level is arranged in marble platform, many visual fields complex optics sensor is arranged on three-dimensional high-precision turntable, the optical axis of complex optics sensor is parallel to the interior axle of three axle high precision turntable, and the starlight direction that single star optical simulator sends is perpendicular to the axis of three axle high precision turntable.
A scaling method for many visual fields complex optics sensor, carries out in two steps, is respectively single sensor calibration and composite sensing device integral calibrating, and described single sensor refers to the part of many visual fields complex optics sensor removing optical prism; After carrying out first step list sensor calibration, obtain installation parameter and the inner parameter of single sensor, result substituted in second step composite sensing device integral calibrating and participate in iterative computation, utilize Non-linear least-square curve fitting to go out prism parameters, implementation step is as follows:
(1) demarcation of single sensor
Utilize three-axle table to demarcate single sensor, carrying out the timing signal of first step list sensor unit, many visual fields optical sensor does not install optical prism (being referred to as single sensor), adjustment three-axle table axis, make the optical axis of single sensor parallel with single star simulator starlight incident direction, the alignment error matrix of single sensor can be obtained by traditional scaling method, focal length, optics principal point and distortion coefficients of camera lens parameter, detailed process is see " inside and outside a kind of star sensor element correcting method " (patent No. ZL2005101125537).
(2) integral calibrating of many visual fields complex optics sensor
On the basis of step (1), keep the installation site of single sensor constant and correct installation optics four prism, adjustment three-axle table makes the starlight incident direction of the optical axis of complex optics sensor and single star simulator perpendicular, and demarcates as follows;
(2.1) surving coordinate system is set up
Set up sensor coordinate system and three-axle table zero-bit coordinate system, the angle that the transformational relation wherein between sensor coordinate system and three-axle table zero-bit coordinate system can be turned over by three-axle table is determined;
Sensor coordinate system M is defined as: initial point O mfor the optical centre of many visual fields complex optics sensor lens, X mparallel with imageing sensor line direction, Y mparallel with imageing sensor column direction, Z malong the optical axis direction of lens of star sensor, determined by the right-hand rule, be denoted as: O m-X my mz m;
Turntable zero-bit coordinate system N is defined as: the coordinate system determined by rotation of rotary table axle under zero-bit state, and initial point is the centre of gyration of turntable, X rfor the rotating shaft of three-axle table inner axis, Y rfor the rotating shaft of turntable center axle, Z rbe that three axle axle turntable housing axle rotating shafts are determined, be denoted as: O r-X ry rz r;
(2.2) prism incidence Vector Modeling
Define and to send and the starlight vector incided optics four prism surface is prism incidence vector V from single star optical simulator 1.The factor affecting prism incidence vector mainly comprises starlight vector initial alignment deviation V 0, sensor installation deviation R wwith turntable transition matrix R r, between them, pass is:
V 1=R r*R w*V 0
Order V 1 = v 11 v 12 v 13 , Can obtain as calculated:
v 1=z 0cosαcosβcosω 2+z 1sinαcosβcosω 2+z 2sinβcosω 2+z 3cosαcosβcosω 1sinω 2
+z 4sinαcosβcosω 1sinω 2+z 5sinβcosω 1sinω 2+z 6cosαcosβsinω 1sinω 2
+z 7sinαcosβsinω 1sinω 2+z 8sinβsinω 1sinω 2
v 2=-z 0cosα 1cosβsinω 2-z 1sinαcosβsinω 2-z 2sinβsinω 2+z 3cosα 1cosβcosω 1cosω 2
+z 4sinαcosβcosω 1cosω 2+z 5sinβcosω 1cosω 2+z 6cosαcosβsinω 1cosω 2
+z 7sinαcosβsinω 1cosω 2+z 8sinβsinω 1cosω 2
v 3=-z 3cosαcosβsinω 1-z 4sinαcosβsinω 1-z 5sinβsinω 1+z 6cosαcosβcosω 1
+z 7sinαcosβcosω 1+z 8sinβcosω 1
Wherein α, β are the class right ascension of starlight vector in coordinate system N and class declination; z 0~ z 8for R wmatrix is from upper left to the element of bottom right, and its value is demarcated can obtain through (1) step; ω 1for turntable is from the rotation angle of zero-bit state around axis, ω 2for turntable is from the rotation angle of zero-bit around outer shaft.
(2.3) prismatic reflection Vector Modeling
Define through four prism surfaces reflections laggard enter the vector of optical lens be prismatic reflection vector V r.The factor affecting prismatic reflection vector mainly comprises the normal vector N of corresponding edge mirror plane 1(being assumed to be visual field 1), prism incidence vector V 1.
According to the pass that plane reflection principle can obtain between them be:
V r = V 1 - 2 V 1 * N 1 T | V 1 | * | N 1 | N 1
Order V r = r 1 r 2 r 3 , Can obtain as calculated:
r 1=sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2+cosαcosβ(z 0cosω 2+z 3cosω 1sinω 2+
z 6sinω 1sinω 2)+cosβsinα(z 1cosω 1+z 4cosω 1sinω 2+z 7sinω 1sinω 2)-cosφcosγ(2sinφ
(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3sinω 1)+cosβsinα(z 7cosω 1-z 4sinω 1))
+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ(z 3cosω 1cosω 2
-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2+z 7cosω 2sinω 1))
+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2)+cosαcosβ(z 0cosω 2
+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+cosβsinα(z 1cosω 2+z 4cosω 1sinω 2+z 7sinω 1sinω 2)))
r 2=sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ(z 3cosω 1cosω 2-z 0sinω 2+
z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 4-z 1sinω 2+z 7cosω 2sinω)-cosφsinγ
(2sinφ(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3cosω 2+cosβsinα(z 7cosω 1-
z 4sinω 1))+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ
(z 3cosω 1cosω 2-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2+z 7cosω 2sinω 1))
+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2)+cosαcosβ
(z 0cosω 2+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+cosβsinα(z 1cosω 2+z 4cosω 1sinω 2+z 7sinω 1sinω 2)))
r 3=sinβ(z 8cosω 1-z 5sinω 1)-sinγ(2sinφ(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3sinω 1)
+cosβsinα(z 7cosω 1-z 4sinω 1))+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+
cosαcosβ(z 3cosω 1cosω 2-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2
+z 7cosω 2sinω 1))+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2)+z 8sinω 1sinω 2)+
cosαcosβ(z 0cosω 2+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+sinαcosβ(z 1cosω 2+z 4cosω 1sinω 2
+z 7sinω 1sinω 2)))+cosαcosβ(z 6cosω 1-z 3sinω 1+cosβsinα(z 7cosω 1-z 4sinω 1))
(2.4) perspective projection imaging modeling
Reflection vector light enters the imaging process of lens on imageing sensor target surface can regard perspective projection as, and its imaging point position is:
x = - f v 11 v 13 1 Dx + X 0 + x ‾ ( q 1 r 2 + q 2 r 4 ) + [ p 1 ( r 2 + 2 x ‾ 2 ) + 2 p 2 xy ‾ ] y = - f v 12 v 13 1 Dy + Y 0 + y ‾ ( q 1 r 2 + q 2 r 4 ) + [ p 2 ( r 2 + 2 y ‾ 2 ) + 2 p 1 xy ‾ ]
x ‾ = x ′ - X 0 y ‾ = y ′ - Y 0 r 2 = x ‾ 2 + y ‾ 2
Wherein, f is lens focus, and unit is mm; (X 0, Y 0) be optics principal point, unit is pixel; p 1, p 2, q 1, q 2for distortion coefficients of camera lens, above-mentioned each value all can be demarcated by (1) step and be obtained.(Dx, Dy) is pixel dimension, and unit is mm.
(2.5) data acquisition and data processing
Above model establishes turntable angle position, composite sensing device parameters states the corresponding relation with asterism image coordinate, by gathering the asterism image coordinate under different revolving table position, utilizes nonlinear least square method can simulate the parameters of sensor.
(2.5.1)
Rotating table outer shaft and interior axle, be asterism linear be covered with whole visual field, a station acquisition n secondary data, n gets 100 ~ 1000, and records turntable coordinate position at that time simultaneously,
Formula is:
x ‾ = 1 n Σ i = 1 n x i
y ‾ = 1 n Σ i = 1 n y i
(2.5.2)
The turntable coordinate utilizing (2.5.1) to record, calculates prism incidence vector corresponding to different turntable coordinate, namely by turntable angle (θ according to prism incidence vector model (2.2) 1, θ 2, θ 3) calculate prism incidence vector V 1 = v 11 v 12 v 13 .
The asterism image coordinate utilizing (2.5.1) to obtain, calculates prismatic reflection vector corresponding to different asterism coordinate according to perspective projection imaging model (2.4).Namely by asterism position calculate prismatic reflection vector V ‾ r = r 1 ‾ r 2 ‾ r 3 ‾ .
(2.5.3)
The second step that two-step approach demarcates complex optics sensor mainly contains 4 parameters, with vector represent that these parameters are: can obtain according to above-mentioned 2.4 joint prismatic reflection models:
V r = V 1 - 2 V 1 * N 1 T | V 1 | * | N 1 | N 1 = F ( x → )
That is: r 1 = v 11 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 11 = F r 1 ( x → )
r 2 = v 12 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 12 = F r 2 ( x → )
r 3 = v 13 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 13 = F r 3 ( x → )
Wherein N 1 = n 11 n 12 n 13 .
Suppose that the estimated value of the reflection vector calculated according to model is V ^ r = r ^ 1 r ^ 2 r ^ 3 , Due to be nonlinear function, non-linear least square alternative manner can be adopted to carry out estimated parameter vector if for vectorial estimated bias, then have:
Δ r 1 = r 1 ‾ - r ^ 1 = AΔ x →
Δ r 2 = r 2 ‾ - r ^ 2 = BΔ x →
Δ r 3 = r 3 ‾ - r ^ 3 = CΔ x →
A = ( ∂ F r 1 ∂ α , ∂ F r 1 ∂ β , ∂ F r 1 ∂ γ , ∂ F r 1 ∂ φ )
B = ( ∂ F r 2 ∂ α , ∂ F r 2 ∂ β , ∂ F r 2 ∂ γ , ∂ F r 2 ∂ φ )
C = ( ∂ F r 3 ∂ α , ∂ F r 3 ∂ β , ∂ F r 3 ∂ γ , ∂ F r 3 ∂ φ )
Here A, B are sensitive matrix, suppose that participating in calculating asterism number is m, order
P = Δ r 11 · · · Δ r 1 m Δ r 21 · · · Δ r 2 m Δ r 31 · · · Δ r 3 m , M = A 1 · · · A m B 1 · · · B m C 1 · · · C m
Here the vector that is made up of the offset on three directions of P, M is the overall sensitive matrix of A, B, C tri-sensitive matrixs compositions,
Then there is iterative equation:
Δ x → ( k + 1 ) = Δ x → ( k ) - ( M k T M k ) - 1 M k T P ( k )
Wherein k is iteration sequence number, the stationary value after iteration terminates, and is the parameter calibration result of second step.The result of the comprehensive first step and second step, just can obtain the calibration value of all parameters, completes the demarcation to complex optics sensor.
Many visual fields complex optics sensor due to its measuring principle and composition structure and general star sensor different, use traditional star sensor scaling method and device cannot complete demarcation task.The present invention adopts the two step scaling methods based on three-axle table, establishes the measurement model of many visual fields complex optics sensor, uses Non-linear least-square curve fitting to go out the parameters of sensor.Compared to traditional scaling method, the present invention has the following advantages.
(1) adopt new caliberating device, calibration process is more simple and reliable.Because novel many visual fields complex optics sensor has towards the virtual visual field of four different directions, and the virtual optical axis and the main optical axis form an angle, utilize traditional two-axle rotating table to carry out timing signal, sensor is installed complicated, and the picture point obtained becomes nonlinear Distribution, greatly reduces the precision of demarcation.The present invention adopts high precision three-axle table, single star optical simulator and test computer as caliberating device, decreases installation alignment procedures complicated in demarcation, and can obtain linear test asterism by turntable motion, scaling method is more simple and reliable.
(2) establish the measurement model of many visual fields complex optics sensor, adopt two step scaling methods, decrease the coupling between parameter, improve stated accuracy.The optical system of novel many visual fields complex optics sensor comprises lens, optics four prism and imager chip, and its parameter is more complicated.Because the installation parameter of four prism parameters, sensor, lens parameter attribute are similar, when utilizing classic method to carry out overall estimation, the phenomenon be coupled can be there is.The present invention adopts two step scaling methods, demarcates respectively, the fit procedure of four prism parameters separated, avoid the introducing between different parameters error, improve stated accuracy single sensor and complex optics sensor complete machine.
Accompanying drawing explanation
Fig. 1 is many visual fields complex optics sensor measuring principle schematic diagram;
Fig. 2 is the present invention many visual fields complex optics sensor calibration device;
Fig. 3 is list sensor calibration schematic diagram of the present invention;
Fig. 4 is the present invention many visual fields complex optics sensor surving coordinate system;
Fig. 5 is prismatic reflection Vector Modeling of the present invention;
Fig. 6 is invention mirror reflective relation schematic diagram;
Fig. 7 is asterism acquisition mode in the present invention;
Fig. 8 is solution process schematic diagram.
Embodiment
The principle schematic of many visual fields complex optics sensor as shown in Figure 1, with traditional star sensor unlike.The optical system of many visual fields complex optics sensor is made up of four prisms 7 and lens 8, according to optical reflection principle, cmos imaging face is divided into nonoverlapping multiple imaging region, fictionalizes multiple observation visual field thus respectively to fixed star and near-Earth object imaging.Autonomous astronomical navigation can be realized by subsequent algorithm.
Two step caliberating devices of a kind of many visual fields of the present invention complex optics sensor are by three axle high precision turntable 1, and for supporting the marble platform 3 of single star simulator, single star optical simulator 2, many visual fields complex optics sensor 5 and data handling machine 4 form.Wherein single star optical simulator is arranged in marble platform 3, and many visual fields complex light sensor 5 is arranged on the inside casing of three-axle table, and concrete mounting means as shown in Figure 2.Wherein single star optical simulator 5 level is arranged in marble platform 3, many visual fields complex optics sensor 5 is arranged on three-dimensional high-precision turntable 1, the optical axis of many visual fields complex optics sensor 5 is parallel to the interior axle of three axle high precision turntable 1, and the starlight direction that single star optical simulator 5 sends is perpendicular to the axis of three axle high precision turntable 1.
The demarcation of many visual fields complex optics sensor is carried out by the present invention in two steps, be respectively single sensor calibration and composite sensing device integral calibrating, installation parameter and the inner parameter of single sensor can be obtained after carrying out first step demarcation, its result is substituted into second step and participates in iterative computation, significantly can weaken the coupling between parameter like this, final acquisition is reliable result accurately.
1. the demarcation of single sensor
Many visual fields complex optics sensor utilizes optical prism to fictionalize multiple visual field, realizes a quick multiplex function, and the part of its removing optical prism can be regarded as single sensor, and the scaling method of traditional star sensor can be adopted to demarcate its relevant parameters.The first step of two-step approach utilizes three-axle table to demarcate single sensor part.
Carrying out the timing signal of first step list sensor unit, many visual fields optical sensor does not install optical prism, is referred to as single sensor 6, and assembling mode as shown in Figure 3.Utilize patent ZL2005101125537 " inside and outside a kind of star sensor element correcting method " described demarcation mode, the alignment error matrix of single sensor can be obtained, focal length, the parameters such as optics principal point and distortion coefficients of camera lens.Through the first step demarcate after available major parameter and meaning as shown in the table:
Table 1 first step calibrating parameters and meaning
Parameter Meaning
f Lens focus
(X 0,Y 0) Optics principal point
p 1、p 2、q 1、q 2 Distortion coefficients of camera lens
R w Sensor installation deviation matrix
2. the integral calibrating of more than visual field complex optics sensor
The second step that two-step approach is demarcated is the integral calibrating of many visual fields complex optics sensor.On the basis of the first step, keep the installation site of sensor constant and assemble prism, the center of adjustment three-axle table, to shown in Fig. 2, makes the starlight incident direction of the optical axis of complex optics sensor and single star simulator perpendicular, and demarcates as follows:
2.1 set up surving coordinate system
Two-stage calibration method adopts the definition of unified coordinate system, mainly contains sensor coordinate system and three-axle table zero-bit coordinate system, and the angle that wherein between the coordinate system of the first step and second step coordinate system, relation is turned over by turntable is determined.
Sensor coordinate system M is defined as: initial point O mfor the optical centre of many visual fields complex optics sensor lens, X mparallel with imageing sensor line direction, Y mparallel with imageing sensor column direction, Z malong the optical axis direction of lens of star sensor, determined by the right-hand rule, see Fig. 4.Be denoted as: O m-X my mz m.
Turntable zero-bit coordinate system N is defined as: the coordinate system determined by rotation of rotary table axle under zero-bit state, and initial point is the centre of gyration of turntable, X rfor the rotating shaft of three-axle table inner axis, Y rfor the rotating shaft of turntable center axle, Z rbe that three axle axle turntable housing axle rotating shafts are determined, be denoted as: O r-X ry rz r.Specifically as shown in Figure 4.
2.2 prism incidence Vector Modelings
Define and to send and the starlight vector incided optics four prism surface is prism incidence vector V from single star optical simulator 1.The factor affecting prism incidence vector mainly comprises starlight vector initial alignment deviation V 0, sensor installation deviation R wwith turntable transition matrix R r
Starlight vector initial alignment deviation V 0being the direction vector of starlight under turntable zero-bit coordinate system N that starlight analog device sends, is the normalized vector of 3 × 1.
V 0 = cos α cos β sin α cos β sin β
Turntable transition matrix R rbe turntable from zero-bit state around X raxle rotates θ 1, around Z raxle rotates θ 2corresponding rotation matrix, is expressed as:
R r = cos θ 2 0 - sin θ 2 0 1 0 sin θ 2 0 cos θ 2 1 0 0 0 cos θ 1 sin θ 1 0 - sin θ 1 cos θ 1
According to above-mentioned model, then the direction vector V of starlight vector under sensor coordinate M 1be expressed as:
V 1=R r*R w*V 0
Order V 1 = v 11 v 12 v 13 , Can obtain as calculated:
v 11=z 0cosαcosβcosω 2+z 1sinαcosβcosω 2+z 2sinβcosω 2+z 3cosαcosβcosω 1sinω 2
+z 4sinαcosβcosω 1sinω 2+z 5sinβcosω 1sinω 2+z 6cosαcosβsinω 1sinω 2
+z 7sinαcosβsinω 1sinω 2+z 8sinβsinω 1sinω 2
v 12=-z 0cosα 1cosβsinω 2-z 1sinαcosβsinω 2-z 2sinβsinω 2+z 3cosα 1cosβcosω 1cosω 2
+z 4sinαcosβcosω 1cosω 2+z 5sinβcosω 1cosω 2+z 6cosαcosβsinω 1cosω 2
+z 7sinαcosβsinω 1cosω 2+z 8sinβsinω 1cosω 2
v 13=-z 3cosαcosβsinω 1-z 4sinαcosβsinω 1-z 5sinβsinω 1+z 6cosαcosβcosω 1
+z 7sinαcosβcosω 1+z 8sinβcosω 1
2.3 prismatic reflection Vector Modelings
Define through the reflection of optics four prism surface laggard enter the vector of optical lens be prismatic reflection vector V r.The factor affecting prismatic reflection vector mainly comprises the normal vector N of corresponding edge mirror plane 1(being assumed to be visual field 1), prism incidence vector V 1.
Optics four prism has four reflectings surface, and the present invention takes the method for demarcating respectively, is introduced here for the reflecting surface of visual field 1 correspondence, and other reflectings surface are demarcated can adopt same method.As shown in Figure 5, timing signal is being carried out to the prism planes of visual field 1 correspondence, supposing that the normal vector of this this plane of reflection under global coordinate system is N 1:
N 1 = cos φ cos γ cos φ sin γ sin φ
Wherein φ, γ class right ascension of normal vector under global coordinate system and class declination for this reason.
Incident vector incides four prism surfaces, through reflecting to form prismatic reflection vector V r.As shown in Figure 6, according to mirror-reflection relation, prism incidence vector V 1, prismatic reflection vector V rwith visual field 1 planar process vector N 1number take advantage of and can form isosceles triangle ABC:
λ N → 1 = V → r - V → 1
In Δ ABC, according to the cosine law
BC 2 = λ 2 = AB 2 + AC 2 - 2 AB * AC * cos θ 1 = | V → 1 | 2 + | V → r | 2 - 2 | V → 1 | | V → r | cos θ
According to angular relationship:
θ 1=π-2θ 2
For vector N 1with V 1angle:
θ 3 = arccos V → 2 * N → 1 | V → 2 | | N → 1 |
According to angular relationship:
θ 2=π-θ 3
More than simultaneous variously can to obtain:
V r = V 1 - 2 V 1 * N 1 T | V 1 | * | N 1 | N 1
If V r = r 1 r 2 r 3 , N 1 = n 11 n 12 n 13 , Then have:
r 1 = v 11 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 11
r 2 = v 12 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 12
r 3 = v 13 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 13
Calculate:
r 1=sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2+cosαcosβ(z 0cosω 2+z 3cosω 1sinω 2+
z 6sinω 1sinω 2)+cosβsinα(z 1cosω 1+z 4cosω 1sinω 2+z 7sinω 1sinω 2)-cosφcosγ(2sinφ
(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3sinω 1)+cosβsinα(z 7cosω 1-z 4sinω 1))
+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ(z 3cosω 1cosω 2
-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2+z 7cosω 2sinω 1))
+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2)+cosαcosβ(z 0cosω 2
+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+cosβsinα(z 1cosω 2+z 4cosω 1sinω 2+z 7sinω 1sinω 2)))
r 2=sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ(z 3cosω 1cosω 2-z 0sinω 2+
z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 4-z 1sinω 2+z 7cosω 2sinω)-cosφsinγ
(2sinφ(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3cosω 2+cosβsinα(z 7cosω 1-
z 4sinω 1))+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ
(z 3cosω 1cosω 2-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2+z 7cosω 2sinω 1))
+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2)+cosαcosβ
(z 0cosω 2+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+cosβsinα(z 1cosω 2+z 4cosω 1sinω 2+z 7sinω 1sinω 2)))
r 3=sinβ(z 8cosω 1-z 5sinω 1)-sinγ(2sinφ(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3sinω 1)
+cosβsinα(z 7cosω 1-z 4sinω 1))+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+
cosαcosβ(z 3cosω 1cosω 2-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2
+z 7cosω 2sinω 1))+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2)+z 8sinω 1sinω 2)+
cosαcosβ(z 0cosω 2+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+sinαcosβ(z 1cosω 2+z 4cosω 1sinω 2
+z 7sinω 1sinω 2)))+cosαcosβ(z 6cosω 1-z 3sinω 1+cosβsinα(z 7cosω 1-z 4sinω 1))
2.4 perspective projection imaging modelings
According to Fig. 4, reflection vector light enters the imaging process of lens on imageing sensor target surface can be divided into linear process and non-linear process, and wherein linear process is perspective projection imaging process, and non-linear process is lens distortion imaging.The principal element affecting subpoint position in imaging process has optics principal point, lens focus f and distortion coefficients of camera lens, and these parameters obtain in the calibration process of first step list sensor, and its main meaning is as follows:
A. (X 0, Y 0) be optics principal point, be the optical axis of sensor lens combination and the intersection point of image device imaging surface;
B.f is lens focus, and unit is mm, i.e. the effective focal length of optical system of star sensor;
C. (p 1, p 2, q 1, q 2) be distortion coefficients of camera lens, in order to describe the distortion model of optical system of star sensor; Wherein q 1, and q 2the coefficient of radial distortion on 1 rank and 2 rank respectively, p 1and p 21 rank and 2 rank decentering distortion coefficients;
D. (Dx, Dy) is pixel dimension, and unit is mm.
x ′ = - f v 11 v 13 1 Dx + X 0 y ′ = - f v 12 v 13 1 Dy + Y 0
x = x ′ + δx y = y ′ + δy
The second order distortion model adopted is:
δx = x ‾ ( q 1 r 2 + q 2 r 4 ) + [ p 1 ( r 2 + 2 x ‾ 2 ) + 2 p 2 xy ‾ ] δy = y ‾ ( q 1 r 2 + q 2 r 4 ) + [ p 2 ( r 2 + 2 y ‾ 2 ) + 2 p 1 xy ‾ ]
Wherein:
x ‾ = x ′ - X 0 y ‾ = y ′ - Y 0 r 2 = x ‾ 2 + y ‾ 2
According to above-mentioned model, can by image asterism calculate reflection vector V ‾ r = r 1 ‾ r 2 ‾ r 3 ‾ .
2.5 data acquisitions and data processing
By above modeling process, establish turntable angle position (θ 1, θ 2, θ 3) to the corresponding relation of asterism image coordinate (x, y).By gathering many group turntable angles and image coordinate, utilize nonlinear least square method can simulate the prism parameters of sensor.
2.5.1
When image data, according to rule shown in Fig. 7, the outer shaft from-6 to+6 rotating three-axle table is spent, and axis, from 0 to 12 degree, at interval of being once a collection position, and gathers n secondary data at each collection position.N desirable 20 ~ 200, records revolving table position at that time simultaneously, and order rotation turntable makes imaging point be covered with visual field, then using its mean value as current asterism position data.
x ‾ = 1 n Σ i = 1 n x i
y ‾ = 1 n Σ i = 1 n y i
Wherein, x i, y ifor collecting the imager coordinate of asterism at every turn, for the mean value after n collection.
2.5.2
After collection many groups turntable and asterism data, need to calculate prism incidence vector and reflection vector.Overall calculation process as shown in Figure 8.
The turntable coordinate utilizing (2.5.1) to record, can calculate prism incidence vector corresponding to different turntable coordinate, namely by turntable angle (θ according to prism incidence vector model (2.2) 1, θ 2, θ 3) calculate prism incidence vector V 1 = v 11 v 12 v 13 .
The asterism image coordinate utilizing (2.5.1) to obtain, calculates prismatic reflection vector corresponding to different asterism coordinate according to perspective projection imaging model (2.4).Namely by asterism position calculate prismatic reflection vector V ‾ r = r 1 ‾ r 2 ‾ r 3 ‾ .
2.5.3
The second step that two-step approach demarcates complex optics sensor mainly contains 4 parameters, with vector represent that these parameters are: can obtain according to above-mentioned 2.4 joint prismatic reflection models:
V r = V 1 - 2 V 1 * N 1 T | V 1 | * | N 1 | N 1 = F ( x → )
That is: r 1 = v 11 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 11 = F r 1 ( x → )
r 2 = v 12 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 12 = F r 2 ( x → )
r 3 = v 13 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 13 = F r 3 ( x → )
Wherein N 1 = n 11 n 12 n 13 .
Suppose that the estimated value of the reflection vector calculated according to model is V ^ r = r ^ 1 r ^ 2 r ^ 3 , Due to be nonlinear function, non-linear least square alternative manner can be adopted to carry out estimated parameter vector if for vectorial estimated bias, then have:
Δ r 1 = r 1 ‾ - r ^ 1 = AΔ x →
Δ r 2 = r 2 ‾ - r ^ 2 = BΔ x →
Δ r 3 = r 3 ‾ - r ^ 3 = CΔ x →
A = ( ∂ F r 1 ∂ α , ∂ F r 1 ∂ β , ∂ F r 1 ∂ γ , ∂ F r 1 ∂ φ )
B = ( ∂ F r 2 ∂ α , ∂ F r 2 ∂ β , ∂ F r 2 ∂ γ , ∂ F r 2 ∂ φ )
C = ( ∂ F r 3 ∂ α , ∂ F r 3 ∂ β , ∂ F r 3 ∂ γ , ∂ F r 3 ∂ φ )
Here A, B are sensitive matrix, suppose that participating in calculating asterism number is m, order
P = Δ r 11 · · · Δ r 1 m Δ r 21 · · · Δ r 2 m Δ r 31 · · · Δ r 3 m , M = A 1 · · · A m B 1 · · · B m C 1 · · · C m
Here the vector that is made up of the offset on three directions of P, M is the overall sensitive matrix of A, B, C tri-sensitive matrixs compositions,
Then there is iterative equation:
Δ x → ( k + 1 ) = Δ x → ( k ) - ( M k T M k ) - 1 M k T P ( k )
Wherein k is iteration sequence number, the stationary value after iteration terminates, and is the parameter calibration result of second step.The result of the comprehensive first step and second step, just can obtain the calibration value of all parameters, completes the demarcation to complex optics sensor.
Emulation and interpretation of result
The present invention adopt the basic parameter of many visual fields complex optics sensor as follows:
Visual field:
Pixel array: 1024 × 102 ∠
Pixel dimension: 0.0055mm × 0.0055mm
Focal length: 16mm
First traditional scaling method is adopted to carry out the demarcation of first step list sensor parameter, asterism number 100, each asterism station acquisition 100 secondary data.The parametric results obtained is demarcated as shown in table 1 through the first step.
Table 1 first step calibrating parameters result
In above-mentioned parameter, ψ 1, ψ 2for the starlight vector initial alignment deviation in first step calibration process, be respectively sensor and prolong turntable zero-bit coordinate system X raxle, Y raxle, Z ralignment error on its axle, p 1, p 2, q 1, q 2for distortion coefficients of camera lens.
Then utilize above-mentioned data, second step calibration process is emulated.Bring above each parameter value into prism incidence vector model and perspective projection model, each numerical value needed for second step can be obtained.When second step emulates, emulation asterism number 169, the starlight direction that setting single file starlight analog device occurs turning the coordinate under initial coordinate system N is:
α=-90°
β=0°
The normal vector direction of prism first reflecting surface:
γ=-45°
φ=-135°
The result that final emulation obtains is as shown in table 2 below.
Table 2 second step calibrating parameters result
When not adding noise, the iterative process of parameter is as shown in table 3 below:
Solution process under table 3 noise-free case
Iterations α β γ φ
1 -90.4829282170376 7.65589048154405 -45.8232671708237 -135.038766847946
2 -89.9868738731705 3.79852865739226 -48.0968931961025 -134.987051625700
3 -89.9998128035864 1.81910651865815 -49.0890586095482 -134.999806050946
4 -89.9999984302528 0.807849932575455 -49.5954606570209 -134.999998344193
5 -89.9999999933001 0.326847344511042 -49.8363278357344 -134.999999992827
6 -89.9999999999871 0.126899373289438 -49.9364537551969 -134.999999999986
7 -90.0000000000000 0.0488737763154895 -49.9755259026622 -135.000000000000
8 -90 0.0188033614024913 -49.9905840002326 -135
9 -90 0.00723375273135167 -49.9963776144894 -135
10 -90 0.00278292405861988 -49.9986064184489 -135
11 -90 0.00107064605555464 -49.9994638615386 -135
12 -90 0.000411901718484022 -49.9997937354233 -135
13 -90 0.000158468356804821 -49.9999206451268 -135
14 -90 6.09666033977422e-05 -49.9999694702641 -135
15 -90 2.34553348794335e-05 -49.9999882544682 -135
16 -90 9.02383927195226e-06 -49.9999954812075 -135
17 -90 3.47169044910738e-06 -49.9999982615106 -135
As seen from the above table under noise-free case, comparatively accurate result can be obtained through 17 iteration, substantially identical with emulation setting value.
Need to consider the introducing of noise in actual alignment process, its generation mainly contains two kinds of modes, and a kind of is picture noise because imaging and asterism center coordination cause, and another kind of noise is that turntable precision causes.At emulation generation 169 points, reflection vector adds that average is 0, and standard deviation is the white Gaussian noise of 1 rad, and for simulating actual noise, computation process is as shown in table 4 below:
Table 4 adds solution process under noise situations
Iterations α β γ φ
1 -90.1630594012960 1.99580258626777 -48.9308417881941 -135.027433380954
2 -89.9990527093928 0.912700067667111 -49.5428548350697 -134.998730256157
3 -89.9998208190054 0.379594845002274 -49.8099000062779 -134.999703028728
4 -89.9998252772507 0.152384726648931 -49.9236777381909 -134.999707996842
5 -89.9998251598955 0.0620151329065337 -49.9689312712840 -134.999707902778
6 -89.9998251105641 0.0266031973187102 -49.9866641949034 -134.999707862658
7 -89.9998250913154 0.0127550328605804 -49.9935988222272 -134.999707847051
8 -89.9998250837905 0.00734036644275169 -49.9963102791897 -134.999707840954
9 -89.9998250808480 0.00522312443447393 -49.9973705127551 -134.999707838570
10 -89.9998250796974 0.00439521357833227 -49.9977850987614 -134.999707837638
11 -89.9998250792474 0.00407146843371694 -49.9979472179164 -134.999707837273
12 -89.9998250790715 0.00394487076850668 -49.9980106131732 -134.999707837131
13 -89.9998250790027 0.00389536575076853 -49.9980354033881 -134.999707837075
Numerical value about tends towards stability through 13 iterative process.Can find out that the method still can well restrain under noise situations according to above-mentioned simulation result, estimated value and the actual value deviation of the minute surface normal vector obtained based on least square method are very little.
The Accuracy Assessment adopted in emulation is: bring the estimated value calculated into model, the corresponding one group of turntable corner of each reflection vector, and the turntable corner calculated by contrast simulation and actual turntable corner, can calculate the RMS angle error of demarcation.This scaling method stated accuracy is 2.36 " to utilize above-mentioned Accuracy Assessment to draw, suitable with the Gaussian noise levels introduced.Stated accuracy of the present invention meets the demands as can be seen here, is applicable to the demarcation of many visual fields complex optics sensor.
There is provided above embodiment to be only used to describe object of the present invention, and do not really want to limit the scope of the invention.Scope of the present invention is defined by the following claims.Do not depart from spirit of the present invention and principle and the various equivalent substitutions and modifications made, all should contain within the scope of the present invention.

Claims (2)

1. a caliberating device for the complex optics sensor of visual field more than, is characterized in that comprising: three axle high precision turntable, single star optical simulator, for supporting the marble platform of single star simulator, data handling machine and many visual fields complex optics sensor; Wherein single star optical simulator level is arranged in marble platform, many visual fields complex optics sensor is arranged on three-dimensional high-precision turntable, the main optical axis of complex optics sensor is parallel to the interior axle of three axle high precision turntable, and the starlight direction that single star optical simulator sends is perpendicular to the axis of three axle high precision turntable.
2. the scaling method of the sensor of visual field complex optics more than a kind, it is characterized in that: the demarcation of many visual fields complex optics sensor is carried out in two steps, be respectively single sensor calibration and composite sensing device integral calibrating, described single sensor refers to the part of many visual fields complex optics sensor removing optical prism; Installation parameter and the inner parameter of single sensor is obtained after carrying out first step list sensor calibration, result is substituted in second step composite sensing device integral calibrating and participate in iterative computation, utilize Non-linear least-square curve fitting to go out prism parameters, specific implementation step is as follows:
(1) demarcation of single sensor
Utilize three-axle table to demarcate single sensor, carrying out first step timing signal, many visual fields optical sensor does not install optical prism, is referred to as single sensor.Adjustment three-axle table axis, makes the optical axis of single sensor parallel with single star simulator starlight incident direction, utilizes the scaling method of star sensor to obtain the alignment error matrix of single sensor, focal length, the parameters such as optics principal point and distortion coefficients of camera lens;
(2) integral calibrating of many visual fields complex optics sensor
On the basis of step (1), keep the installation site of single sensor constant and correct installation optics four prism, adjustment three-axle table makes the starlight incident direction of the optical axis of complex optics sensor and single star simulator perpendicular, and demarcates as follows;
(2.1) surving coordinate system is set up
Set up sensor coordinate system and three-axle table zero-bit coordinate system, the angle that the transformational relation wherein between sensor coordinate system and three-axle table zero-bit coordinate system is turned over by three-axle table is determined;
Sensor coordinate system M is defined as: initial point O mfor the optical centre of many visual fields complex optics sensor lens, X mparallel with imageing sensor line direction, Y mparallel with imageing sensor column direction, Z malong the optical axis direction of lens of star sensor, determined by the right-hand rule, be denoted as: O m-X my mz m;
Turntable zero-bit coordinate system N is defined as: the coordinate system determined by rotation of rotary table axle under zero-bit state, and initial point is the centre of gyration of turntable, X rfor the rotating shaft of three-axle table inner axis, Y rfor the rotating shaft of turntable center axle, Z rfor the rotating shaft of three-axle table housing axle is determined, be denoted as: O r-X ry rz r;
(2.2) prism incidence Vector Modeling
Define and to send and the starlight vector incided optics four prism surface is prism incidence vector V from single star optical simulator 1.The factor affecting prism incidence vector comprises starlight vector initial alignment deviation V 0, sensor installation deviation R wwith turntable transition matrix R r, between them, pass is:
V 1=R r*R w*V 0
Order V 1 = v 11 v 12 v 13 , Can obtain as calculated:
v 11=z 0cosαcosβcosω 2+z 1sinαcosβcosω 2+z 2sinβcosω 2+z 3cosαcosβcosω 1sinω 2
+z 4sinαcosβcosω 1sinω 2+z 5sinβcosω 1sinω 2+z 6cosαcosβsinω 1sinω 2
+z 7sinαcosβsinω 1sinω 2+z 8sinβsinω 1sinω 2
v 12=-z 0cosα 1cosβsinω 2-z 1sinαcosβsinω 2-z 2sinβsinω 2+z 3cosα 1cosβcosω 1cosω 2
+z 4sinαcosβcosω 1cosω 2+z 5sinβcosω 1cosω 2+z 6cosαcosβsinω 1cosω 2
+z 7sinαcosβsinω 1cosω 2+z 8sinβsinω 1cosω 2
v 13=-z 3cosαcosβsinω 1-z 4sinαcosβsinω 1-z 5sinβsinω 1+z 6cosαcosβcosω 1
+z 7sinαcosβcosω 1+z 8sinβcosω 1
Wherein α, β are the class right ascension of starlight vector in coordinate system N and class declination; z 0~ z 8for R wmatrix is from upper left to the element of bottom right, and its value is demarcated through (1) step and obtained; ω 1for turntable is from the rotation angle of zero-bit state around axis, ω 2for turntable is from the rotation angle of zero-bit around outer shaft;
(2.3) prismatic reflection Vector Modeling
Define through four prism surfaces reflections laggard enter the vector of optical lens be prismatic reflection vector V r, the factor affecting prismatic reflection vector comprises the normal vector N of corresponding edge mirror plane 1, be set to visual field 1, prism incidence vector V 1,
According to the pass that plane reflection principle obtains between them be:
V r = V 1 - 2 V 1 * N 1 T | V 1 | * | N 1 | N 1
Order V r = r 1 r 2 r 3 , Can obtain as calculated:
r 1=sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2+cosαcosβ(z 0cosω 2+z 3cosω 1sinω 2+
z 6sinω 1sinω 2)+cosβsinα(z 1cosω 1+z 4cosω 1sinω 2+z 7sinω 1sinω 2)-cosφcosγ(2sinφ
(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3sinω 1)+cosβsinα(z 7cosω 1-z 4sinω 1))
+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ(z 3cosω 1cosω 2
-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2+z 7cosω 2sinω 1))
+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2)+cosαcosβ(z 0cosω 2
+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+cosβsinα(z 1cosω 2+z 4cosω 1sinω 2+z 7sinω 1sinω 2)))
r 2=sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ(z 3cosω 1cosω 2-z 0sinω 2+
z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 4-z 1sinω 2+z 7cosω 2sinω)-cosφsinγ
(2sinφ(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3cosω 2+cosβsinα(z 7cosω 1-
z 4sinω 1))+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+cosαcosβ
(z 3cosω 1cosω 2-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2+z 7cosω 2sinω 1))
+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2+z 8sinω 1sinω 2)+cosαcosβ
(z 0cosω 2+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+cosβsinα(z 1cosω 2+z 4cosω 1sinω 2+z 7sinω 1sinω 2)))
r 3=sinβ(z 8cosω 1-z 5sinω 1)-sinγ(2sinφ(sinβ(z 8cosω 1-z 5sinω 1)+cosαcosβ(z 6cosω 1-z 3sinω 1)
+cosβsinα(z 7cosω 1-z 4sinω 1))+2cosφsinγ(sinβ(z 5cosω 1cosω 2-z 2sinω 2+z 8cosω 2sinω 1)+
cosαcosβ(z 3cosω 1cosω 2-z 0sinω 2+z 6cosω 2sinω 1)+cosβsinα(z 4cosω 1cosω 2-z 1sinω 2
+z 7cosω 2sinω 1))+2cosφcosγ(sinβ(z 2cosω 2+z 5cosω 1sinω 2)+z 8sinω 1sinω 2)+
cosαcosβ(z 0cosω 2+z 3cosω 1sinω 2+z 6sinω 1sinω 2)+sinαcosβ(z 1cosω 2+z 4cosω 1sinω 2
+z 7sinω 1sinω 2)))+cosαcosβ(z 6cosω 1-z 3sinω 1+cosβsinα(z 7cosω 1-z 4sinω 1))
(2.4) perspective projection imaging modeling
Reflection vector light enters the imaging process of lens on imageing sensor target surface and regards perspective projection as, and its imaging point position is:
x = - f v 11 v 13 1 Dx + X 0 + x ‾ ( q 1 r 2 + q 2 r 4 ) + [ p 1 ( r 2 + 2 x ‾ 2 ) + 2 p 2 xy ‾ ] y = - f v 12 v 13 1 Dy + Y 0 + y ‾ ( q 1 r 2 + q 2 r 4 ) + [ p 2 ( r 2 + 2 y ‾ 2 ) + 2 p 1 xy ‾ ]
x ‾ = x ′ - X 0 y ‾ = y ′ - Y 0 r 2 = x ‾ 2 + y ‾ 2
Wherein, f is lens focus, (X 0, Y 0) be optics principal point, unit is pixel; (Dx, Dy) is pixel dimension; p 1, p 2, p 3, p 4for distortion coefficients of camera lens;
(2.5) data acquisition and data processing
Above model establishes turntable angle position, composite sensing device parameters states the corresponding relation with asterism image coordinate, by gathering the asterism image coordinate under different revolving table position, utilizes nonlinear least square method can simulate the parameters of sensor;
(2.5.1)
Rotating table outer shaft and interior axle, be asterism linear be covered with whole visual field, a station acquisition n secondary data, n gets 100 ~ 1000, and records turntable coordinate position at that time simultaneously,
Formula is:
x ‾ = 1 n Σ i = 1 n x i
y ‾ = 1 n Σ i = 1 n y i
(2.5.2)
The turntable coordinate utilizing (2.5.1) to record, calculates prism incidence vector corresponding to different turntable coordinate, namely by turntable angle (θ according to prism incidence vector model (2.2) 1, θ 2, θ 3) calculate prism incidence vector V 1 = v 11 v 12 v 13 ;
The asterism image coordinate utilizing (2.5.1) to obtain, calculates prismatic reflection vector corresponding to different asterism coordinate, namely by asterism position according to perspective projection imaging model (2.4) calculate prismatic reflection vector V ‾ r = r 1 ‾ r 2 ‾ r ‾ 3 ;
(2.5.3)
The second step that two-step approach demarcates complex optics sensor mainly contains 4 parameters, with vector represent that these parameters are: obtain according to above-mentioned (2.4) prismatic reflection model:
V r = V 1 - 2 V 1 * N 1 T | V 1 | * | N 1 | N 1 = F ( x → )
That is: r 1 = v 11 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 11 = F r 1 ( x → )
r 2 = v 12 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 12 = F r 2 ( x → )
r 3 = v 13 - 2 V 1 * N 1 T | V 1 | * | N 1 | n 13 = F r 3 ( x → )
Wherein N 1 = n 11 n 12 n 13 ,
Suppose that the estimated value of the reflection vector calculated according to model is V ^ r = r ^ 1 r ^ 2 r ^ 3 , Due to be nonlinear function, adopt non-linear least square alternative manner to carry out estimated parameter vector if for vectorial estimated bias, then have:
Δr 1 = r 1 ‾ - r ^ 1 = AΔ x →
Δr 2 = r 2 ‾ - r ^ 2 = BΔ x →
Δr 3 = r 3 ‾ - r ^ 3 = CΔ x →
A = ( ∂ F r 1 ∂ α , ∂ F r 1 ∂ β , ∂ F r 1 ∂ γ , ∂ F r 1 ∂ φ )
B = ( ∂ F r 2 ∂ α , ∂ F r 2 ∂ β , ∂ F r 2 ∂ γ , ∂ F r 2 ∂ φ )
A = ( ∂ F r 3 ∂ α , ∂ F r 3 ∂ β , ∂ F r 3 ∂ γ , ∂ F r 3 ∂ φ )
Here A, B are sensitive matrix, suppose that participating in calculating asterism number is m, order
P = Δr 11 . . . Δr 1 m Δr 21 . . . Δr 2 m Δr 31 . . . Δr 3 m , M = A 1 . . . A m B 1 . . . B m C 1 . . . C m
Here the vector that is made up of the offset on three directions of P, M is the overall sensitive matrix of A, B, C tri-sensitive matrixs compositions,
Then there is iterative equation:
Δ x → ( k + 1 ) = Δ x → ( k ) - ( M k T M k ) - 1 M k T P ( k )
Wherein k is iteration sequence number, the stationary value after iteration terminates, and is the parameter calibration result of second step, the result of the comprehensive first step and second step, just can obtains the calibration value of all parameters, complete the demarcation to complex optics sensor.
CN201410676243.7A 2014-11-21 2014-11-21 The caliberating device of a kind of many visual fields complex optics sensor and method Active CN104406607B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410676243.7A CN104406607B (en) 2014-11-21 2014-11-21 The caliberating device of a kind of many visual fields complex optics sensor and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410676243.7A CN104406607B (en) 2014-11-21 2014-11-21 The caliberating device of a kind of many visual fields complex optics sensor and method

Publications (2)

Publication Number Publication Date
CN104406607A true CN104406607A (en) 2015-03-11
CN104406607B CN104406607B (en) 2016-04-27

Family

ID=52644254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410676243.7A Active CN104406607B (en) 2014-11-21 2014-11-21 The caliberating device of a kind of many visual fields complex optics sensor and method

Country Status (1)

Country Link
CN (1) CN104406607B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105956233A (en) * 2016-04-21 2016-09-21 清华大学 Sun-synchronous orbital satellite single view field star sensor installation direction design method
CN106767906A (en) * 2016-11-30 2017-05-31 上海航天控制技术研究所 A kind of demarcation digital sun sensor principal point, the method for focal length
CN106767901A (en) * 2016-11-25 2017-05-31 上海航天控制技术研究所 A kind of star sensor quick calibrating method
CN107588785A (en) * 2017-09-12 2018-01-16 中国人民解放军国防科技大学 Star sensor internal and external parameter simplified calibration method considering image point error
US9940524B2 (en) 2015-04-17 2018-04-10 General Electric Company Identifying and tracking vehicles in motion
CN107966164A (en) * 2017-11-28 2018-04-27 北京仿真中心 A kind of celestial sphere curtain starlight scaling method based on five-axis flight table
CN108020244A (en) * 2018-02-05 2018-05-11 北京国电高科科技有限公司 A kind of caliberating device and method of star sensor benchmark prism square installation error
US10043307B2 (en) 2015-04-17 2018-08-07 General Electric Company Monitoring parking rule violations
CN108645428A (en) * 2018-05-10 2018-10-12 天津大学 The monoblock type scaling method of six degree of freedom laser target
CN109141468A (en) * 2017-06-15 2019-01-04 北京航天计量测试技术研究所 The caliberating device at spaceborne mapping system reference attitude angle in thermal vacuum environment
CN109459058A (en) * 2018-11-16 2019-03-12 北京航天计量测试技术研究所 A kind of calibration system and method for more visual field star sensors based on three-axle table
CN109579874A (en) * 2018-12-14 2019-04-05 天津津航技术物理研究所 A kind of northern method of photoelectric platform scene mark
CN109682395A (en) * 2018-12-13 2019-04-26 上海航天控制技术研究所 The equivalent angle appraisal procedure of star sensor dynamic noise and system
CN110207723A (en) * 2019-06-16 2019-09-06 西安应用光学研究所 A kind of optronic tracker complex axes control system control method for testing precision
CN111070210A (en) * 2020-01-02 2020-04-28 中车青岛四方机车车辆股份有限公司 Workpiece positioning and calibrating method
CN113607188A (en) * 2021-08-02 2021-11-05 北京航空航天大学 Calibration system and method of multi-view-field star sensor based on theodolite cross-hair imaging
CN113847933A (en) * 2021-11-29 2021-12-28 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Astronomical navigation system shafting parameter calibration method
CN114088060A (en) * 2020-08-24 2022-02-25 中国科学院长春光学精密机械与物理研究所 Satellite-ground camera imaging system for optical remote sensing satellite pointing measurement

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101013065A (en) * 2006-03-21 2007-08-08 北京航空航天大学 Pixel frequency based star sensor high accuracy calibration method
CN101046386A (en) * 2007-03-16 2007-10-03 北京航空航天大学 Converting method and device for measuring daturm of sun sensor
CN101082497A (en) * 2007-07-13 2007-12-05 北京航空航天大学 Heavenly body sensor measuring basis transform method and apparatus thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101013065A (en) * 2006-03-21 2007-08-08 北京航空航天大学 Pixel frequency based star sensor high accuracy calibration method
CN101046386A (en) * 2007-03-16 2007-10-03 北京航空航天大学 Converting method and device for measuring daturm of sun sensor
CN101082497A (en) * 2007-07-13 2007-12-05 北京航空航天大学 Heavenly body sensor measuring basis transform method and apparatus thereof

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9940524B2 (en) 2015-04-17 2018-04-10 General Electric Company Identifying and tracking vehicles in motion
US10872241B2 (en) 2015-04-17 2020-12-22 Ubicquia Iq Llc Determining overlap of a parking space by a vehicle
US10043307B2 (en) 2015-04-17 2018-08-07 General Electric Company Monitoring parking rule violations
US11328515B2 (en) 2015-04-17 2022-05-10 Ubicquia Iq Llc Determining overlap of a parking space by a vehicle
US10380430B2 (en) 2015-04-17 2019-08-13 Current Lighting Solutions, Llc User interfaces for parking zone creation
CN105956233A (en) * 2016-04-21 2016-09-21 清华大学 Sun-synchronous orbital satellite single view field star sensor installation direction design method
CN105956233B (en) * 2016-04-21 2019-03-05 清华大学 Design method is directed toward in the installation of satellite in Sun-synchronous orbit monoscopic star sensor
CN106767901B (en) * 2016-11-25 2019-12-31 上海航天控制技术研究所 Star sensor rapid calibration method
CN106767901A (en) * 2016-11-25 2017-05-31 上海航天控制技术研究所 A kind of star sensor quick calibrating method
CN106767906A (en) * 2016-11-30 2017-05-31 上海航天控制技术研究所 A kind of demarcation digital sun sensor principal point, the method for focal length
CN106767906B (en) * 2016-11-30 2020-05-19 上海航天控制技术研究所 Method for calibrating principal point and focal length of digital sun sensor
CN109141468A (en) * 2017-06-15 2019-01-04 北京航天计量测试技术研究所 The caliberating device at spaceborne mapping system reference attitude angle in thermal vacuum environment
CN107588785A (en) * 2017-09-12 2018-01-16 中国人民解放军国防科技大学 Star sensor internal and external parameter simplified calibration method considering image point error
CN107588785B (en) * 2017-09-12 2019-11-05 中国人民解放军国防科技大学 Star sensor internal and external parameter simplified calibration method considering image point error
CN107966164A (en) * 2017-11-28 2018-04-27 北京仿真中心 A kind of celestial sphere curtain starlight scaling method based on five-axis flight table
CN107966164B (en) * 2017-11-28 2020-06-02 北京仿真中心 Celestial dome curtain starlight calibration method based on five-axis turntable
CN108020244B (en) * 2018-02-05 2024-01-02 北京国电高科科技有限公司 Calibration device and method for star sensor reference cube mirror installation error
CN108020244A (en) * 2018-02-05 2018-05-11 北京国电高科科技有限公司 A kind of caliberating device and method of star sensor benchmark prism square installation error
CN108645428A (en) * 2018-05-10 2018-10-12 天津大学 The monoblock type scaling method of six degree of freedom laser target
CN109459058A (en) * 2018-11-16 2019-03-12 北京航天计量测试技术研究所 A kind of calibration system and method for more visual field star sensors based on three-axle table
CN109682395A (en) * 2018-12-13 2019-04-26 上海航天控制技术研究所 The equivalent angle appraisal procedure of star sensor dynamic noise and system
CN109579874A (en) * 2018-12-14 2019-04-05 天津津航技术物理研究所 A kind of northern method of photoelectric platform scene mark
CN110207723A (en) * 2019-06-16 2019-09-06 西安应用光学研究所 A kind of optronic tracker complex axes control system control method for testing precision
CN111070210A (en) * 2020-01-02 2020-04-28 中车青岛四方机车车辆股份有限公司 Workpiece positioning and calibrating method
CN114088060A (en) * 2020-08-24 2022-02-25 中国科学院长春光学精密机械与物理研究所 Satellite-ground camera imaging system for optical remote sensing satellite pointing measurement
CN113607188A (en) * 2021-08-02 2021-11-05 北京航空航天大学 Calibration system and method of multi-view-field star sensor based on theodolite cross-hair imaging
CN113847933A (en) * 2021-11-29 2021-12-28 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Astronomical navigation system shafting parameter calibration method

Also Published As

Publication number Publication date
CN104406607B (en) 2016-04-27

Similar Documents

Publication Publication Date Title
CN104406607B (en) The caliberating device of a kind of many visual fields complex optics sensor and method
CN103323026B (en) The attitude reference estimation of deviation of star sensor and useful load and modification method
CN104154928B (en) Installation error calibrating method applicable to built-in star sensor of inertial platform
CN105910624B (en) A kind of scaling method of used group of optical laying prism installation error
CN100504301C (en) Heavenly body sensor measuring reference transform method and apparatus thereof
CN104462776B (en) A kind of low orbit earth observation satellite is to moon absolute radiation calibration method
CN105371844B (en) A kind of inertial navigation system initial method based on inertia/astronomical mutual assistance
CN105068065B (en) The in-orbit calibration method of spaceborne laser altimeter system instrument and system
CN104729537B (en) A kind of in-orbit real-time compensation method of star sensor low frequency aberration
CN103852085B (en) A kind of fiber strapdown inertial navigation system system for field scaling method based on least square fitting
CN104344836B (en) Posture observation-based redundant inertial navigation system fiber-optic gyroscope system level calibration method
WO2013004033A1 (en) Precision measurement method and system for star sensor
CN105160125B (en) A kind of simulating analysis of star sensor quaternary number
CN104567819B (en) A kind of star loaded camera full filed drift angle determines and compensation method
CN102426025B (en) Simulation analysis method for drift correction angle during remote sensing satellite attitude maneuver
CN103557841A (en) Method for improving photogrammetric precision of multi-camera resultant image
CN106052718A (en) Verifying method and apparatus based on POS equipment and digital aerial survey camera
CN107607127B (en) External field-based star sensor internal parameter calibration and precision rapid verification system
CN103217159A (en) SINS/GPS/polarized light combination navigation system modeling and dynamic pedestal initial aligning method
CN102706363B (en) Precision measuring method of high-precision star sensor
CN109612438B (en) Method for determining initial orbit of space target under constraint of virtual coplanar condition
CN103310487B (en) A kind of universal imaging geometric model based on time variable generates method
CN102538820B (en) Calibration method of aerial remote sensing integrated system
CN104655153A (en) Method for calibrating elements of interior orientation of mapping camera based on matrix orthogonality
Li et al. High-accuracy self-calibration for smart, optical orbiting payloads integrated with attitude and position determination

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant