CN105547286A - Composite three-view-field star sensor star map simulation method - Google Patents

Composite three-view-field star sensor star map simulation method Download PDF

Info

Publication number
CN105547286A
CN105547286A CN201610015795.2A CN201610015795A CN105547286A CN 105547286 A CN105547286 A CN 105547286A CN 201610015795 A CN201610015795 A CN 201610015795A CN 105547286 A CN105547286 A CN 105547286A
Authority
CN
China
Prior art keywords
star
observation
optical surface
magnitude
visual field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610015795.2A
Other languages
Chinese (zh)
Other versions
CN105547286B (en
Inventor
吴峰
朱锡芳
相入喜
许清泉
李辉
邹全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zhixing Future Automobile Research Institute Co., Ltd
Original Assignee
Changzhou Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Institute of Technology filed Critical Changzhou Institute of Technology
Priority to CN201610015795.2A priority Critical patent/CN105547286B/en
Publication of CN105547286A publication Critical patent/CN105547286A/en
Application granted granted Critical
Publication of CN105547286B publication Critical patent/CN105547286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • G01C21/025Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means with the use of startrackers

Abstract

The invention discloses a composite three-view-field star sensor star map simulation method. The composite three-view-field star sensor star map simulation method comprises the following steps that step 1, an observation star meeting the requirements is selected according to the limiting magnitude of a star sensor, and star number, star magnitude, right ascension and declination data of the observation star is extracted and stored in an observation star database; step 2, statistics is conducted on observation stars in three optical system view fields of the composite three-view-field star sensor, and incidence angles of the observation stars in subsystem coordinate systems the observation stars belongs to in the X and Y directions are calculated; step 3, the star magnitude, view field position data of fake stars is randomly produced according to the total number of the fake stars; step 4, imaging of the observation stars and the fake stars within the respective view fields performed by an optical system is simulated through ray tracing, and image plane information is calculated; step 5, the brightness of each pixel of a digital star map is calculated, and the digital star map is output. The composite three-view-field star sensor star map simulation method is simple and convenient to operate and low in cost, does not depend on a hardware system formed through machining and can provide rich simulation star map data for other technical research of the composite three-view-field star sensor.

Description

A kind of compound three visual field star sensor star map simulation method
Technical field
The invention belongs to celestial navigation technical field, relate to a kind of technology using computer simulation compound three visual field star sensor imaging process to obtain simulation star chart.
Background technology
Effective gesture stability is the necessary guarantee that the spacecrafts such as satellite fly smoothly, attitude measurement is then the prerequisite of gesture stability, and star sensor take fixed star as target, by taking pictures to them and identifying them, and then measure attitude information, be one of attitude measurement instrument that current precision is the highest.Main use monoscopic star sensor at present, this kind of star sensor has larger field angle usually, is easily subject to the impact of the parasitic lights such as sunshine, and reliability is low.Meanwhile, the attitude measurement accuracy of roll angle is lower than the angle of pitch and crab angle.Spacecraft often needs to install two or more this kind of star sensor, to ensure to obtain higher reliability and measuring accuracy.
Multiple optical system is become an entirety according to certain space distribution permutation and combination by many visual fields star sensor, starry sky in their visual fields is imaged onto on respective imageing sensor by each optical system respectively, carry out importance in star map recognition after their output star chart combination, calculate and export attitude.Compared with monoscopic star sensor, the visual field of each optical system of many visual fields star sensor is little, is easy to obtain higher attitude measurement accuracy.
Current compound three visual field star sensor star map simulation method is, adopts 3 monoscopic star sensors to take pictures to sky simulator respectively, exports star chart.Because the optical system of star sensor 3 visual fields, compound three visual field meets certain space geometry relation, and their imaging should keep synchronous, therefore, the analog starry sky that this analogy method requires each multi-star simulator to export is consistent with the starry sky that star sensor 3 visual fields, compound three visual field are observed respectively, and can synchronously convert.The system architecture of this analogy method is complicated, operates more loaded down with trivial details.
" method for recognising star map of a kind of many visual fields star sensor " (publication number: CN103363987A), in fact be a kind of method for recognising star map of double-view field, its technical scheme be the fixed star star image coordinate conversion of other visual field under the first field image volume coordinate, then utilize the method for recognising star map of double-view field star sensor to identify.The method focuses on real-time when realizing importance in star map recognition, high to matching requirements, and cannot realize star image simulation.
Summary of the invention
For problems of the prior art, the object of the invention is: provide a kind of compound three visual field star sensor star map simulation method, for the technology such as obtaining star location, the nautical star of studying compound three visual field star sensor are preferred, importance in star map recognition provide abundant star chart data, for providing technical support in compound three visual field optical system of star sensor design phase inspection optical lens image quality.
A kind of compound three visual field star sensor star map simulation method that the present invention proposes, comprises the steps:
(1) first, according to the limiting magnitude of star sensor, the variable that magnitude is not more than single star of limiting magnitude, equivalent magnitude is not more than limiting magnitude double star and the highest magnitude are not more than limiting magnitude is chosen from original star catalogue, extract the data such as their asterisk, magnitude, right ascension, declination, set up observation sing data storehouse.
(2) then, determine the observation star in star sensor 3 optical system visual fields, compound three visual field, and calculate the incident angle of their X and Y-directions in affiliated subsystem coordinate system.
(3) then, according to pseudo-star sum, magnitude, field positions that random data simulates pseudo-star is produced.
(4) by ray tracing, simulate each optical system to the imaging observing star and pseudo-star in respective visual field, calculate image planes information.The optical surface of optical system can adopt sphere or aspheric surface, if any point coordinate is (x on a jth optical surface j, y j, z j), it is optical surface sum that j gets 1 to η, η, and coordinate points meets
z j = f j ( x j , y j ) = r j 2 R j + R j 2 - ( 1 + K j ) r j 2 + Σ i = 1 N j A j , i r j i
Wherein R jfor the radius-of-curvature on a jth optical surface summit, K jfor quadric surface coefficient, A j,ifor asphericity coefficient, N jfor the top step number of asphericity coefficient, for coordinate points is to the distance of axis.This formula determines the shape of optical surface, is called the face type function of optical surface.Work as K j=0, A j,iwhen being all 0, this optical surface is sphere.
The detailed process of this step comprises:
Step 1, according to incident angle and the incoming position of incident ray, calculates the direction cosine vector (l inciding the 1st optical surface glazed thread 1, m 1, p 1).According to the position of light on entrance pupil, entrance pupil is to the distance of first optical surface, and first optical surface face type function, obtains the position (x of light incidence point on first optical surface 1, y 1, z 1).J corresponding to this step equals 1.
Step 2, calculates the direction cosine vector of a jth optical surface emergent ray, that is to say direction cosine vector (l when light arrives jth+1 optical surface j+1, m j+1, p j+1).
Step 3, if j< is η, so adopts approximate algorithm, the incidence point position (x of compute ray on jth+1 optical surface j+1, y j+1, z j+1), j increases by 1 subsequently, repeats step 2.Otherwise, perform step 4.
Step 4, compute ray arrives the position (x of image planes image, y image, z image).
(5) calculate the brightness of each pixel of digital star chart, export digital star chart.
Compared with prior art, the present invention has following feature:
(1) when star sensor three optical system relative positions in compound three visual field change, only need revise parameter, stellar field Imaging Simulation can be carried out, easy and simple to handle.
(2) the present invention does not need sky simulator and the optical system of star sensor shaped, and star image simulation is realized by computer program, and to hardware device no requirement (NR), cost is low.
(3) the star chart data exported can reflect the image quality of optical system, the present invention, for predict and to avoid compound three visual field optical system of star sensor design mistake to provide support, provides reference for improving compound three visual field optical system of star sensor designing quality.
Accompanying drawing explanation
Fig. 1 is the star image simulation process flow diagram of the embodiment of the present invention.
Fig. 2 is the schematic diagram of fixed star imaging in the 1st sub-system coordinate system.
Fig. 3 is the mutual relationship figure of body coordinate system and subsystem coordinate system.
Fig. 4 is the schematic diagram that light is propagated between the optical surface that optical system 2 is adjacent.
Optical system schematic diagram in the embodiment that Fig. 5 represents.
What Fig. 6 represented is the simulation star chart schematic diagram that in embodiment, the 1st subsystem exports.
What Fig. 7 represented is the simulation star chart schematic diagram that in embodiment, the 2nd subsystem exports.
What Fig. 8 represented is the simulation star chart schematic diagram that in embodiment, the 3rd subsystem exports.
What Fig. 9 represented is the star chart schematic diagram corresponding with the 1st subsystem in embodiment that Skychart software exports.
What Figure 10 represented is the star chart schematic diagram corresponding with the 2nd subsystem in embodiment that Skychart software exports.
What Figure 11 represented is the star chart schematic diagram corresponding with the 3rd subsystem in embodiment that Skychart software exports.
Embodiment
Below in conjunction with drawings and Examples, the invention will be further described.
The principle of the present embodiment is: carry out compound three visual field star sensor stellar field Imaging Simulation, need to set up inertial coordinates system and body coordinate system, be set to O respectively i-X iy iz iand O b-X by bz b.Also need to set up the coordinate system be connected with each optical system of compound three visual field star sensor, be referred to as subsystem coordinate system, be set to O k-X ky kz k, wherein k gets 1,2 or 3, respectively corresponding each optical system.
First, subsystem coordinate system is set up.Optical system is equivalent to ideal image system, H kand H k' being respectively its thing, image space principal point, f is the focal length of optical system, the initial point O of subsystem coordinate system kbe taken at the image space principal point H of optical system k' place.X k, Y k, Z kthree axles become right-handed coordinate system, wherein X kaxle, Y kaxle, in image space interarea, is parallel to the row and column of detector focal plane respectively.Z kaxle is along optical axis, and positive dirction points to object plane, as shown in Figure 2.
If the coordinate that observation star S ideal image is S ', S ' in first sub-system coordinate system is (x s, y s,-f) time, so the direction cosine vector of observation star S in this subsystem coordinate system is
V s k = V s k 1 V s k 2 V s k 3 = - 1 x s 2 + y s 2 + f 2 x s y s - f - - - ( 1 )
So it is at X kand Y kthe field angle in direction is
X F L D = - tg - 1 ( V s k 1 V s k 3 ) , Y F L D = - tg - 1 ( V s k 2 V s k 3 ) - - - ( 2 )
Then, body coordinate system is set up.For ease of star sensor manufacture and assembling, three system optical axis select symmetric form, and angle is each other equal.For this reason, the present embodiment is by coordinate system O as shown in Figure 3 b-X by bz bas makes body coordinate system.Wherein, initial point O bbe taken as Z 1, Z 2, Z 3the intersection point of three axles, Z baxle and Z 1, Z 2, Z 3there is identical angle, if angle is x 1o 1z 1face and X bo bz bthe angle in face is τ.At body coordinate system O b-X by bz bin, the azimuthal separation of subsystem coordinate system optical axis is 120 °, and the elevation angle is 90 ° as seen from Figure 3, angle or the elevation angle determines three optical systems position relationship each other.
If known body coordinate system Z b(α is oriented in inertial coordinates system c, δ c), and given structural parameters with the value of τ, so inertial coordinates system O-X iy iz iby rotating 3 times, can with body coordinate system O b-X by bz boverlap.Inertial coordinates system is first around Z iaxle is by+X iaxially+Y iaxle rotation alpha c, obtain X ' Y ' Z ' coordinate system, new coordinate system again around Y ' axle by+Z ' axially+X ' axle half-twist-δ c, " coordinate system, this coordinate system is around Z " axle rotates θ to obtain X " Y " Z, and the coordinate system obtained overlaps with body coordinate system.Wherein θ is by body coordinate system X baxle and Y bthe actual sensing of axle determines.Similar, body coordinate system O b-X by bz bby rotating 2 times, can with subsystem coordinate system O k-X ky kz koverlap.Body coordinate system is first around Z baxle is by+X baxially+Y baxle rotates τ+(k-1) * 120 °, obtains X k' Y k' Z k' coordinate system, new coordinate system is again around Y k' axle is by+Z k' axial+X k' axle rotation the coordinate system obtained and subsystem coordinate system O k-X ky kz koverlap.
According to co-relation, if the coordinate of a star S in inertial coordinates system is (α, δ), so it is at subsystem coordinate system O k-X ky kz kin direction cosine vector be
Conversely, if a vector is at subsystem coordinate system O k-X ky kz kmiddle direction cosine vector is { V k1, V k2, V k3, so its vector in inertial coordinates system is
As shown in Figure 1, the process of compound three visual field star sensor star image simulation is as follows:
1, observation star storehouse is set up.According to the limiting magnitude of star sensor, process original star catalogue, therefrom choose the variable that magnitude is not more than single star of limiting magnitude, equivalent magnitude is not more than limiting magnitude double star and the highest magnitude are not more than limiting magnitude, extract the data such as their asterisk, magnitude, right ascension, declination, be stored as observation sing data storehouse.In this database, each observation sing data arranges by the order that declination is ascending.
2, at known body coordinate system Z bsensing (the α of axle c, δ c), and characterize the angle of three visual field system architectures with under the condition of τ, determine the observation star in star sensor 3 optical system visual fields, compound three visual field, calculate incident angle XFLD, the YFLD of their X and Y-directions in respective optical system visual field.
First, subsystem coordinate system O k-X ky kz kz kaxle is { V at the direction vector of this coordinate system k1, V k2, V k3}={ 001}, so calculates its sensing { V at inertial coordinates system according to formula (4) 1, V 2,v 3, corresponding red footpath, declination (α zk, δ zk) be
Coordinate (α, δ) is only had to meet
|δ-δ zk|≤w m(6)
Observation star just may appear in the visual field of a kth optical system, wherein w mrepresent the field angle that this optical system image planes detector diagonal line is corresponding.
Then, according to formula (3), the position of selected observation star is transformed into subsystem coordinate system from inertial coordinates system.
Finally, according to formula (2), calculating observation star is at subsystem coordinate system O k-X ky kz kin incident angle.If optical system is at X k, Y kmaximum field of view angle on direction is w aand w b, only have satisfied
| X F L D | &le; w A 2 , | Y F L D | &le; w B 2 - - - ( 7 )
Fixed star could be observed by a kth optical system, determine whether they appear in the visual field of this optical system with this.According to above method, add up the observation star obtained in each optical system visual field, also obtain their incident angle XFLD, YFLD simultaneously.
3, according to pseudo-star sum, the random data such as magnitude, position producing pseudo-star.When pseudo-star appears in the visual field of a kth optical system, it is at X k, Y kfield angle on direction is
Wherein r and χ is a random number in [0,1] interval.
4, by ray tracing, simulate each optical system to the imaging observing star and pseudo-star in respective visual field, calculate image planes information.
(1) light being used for trace is selected.Every root light represents a energy, and light should be uniformly distributed.For a kth optical system, the entrance pupil of optical system is pressed square net and divides, choose sent by the observation star in a kth optical system visual field, through the light of entrance pupil center and entrance pupil internal net point for ray tracing and analog imaging.
Consider brightness and the detector spectral response characteristic of observation star, the weights W of light is set mand W w, wherein W mbe directly proportional to brightness, W wbe directly proportional to detector spectral response.Supposing that every observation star sends the sum being full of entrance pupil light is all n ray, the initial energy of every root light is W mw w/ n ray.
(2) make ray tracing for each optical surface, compute ray arrives position and the energy of image planes.A coordinate system is set up for each optical surface in the present embodiment.For a jth optical surface, coordinate system O j-X jy jz jinitial point O jbe positioned at the summit of optical surface, Z jaxle points to image planes along optical axis.As shown in Figure 4, a jth optical surface summit is d to the distance on jth+1 optical surface summit j.If any point coordinate is (x on a jth optical surface j, y j, z j), it is optical surface sum that j gets 1 to η, η, and coordinate points meets
z j = f j ( x j , y j ) = r j 2 R j + R j 2 - ( 1 + K j ) r j 2 + &Sigma; i = 1 N j A j , i r j i - - - ( 7 )
Wherein R jfor the radius-of-curvature on a jth optical surface summit, K jfor quadric surface coefficient, A j,ifor asphericity coefficient, N jfor the top step number of asphericity coefficient, for coordinate points is to the distance of axis.Work as K j=0, A j,iwhen being all 0, this optical surface is sphere.Along light working direction, the forward and backward refractive index of this optical surface is respectively n jand n j+1, it is d to the axial distance of jth+1 optical surface j.
Step 1, according to incident angle and the incoming position of incident ray, calculates the direction cosine vector (l inciding the 1st optical surface glazed thread 1, m 1, p 1).Again according to the position of light on entrance pupil, entrance pupil is to the distance of first optical surface, and first optical surface face type function, obtains the position (x of light incidence point on first optical surface 1, y 1, z 1).J corresponding to this step equals 1.
Be the light of XFLD, YFLD for incident angle, direction cosine vector when it reaches the 1st optical surface is
l 1 = t g ( X F L D ) tg 2 ( X F L D ) + tg 2 ( Y F L D ) + 1
m 1 = t g ( Y F L D ) tg 2 ( X F L D ) + tg 2 ( Y F L D ) + 1 p 1 = 1 tg 2 ( X F L D ) + tg 2 ( Y F L D ) + 1 - - - ( 8 )
Be reference with the 1st optical surface coordinate, suppose that the coordinate of light on entrance pupil face is (x ρ, y ρ, d ρ), wherein d ρfor entrance pupil is to the distance of the 1st optical surface, so
x 1 - x &rho; l 1 = y 1 - y &rho; m 1 = z 1 - d &rho; p 1 - - - ( 9 )
Simultaneous formula (7) and (9), can obtain incidence point coordinate (x 1, y 1, z 1).
Step 2, calculates the direction cosine vector of a jth optical surface emergent ray, that is to say direction cosine vector (l when light arrives jth+1 optical surface j+1, m j+1, p j+1).If its incidence point on a jth optical surface is (x j, y j, z j), so at the normal direction (ζ at this some place of optical surface j, ξ j, γ j) be
So the exit direction of light is (l ' j+1, m j+1, p j+1), calculate
m j + 1 = n j n j + 1 l j + &xi; j [ n j n j + 1 cos&mu;- 1 - sin 2 &mu; ( n j n j + 1 ) 2 ]
p j + 1 = n j n j + 1 p j + &gamma; j [ n j n j + 1 cos &mu; - 1 - sin 2 &mu; ( n j n j + 1 ) 2 ]
Wherein μ is the angle of incident direction and normal,
Step 3, if j< is η, so adopts approximate algorithm, the incidence point position (x of compute ray on jth+1 optical surface j+1, y j+1, z j+1), j increases by 1 subsequently, repeats step 2.Otherwise, perform step 4.
Adopt approximate algorithm, if initial coordinate values is (x j+1,0, y j+1,0, z j+1,0), they are respectively
X j+1,0=x j, y j+1,0=y jand z j+1,0=z j-d j(12)
Adopt formula (13), calculate new coordinate figure, until the z calculated j+1, t+1and z j+1, tdiffer very little, the x finally obtained j+1, t+1, y j+1, t+1, z j+1, t+1namely be the incidence point position (x of light on jth+1 optical surface j+1, y j+1, z j+1).
x j + 1 , t + 1 = x j + 1 , t + l j + 1 f j + 1 ( x j + 1 , t , y j + 1 , t ) - z j + 1 , t p j + 1 y j + 1 , t + 1 = y j + 1 , t + m j + 1 f j + 1 ( x j + 1 , t , y j + 1 , t ) - z j + 1 , t p j + 1 z j + 1 , t + 1 = f j + 1 ( x j + 1 , t , y j + 1 , t ) - - - ( 13 )
Step 4, compute ray arrives the position (x of image planes image, y image, z image).The eye point of a light in the end optical surface is (x η, y η, z η), direction cosine vector is (l η, m η, p η), this optical surface is d to the distance of image planes η.So
x i m a g e = x &eta; + l &eta; d &eta; - z &eta; p &eta; , y i m a g e = y &eta; + m &eta; d &eta; - z &eta; p &eta; , z i m a g e = 0 - - - ( 13 )
According to the intersection point of light on each optical surface, the propagation distance of light in same medium can be drawn, recycle the optical energy attenuation coefficient of this medium, the energy that light reaches image planes can be tried to achieve.If energy attenuation is σ times during certain root light arrival image planes, energy when arriving image planes is σ W mw w/ n ray.For every observation star in a kth optical system visual field, in spectral range, make ray tracing, obtain them through optical system imaging.
5, calculate the brightness of each pixel of digital star chart, export digital star chart.Star sensor CCD or APS detector receive star image, and detector take pixel as elementary cell.Position (the x of image planes is arrived according to every root light image, y image) and energy, calculate the pixel at their places, and by arriving the energy accumulation of same pixel position light, obtain the light intensity on this pixel in star chart.Specify that wherein the brightest pixel gray-scale value is 255, equal proportion convergent-divergent is made in all the other pixel brightness, obtains corresponding gray-scale value.Each pixel position and gray scale thereof are saved as digital image format, i.e. exportable digital star chart.
Embodiment:
Getting three visual field optical system of star sensor field angle is 10 ° × 10 °, and bore is 27.3mm, and focal length is 43.89mm, and as shown in Figure 5, comprise 5 eyeglasses, optical surface parameter is as shown in table 1 for optical system, and wherein the 1st and the 7th optical surface are aspheric surface.The asphericity coefficient of the 1st optical surface is K 1=-0.41, A 1,8=3.12 × 10 -12, all the other are 0.The asphericity coefficient of the 7th optical surface is K 1=-0.61, A isosorbide-5-Nitrae=4.85 × 10 -5, A 1,6=3.80 × 10 -7, A 1,8=1.26 × 10 -9, all the other are 0.Selection limit magnitude is 5.2 etc., and detector pixel number is 1024 × 1024.
Face sequence number Radius Thickness Material
1 19.44 6.00 Silica
2 219.08 9.00
3 16.90 6.00 K9
4 -40.89 1.00 ZF1
5 9.24 9.00
6 18.87 5.59 K9
7 22.07 3.96
8 -92.60 6.00 ZF1
9 -16.40 7.32
10 Infinity 10.94
Table 1 optical system parameter (unit mm)
Set up observation sing data storehouse.When choosing body coordinate system Z bthe right ascension that axle points to and declination are (36 °, 30 °), Z 1axle and Z bthe angle of axle x 1o 1z 1face and X bo bz bwhen the angle in face is τ=0, the Z of subsystem coordinate system kaxle points to and is respectively (341.43 °, 41.28 °) in inertial coordinates system, (36 ° ,-15 °), (90.57 °, 41.28 °).The star chart that observation star is obtained by the optical system imaging of each subsystem is as shown in Fig. 6,7,8, and three visual fields observe 5,5,6 stars respectively.Fig. 9,10,11 corresponding to three visual fields adopting Skychart software to export gives at star chart, comparison diagram 6,7,8 visible, the star chart of the star map simulation method output of the present embodiment is consistent with Skychart software.
Table 2 gives star image position during observation Conger myriaster imaging, and uses obtaining star location algorithm process Fig. 6,7, the 8 star image positions obtained, and for simplicity, is called ideal position and measuring position.Data from table, the site error that star image simulation obtains is less than 0.4 pixel, and compound three visual field star sensor star map simulation method is correct.
Star chart star image data simulated by table 2
The present invention can for research compound three visual field star sensor obtaining star location, nautical star preferably, the technology such as importance in star map recognition provides abundant simulation star chart data.It with preferred embodiment openly as above; but they are not for limiting the present invention; anyly have the knack of this those skilled in the art; without departing from the spirit and scope of the invention; can make various changes or retouch from working as, what therefore protection scope of the present invention should define with the claims of the application is as the criterion.

Claims (5)

1. a compound three visual field star sensor star map simulation method, is characterized in that, comprise the steps:
Step 1, foundation observation sing data storehouse; According to the limiting magnitude of star sensor, the variable that magnitude is not more than single star of limiting magnitude, equivalent magnitude is not more than limiting magnitude double star and the highest magnitude are not more than limiting magnitude is chosen from original star catalogue, extract the asterisk of described observation star, magnitude, right ascension, declination data, be stored as described observation sing data storehouse;
Described observation star in step 2, statistics star sensor three optical system visual fields, compound three visual field, and calculate the incident angle of described observation star X, Y-direction in affiliated subsystem coordinate system;
Step 3, according to pseudo-star sum, random magnitude, the field positions data producing described pseudo-star;
Step 4, by ray tracing, simulate the imaging of described optical system to observation star described in respective visual field and described pseudo-star, calculate image planes information;
Step 5, calculate the brightness of each pixel of digital star chart, export digital star chart.
2. compound three visual field according to claim 1 star sensor star map simulation method, is characterized in that: in described observation sing data storehouse, and described observation sing data arranges by the order that declination is ascending.
3. compound three visual field according to claim 1 star sensor star map simulation method, is characterized in that: three described system optical axis select symmetric form, and angle is each other equal; First set up subsystem coordinate system, next sets up body coordinate system.
4. compound three visual field according to claim 3 star sensor star map simulation method, is characterized in that, comprise further after step 2:
Known body coordinate system Z bsensing (the α of axle c, δ c), and the angle of compound three visual field system architecture, step 21, subsystem coordinate system O k-X ky kz kz kaxle is { V at the direction vector of this coordinate system k1, V k2, V k3}={ 001}, according to:
Calculate its sensing { V at inertial coordinates system 1, V 2, V 3, corresponding red footpath, declination (α zk, δ zk) be:
Coordinate (α, δ) is only had to meet
|δ-δ zk|≤w m(3)
Described observation star just may appear in the visual field of kth described optical system, wherein w mrepresent the field angle that described optical system image planes detector diagonal line is corresponding;
Step 22, according to:
The position of described observation star is transformed into subsystem coordinate system from inertial coordinates system;
Step 23, according to:
X F L D = - tg - 1 ( V s k 1 V s k 3 ) , Y F L D = - tg - 1 ( V s k 2 V s k 3 ) - - - ( 5 )
Calculating observation star is at subsystem coordinate system O k-X ky kz kin incident angle; If optical system is at X k, Y kmaximum field of view angle on direction is w aand w b, only have satisfied
| X F L D | &le; w A 2 , | Y F L D | &le; w B 2 - - - ( 6 )
Fixed star could be observed by a kth optical system.
5. compound three visual field according to claim 1 star sensor star map simulation method, is characterized in that: the optical surface of described optical system adopts sphere or aspheric surface, if any point coordinate is (x on a jth optical surface j, y j, z j), it is optical surface sum that j gets 1 to η, η, and coordinate points meets:
z j = f j ( x j , y j ) = r j 2 R j + R j 2 - ( 1 + K j ) r j 2 + &Sigma; i = 1 N j A j , i r j i - - - ( 7 )
Wherein R jfor the radius-of-curvature on a jth optical surface summit, K jfor quadric surface coefficient, A j,ifor asphericity coefficient, N jfor the top step number of asphericity coefficient, for coordinate points is to the distance of axis; Work as K j=0, A j,iwhen being all 0, described optical surface is sphere;
Comprise further after step 4:
Step 41, according to incident angle and the incoming position of incident ray, calculates the direction cosine vector (l inciding the 1st optical surface glazed thread 1, m 1, p 1); According to the position of light on entrance pupil, entrance pupil is to the distance of first optical surface, and first optical surface face type function, obtains the position (x of light incidence point on first optical surface 1, y 1, z 1), now j=1;
Step 42, calculates the direction cosine vector of a jth optical surface emergent ray, that is to say direction cosine vector (l when light arrives jth+1 optical surface j+1, m j+1, pj+1);
Step 43, if j< is η, so adopts approximate algorithm, the incidence point position (x of compute ray on jth+1 optical surface j+1, y j+1, z j+1), j increases by 1 subsequently, repeats step 42; If j>=η, perform step 44.
Step 44, compute ray arrives the position (x of image planes image, y image, z image).
CN201610015795.2A 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method Active CN105547286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610015795.2A CN105547286B (en) 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610015795.2A CN105547286B (en) 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method

Publications (2)

Publication Number Publication Date
CN105547286A true CN105547286A (en) 2016-05-04
CN105547286B CN105547286B (en) 2018-04-10

Family

ID=55826641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610015795.2A Active CN105547286B (en) 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method

Country Status (1)

Country Link
CN (1) CN105547286B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101637A (en) * 2017-05-27 2017-08-29 电子科技大学天府协同创新中心 Digital star chart emulation mode and device
CN107883947A (en) * 2017-12-28 2018-04-06 常州工学院 Star sensor method for recognising star map based on convolutional neural networks
CN110926501A (en) * 2019-11-08 2020-03-27 中国科学院长春光学精密机械与物理研究所 Automatic calibration method and system for optical measurement equipment and terminal equipment
CN112697136A (en) * 2020-11-26 2021-04-23 北京机电工程研究所 Rapid minimized area star map simulation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102853851A (en) * 2012-09-17 2013-01-02 常州工学院 Imaging system and imaging method for stellar field of computer simulated star sensors
CN103344256A (en) * 2013-06-19 2013-10-09 哈尔滨工业大学 Laboratory testing method for multi-field-of-view star sensor
CN104061929A (en) * 2014-07-08 2014-09-24 上海新跃仪表厂 Common-light-path and multi-view-field star sensor and star attitude measurement method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102853851A (en) * 2012-09-17 2013-01-02 常州工学院 Imaging system and imaging method for stellar field of computer simulated star sensors
CN103344256A (en) * 2013-06-19 2013-10-09 哈尔滨工业大学 Laboratory testing method for multi-field-of-view star sensor
CN104061929A (en) * 2014-07-08 2014-09-24 上海新跃仪表厂 Common-light-path and multi-view-field star sensor and star attitude measurement method thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FENG WU 等,: ""Design and application of star map simulation system for star sensors"", 《2013 INTERNATIONAL CONFERENCE ON OPTICAL INSTRUMENTS AND TECHNOLOGY: OPTICAL SENSORS AND APPLICATIONS》 *
王昊京 等,: ""基于粗测位置和方位的三视场快速星图识别方法"", 《中国光学》 *
黄战华 等,: ""变折射率介质中光线追迹通用算法的研究"", 《光学学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101637A (en) * 2017-05-27 2017-08-29 电子科技大学天府协同创新中心 Digital star chart emulation mode and device
CN107883947A (en) * 2017-12-28 2018-04-06 常州工学院 Star sensor method for recognising star map based on convolutional neural networks
CN107883947B (en) * 2017-12-28 2020-12-22 常州工学院 Star sensor star map identification method based on convolutional neural network
CN110926501A (en) * 2019-11-08 2020-03-27 中国科学院长春光学精密机械与物理研究所 Automatic calibration method and system for optical measurement equipment and terminal equipment
CN112697136A (en) * 2020-11-26 2021-04-23 北京机电工程研究所 Rapid minimized area star map simulation method
CN112697136B (en) * 2020-11-26 2023-12-05 北京机电工程研究所 Quick minimum area star map simulation method

Also Published As

Publication number Publication date
CN105547286B (en) 2018-04-10

Similar Documents

Publication Publication Date Title
Zhang Star identification
CN103913148B (en) Space flight TDI CCD camera full link numerical value emulation method
CN102607526B (en) Target posture measuring method based on binocular vision under double mediums
CN103426149B (en) The correction processing method of wide-angle image distortion
CN104406607B (en) The caliberating device of a kind of many visual fields complex optics sensor and method
CN106595645A (en) Method for making guide star database based on output accuracy of star sensors
CN102175241B (en) Autonomous astronomical navigation method of Mars probe in cruise section
CN103727937B (en) Star sensor based naval ship attitude determination method
CN102927982B (en) Double-spectrum autonomous navigation sensor and design method of double-spectrum autonomous navigation sensor
CN106595702B (en) A kind of multisensor spatial registration method based on astronomy calibration
CN105547286A (en) Composite three-view-field star sensor star map simulation method
CN105528500A (en) Imaging simulation method and system for decimeter-scale satellite-borne TDI CCD stereoscopic mapping camera
CN105913435B (en) A kind of multiscale morphology image matching method and system suitable for big region
CN103017762A (en) Fast acquisition positioning method for space target of ground-based photoelectric telescope
CN103591966A (en) Star simulator test platform and test calibration method
CN102706363B (en) Precision measuring method of high-precision star sensor
CN111537003A (en) Starlight atmospheric refraction measurement correction method based on refraction surface collineation
CN107655485A (en) A kind of cruise section independent navigation position deviation modification method
CN106679676A (en) Single-viewing-field multifunctional optical sensor and realization method
CN107451957B (en) Imaging simulation method and device for satellite-borne TDI CMOS camera
CN102853851B (en) The imaging system of computer simulation star sensor stellar field and formation method
CN105023281B (en) Asterism based on point spread function wavefront modification is as centroid computing method
CN102288201A (en) Precision measurement method for star sensor
CN105444778A (en) Star sensor in-orbit attitude determination error obtaining method based on imaging geometric inversion
CN102288200A (en) Accuracy measurement system for star sensor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201117

Address after: Jiangning District of Nanjing City, Jiangsu province 211111 streets moling Road No. 12 mo Zhou

Patentee after: Jiangsu Zhixing Future Automobile Research Institute Co., Ltd

Address before: 213022 No. 1, Wushan Road, Xinbei District, Jiangsu, Changzhou

Patentee before: CHANGZHOU INSTITUTE OF TECHNOLOGY