CN101676687B - Ultra-high accuracy star sensor - Google Patents

Ultra-high accuracy star sensor Download PDF

Info

Publication number
CN101676687B
CN101676687B CN2009101718795A CN200910171879A CN101676687B CN 101676687 B CN101676687 B CN 101676687B CN 2009101718795 A CN2009101718795 A CN 2009101718795A CN 200910171879 A CN200910171879 A CN 200910171879A CN 101676687 B CN101676687 B CN 101676687B
Authority
CN
China
Prior art keywords
star
unit
pixel
pixels
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009101718795A
Other languages
Chinese (zh)
Other versions
CN101676687A (en
Inventor
张广军
江洁
樊巧云
魏新国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN2009101718795A priority Critical patent/CN101676687B/en
Publication of CN101676687A publication Critical patent/CN101676687A/en
Application granted granted Critical
Publication of CN101676687B publication Critical patent/CN101676687B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a star sensor with ultrahigh precision, which comprises an optical imaging system, an imaging sensor, an imaging sensor drive unit, a two-path centroid imaging unit, a star tracking unit, a star pattern recognition unit, an attitude calculation unit and a navigation star database. The star sensor is characterized in that two-path centroid tracking imaging technology is introduced to the star sensor of image sensor with pixel planar array of 2048*2048; two-path pixel data are read and processed at the same time; and when star tracking is conducted, the non-window matching tracking without feedback based on locating information is adopted. Therefore, the precision and the data updating rate of the star sensor are greatly improved.

Description

Ultra-high precision star sensor
Technical Field
The invention relates to a star sensor technology, in particular to a small-sized star sensor with ultrahigh precision and high data update rate.
Background
The Star Sensor (Star Sensor) is a high-precision and high-reliability attitude measurement component widely used in the current aerospace craft, the Star Sensor works in a real-time dynamic measurement mode, the imaging device of the Star Sensor adopts an area array image Sensor, and the Star Sensor is widely applied to 1024 x 1024 pixels.
With the increase of pixels, if a large area array image sensor with 2048 × 2048 pixels is adopted, the attitude accuracy is improved nonlinearly, but the data volume is increased linearly, so that for the star sensor adopting a frame imaging system and a window tracking working mode at present, the data updating rate of the star sensor is reduced linearly under the limitation of the imaging system and the working mode, the real-time dynamic testing performance of the star sensor is seriously influenced, and the star sensor becomes a bottleneck for providing attitude information in real time.
Disclosure of Invention
In view of the above, the main objective of the present invention is to provide an ultra-high precision star sensor, which has higher precision and high data update rate.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides an ultra-high precision star sensor, which comprises an optical imaging system, an image sensor driving unit, a double-path centroid imaging unit, a star tracking unit, a star map identification unit, an attitude calculation unit and a navigation star library, wherein the optical imaging system is connected with the image sensor; wherein,
an optical imaging system for imaging a starry sky image on an image sensor;
the image sensor is used for converting the optical signal into an electric signal under the driving of the image sensor driving unit and transmitting the electric signal to the double-path centroid imaging unit;
an image sensor driving unit for driving the image sensor;
the double-path centroid imaging unit is used for carrying out double-path pixel data processing on two paths of pixels which are read in simultaneously, and outputting centroid coordinates of the light spot image to the star tracking unit and the star map identification unit after the complete image is processed;
the star tracking unit is used for tracking the star in the current view field according to the star information identified at the previous moment and acquiring the star information;
the star map identification unit is used for identifying star maps from the whole celestial sphere;
the attitude calculation unit is used for solving the accurate attitude of the star sensor according to the information of the star bodies recognized by the all celestial sphere or the information of all tracked star bodies and outputting the calculated attitude of the star sensor;
and the navigation star library is used for storing a navigation star list.
In the above scheme, the image sensor is a large-area-array image sensor with 2048 × 2048 pixels.
In the scheme, the image sensor driving unit and the double-path mass center imaging unit are integrated on one FPGA; the star tracking unit, the star map recognition unit and the attitude calculation unit are integrated on a RISC.
In the above scheme, the two-way centroid imaging unit further includes: the device comprises a gray value reading module, a gray value comparison module, a two-way pixel data processing module, a background pixel processing module, a first judgment module, a storage module, a second judgment module and a light spot image centroid calculation module; wherein,
the gray value reading module is used for simultaneously reading the gray values of the two paths of pixels and sending the read gray values into the gray value comparison module;
the gray value comparison module is used for comparing the gray values of the two paths of pixels sent by the gray value reading module with a preset threshold respectively and finishing the processing of the two paths of pixels according to the comparison result;
the two-way pixel data processing module is used for finishing two-way pixel marking, two-way data equivalent merging and two-way data accumulation, then, sending the processed data to the storage module when the gray values of the two-way pixels are larger than a preset threshold value, and sending the processed data to the first judging module when the gray value of the left pixel in the two-way pixels is larger than the preset threshold value;
further, the two-way pixel data processing module comprises: the device comprises a marking unit, a merging unit and an accumulation unit; the marking unit is used for completing double-path pixel marking, marking a left pixel and a right pixel according to the comparison result of the gray value of the left pixel and the gray value of the right pixel with a preset threshold value, and marking the pixels with the same comparison result as equivalent marks; the merging unit is used for completing two-path data equivalent merging and completing merging of equivalent data according to the comparison result of the gray value of the left/right pixel and a preset threshold; and the accumulation unit is used for completing double-path data accumulation, completing accumulation of the gray values of the left and right pixels according to the comparison result of the gray values of the left and right pixels and the preset threshold value, and completing accumulation of the products of the gray values of the left and right pixels and the coordinate values.
The background pixel processing module is used for marking the current two paths of pixels as background pixels when the gray values of the left and right pixels are smaller than a preset threshold value, and assigning the marking values to corresponding parameters;
the first judgment module is used for judging whether a left pixel of a left pixel in the two paths of pixels has a mark value or not;
the storage module is used for accumulating the value of the accumulation unit into the data storage corresponding to the equivalent mark value and clearing the accumulation unit;
the second judgment module is used for judging whether the whole image is processed;
and the light spot image centroid calculation module is used for calculating and outputting the coordinate value of the light spot image centroid after the complete image is processed.
The accumulation unit comprises a first accumulator and a second accumulator, wherein the first accumulator is used for accumulating the products of the gray values of the left pixel and the right pixel and the coordinate values, and the second accumulator is used for accumulating the gray values of the left pixel and the right pixel.
In the above scheme, the star tracking unit implements non-feedback, non-window matching tracking based on the position information.
The ultra-high precision star sensor provided by the invention adopts the image sensor with large area array such as 2048 multiplied by 2048 pixels, so that the angular resolution can be improved, and the star sensor has higher precision.
In order to be matched with a large-area-array image sensor for use, the invention simultaneously introduces a double-path centroid imaging technology, can simultaneously read two paths of pixel data and simultaneously process the two paths of pixel data, thus improving the data parallel processing capacity and the data processing speed. Obviously, the invention combines the two-way centroid imaging technology with the large-area-array image sensor, the data processing speed of the star sensor with 2048 x 2048 pixels can be doubled, the application of the large-area-array image sensor in the star sensor can be realized, and the high precision and the high data updating rate can be realized.
The invention adopts feedback-free non-window matching tracking during satellite tracking, can track all the satellites on the view field, further improves the accuracy of attitude calculation, has high tracking speed, has the data update rate of 15Hz in the tracking mode and has higher data update rate.
The invention realizes the matching of the triangle by using the angular distance matching mode during the star map recognition, and can control the whole celestial sphere recognition time to be 0.5s by storing the star pairs according to the intervals and performing the triangle recognition by using the state marks.
The invention adopts the navigation star list which is divided evenly and without overlapping, the retrieval of the navigation star does not need to traverse the whole navigation star list any more, the average search range is reduced to 1/54 before, and the search speed is greatly improved.
Drawings
FIG. 1 is a schematic diagram of the structure and operation of the star sensor of the present invention;
fig. 2 is a schematic diagram of the structure of a dual-path centroid imaging device used in the present invention.
Detailed Description
The basic idea of the invention is: for a star sensor using a 2048 x 2048 pixel large-area array image sensor, a double-path centroid following imaging technology is introduced, and two paths of pixel data are read and processed simultaneously; and when the star tracking is carried out, feedback-free and non-window matching tracking based on position information is adopted, so that the precision and the data updating rate of the star sensor can be greatly improved.
As shown in fig. 1, the Star sensor proposed by the present invention includes an optical imaging system 10, an image sensor 11, an image sensor driving unit 12, a two-way centroid imaging unit 13, a Star tracking unit 14, a Star map recognition unit 15, an attitude calculation unit 16, and a navigation Star library (Guide Star Catalogue) 67.
Here, the image sensor is a large-area array image sensor having 2048 × 2048 pixels; the star tracking unit adopts a non-window tracking mode without feedback. In practical application, the image sensor driving unit 12 and the two-way centroid imaging unit 13 can be integrated on an FPGA signal processing unit for realization, and the star tracking unit 14, the star map recognition unit 15 and the attitude calculation unit 16 are integrated on an RISC signal processing unit for realization; of course, FPGA or RISC, or Digital Signal Processing (DSP) can be used; alternatively, all units except the optical imaging system 10, the image sensor 11, and the navigation star bank 17 are integrated on one FPGA, or RISC, or DSP. The following description will take the case that the image sensor driving unit 12 and the two-way centroid imaging unit 13 are integrated in the FPGA signal processing unit, and the star tracking unit 14, the star map recognition unit 15, and the attitude calculation unit 16 are integrated in the RISC signal processing unit as an example.
The optical imaging system 10 is composed of a light shield and a high-precision lens, and is used for imaging a starry sky image on the image sensor 11.
And the image sensor 11 is used for converting the optical signal into an electric signal under the driving of the image sensor driving unit 12 and transmitting the electric signal to the two-way centroid imaging unit 13. In general, an image sensor available is Lupa4000 by Cypress, which has 2048 × 2048 pixels and a frame rate of 15 frames/s.
The image sensor driving unit 12 drives the 2048 × 2048 pixel image sensor 11 based on the FPGA according to the requirement of the image sensor driving timing sequence, so that it realizes simultaneous output of two image signals line by line, the gray value of each pixel is 10bit, and the clock frequency reaches 33M. That is, the image sensor 11 outputs two paths of pixel data to the two-path centroid imaging unit 13 at a time.
And the double-path centroid imaging unit 13 is used for performing double-path pixel data processing on two paths of pixels read simultaneously, and outputting centroid coordinates of the light spot image to the star tracking unit 14 and the star map identification unit 15 after processing the complete image. Specifically, when the whole light spot image is processed, the gray values of the two current paths of pixels are read at the same time each time, the gray values of the two current paths of pixels are compared with a preset threshold value at the same time, and when at least one of the two paths of pixels is larger than the threshold value, the processes of two-path data marking, two-path data merging and two-path data accumulation are executed until the whole light spot image is processed, and the obtained light spot image centroid coordinate value is output. How the two-way centroid imaging unit 13 implements the principle and process of two-way centroid imaging is disclosed in detail in another patent application with patent application number 200810222489.1, which is applied by the applicant at 9/17 of 2008, and is not described herein again.
The double-path centroid imaging unit utilizes the characteristic of FPGA real-time parallel computation to follow and image the star centroids, and directly outputs all star centroid data in one frame of image based on four connected domain segmentation and a first moment centroid algorithm, so that compared with directly outputting the whole frame of data, the double-path centroid imaging unit can reduce the image data amount by ten thousand times.
In the specific implementation process, assuming that the imaging area of a star point is M rows and N columns, the centroid coordinate of the star point can be obtained by the following formula:
<math><mrow><msub><mi>x</mi><mn>0</mn></msub><mo>=</mo><mfrac><mrow><munderover><mi>&Sigma;</mi><mrow><mi>x</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></munderover><munderover><mi>&Sigma;</mi><mrow><mi>y</mi><mo>=</mo><mn>1</mn></mrow><mi>m</mi></munderover><mi>F</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mi>x</mi></mrow><mrow><munderover><mi>&Sigma;</mi><mrow><mi>x</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></munderover><munderover><mi>&Sigma;</mi><mrow><mi>y</mi><mo>=</mo><mn>1</mn></mrow><mi>m</mi></munderover><mi>F</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow></mrow></mfrac><mo>,</mo></mrow></math> <math><mrow><msub><mi>y</mi><mn>0</mn></msub><mo>=</mo><mfrac><mrow><munderover><mi>&Sigma;</mi><mrow><mi>x</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></munderover><munderover><mi>&Sigma;</mi><mrow><mi>y</mi><mo>=</mo><mn>1</mn></mrow><mi>m</mi></munderover><mi>F</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mi>y</mi></mrow><mrow><munderover><mi>&Sigma;</mi><mrow><mi>x</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></munderover><munderover><mi>&Sigma;</mi><mrow><mi>y</mi><mo>=</mo><mn>1</mn></mrow><mi>m</mi></munderover><mi>F</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow></mrow></mfrac></mrow></math>
wherein x0 and y0 are obtained star point centroid coordinates; x, y are coordinates of the pixel; f (x, y) is the gray scale value for x rows and y columns of pixels.
Further, the two-way centroid imaging unit 13 reads the gray values of the two ways of pixels of the output image each time through the image sensor driving unit 12, and performs labeling, equivalent merging and accumulation processing on the data of the two ways of pixels according to the principle of four-connected domain segmentation and the comparison result of the gray values of the left pixel and the right pixel with a preset threshold value, so as to improve the data parallel processing capability and the data processing speed.
Here, the specific composition structure of the two-way centroid imaging unit 13 is shown in fig. 2, and includes: the device comprises a gray value reading module 21, a gray value comparison module 22, a two-way pixel data processing module 23, a background pixel processing module 24, a first judgment module 25, a storage module 26, a second judgment module 27 and a spot image centroid calculation module 28.
The gray value reading module 21 is configured to read gray values of two paths of pixels simultaneously, and send the read gray values to the gray value comparing module 22; here, the two paths of pixels are referred to as a left pixel and a right pixel, respectively.
The gray value comparison module 22 is configured to compare the gray values of the left pixel and the right pixel sent by the gray value reading module 21 with preset thresholds respectively, and complete processing on the two paths of pixels according to a comparison result; specifically, if one of the gray values of the left pixel and the right pixel is greater than the preset threshold, the two-way pixel data processing module 23 is entered to complete two-way pixel marking, two-way data equivalent merging and two-way data accumulation; and if the gray values of the left pixel and the right pixel are both smaller than the preset threshold, entering an assignment module 24, marking the current two paths of pixels and assigning the marked values to corresponding parameters.
The two-way pixel data processing module 23 is configured to complete two-way pixel labeling, two-way data equivalent merging, and two-way data accumulation, enter the second determination module 27 when the gray values of the two ways of pixels are greater than the preset threshold, send the processed data to the storage module 26 when the gray value of the left pixel in the two ways of pixels is greater than the preset threshold, and send the processed data to the first determination module 25 when the gray value of the right pixel in the two ways of pixels is greater than the preset threshold;
the two-way pixel data processing module 23 further includes a marking unit 231, a merging unit 232, and an accumulating unit 233, where the marking unit 231 is configured to complete two-way pixel marking, mark the left pixel and the right pixel according to a comparison result between the gray-level value of the left/right pixel and a preset threshold, and mark pixels with the same comparison result as equivalent marks, that is, marks with the same identification; the merging unit 232 is configured to complete two-way data equivalent merging, and complete merging of equivalent data according to a comparison result between the left/right pixel gray values and a preset threshold; the accumulation unit 233 is configured to complete two-way data accumulation, complete accumulation of the gray values of the left and right pixels according to the comparison result between the gray values of the left and right pixels and the preset threshold, and complete accumulation of the product of the gray values of the left and right pixels and the coordinate values.
And the background pixel processing module 24 is configured to mark the current two paths of pixels as background pixels when the left and right pixel gray values are smaller than a preset threshold, and assign the mark values to corresponding parameters.
The first judging module 25 is configured to judge whether a left pixel of a left pixel in the two paths of pixels has a flag value, and enter the storage module 26 when the left pixel has the flag value; in the absence of a flag value, the second decision block 27 is entered.
A storage module 26, configured to accumulate the value of the accumulation unit into the data storage corresponding to the equivalent flag value, and clear the accumulation unit;
here, the accumulation unit includes an accumulator for accumulating products of the left and right pixel gradation values and the coordinate values; and an accumulator for accumulating the gray values of the left and right pixels.
The second judging module 27 is configured to judge whether the whole image is processed, and enter the spot image centroid calculating module 28 when the whole image is processed, and enter the gray value reading module 21 when the whole image is not processed, so as to read the next two paths of pixels.
And the light spot image centroid calculating module 28 is used for calculating and outputting the coordinate value of the light spot image centroid after the complete image is processed.
Specifically, the two-way pixel marking comprises the following steps:
step a 1-b 1: judging whether the gray value of the left pixel is smaller than a threshold value and the gray value of the right pixel is larger than the threshold value, marking the pixel above the right pixel as a background pixel, if so, marking the left pixel as the background pixel and marking the right pixel as a new marking value, and executing a step q 1; otherwise, step c1 is performed.
Wherein the background pixels may be marked with zero and non-background pixels with non-zero values. The new flag value may be stored in a special register for providing the pixel with the new flag value, and the new flag value may be updated in different ways as long as it is ensured that the new flag value provided each time is not repeated. Such as: after each new mark value is used, the new mark value is added with 1 and is stored again for the next pixel marking.
Step c 1-d 1: judging whether the gray value of the left pixel is smaller than the threshold value, the gray value of the right pixel is larger than the threshold value, and the mark of the pixel above the right pixel is not zero, if so, marking the left pixel as zero and marking the right pixel as the mark value of the pixel above, and executing a step q 1; otherwise, step e1 is performed.
Step e 1-f 1: judging whether the gray value of the left pixel is greater than the threshold value, the gray value of the right pixel is greater than the threshold value, and the left pixel mark of the left pixel is not zero, if so, marking the left pixel and the right pixel with the same mark value of the left pixel, and executing a step q 1; otherwise, step g1 is executed.
Here, the left pixel of the right pixel is the left pixel, so the left and right pixels have only one left pixel of the left pixel.
Step g 1-h 1: judging whether the gray value of the left pixel is greater than a threshold value, the gray value of the right pixel is greater than the threshold value, the left pixel of the left pixel is marked as zero, the mark of the pixel above the left pixel is not zero, if yes, marking the same mark value of the pixel above the left pixel and the right pixel, and executing a step q 1; otherwise, step i1 is performed.
Steps i1 to j 1: judging whether the gray value of the left pixel is greater than the threshold value, the gray value of the right pixel is greater than the threshold value, the left pixel of the left pixel is marked as zero, the pixel mark above the left pixel is also marked as zero, if so, marking the left pixel and the right pixel with the same new mark value, and executing a step q 1; otherwise, step k1 is executed.
Step k 1-l 1: judging whether the gray value of the left pixel is larger than the threshold value and the gray value of the right pixel is smaller than the threshold value, and if so, marking the left pixel as the marking value of the left pixel and marking the right pixel as zero, and executing a step q 1; otherwise, step m1 is performed.
Step m 1-n 1: judging whether the gray value of the left pixel is larger than a threshold value and the gray value of the right pixel is smaller than the threshold value, marking the left pixel of the left pixel as zero and marking the pixel above the left pixel as non-zero, if so, marking the left pixel as the marking value of the pixel above and marking the right pixel as zero, and executing a step q 1; otherwise, step o1 is performed.
Step o 1-p 1: judging whether the gray value of the left pixel is larger than a threshold value and the gray value of the right pixel is smaller than the threshold value, marking the left pixel of the left pixel as zero and marking the pixel above the left pixel as zero, if so, marking the left pixel as a new marking value and marking the right pixel as zero, and executing a step q 1; otherwise, step q1 is executed directly.
Step q 1: and assigning the marking values of the current two paths of pixels to respective upper marking parameter groups, and assigning the marking value of the right pixel to a left marking parameter.
Here, the upper flag parameter group may be stored by a buffer, and the left flag parameter group may be stored by a register. The left marking parameter is a marking value, and is set to zero during initialization, the upper marking parameter group is used to store a group of marking parameter values, an array may be adopted, and each mark in the group corresponds to a pixel, for example: one line has 10 pixels, the upper mark parameter group is a mark group consisting of 10 marks, each mark corresponds to one pixel in the line, and the initial values of the group of mark parameters are all zero. Accordingly, in the case of assigning, the marking value of the current pixel is assigned to a marking parameter in the set of marking parameters corresponding to the current pixel, for example: there are 10 pixels in a row, the upper marking parameter group includes 10 marking parameters, and the current pixel is the 5 th pixel in the row, then the assigning means assigning the marking value of the current pixel to the 5 th marking parameter in the upper marking parameter group. When the judgment is carried out, the marking value of the pixel above the current pixel is also judged by finding the marking parameter corresponding to the serial number of the current pixel in the upper marking parameter group.
The two-way data combination specifically comprises the following steps:
step a 2-b 2: judging whether the left pixel is larger than the threshold value, and if so, reading and accumulating the data of the storage space corresponding to the left pixel mark to the storage space corresponding to the left pixel mark, and emptying the storage space corresponding to the upper pixel mark; otherwise, performing step c 2;
step c 2-d 2: judging whether the pixel above the left pixel has no mark when the left pixel and the right pixel are both larger than the threshold value, and judging whether the pixel above the left pixel has a mark and is not equal to the mark of the left pixel; otherwise, no processing is performed.
The two-way data accumulation specifically comprises the following steps:
step a 3-b 3: judging whether the gray value of the left pixel is smaller than a threshold value and the gray value of the right pixel is larger than the threshold value, if so, assigning the product of the gray value of the right pixel and the coordinate value to a first accumulator, and assigning the gray value of the right pixel to a second accumulator; otherwise, performing step c 3;
step c 3-d 3: judging whether the gray value of the left pixel is larger than a threshold value and the gray value of the right pixel is smaller than the threshold value, if so, accumulating the product of the gray value and the coordinate value of the left pixel and the value of the first accumulator, and then updating the value of the first accumulator by using an accumulated value; accumulating the gray value of the left pixel and the value of the second accumulator, and then updating the value of the second accumulator by using the accumulated value; otherwise, go to step e 3;
step e 3-f 3: judging whether the gray values of the left pixel and the right pixel are both larger than a threshold value, if so, accumulating the product of the gray value and the coordinate value of the left pixel and the right pixel with the value of a first accumulator, and then updating the value of the first accumulator by using an accumulated value; accumulating the gray values of the left and right pixels and the value of a second accumulator, and then updating the value of the second accumulator by using the accumulated value; otherwise, no processing is carried out;
here, the first accumulator is used for accumulating the products of the gray values and the coordinate values of the left and right pixels; the second accumulator is used for accumulating the gray values of the left and right pixels.
And the star tracking unit 14 is used for tracking the star in the current view field according to the star information which is identified at the previous moment and acquiring the star information. Generally, after the star sensor obtains the identification result of the all-celestial star map, the working state of the star sensor is switched into a tracking mode, and the tracking mode is the main working mode of the star sensor.
In order to track all stars and avoid the influence of original image data on tracking speed and data updating rate, the invention adopts the fast star tracking method of the star sensor given in the patent with the Chinese patent number ZL200510084010.9 on the basis of double-path centroid imaging and direct output of centroid data instead of original image data, and realizes non-feedback and non-window matching tracking based on position information. The specific tracking process is as follows: and searching the tracked star matched with the star at the position at the previous moment according to the position information of the star at the current moment, if one and only one star is found to be matched with the star, the matching identification is successful, and the information (including right ascension, declination, star and the like, star signs) of the current star is consistent with the information of the matched star at the previous moment. The tracking method can track all stars on the view field, and the speed of the tracking process is improved due to small transmitted data volume, thereby providing guarantee for high attitude calculation precision and high data updating rate of the star sensor.
The star map recognition unit 15 is configured to recognize star maps from the whole celestial sphere and send the recognized star maps to the attitude calculation unit 16.
The unit adopts an improved triangle star algorithm disclosed in the Chinese patent No. ZL200410102585.4, and realizes the matching of the triangles by using an angular distance matching mode, so that the problem of large storage capacity which is required to be faced when the triangles are stored can be avoided. The algorithm does not need to rely on accurate brightness information, and therefore has higher feasibility. Moreover, the speed of the algorithm is greatly improved by storing the star pairs and utilizing the state marks in intervals to perform triangle recognition, the all celestial sphere recognition time is only 0.5s, and the general all celestial sphere recognition time is over the second level, so that the high data updating rate of the star sensor under the all celestial sphere recognition is ensured.
The attitude calculation unit 16 uses four elements widely used in the star sensor, and solves the accurate attitude of the star sensor according to the information of all stars tracked by the star tracking unit 14 or the information of stars recognized by the star map recognition unit 15 in the whole celestial sphere, and outputs the calculated attitude of the star sensor. How to calculate the attitude of the star sensor belongs to the prior art, and details are not described here.
The navigation satellite library 17 is used for storing navigation satellite tables which are divided uniformly and without overlapping, and specifically, the invention adopts the scheme disclosed in the Chinese patent with the patent number ZL 200510002220.9: and (3) subdividing the sky area under a rectangular coordinate system: the celestial sphere is uniformly divided into six areas by an inscribed cube of the celestial sphere, the connecting line of the center of the celestial sphere and four vertexes of each side surface of the cube forms a cone, and the cone is intersected with the sphere and divided into six blocks; for each of the six blocks, it is divided into N × N small blocks, so that the entire celestial sphere can be divided into 6 × N sub-blocks.
The celestial sphere is divided according to the method, and then the navigation star table is scanned, so that each navigation star can be classified into corresponding sub-blocks, and a partition table is established. Thus, if the direction vector pointed by the visual axis or the declination coordinate of the right ascension is known, the corresponding sub-block and the adjacent sub-blocks can be quickly found on the sky area.
In order to facilitate the navigation stars in the vicinity of the navigation stars to be quickly retrieved from the serial numbers of the navigation stars, the serial numbers of the sub-blocks to which the navigation stars belong can be stored in the navigation star table. By utilizing the method to construct the navigation star catalogue and the partition catalogue, the navigation star catalogue can be quickly searched from an initial posture (visual axis direction) or a navigation star number to a certain neighborhood range. The method for searching the navigation stars does not need to traverse the whole navigation star table any more, the average searching range is 9/486-1/54, and the searching speed can be greatly improved.
Based on the structure given in fig. 1, the optical imaging system 10 images a starry sky image on the image sensor 11; under the drive of an image sensor driving unit 12 in the FPGA signal processing unit, an image sensor 11 converts an optical signal into an electric signal and transmits the electric signal to a double-path centroid imaging unit 13; the double-path centroid imaging unit 13 extracts the position information of the star in the observation field of view and outputs the position information to the subsequent RISC signal processing unit; the star map recognition unit 15 and the star tracking unit 14 in the RISC signal processing unit complete the star map recognition and the star tracking according to the navigation star table stored in the navigation star database 17, that is: finding out the corresponding match of the observation star in the navigation star library 17 to realize the pattern recognition of the star and the non-window tracking without feedback; and the attitude calculation unit 16 calculates the three-axis attitude of the star sensor by using the direction vector information of the matched star pairs and outputs attitude information.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (5)

1. An ultra-high precision star sensor is characterized by comprising an optical imaging system, an image sensor driving unit, a double-path centroid imaging unit, a star tracking unit, a star map identification unit, an attitude calculation unit and a navigation star library; wherein,
an optical imaging system for imaging a starry sky image on an image sensor;
the image sensor is used for converting the optical signal into an electric signal under the driving of the image sensor driving unit and transmitting the electric signal to the double-path centroid imaging unit;
an image sensor driving unit for driving the image sensor;
the double-path centroid imaging unit is used for carrying out double-path pixel data processing on two paths of pixels which are read in simultaneously, and outputting centroid coordinates of the light spot image to the star tracking unit and the star map identification unit after the complete image is processed;
the star tracking unit is used for tracking the star in the current view field according to the star information identified at the previous moment and acquiring the star information;
the star map identification unit is used for identifying star maps from the whole celestial sphere;
the attitude calculation unit is used for solving the accurate attitude of the star sensor according to the information of the star bodies recognized by the all celestial sphere or the information of all tracked star bodies and outputting the calculated attitude of the star sensor;
the navigation star library is used for storing a navigation star list;
the image sensor is a large-area-array image sensor with 2048 x 2048 pixels.
2. The star sensor according to claim 1, wherein the image sensor driving unit and the dual path centroid imaging unit are integrated on one FPGA; the star tracking unit, the star map recognition unit and the attitude calculation unit are integrated on a RISC.
3. The star sensor of claim 1, wherein the dual path centroid imaging unit further comprises: the device comprises a gray value reading module, a gray value comparison module, a two-way pixel data processing module, a background pixel processing module, a first judgment module, a storage module, a second judgment module and a light spot image centroid calculation module; wherein,
the gray value reading module is used for simultaneously reading the gray values of the two paths of pixels and sending the read gray values into the gray value comparison module;
the gray value comparison module is used for comparing the gray values of the two paths of pixels sent by the gray value reading module with a preset threshold respectively and finishing the processing of the two paths of pixels according to the comparison result;
the two-way pixel data processing module is used for finishing two-way pixel marking, two-way data equivalent merging and two-way data accumulation, then, sending the processed data to the storage module when the gray values of the two-way pixels are larger than a preset threshold value, and sending the processed data to the first judging module when the gray value of the left pixel in the two-way pixels is larger than the preset threshold value;
further, the two-way pixel data processing module comprises: the device comprises a marking unit, a merging unit and an accumulation unit; wherein,
the marking unit is used for completing double-path pixel marking, marking the left pixel and the right pixel according to the comparison result of the gray value of the left pixel and the gray value of the right pixel with a preset threshold value, and marking the pixels with the same comparison result as equivalent marks;
the merging unit is used for completing two-path data equivalent merging and completing merging of equivalent data according to the comparison result of the gray value of the left/right pixel and a preset threshold;
the accumulation unit is used for completing double-path data accumulation, completing accumulation of the gray values of the left and right pixels according to the comparison result of the gray values of the left and right pixels and a preset threshold value, and completing accumulation of the products of the gray values of the left and right pixels and the coordinate values;
the background pixel processing module is used for marking the current two paths of pixels as background pixels when the gray values of the left and right pixels are smaller than a preset threshold value, and assigning the marking values to corresponding parameters;
the first judgment module is used for judging whether a left pixel of a left pixel in the two paths of pixels has a mark value or not;
the storage module is used for accumulating the value of the accumulation unit into the data storage corresponding to the equivalent mark value and clearing the accumulation unit;
the second judgment module is used for judging whether the whole image is processed;
and the light spot image centroid calculation module is used for calculating and outputting the coordinate value of the light spot image centroid after the complete image is processed.
4. The star sensor of claim 3, wherein the accumulation unit comprises a first accumulator for accumulating the product of the gray scale value of the left and right pixels and the coordinate value, and a second accumulator for accumulating the gray scale value of the left and right pixels.
5. The star sensor of claim 3, wherein the star tracking unit implements feedback-free, non-window matching tracking based on position information.
CN2009101718795A 2008-09-17 2009-09-16 Ultra-high accuracy star sensor Expired - Fee Related CN101676687B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009101718795A CN101676687B (en) 2008-09-17 2009-09-16 Ultra-high accuracy star sensor

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN200810222490.4 2008-09-17
CNA2008102224904A CN101363733A (en) 2008-09-17 2008-09-17 Ultra-high accuracy star sensor
CN2009101718795A CN101676687B (en) 2008-09-17 2009-09-16 Ultra-high accuracy star sensor

Publications (2)

Publication Number Publication Date
CN101676687A CN101676687A (en) 2010-03-24
CN101676687B true CN101676687B (en) 2011-02-02

Family

ID=40390233

Family Applications (2)

Application Number Title Priority Date Filing Date
CNA2008102224904A Pending CN101363733A (en) 2008-09-17 2008-09-17 Ultra-high accuracy star sensor
CN2009101718795A Expired - Fee Related CN101676687B (en) 2008-09-17 2009-09-16 Ultra-high accuracy star sensor

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CNA2008102224904A Pending CN101363733A (en) 2008-09-17 2008-09-17 Ultra-high accuracy star sensor

Country Status (1)

Country Link
CN (2) CN101363733A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102410844A (en) * 2011-08-12 2012-04-11 北京航空航天大学 Method and device for correcting non-uniformity of image of high-dynamic star sensor

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102175259B (en) * 2010-12-31 2012-11-14 北京控制工程研究所 Autonomous navigation simulation test system based on earth-sun-moon integrated sensor
CN102155945B (en) * 2011-03-08 2012-12-05 哈尔滨工业大学 Method for improving dynamic performance of CCD star sensor
CN102252664B (en) * 2011-04-18 2013-01-23 北京航空航天大学 Fixed star gyroscope and implementation method thereof
CN102252678B (en) * 2011-04-18 2013-01-23 北京航空航天大学 High dynamic and high update rate star sensor and implementation method thereof
CN102759348B (en) * 2012-07-18 2014-04-16 宁波舜宇电子有限公司 System for automatically identifying coordinates of shooting sites by using star-field digital photography
CN102927973B (en) * 2012-10-24 2015-07-08 北京控制工程研究所 Quick edge locating method of sub pixel image of target celestial body for deep space exploration autonomous navigation
CN103323027B (en) * 2013-05-30 2015-07-08 北京控制工程研究所 Star point reconstruction-based star sensor dynamic-compensation method
CN103438905B (en) * 2013-08-30 2016-01-20 中国人民解放军第二炮兵工程大学 A kind of star sensor star catalogue complete evaluation method
CN103968845B (en) * 2014-04-15 2016-08-31 北京控制工程研究所 A kind of DSP Yu FPGA parallel multi-mode star image processing method for star sensor
RU2585179C1 (en) * 2014-11-14 2016-05-27 Общество с ограниченной ответственностью "Азмерит", ООО "Азмерит" Method of improving accuracy of determining celestial orientation and prolonged maintenance of high accuracy of determining orientation and apparatus therefor
CN105243075B (en) * 2015-08-07 2018-08-31 北京控制工程研究所 A kind of star sensor whole day ball greatly organizes the improvement searching method of identification
CN107507123B (en) * 2017-06-20 2018-10-02 上海航天控制技术研究所 The quick wave door image processing system of star sensor and method
CN107843254B (en) * 2017-10-29 2020-08-14 上海航天控制技术研究所 Data processing unit of space star sensor
CN108362292A (en) * 2018-02-13 2018-08-03 上海航天控制技术研究所 A kind of Mars navigation sensor mounting arrangement optimization method based on genetic algorithm
CN111504329B (en) * 2020-06-12 2022-07-29 上海航天控制技术研究所 High-speed hardware platform of navigation sensor based on FPGA and DSP
CN112200855B (en) * 2020-09-29 2022-11-22 中国科学院长春光学精密机械与物理研究所 Star point centroid extraction method of multi-channel image of star sensor and star sensor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1796938A (en) * 2004-12-28 2006-07-05 北京航空航天大学 Method for recognising star map based on triangle character
CN1808524A (en) * 2005-01-18 2006-07-26 北京航空航天大学 Method of dividing navigational star table
CN101363718A (en) * 2008-09-17 2009-02-11 北京航空航天大学 Two-way mass center tracking imaging method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1796938A (en) * 2004-12-28 2006-07-05 北京航空航天大学 Method for recognising star map based on triangle character
CN1808524A (en) * 2005-01-18 2006-07-26 北京航空航天大学 Method of dividing navigational star table
CN101363718A (en) * 2008-09-17 2009-02-11 北京航空航天大学 Two-way mass center tracking imaging method and device

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JP2004-340784A 2004.12.02
北京航空航天大学学报》.2005,第31卷(第4期),第381-384页. *
李杰.APS星敏感器关键技术的研究.《中国博士学位论文全文数据库》.2006,第4-12页. *
郝雪涛,江洁,张广军.MOS星敏感器图像驱动及实时星点定位算法.&lt *
郝雪涛,江洁,张广军.MOS星敏感器图像驱动及实时星点定位算法.<北京航空航天大学学报》.2005,第31卷(第4期),第381-384页.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102410844A (en) * 2011-08-12 2012-04-11 北京航空航天大学 Method and device for correcting non-uniformity of image of high-dynamic star sensor

Also Published As

Publication number Publication date
CN101676687A (en) 2010-03-24
CN101363733A (en) 2009-02-11

Similar Documents

Publication Publication Date Title
CN101676687B (en) Ultra-high accuracy star sensor
CN100580365C (en) Two-way mass center tracking imaging method and device
CN111340797A (en) Laser radar and binocular camera data fusion detection method and system
CN115147723B (en) Inland ship identification and ranging method, inland ship identification and ranging system, medium, equipment and terminal
CN107491071B (en) Intelligent multi-robot cooperative mapping system and method thereof
CN103075998B (en) A kind of monocular extraterrestrial target range finding angle-measuring method
CN102496015A (en) High-precision method for quickly positioning centers of two-dimensional Gaussian distribution spot images
CN111830953A (en) Vehicle self-positioning method, device and system
CN102155945B (en) Method for improving dynamic performance of CCD star sensor
CN103837085B (en) The displacement of targets device for measuring vector quantity demarcated based on laser tracker pointwise and method
CN108534782A (en) A kind of instant localization method of terrestrial reference map vehicle based on binocular vision system
CN103017654A (en) Multi-path centroid positioning method and device for light spot image
CN102607526A (en) Target posture measuring method based on binocular vision under double mediums
CN103674021A (en) Integrated navigation system and method based on SINS (Strapdown Inertial Navigation System) and star sensor
CN113377888A (en) Training target detection model and method for detecting target
CN109407115B (en) Laser radar-based pavement extraction system and extraction method thereof
CN101701822A (en) Star tracking method of star sensor based on correlation of optical joint and transformation
CN114034288A (en) Seabed microtopography laser line scanning three-dimensional detection method and system
CN107991665A (en) It is a kind of based on fixed-focus camera to target three-dimensional coordinate method for continuous measuring
CN100357703C (en) Fast tracting method of star sensor
CN115128628A (en) Road grid map construction method based on laser SLAM and monocular vision
CN113218577A (en) Outfield measurement method for star point centroid position precision of star sensor
CN104964684B (en) A kind of high dynamically lower fast tracting method
CN103630299A (en) Positioning method and device for real time centroid of large-pixel light spot image
CN103791901B (en) A kind of star sensor data processes system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110202

Termination date: 20210916

CF01 Termination of patent right due to non-payment of annual fee