CN107883947B - Star sensor star map identification method based on convolutional neural network - Google Patents

Star sensor star map identification method based on convolutional neural network Download PDF

Info

Publication number
CN107883947B
CN107883947B CN201711458120.6A CN201711458120A CN107883947B CN 107883947 B CN107883947 B CN 107883947B CN 201711458120 A CN201711458120 A CN 201711458120A CN 107883947 B CN107883947 B CN 107883947B
Authority
CN
China
Prior art keywords
star
stars
constellation
neural network
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711458120.6A
Other languages
Chinese (zh)
Other versions
CN107883947A (en
Inventor
吴峰
朱锡芳
徐也
相入喜
于秋阳
缪志康
吴涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Institute of Technology
Original Assignee
Changzhou Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Institute of Technology filed Critical Changzhou Institute of Technology
Priority to CN201711458120.6A priority Critical patent/CN107883947B/en
Publication of CN107883947A publication Critical patent/CN107883947A/en
Application granted granted Critical
Publication of CN107883947B publication Critical patent/CN107883947B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • G01C21/025Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means with the use of startrackers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The invention discloses a star sensor star map identification method based on a convolutional neural network, which comprises the following steps: performing star filtering processing on an original star table to establish a navigation star library, counting constellations to which all celestial navigation stars belong, numbering the constellations, wherein the sample library consists of a simulation star map and the numbers of the constellations with the largest number of corresponding stars; replacing the original star map with a sparse matrix, and inputting the star map of the sample library into a convolutional neural network for training; extracting the star map obtained by shooting, converting the star map into a sparse matrix, inputting the sparse matrix into a convolutional neural network, and identifying the star map with the rough attitude to obtain an approximate azimuth; and identifying fixed stars in the field of view by using a local sky area star map identification algorithm. The trained convolutional neural network is adopted to realize the identification of the coarse-attitude all-celestial star map, a navigation star library and a local sky star map are not required to be searched, and only a small part of databases are required to be searched; the convolutional neural network has the capability of autonomously extracting the characteristics of the original image, is applied to star map identification, and has strong noise resistance and pseudo-star resistance.

Description

Star sensor star map identification method based on convolutional neural network
Technical Field
The invention belongs to the technical field of astronomical navigation, and relates to a star map identification method for a star sensor.
Background
The star map recognition algorithm is one of the core technologies of star sensors, a great deal of research is carried out on all-day autonomous star map recognition of spacecrafts at home and abroad in recent decades, and a plurality of algorithms are proposed, which mainly comprise: angular distance algorithms, triangular algorithms, grid methods, binary tree methods, algorithms based on pyramid models, and the like.
Lexinluo, Yangxi Huanhua and the like of the electro-optical engineering college of Changchun university establish an angular distance characteristic library of an declination band and select an optimized triangle constraint condition in order to overcome the problem of redundant matching of triangles in a triangle identification algorithm caused by lower feature dimension, and an improved triangle method is adopted for star map identification. The simulation test result shows that: the simulation experiment shows that the success rate of statistical identification is 99.57%, and the average time of single identification is about 4.28 ms. According to the scherchia chensinensis university, an improved star map identification algorithm based on singular value decomposition is provided aiming at the problem that the star map identification algorithm based on singular value decomposition is possibly low in coverage rate of an all-day region caused by discontinuous visual axis. In a laboratory environment, compared with the traditional triangle algorithm in terms of storage capacity, average running time, recognition rate and the like, the improved algorithm is found to be superior to the traditional triangle algorithm. The noise and pseudo-satellite interference resistance of the above algorithm needs to be further improved.
The plum macros proposed an asterogram identification algorithm based on genetic inheritance in 2000. The main idea is as follows: firstly, initializing a population according to an angular distance vector obtained by each observation star in an observation field of view of a sensor, and selecting control parameters; then searching a navigation satellite database according to the angular distance information contained in the individual and calculating the individual fitness value; and finally, judging whether the matching is successful. The algorithm has good robustness and real-time performance, and the required navigation satellite database has small capacity. However, this method is optimized each time, and its accuracy and speed are affected by the optimization parameters.
Two star map recognition algorithms, namely a recognition algorithm based on star matching and a star map recognition algorithm based on triangle characteristic vectors and characteristic values, are designed by the research yard of university of defense science and technology. The identification algorithm based on the main satellite matching improves the reliability of the algorithm and effectively improves the accuracy of each identification by increasing the corner matching mode. The star map identification algorithm based on the triangular characteristic vectors and the characteristic values adopts the triangular characteristic vectors and the characteristic values as identification characteristics, improves the anti-interference performance of the identification algorithm in a multiple comparison mode, and improves the identification success rate of a single star map.
Disclosure of Invention
The purpose of the invention is: aiming at the defects of the prior art, the star map simulation and the convolutional neural network are combined, the star sensor star map identification method based on the convolutional neural network is provided, the search time of a navigation star database is saved, the identification success rate is improved, and the algorithm robustness is enhanced.
The technical scheme of the invention is as follows:
the star sensor star map identification method based on the convolutional neural network comprises the following steps:
step 1: establishing a sample library; performing star filtering processing on an original star list to establish a navigation star library, counting the constellation of all celestial navigation stars by adopting a constellation clustering method, numbering the constellations, wherein the sample library consists of a simulation star map and the serial number of the constellation with the maximum number of corresponding stars;
step 2: establishing and training a convolutional neural network; the input is a star map, the output is a constellation number with the most number of stars in the star map, a sparse matrix is used for replacing an original star map, and the star map of a sample library is input into a convolutional neural network for training;
and step 3: carrying out star map identification; extracting the star map obtained by shooting, converting the star map into a sparse matrix, inputting the sparse matrix into a convolutional neural network, and identifying the star map with the rough attitude to obtain an approximate azimuth; and identifying fixed stars in the field of view by using a local sky area star map identification algorithm.
Further, the establishing a sample library: firstly, according to the limit star of a star sensor and the like, carrying out star filtering processing on an original star table, and deleting fixed stars such as double stars, variable stars and star stars which are higher than the limit star and the like; then, clustering the all celestial navigation stars into different constellations by adopting a constellation clustering method, and numbering the constellations; finally, traversing the all celestial sphere, generating a simulation star map aiming at the direction and the posture of each optical axis, and counting the serial numbers of the constellations with the maximum star number in the view field; the sample library consists of star maps and the numbers of the corresponding constellations with the most star numbers.
Further, the establishing of the convolutional neural network: the convolutional neural network comprises 5 convolutional layers, 5 pooling layers and 2 full-connection layers, and ReLu (rectified Linear units) is selected as an activation function; the input is a star map represented by a triple sparse matrix, and the output is a constellation number with the maximum number of stars in the current star map; the coordinate position of the star image of the star sensor obtained by extraction and calculation in the body coordinate system of the constant star image is converted into a triple sparse matrix with 1024 rows and columns, and each star point occupies 1 pixel to replace the original star image.
Further, the training convolutional neural network: randomly ordering star maps in a sample library, calculating position coordinates of star points in a body coordinate system by using a star image extraction algorithm one by one, converting the position coordinates into a triple sparse matrix with 1024 rows and columns, inputting the triple sparse matrix into a convolutional neural network, and carrying out network training by taking the serial number of a constellation with the most number of corresponding stars as output; and the trained neural network is used for identifying the coarse attitude star map.
Further, the developing star map recognition: firstly, calculating position coordinates of star points in a body coordinate system by using a star image extraction algorithm on a shot star image, converting the position coordinates into a triple sparse matrix with 1024 row and column numbers, inputting the triple sparse matrix into a convolutional neural network, and outputting the serial number of a constellation with the largest star number; obtaining the inertial coordinate system coordinate of one star in the numbered constellation according to the navigation star library, and calculating to obtain the approximate direction of the current view field; and then, identifying the fixed stars in the star map by using a local sky area star map identification method.
Further, the constellation clustering method specifically comprises the following steps:
1) setting parameters; taking a clustering angle theta as one eighth of a field angle, taking a threshold value t as one thousandth of the theta, defining a variable cluster, initializing the variable cluster to 0, setting M navigation stars in total, defining an array cnum containing M elements, and initializing the array cnum to 0, wherein the array cnum elements correspond to each navigation star and are used for recording the constellation number to which the corresponding star belongs, and the element value of 0 represents that the corresponding star is not clustered;
2) the origin is the right ascension angle alpha 0 degree and the right ascension angle-90 degree;
3) for the current optical axis pointing (alpha)i,i) Counting fixed stars in the field of view, wherein N fixed stars are arranged; defining an array flag containing N elements]What is, what isThere is an element initialization to 0; extracting the constellation number of the fixed star of the current view field from the array cnum and storing the constellation number in a flag;
4) counting the total number of elements with the median value of 0 in the array flag, and setting the total number as Nvis; if Nvis ═ 0, all stars in the current field of view have been clustered into corresponding constellations, execute step 9); if Nvis is greater than 0, i.e. there are more stars left untreated, then any one of the stars S is selected0The position is the initial position, and the coordinate of the initial position in the inertial coordinate system is (alpha)0,0) Direction cosine vector V0Is composed of
5) Statistics and S0All stars with an angular distance smaller than theta are provided with k stars including a star S0Their right ascension and declination are (. alpha.)jj) Where j ∈ [1, k ]](ii) a Their direction cosine vector can be calculated as V with reference to equation (1)jOf them with S0The angular distance of (a) is calculated according to the formula (2);
6) if the constellation numbers of the k stars are all 0, the cluster is increased by 1, and is assigned to the corresponding element in the array flag, and the step 7) is executed; otherwise, counting the minimum value of the number of the k non-zero stars and setting the minimum value as fm; if stars with the same constellation number as the k star seats exist in stars with the allocated constellation numbers in the current field of view, the constellation numbers are reassigned to fm, the k star seats are also numbered to fm, all connected neighbor stars are clustered to the same constellation, and the constellation numbers of the connected neighbor stars are unified;
7) according to the formula (3), the direction cosine vector of the center position S of the k stars is calculated and is set as Vnew
Wherein
8) If the starting position S0If the angular distance from the new central position S is less than t, finishing the constellation clustering of the current field of view, and executing the step 4); otherwise, V will benewIs given to V0,S0Moving to the position S and returning to the step 5);
9) and assigning the values of the elements of the array flag to the corresponding elements in the array cnum, pointing the optical axis to the next direction, and returning to the step 3) until traversal is finished.
Further, the focal length, the row-direction angle of view, and the column-direction angle of view of the star chart are respectively set to f and wa、wbIf the coordinates of the star image in the body coordinate system are (x, y, -f), the star image is converted into a sparse matrix, and the row and column numbers of the star image are respectively
Where the round function is rounded.
Compared with the prior art, the invention has the following characteristics:
(1) the search time of the navigation star database is short. The trained convolutional neural network is adopted to realize the coarse-attitude all-celestial star map recognition without searching a navigation star library. And the local sky star map identification only needs to search a small part of the database.
(2) And the robustness is strong. The convolutional neural network has the capability of autonomously extracting the characteristics of the original image, is applied to star map identification, and has strong noise resistance and pseudo-star resistance.
Drawings
FIG. 1 is a flow chart of a star map identification method of the present invention.
Fig. 2 is a constellation clustering flow chart.
FIG. 3 is a navigation star distribution of the embodiment when the optical axis points at coordinates (120, 20) on the celestial sphere.
FIG. 4 is a navigation star distribution of the first embodiment when the optical axis points at coordinates (120, 20) on the celestial sphere and is rotated 30 about the optical axis.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
The invention discloses a star sensor star map identification method based on a convolutional neural network, which is shown in figure 1 and comprises the following steps:
(1) and establishing a sample library. Firstly, according to the limit star of the star sensor, the original star table is subjected to star filtering treatment, and fixed stars such as double stars, variable stars and star stars which are higher than the limit star are deleted. Then, the constellation clustering method shown in fig. 2 is adopted to cluster the all celestial navigation stars into different constellations, and the constellations are numbered. And finally, traversing the all celestial sphere, generating a simulation star map aiming at the direction and the posture of each optical axis, and counting the serial numbers of the constellations with the maximum star number in the view field. The sample library consists of star maps and the numbers of the corresponding constellations with the most star numbers.
(2) And establishing a convolutional neural network. The convolutional neural network comprises 5 convolutional layers, 5 pooling layers and 2 full-connection layers, and ReLu (rectified Linear units) is selected as an activation function. The input is a star map represented by a triple sparse matrix, and the output is the constellation number with the maximum star number in the current star map. Because the star map shot by the star sensor has higher resolution, the operation efficiency is low when the star map is directly input into the convolutional neural network. Therefore, coordinate positions of the star images of the star sensors in the body coordinate system are converted into a triple sparse matrix with 1024 rows and columns by extraction and calculation, and each star point occupies 1 pixel to replace the original star map.
(3) And training the convolutional neural network. And randomly ordering star maps in the sample library, calculating position coordinates of star points in a body coordinate system by using a star map extraction algorithm one by one, converting the position coordinates into a triple sparse matrix with 1024 rows and columns, inputting the triple sparse matrix into a convolutional neural network, and outputting the number of the constellation with the most number of corresponding stars. And the trained neural network is used for identifying the coarse attitude star map.
(4) And carrying out star map identification. Firstly, calculating position coordinates of star points in a body coordinate system by using a star image extraction algorithm on a shot star image, converting the position coordinates into a triple sparse matrix with 1024 rows and columns, inputting the triple sparse matrix into a convolutional neural network, and outputting the serial number of a constellation with the largest star number. And obtaining the inertial coordinate system coordinate of one star in the numbered constellation according to the navigation star library, and calculating to obtain the approximate direction of the current view field. And then, identifying the fixed stars in the star map by using a local sky area star map identification method.
The star sensor star map identification method based on the convolutional neural network is implemented as follows.
In the first step, a sample library is established. (1) According to the limit star of the star sensor and the like, the original star table is subjected to star filtering treatment, fixed stars such as double stars, variable stars and stars which are higher than the limit star and the like are deleted, and the residual stars are used as navigation stars. (2) And clustering the all celestial navigation stars into different constellations by adopting a constellation clustering method, and numbering the constellations. (3) Traversing the whole celestial sphere, selecting 10 postures aiming at each optical axis direction, generating a simulated star map, and counting the serial numbers of the constellations with the most star numbers in the field of view. The sample library consists of star maps and corresponding constellation numbers. The constellation clustering method comprises the following specific steps
(1) And setting parameters. And taking the clustering angle theta as one eighth of the field angle, and taking the threshold value t as one thousandth of the angle theta. A variable cluster is defined and initialized to 0. Let the navigation stars have M, define an array cnum containing M elements, and initialize to 0. The array cnum element corresponds to each navigation star and is used for recording the constellation number to which the navigation star belongs, and the element value of 0 indicates that the corresponding star is not subjected to clustering processing.
(2) Starting from the optical axis pointing at right ascension α of 0 ° and right ascension of-90 °.
(3) For the current optical axis pointing (alpha)i,i) N fixed stars in the statistical field of view are arranged. Defining an array flag containing N elements]All elements are initialized to 0. And extracting the constellation number of the star of the current field of view from the array cnum, and storing the constellation number in a flag.
(4) And setting the total number of elements with the value of 0 in the statistical array flag as Nvis. If Nvis is equal to 0, all stars in the current field of view have been clustered into corresponding constellations, and the step is executed(9). If Nvis is greater than 0, i.e. there are more stars left untreated, then any one of the stars S is selected0The position is the initial position, and the coordinate of the initial position in the inertial coordinate system is (alpha)0,0) Direction cosine vector V0Is composed of
(5) Statistics and S0All stars with an angular distance smaller than theta are provided with k stars including a star S0Their right ascension and declination are (. alpha.)jj) Where j ∈ [1, k ]]. Their direction cosine vector can be calculated as V with reference to equation (1)jOf them with S0The angular distance of (c) is calculated according to equation (2).
(6) And if the constellation numbers of the k stars are all 0, adding 1 to cluster, assigning the cluster to the corresponding element in the array flag, and executing the step (7). Otherwise, counting the minimum value of the non-zero constellation numbers of the k stars, and setting the minimum value as fm. If there are stars with the same constellation number as the k-star constellation number in the stars with the assigned constellation number in the current field of view, the constellation number is reassigned to fm, and the k-star constellation number is also fm, so that all connected neighboring stars are clustered into the same constellation, and the constellation numbers of the connected neighboring stars are unified.
(7) According to the formula (3), the direction cosine vector of the center position S of the k stars is calculated and is set as Vnew
Wherein
(8) If the starting position S0And a new centerAnd (5) if the angular distance of the position S is less than t, finishing the constellation clustering of the current field of view, and executing the step (4). Otherwise, V will benewIs given to V0,S0Moving to the position S, returning to the step (5).
(9) And (4) assigning the element values of the array flag to corresponding elements in the array cnum, pointing the optical axis to the next direction, and returning to the step (3) until traversal is finished.
And secondly, the convolutional neural network comprises 5 convolutional layers, 5 pooling layers and 2 full-connection layers, the convolutional surfaces of the convolutional layers are respectively 6, 12, 18, 12 and 6, except the convolutional kernel of the second convolutional layer, the convolutional kernels have the size of 7 multiplied by 7, the rest convolutional layers have the size of 5 multiplied by 5, the sliding windows of the pooling layers have the size of 2 multiplied by 2, and ReLu (rectified Linear units) are selected as an activation function. The input is a star map represented by a triple sparse matrix, and the output is the constellation number with the maximum star number in the current star map. Because the star map shot by the star sensor has higher resolution, the operation efficiency is low when the star map is directly input into the convolutional neural network. Therefore, coordinate positions of the star images of the star sensors in the body coordinate system are converted into a triple sparse matrix with 1024 rows and columns by extraction and calculation, and each star point occupies 1 pixel to replace the original star map. Setting the focal length, the row-direction field angle and the column-direction field angle of the shot star map as f and w respectivelya、wbIf the coordinates of the star image in the body coordinate system are (x, y, -f), the star image is converted into a sparse matrix, and the row and column numbers of the star image are respectively
Where the round function is rounded.
And thirdly, training the convolutional neural network. And randomly ordering star maps in the sample library, calculating position coordinates of star points in a body coordinate system by using a star map extraction algorithm one by one, converting the position coordinates into a triple sparse matrix with 1024 rows and columns, inputting the triple sparse matrix into a convolutional neural network, and outputting the number of the constellation with the most number of corresponding stars. And training a neural network for recognizing the coarse attitude star map.
Fourthly, star map recognition is carried out. Firstly, calculating position coordinates of star points in a body coordinate system by using a star image extraction algorithm on a shot star image, converting the position coordinates into a triple sparse matrix with 1024 rows and columns, inputting the triple sparse matrix into a convolutional neural network, and outputting the serial number of a constellation with the largest star number. And obtaining the inertial coordinate system coordinate of one star in the numbered constellation according to the navigation star library, and calculating to obtain the approximate direction of the current view field. And then, identifying the fixed stars in the star map by using a local sky area star map identification method.
The invention is further described with reference to the following figures and examples.
Example one
Selecting an SAO star catalogue as an original star catalogue, 5.2 limit stars and the like, 20 degrees multiplied by 20 degrees of field angle, 1024 multiplied by 1024 degrees of star sensor detector resolution and 43.56mm of focal length. And performing star filtering treatment on the original star list to obtain 1607 stars which are all selected as navigation stars. When the optical axis is pointed to the position of coordinates (120 °, 20 °) on the celestial sphere, the distribution of the navigation stars within the field of view is as shown in fig. 3, and there are 16 stars in total, and the detailed information is shown in table 1.
Through the traversal of constellation clustering on the whole celestial sphere, the constellation numbers of the constellation clusters are shown in table 1, wherein the constellation number formed by 7 stars with the serial numbers of 9, 10, 12, 13, 14, 15 and 16 is the largest, and the number is 410.
When the optical axis is pointing unchanged and the field of view is rotated 30 ° about the optical axis, the resulting star map is shown in fig. 4, with a total of 15 stars and their positions within the field of view as shown in table 2. Some of the stars in fig. 3 are already out of view and the stars numbered 17 and 18 are new. At this time, the number of the constellation made up of 5 stars, which are numbered 9, 10, 12, 14, and 16, is the largest, and is numbered 410.
TABLE 1 sidereal data in the field of view
After the star map is extracted, the coordinate values in the body coordinate system are converted into a triple sparse matrix, and the corresponding row numbers and column numbers in the X direction and the Y direction of the star map are shown in tables 1 and 2. The triplet sparse matrix representing the constellation and the number of the most-star constellation are the inputs and outputs of the convolutional neural network. The method comprises the steps of setting parameters of a convolutional neural network, wherein the parameters comprise 5 convolutional layers, 5 pooling layers and 2 full-connection layers, convolution surfaces of the convolutional layers are respectively 6, 12, 18, 12 and 6, except for the convolution kernel of the second convolutional layer, the convolution kernel is 7 multiplied by 7, the convolution surfaces are 5 multiplied by 5, sliding windows of the pooling layers are 2 multiplied by 2, and ReLu (rectified Linear units) are selected as an activation function. All convolution kernel elements and weights are initialized to a random number. And the convolutional neural network after training is used for identifying the coarse attitude star map and outputting the constellation number with the most star number.
TABLE 2 fixed star data after 30 degree field rotation
If FIG. 3 is input, 410 is output. Since the constellation numbered 410 includes the star numbered 12, the right ascension and the declination of the inertial system are 2.11 and 0.49, respectively. Then, a known local sky region identification method is adopted, only local sky region star map identification needs to be carried out in a small-range sky region corresponding to the right ascension and the declination, fixed stars in the visual field are identified, and a large-area star base does not need to be searched.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the present invention. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. A star sensor star map identification method based on a convolutional neural network comprises the following steps:
step 1: establishing a sample library; performing star filtering processing on an original star list to establish a navigation star library, counting the constellation of all celestial navigation stars by adopting a constellation clustering method, numbering the constellations, wherein the sample library consists of a simulation star map and the serial number of the constellation with the maximum number of corresponding stars;
step 2: establishing and training a convolutional neural network; the input is a star map, the output is a constellation number with the most number of stars in the star map, a sparse matrix is used for replacing an original star map, and the star map of a sample library is input into a convolutional neural network for training;
and step 3: carrying out star map identification; extracting a star image obtained by shooting, converting the star image into a sparse matrix, inputting the sparse matrix into a convolutional neural network, and identifying a coarse attitude star image to obtain an azimuth; identifying fixed stars in the field of view by using a local sky area star map identification algorithm;
the constellation clustering method specifically comprises the following steps:
1) setting parameters; taking a clustering angle theta as one eighth of a field angle, taking a threshold value t as one thousandth of the theta, defining a variable cluster, and initializing to 0, wherein the navigation stars have M numbers, defining an array cnum containing M elements, and initializing to 0, the array cnum elements correspond to each navigation star and are used for recording the constellation number to which the fixed star belongs, and the element value of 0 represents that the corresponding fixed star is not clustered;
2) the origin is the right ascension angle alpha 0 degree and the right ascension angle-90 degree;
3) for the current optical axis pointing (alpha)i,i) Counting N fixed stars in the field of view, defining an array flag containing N elements, and initializing all the elements to be 0; extracting the constellation number of the fixed star of the current view field from the array cnum, and storing the constellation number into an array flag;
4) counting the total number of elements with the median value of 0 in the array flag as Nvis; if Nvis ═ 0, all stars in the current field of view have been clustered into corresponding constellations, execute step 9); if Nvis is greater than 0, that is, no star is processed, selecting the position S of any first star in the Nvis0Setting the coordinate of the initial position in the inertial coordinate system as (alpha)0,0) Direction cosine vector V0Is composed of
5) Statistics and S0All stars with an angular distance smaller than theta are provided with k stars including the position S of the first star0Their right ascension and red ascensionWeft is (alpha)jj) Where j ∈ [1, k ]](ii) a Their direction cosine vector can be calculated as V with reference to equation (1)jOf them with S0The angular distance of (a) is calculated according to the formula (2);
θ=cos-1(V0·Vj) (2)
6) if the constellation numbers of the k stars are all 0, the cluster is increased by 1, and is assigned to the corresponding element in the array flag, and the step 7) is executed; otherwise, counting the minimum value of the number of the k non-zero stars and setting the minimum value as fm; if stars with the same constellation number as the k star seats exist in stars with the allocated constellation numbers in the current field of view, the constellation numbers are reassigned to fm, the k star seats are also numbered to fm, all connected neighbor stars are clustered to the same constellation, and the constellation numbers of the connected neighbor stars are unified;
7) according to the formula (3), the direction cosine vector of the center position S of the k stars is calculated and is set as Vnew
Wherein
8) If the position S of the first star0If the angular distance from the new central position S is less than t, finishing the constellation clustering of the current field of view, and executing the step 4); otherwise, V will benewIs given to V0,S0Moving to the central position S, and returning to the step 5);
9) and assigning the values of the elements of the array flag to the corresponding elements in the array cnum, pointing the optical axis to the next direction, and returning to the step 3) until traversal is finished.
2. The method for recognizing the star sensor star map based on the convolutional neural network as claimed in claim 1, wherein: the establishment of a sample library: firstly, according to the limit star of a star sensor and the like, carrying out star filtering processing on an original star table, and deleting fixed stars such as double stars, variable stars and star stars which are higher than the limit star and the like; then, clustering the all celestial navigation stars into different constellations by adopting a constellation clustering method, and numbering the constellations; finally, traversing the all celestial sphere, generating a simulation star map aiming at the direction and the posture of each optical axis, and counting the serial numbers of the constellations with the maximum star number in the view field; the sample library consists of star maps and the numbers of the corresponding constellations with the most star numbers.
3. The method for recognizing the star sensor star map based on the convolutional neural network as claimed in claim 1, wherein: establishing a convolutional neural network: the convolutional neural network comprises 5 convolutional layers, 5 pooling layers and 2 full-connection layers, and ReLu is selected as an activation function; the input is a star map represented by a triple sparse matrix, and the output is a constellation number with the maximum number of stars in the current star map; the coordinate position of the star image of the star sensor obtained by extraction and calculation in the body coordinate system of the constant star image is converted into a triple sparse matrix with 1024 rows and columns, and each star point occupies 1 pixel to replace the original star image.
4. The method for recognizing the star sensor star map based on the convolutional neural network as claimed in claim 1, wherein: training a convolutional neural network: randomly ordering star maps in a sample library, calculating position coordinates of star points in a body coordinate system by using a star image extraction algorithm one by one, converting the position coordinates into a triple sparse matrix with 1024 rows and columns, inputting the triple sparse matrix into a convolutional neural network, and carrying out network training by taking the serial number of a constellation with the most number of corresponding stars as output; and the trained neural network is used for identifying the coarse attitude star map.
5. The method for recognizing the star sensor star map based on the convolutional neural network as claimed in claim 1, wherein: and carrying out star map identification: firstly, calculating position coordinates of star points in a body coordinate system by using a star image extraction algorithm on a shot star image, converting the position coordinates into a triple sparse matrix with 1024 row and column numbers, inputting the triple sparse matrix into a convolutional neural network, and outputting the serial number of a constellation with the largest star number; obtaining the inertial coordinate system coordinate of one star in the numbered constellation according to the navigation star library, and calculating to obtain the approximate direction of the current view field; and then, identifying the fixed stars in the star map by using a local sky area star map identification method.
6. The method for recognizing the star sensor star map based on the convolutional neural network as claimed in claim 3, wherein: the focal length, the row-direction field angle and the column-direction field angle of the shot star chart are respectively f and wa、wbIf the coordinates of the star image in the body coordinate system are (x, y, -f), the star image is converted into a sparse matrix, and the row and column numbers of the star image are respectively
Where the round function is rounded.
CN201711458120.6A 2017-12-28 2017-12-28 Star sensor star map identification method based on convolutional neural network Active CN107883947B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711458120.6A CN107883947B (en) 2017-12-28 2017-12-28 Star sensor star map identification method based on convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711458120.6A CN107883947B (en) 2017-12-28 2017-12-28 Star sensor star map identification method based on convolutional neural network

Publications (2)

Publication Number Publication Date
CN107883947A CN107883947A (en) 2018-04-06
CN107883947B true CN107883947B (en) 2020-12-22

Family

ID=61770440

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711458120.6A Active CN107883947B (en) 2017-12-28 2017-12-28 Star sensor star map identification method based on convolutional neural network

Country Status (1)

Country Link
CN (1) CN107883947B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109813303B (en) * 2019-03-08 2020-10-09 北京航空航天大学 Star map identification method independent of calibration parameters based on angular pattern cluster voting
CN111156988B (en) * 2019-12-31 2021-03-16 中国科学院紫金山天文台 Space debris astronomical positioning and photometry method based on automatic pointing error determination

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB112012A (en) * 1916-12-12 1918-12-19 Henri Louis Victor Desire Giot Improvements in Stars' Transit Calculators and Indicators for Mariners.
CN102840860A (en) * 2012-08-30 2012-12-26 北京航空航天大学 Hybrid particle swarm algorithm-based star graph recognition method
CN103363987A (en) * 2013-06-26 2013-10-23 哈尔滨工业大学 Star map identification method of multi-view-field star sensor
CN105547286A (en) * 2016-01-11 2016-05-04 常州工学院 Composite three-view-field star sensor star map simulation method
CN106971189A (en) * 2017-03-31 2017-07-21 西北工业大学 A kind of noisy method for recognising star map of low resolution

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB112012A (en) * 1916-12-12 1918-12-19 Henri Louis Victor Desire Giot Improvements in Stars' Transit Calculators and Indicators for Mariners.
CN102840860A (en) * 2012-08-30 2012-12-26 北京航空航天大学 Hybrid particle swarm algorithm-based star graph recognition method
CN103363987A (en) * 2013-06-26 2013-10-23 哈尔滨工业大学 Star map identification method of multi-view-field star sensor
CN105547286A (en) * 2016-01-11 2016-05-04 常州工学院 Composite three-view-field star sensor star map simulation method
CN106971189A (en) * 2017-03-31 2017-07-21 西北工业大学 A kind of noisy method for recognising star map of low resolution

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"利用神经网络技术实现星敏感器的星图识别";李春艳;《中国优秀博硕士学位论文全文数据库 (硕士) 工程科技Ⅱ辑》;20040915(第3期);正文第16页至45页 *

Also Published As

Publication number Publication date
CN107883947A (en) 2018-04-06

Similar Documents

Publication Publication Date Title
US10157479B2 (en) Synthesizing training data for broad area geospatial object detection
Chen et al. Target classification using the deep convolutional networks for SAR images
CN104090972B (en) The image characteristics extraction retrieved for D Urban model and method for measuring similarity
CN107883947B (en) Star sensor star map identification method based on convolutional neural network
CN104154929B (en) Optimal selection method of navigational stars of star map simulator based on star density
CN111062310A (en) Few-sample unmanned aerial vehicle image identification method based on virtual sample generation
CN109948453B (en) Multi-person attitude estimation method based on convolutional neural network
CN109492580B (en) Multi-size aerial image positioning method based on neighborhood significance reference of full convolution network
CN109344878B (en) Eagle brain-like feature integration small target recognition method based on ResNet
CN108805280A (en) A kind of method and apparatus of image retrieval
Xu et al. RPNet: A Representation Learning-Based Star Identification Algorithm
CN108647573A (en) A kind of military target recognition methods based on deep learning
Li et al. A novel deep feature fusion network for remote sensing scene classification
CN110334777A (en) A kind of unsupervised attribute selection method of weighting multi-angle of view
CN109460774A (en) A kind of birds recognition methods based on improved convolutional neural networks
Koprinkova-Hristova et al. Clustering of spectral images using Echo state networks
Yoon Autonomous star identification using pattern code
WO2015040450A1 (en) Multi-purpose image processing core
CN112580546A (en) Cross-view image matching method for unmanned aerial vehicle image and satellite image
CN112150359A (en) Unmanned aerial vehicle image fast splicing method based on machine learning and feature point identification
CN107832335B (en) Image retrieval method based on context depth semantic information
CN110175615B (en) Model training method, domain-adaptive visual position identification method and device
CN108320310B (en) Image sequence-based space target three-dimensional attitude estimation method
Zhai et al. MF-SarNet: Effective CNN with data augmentation for SAR automatic target recognition
CN110555461A (en) scene classification method and system based on multi-structure convolutional neural network feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant