CN111121756A - High-dynamic star sensor image processing control method - Google Patents

High-dynamic star sensor image processing control method Download PDF

Info

Publication number
CN111121756A
CN111121756A CN201910640849.8A CN201910640849A CN111121756A CN 111121756 A CN111121756 A CN 111121756A CN 201910640849 A CN201910640849 A CN 201910640849A CN 111121756 A CN111121756 A CN 111121756A
Authority
CN
China
Prior art keywords
star
image
angular velocity
pixel
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910640849.8A
Other languages
Chinese (zh)
Other versions
CN111121756B (en
Inventor
余路伟
毛晓楠
高原
练达
杨元钊
李新鹏
金荷
左乐
高文杰
张磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aerospace Control Technology Institute
Original Assignee
Shanghai Aerospace Control Technology Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aerospace Control Technology Institute filed Critical Shanghai Aerospace Control Technology Institute
Priority to CN201910640849.8A priority Critical patent/CN111121756B/en
Publication of CN111121756A publication Critical patent/CN111121756A/en
Application granted granted Critical
Publication of CN111121756B publication Critical patent/CN111121756B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

A high-dynamic star sensor image processing control method includes the steps that a star sensor autonomously predicts angular speed according to star point characteristic quantity information, calculates optimal exposure time according to the angular speed, estimates the tailing length of star points according to the angular speed and the optimal exposure time, adaptively adjusts the size of a tracking window corresponding to the star points, tracks the star points through the adaptive window, and preprocesses an image in the window by adopting a background estimation method to obtain the star point characteristic quantity information. The invention realizes autonomous adjustment, can effectively inhibit star point tailing, greatly improves the star point extraction rate, maximizes the star point imaging signal-to-noise ratio, ensures the data effectiveness of the star sensor and finally improves the dynamic performance of the star sensor.

Description

High-dynamic star sensor image processing control method
Technical Field
The invention relates to a high-dynamic star sensor image processing control method.
Background
The dynamic performance is a big bottleneck of the development of the star sensor. The relative motion of the fixed star and the detector causes the imaging fuzzy trailing of star points, so that the common image processing method cannot effectively extract the star points, and finally the data effectiveness of the star sensor is influenced.
In fact, the dynamic performance of the star sensor is improved, the method not only depends on an image preprocessing algorithm, but also needs to consider aspects such as star tracking prediction, imaging control, image window acquisition, star point extraction and the like, and a high-dynamic star sensor image processing system is formed together. The star tracking means that the star sensor has prior information after completing all-day identification, and can estimate the angular speed information of the star sensor by deducing next frame data according to the previous frame data, thereby achieving the purpose of predicting the star point position. The imaging control includes functions of driving the detector to read out an image of a specified area and controlling the exposure time of one frame of image. According to the information of the star point position predicted by the star tracking, a small window image of a designated area can be acquired, and the information amount is reduced. The extraction of the star points is a core method of image processing, and the preprocessing method influences the extraction rate and unit precision of the star points and is the basis of subsequent calculation and processing of the star sensor.
Disclosure of Invention
The invention provides a high-dynamic star sensor image processing control method, which realizes autonomous adjustment, can effectively inhibit star point tailing, greatly improves the star point extraction rate, maximizes the star point imaging signal-to-noise ratio, guarantees the data effectiveness of a star sensor and finally improves the dynamic performance of the star sensor.
In order to achieve the above object, the present invention provides a method for processing and controlling an image of a high dynamic star sensor, comprising the following steps:
step S1, estimating the angular speed;
step S2, calculating the optimal exposure time according to the angular speed;
step S3, estimating the trailing length of the star point according to the angular velocity and the optimal exposure time in the star tracking mode, and adaptively adjusting the size of the tracking window corresponding to the star point;
step S4, tracking the star points through a self-adaptive window in a star tracking mode, and preprocessing the image in the window by adopting a background estimation method to obtain star point characteristic quantity information for angular velocity estimation;
step S1 to step S4 form closed-loop control.
In step S1, in the star tracking mode, the angular velocity estimation method includes: and calculating the angular velocity according to the attitude information or calculating the angular velocity according to the star vector difference.
In the step S1, in the all-day recognition mode, the angular velocity of the star sensor is estimated from the image spots of the trailing star points:
Figure BDA0002131804370000021
wherein, ω isx,ωyAngular velocities are estimated for the X/Y axes of the star sensor respectively,
Figure BDA0002131804370000025
for detector pel angular resolution,/x,lyRespectively, the trailing length of the trailing star in the x/y direction of the imaging plane, and α is the direction angle.
In the step S2, the optimal exposure time t of the star sensor is:
Figure BDA0002131804370000022
wherein, tmaxAnd tminRespectively an upper limit and a lower limit of the exposure time,
Figure BDA0002131804370000026
is the resolution of the pixel angle of the detector, omega is the angular velocity, omegaminIs the minimum angular velocity, ωmaxIs the maximum angular velocity.
In step S3, when the star sensor has an angular velocity component around the X/Y axis, the star point tail length L is given by:
Figure BDA0002131804370000023
wherein the content of the first and second substances,
Figure BDA0002131804370000024
called pixellated angular resolution, FOV field of view, npixelThe number of pixels in the X/Y axis direction;
then the tracking window size LwinComprises the following steps:
Lwin=lstar+L+l0(4)
wherein lstarStatic star spot size; l0Is a window margin;
the dimensions of the X/Y axes in the window are all in accordance with the above formula.
In the step S4, the preprocessing includes: background estimation, filtering processing and threshold segmentation;
obtaining a background value in the background estimation:
Figure BDA0002131804370000031
wherein, Ib(x, y) is a certain pixel background value obtained by background estimation; i (I, j) is the gray value of a pixel point with coordinates (I, j) in the neighborhood of the (x, y) point in the original image I; wijBeing respective pointsA weight value; k is the reference neighborhood radius;
the filtering process obtains a residual image:
It=I-Ib(6)
wherein, ItIs a residual image, and each pixel background value I obtained by background estimationb(x, y) constituting an image background Ib(ii) a Forming an original image I by the gray value I (I, j) of a pixel point with coordinates (I, j) in the (x, y) point neighborhood;
the threshold segmentation is used for judging the threshold of the residual image, when the pixel value is higher than the threshold, the pixel value in the residual image is differed from the threshold to obtain the gray value of the effective pixel, and the pixel value is lower than the threshold and is judged to be the ineffective pixel; the set of all valid pixels in the residual image constitutes the star point feature quantity.
Compared with the prior art, the method adopted by the invention has the advantages and beneficial effects that:
1. and the dynamic performance of the star sensor is improved. The star point tailing can be effectively inhibited, the star point extraction rate is greatly improved, the data effectiveness of the star sensor is guaranteed, and the star sensor attitude measurement noise can be reduced to 11.44' under the dynamic condition of 3 DEG/s.
2. And (4) self-regulation. And closed-loop control is formed, the exposure time is controlled according to the angular speed information autonomously measured in the product, and autonomous adjustment is realized without external intervention.
3. And improving the signal-to-noise ratio of the image. The optimal exposure control strategy can maximize the signal-to-noise ratio of star point imaging, shorten the trailing length as far as possible under the condition of ensuring enough time sensitization of the star spots, and finally improve the dynamic performance of the star sensor.
Drawings
FIG. 1 is a schematic diagram of an image processing control method of a high dynamic star sensor provided by the invention.
FIG. 2 is an estimated angular velocity of a trailing star.
Fig. 3 is a schematic diagram of an optimal exposure time.
FIG. 4 shows the variation trend of the dynamic single-satellite positioning accuracy of 3 deg/s for 5mv satellite.
Detailed Description
The preferred embodiment of the present invention will be described in detail below with reference to fig. 1 to 4.
As shown in fig. 1, the present invention provides a method for processing and controlling an image of a high dynamic star sensor, comprising the following steps:
and step S1, angular velocity estimation.
The control of the exposure time is based on the angular velocity.
The angular velocity sources are mainly two main channels: the star sensor autonomously predicts and sub-systems assist angular velocity information.
The star sensor autonomous prediction can calculate information according to the attitude or directly pass through star point image information. The former is mainly used in the star tracking mode, and the latter relates to the all-day identification mode.
The priority of the three sources is in turn: the subsystem assists angular velocity information, satellite sensor autonomous prediction in a satellite tracking mode, and star point image spot estimation in an all-day identification mode.
Before entering the satellite tracking mode, the angular velocity information can be provided by an all-day identification mode or a subsystem, and after entering the satellite tracking mode, the angular velocity information is predicted according to the attitude information. The subsystem-assisted, all-day identification mode provides conditions only for entering the satellite tracking mode.
Under the state of keeping the star tracking, the angular velocity prediction has higher reliability. There are generally two approaches to star tracking: 1. calculating the angular speed according to the attitude information, recursion the attitude of the next frame, and predicting the position of a star point; 2. Differenced by the star vector. The angular rate obtained from the attitude information has high accuracy, but the attitude determination needs to be completed on the premise. The star vector difference is then available before pose determination, but with relatively low accuracy.
Under the all-day identification mode, if no auxiliary information exists, the angular speed of the star sensor can be only roughly estimated through the image spots of the trailing star points. The hardware image processing may estimate components of the pitch axis and yaw axis at lower angular velocities (<1.5 °/s). The image spot length provides rough information on the rotation speed, and the motion direction of the mass center of the two-frame star image spot describes the rotation direction, as shown in fig. 2.
Figure BDA0002131804370000051
Wherein, ω isx,ωyAngular velocities are estimated for the X/Y axes of the star sensor respectively,
Figure BDA0002131804370000053
for detector pel angular resolution,/x,lyRespectively, the trailing length of the trailing star in the x/y direction of the imaging plane, and α is the direction angle.
And step S2, controlling the optimal exposure time according to the angular speed.
According to the optimal exposure control criterion, combining the predicted angular speed information, the star trailing is controlled to be 5 pixels, so that the target pixel is fully sensitive, the signal-to-noise ratio is maximized, and the accuracy is maximized while the star extraction is ensured.
The expression of the optimal exposure time t of the star sensor is as follows:
Figure BDA0002131804370000052
dynamic range [ omega ]min,ωmax]In, make MtMinimum exposure time length at which the equi-star imaging signal-to-noise ratio reaches a maximum, where MtTo ensure that more than 10 stars, ω, are lit at the star's isocenter within the field of viewminIs the maximum exposure time tmaxUnder the condition of MtThe isocenter mass center is displaced by one spot width (l)s) Angular velocity of time tmaxFor steady detection M in a static statetMinimum exposure time of the equines, omegamaxIs MtIsostar centroid displacement lsAnd the maximum angular velocity during detection can be stabilized.
As shown in FIG. 3, the bold black curve is the optimal exposure time curve, which is a piecewise function, tmax/tminRespectively, the upper and lower exposure time limits. When the angular velocity is lower than omegaminTime, fixed exposure time tmaxThe mass center displacement of the inner star point does not exceed lsThe trailing has little influence on the star spot shape, and the exposure time does not need to be adjusted; when the angular velocity is highIn omegamaxAt that time, it has not been possible to ensure stable extraction of MtWaiting for stars, the continued shortening of the exposure time will weaken the energy accumulation of the brighter stars and make it more unfavorable for the extraction of the bright stars, so the lower limit t of the exposure time is setmin(ii) a When the angular velocity is [ omega ]minmax]Inner time, star point mass center shift lsThe signal-to-noise ratio can be maximized, the increase of the signal-to-noise ratio is easier to extract star points, and the single-star positioning accuracy is higher due to the shorter tail.
As shown in fig. 4, when weak satellite (5mv) realizes the shortest tail under the premise of maximum signal-to-noise ratio, and the single-satellite positioning accuracy is highest; although the bright star does not reach the maximum signal-to-noise ratio, the signal-to-noise ratio of the bright star is far superior to that of the weak star, and the exposure time is continuously prolonged, so that unnecessary tailing of the weak star is increased, and the single-star positioning accuracy of the weak star is reduced. To balance the two effects, the exposure time at 5pix. trailing was considered the optimal exposure time for the entire star map.
And S3, estimating the trailing length of the star point according to the angular velocity and the optimal exposure time, and adaptively adjusting the size of the tracking window corresponding to the star point.
The star point window is acquired by taking the star point coordinates as the center and estimating the fuzzy length of the star spots by using the corresponding instrument star and the like of each star point and the dynamic conditions, so that the adaptive adjustment capability is realized.
By controlling the exposure time, the star trailing is limited to 2-4 pixels or as short as possible. Thus, in the star tracking mode, the window image processing will face the problem of adaptive adjustment of the window size along with the star point tailing. And predicting the centroid coordinate of the star point of the next frame in the star tracking mode, wherein the time reference is the exposure midpoint moment. This means that theoretically the predicted star point coordinates are located in the center of the trailing star point. When the star sensor has an angular velocity component about the X/Y axis direction, the star point smear length L is given by:
Figure BDA0002131804370000061
wherein the content of the first and second substances,
Figure BDA0002131804370000062
called pixellated angular resolution, FOV field of view, npixelIs the number of pixels in the X/Y axis direction.
Then the tracking window size LwinIs composed of
Lwin=lstar+L+l0(4)
Wherein lstarThe size of the static star spot is related to the star point of a star point instrument and the like and the defocusing degree of an optical system; l0And setting the window margin according to algorithm requirements. The sizes of the X/Y axes in the window are all in accordance with the above formula and can be respectively adjusted in a self-adaptive manner.
Step S4, tracking star points through a self-adaptive window, preprocessing the image in the window by adopting a background estimation method, and obtaining star point characteristic quantity information (the star point characteristic quantity is generally information such as weighted gray sum, gray sum and the like, and is mainly used for subsequently solving the mass center).
The pretreatment comprises the following steps: background estimation, filtering processing and threshold segmentation;
the star point is tracked by a self-adaptive window, and the image is processed in the window by adopting a background estimation method, which is a star tracking mode of the star sensor for a long time.
Background estimation: the background gray value of any point can be represented by the linear or nonlinear combination of the surrounding pixel points, as follows:
Figure BDA0002131804370000071
wherein, Ib(x, y) is a certain pixel background value obtained by background estimation; i (I, j) is the gray value of a pixel point with coordinates (I, j) in the neighborhood of the (x, y) point in the original image I; wijThe weight value of the corresponding point is obtained; k is the reference neighborhood radius.
The background estimation filtering method firstly utilizes the formula to obtain the image background IbThen, the background image and the original image are subtracted to obtain a residual image ItIdeally, only the target and a small amount of high frequency noise remain in the residual image.
It=I-Ib(6)
In the formulaEach pixel background value I obtained by background estimationb(x, y) constituting an image background Ib(ii) a And (3) the gray value I (I, j) of the pixel point with the coordinate (I, j) in the neighborhood of the (x, y) point forms an original image I.
The residual image reflects the image contrast without stray light interference originally, and the optimal segmentation threshold is easily obtained by performing threshold segmentation on the basis.
When threshold segmentation is carried out, a certain fixed value is selected as a threshold, and when the pixel value is higher than the threshold, the difference is carried out between the pixel value and the threshold to obtain the gray value of an effective pixel; if the threshold value is lower than the threshold value, the pixel is judged to be an invalid pixel. The set of effective pixels constitutes a star point feature quantity.
The four technologies form a closed-loop control system, angular velocity estimation, exposure time control and the size of a tracking window are highly related, and a precondition is provided for background estimation image processing together, so that the image processing is easier to correctly propose complete star spots, the tailing star spot fracture is avoided, and star point characteristic quantities extracted by the background estimation image processing can be used for angular velocity estimation. And guiding the optimal control of the exposure time of the detector by adopting the predicted angular speed as constraint information. And an optimal exposure control strategy is applied, the exposure time is intelligently adjusted, the star trailing is effectively inhibited, and the signal-to-noise ratio is maximized. And (4) combining the predicted angular velocity information and the optimized exposure time, and adaptively adjusting the window size of the collected star points to ensure that the star spot is included in a small window. And filtering in an acquisition window by using a background estimation image processing algorithm, and extracting to obtain star point characteristic quantity information.
The angular velocity estimation technology adopted by the invention can be applied to an all-day identification mode and a star tracking mode and is used for guiding the adjustment of the exposure time.
The exposure time control technology adopted by the invention uses the optimal exposure time control rate, and can enable the imaging of the star point to reach the maximum signal-to-noise ratio under the condition of different angular speeds, thereby being beneficial to the extraction of the star point.
The adaptive tracking window technology adopted by the invention can estimate the size of the star spot by combining the angular speed and the obtained optimal exposure time, adaptively adjust the size of the window corresponding to the star point and ensure that the star spot is completely positioned in the acquisition window under the dynamic condition.
The background estimation image processing technology adopted by the invention adopts a background estimation method to process in a small window, obtains a background value through filtering, and accordingly segments the image and extracts to obtain the star point characteristic quantity.
Compared with the prior art, the method adopted by the invention has the advantages and beneficial effects that:
1. and the dynamic performance of the star sensor is improved. The star point tailing can be effectively inhibited, the star point extraction rate is greatly improved, the data effectiveness of the star sensor is guaranteed, and the star sensor attitude measurement noise can be reduced to 11.44' under the dynamic condition of 3 DEG/s.
2. And (4) self-regulation. And closed-loop control is formed, the exposure time is controlled according to the angular speed information autonomously measured in the product, and autonomous adjustment is realized without external intervention.
3. And improving the signal-to-noise ratio of the image. The optimal exposure control strategy can maximize the signal-to-noise ratio of star point imaging, shorten the trailing length as far as possible under the condition of ensuring enough time sensitization of the star spots, and finally improve the dynamic performance of the star sensor.
While the present invention has been described in detail with reference to the preferred embodiments thereof, it should be understood that the above description should not be taken as limiting the invention. Various modifications and alterations to this invention will become apparent to those skilled in the art upon reading the foregoing description. Accordingly, the scope of the invention should be determined from the following claims.

Claims (6)

1. A high-dynamic star sensor image processing control method is characterized by comprising the following steps:
step S1, estimating the angular velocity according to the star point characteristic quantity information;
step S2, calculating the optimal exposure time according to the angular speed;
step S3, estimating the trailing length of the star point according to the angular velocity and the optimal exposure time in the star tracking mode, and adaptively adjusting the size of the tracking window corresponding to the star point;
step S4, tracking the star points through a self-adaptive window in a star tracking mode, and preprocessing the image in the window by adopting a background estimation method to obtain star point characteristic quantity information for angular velocity estimation;
step S1 to step S4 form closed-loop control.
2. The image processing control method of a high dynamic star sensor as claimed in claim 1, wherein in step S1, in the star tracking mode, the angular velocity estimation method comprises: and calculating the angular velocity according to the attitude information or calculating the angular velocity according to the star vector difference.
3. The image processing control method of a high dynamic star sensor as claimed in claim 1, wherein in step S1, in the all-day recognition mode, the angular velocity of the star sensor is estimated from the image spots of the trailing star points:
Figure FDA0002131804360000011
wherein, ω isx,ωyAngular velocities are estimated for the X/Y axes of the star sensor respectively,
Figure FDA0002131804360000012
for detector pixel angular resolution,/x,lyRespectively, the trailing length of the trailing star in the x/y direction of the imaging plane, and α is the direction angle.
4. The image processing control method for the high-dynamic star sensor as claimed in claim 2 or 3, wherein in step S2, the optimal exposure time t of the star sensor is:
Figure FDA0002131804360000013
wherein, tmaxAnd tminRespectively at exposure timeThe lower and upper limits are the lower limit,
Figure FDA0002131804360000021
is the resolution of the pixel angle of the detector, omega is the angular velocity, omegaminIs the minimum angular velocity, ωmaxIs the maximum angular velocity.
5. The high-dynamic star sensor image processing control method according to claim 4, wherein in step S3, when the star sensor has an angular velocity component about the X/Y axis direction, the star point smear length L is given by:
Figure FDA0002131804360000022
wherein the content of the first and second substances,
Figure FDA0002131804360000023
called pixellated angular resolution, FOV field of view, npixelThe number of pixels in the X/Y axis direction;
then the tracking window size LwinComprises the following steps:
Lwin=lstar+L+l0(4)
wherein lstarStatic star spot size; l0Is a window margin;
the dimensions of the X/Y axes in the window are all in accordance with the above formula.
6. The method for processing and controlling the image of the high-dynamic star sensor as claimed in claim 5, wherein in step S4, the preprocessing comprises: background estimation, filtering processing and threshold segmentation;
obtaining a background value in the background estimation:
Figure FDA0002131804360000024
wherein, Ib(x, y) is a certain pixel background value obtained by background estimation; i (I, j) is originalCoordinates in the neighborhood of the (x, y) point in the initial image I are gray values of the (I, j) pixel points; wijThe weight value of the corresponding point is obtained; k is the reference neighborhood radius;
the filtering process obtains a residual image:
It=I-Ib(6)
wherein, ItIs a residual image, and each pixel background value I obtained by background estimationb(x, y) constituting an image background Ib(ii) a Forming an original image I by the gray value I (I, j) of a pixel point with coordinates (I, j) in the (x, y) point neighborhood;
the threshold segmentation is used for judging the threshold of the residual image, when the pixel value is higher than the threshold, the pixel value in the residual image is differed from the threshold to obtain the gray value of the effective pixel, and the pixel value is lower than the threshold and is judged to be the ineffective pixel; the set of all valid pixels in the residual image constitutes the star point feature quantity.
CN201910640849.8A 2019-07-16 2019-07-16 High-dynamic star sensor image processing control method Active CN111121756B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910640849.8A CN111121756B (en) 2019-07-16 2019-07-16 High-dynamic star sensor image processing control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910640849.8A CN111121756B (en) 2019-07-16 2019-07-16 High-dynamic star sensor image processing control method

Publications (2)

Publication Number Publication Date
CN111121756A true CN111121756A (en) 2020-05-08
CN111121756B CN111121756B (en) 2021-12-07

Family

ID=70495178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910640849.8A Active CN111121756B (en) 2019-07-16 2019-07-16 High-dynamic star sensor image processing control method

Country Status (1)

Country Link
CN (1) CN111121756B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112212890A (en) * 2020-09-21 2021-01-12 中国科学院长春光学精密机械与物理研究所 Image motion compensation method of high-dynamic star sensor
CN112419180A (en) * 2020-11-19 2021-02-26 北京航空航天大学 High-dynamic star point extraction method for unknown direction

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0351713A (en) * 1989-07-19 1991-03-06 Nec Corp Fixed star sensor
CN101852616A (en) * 2010-04-30 2010-10-06 北京航空航天大学 Method and device for realizing extraction of star target under high dynamic condition
CN102116626A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Prediction and correction method of node of star point track image
CN103148851A (en) * 2013-02-18 2013-06-12 清华大学 Method for determining attitude of star sensor based on roller shutter exposure imaging
CN103487058A (en) * 2013-09-06 2014-01-01 北京控制工程研究所 Method for improving dynamic performance of active pixel sensor (APS) star sensor
CN104567864A (en) * 2014-12-29 2015-04-29 北京控制工程研究所 Dynamic exposure time adjusting method for APS (active pixel sensor) star sensor
CN106382928A (en) * 2016-08-26 2017-02-08 北京控制工程研究所 Roller shutter door exposure star sensor-based dynamic compensation method
CN107588786A (en) * 2017-09-22 2018-01-16 上海航天控制技术研究所 A kind of multipurpose fixed star simulator driving method for star sensor emulation testing
CN108827281A (en) * 2018-09-14 2018-11-16 上海航天控制技术研究所 A kind of combined imaging driving method of star sensor
CN108871317A (en) * 2018-09-10 2018-11-23 上海航天控制技术研究所 A kind of Rotating Platform for High Precision Star Sensor information processing system
CN109141403A (en) * 2018-08-01 2019-01-04 上海航天控制技术研究所 A kind of image processing system and its method of the access of star sensor wicket
US10180327B1 (en) * 2015-06-15 2019-01-15 The Charles Stark Draper Laboratory, Inc. Methods and apparatus for navigational aiding using celestial object tracking
CN109682369A (en) * 2018-12-13 2019-04-26 上海航天控制技术研究所 Rotating Platform for High Precision Star Sensor data fusion method based on asynchronous exposure

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0351713A (en) * 1989-07-19 1991-03-06 Nec Corp Fixed star sensor
CN102116626A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Prediction and correction method of node of star point track image
CN101852616A (en) * 2010-04-30 2010-10-06 北京航空航天大学 Method and device for realizing extraction of star target under high dynamic condition
CN103148851A (en) * 2013-02-18 2013-06-12 清华大学 Method for determining attitude of star sensor based on roller shutter exposure imaging
CN103487058A (en) * 2013-09-06 2014-01-01 北京控制工程研究所 Method for improving dynamic performance of active pixel sensor (APS) star sensor
CN104567864A (en) * 2014-12-29 2015-04-29 北京控制工程研究所 Dynamic exposure time adjusting method for APS (active pixel sensor) star sensor
US10180327B1 (en) * 2015-06-15 2019-01-15 The Charles Stark Draper Laboratory, Inc. Methods and apparatus for navigational aiding using celestial object tracking
CN106382928A (en) * 2016-08-26 2017-02-08 北京控制工程研究所 Roller shutter door exposure star sensor-based dynamic compensation method
CN107588786A (en) * 2017-09-22 2018-01-16 上海航天控制技术研究所 A kind of multipurpose fixed star simulator driving method for star sensor emulation testing
CN109141403A (en) * 2018-08-01 2019-01-04 上海航天控制技术研究所 A kind of image processing system and its method of the access of star sensor wicket
CN108871317A (en) * 2018-09-10 2018-11-23 上海航天控制技术研究所 A kind of Rotating Platform for High Precision Star Sensor information processing system
CN108827281A (en) * 2018-09-14 2018-11-16 上海航天控制技术研究所 A kind of combined imaging driving method of star sensor
CN109682369A (en) * 2018-12-13 2019-04-26 上海航天控制技术研究所 Rotating Platform for High Precision Star Sensor data fusion method based on asynchronous exposure

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XINGGUO WEI 等: "Exposure Time Optimization for Highly Dynamic Star Trackers", 《SENSORS》 *
余路伟 等: "采用最大背景估计的星敏感器图像处理方法", 《激光与红外》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112212890A (en) * 2020-09-21 2021-01-12 中国科学院长春光学精密机械与物理研究所 Image motion compensation method of high-dynamic star sensor
CN112212890B (en) * 2020-09-21 2022-08-19 中国科学院长春光学精密机械与物理研究所 Image motion compensation method of high-dynamic star sensor
CN112419180A (en) * 2020-11-19 2021-02-26 北京航空航天大学 High-dynamic star point extraction method for unknown direction
CN112419180B (en) * 2020-11-19 2022-07-05 北京航空航天大学 High-dynamic star point extraction method for unknown direction

Also Published As

Publication number Publication date
CN111121756B (en) 2021-12-07

Similar Documents

Publication Publication Date Title
US11651595B2 (en) Road plane output with lateral slope
CN111556833B (en) Vehicle control device, control method thereof and vehicle control system
US20130141520A1 (en) Lane tracking system
US7177447B2 (en) Real-time multi-stage infrared image-based tracking system
CN107305632B (en) Monocular computer vision technology-based target object distance measuring method and system
CN111121756B (en) High-dynamic star sensor image processing control method
US11157753B2 (en) Road line detection device and road line detection method
US11436815B2 (en) Method for limiting object detection area in a mobile system equipped with a rotation sensor or a position sensor with an image sensor, and apparatus for performing the same
WO2008088880A1 (en) System and method for vehicle detection and tracking
JP6932058B2 (en) Position estimation device and position estimation method for moving objects
CN108898057B (en) Method, device, computer equipment and storage medium for tracking target detection
JP3775200B2 (en) Inter-vehicle distance estimation device
CN107270904B (en) Unmanned aerial vehicle auxiliary guide control system and method based on image registration
JP6988873B2 (en) Position estimation device and computer program for position estimation
CN116665097A (en) Self-adaptive target tracking method combining context awareness
KR20180034027A (en) Apparatus for detecting edge with image blur
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
CN112492196B (en) Live broadcast in-process anchor tracking method, device and system
US11257239B2 (en) Image selection device and image selection method
CN114170275A (en) Lane line processing method and system based on Kalman filtering
CN114463387A (en) Contact net tracking detection method and detection device
CN112634322A (en) Compound tracking system and method based on multipath extraction and intelligent recognition autonomous guidance
CN113111890B (en) Remote water surface infrared target rapid tracking method based on water antenna
CN116774590B (en) Adaptive regulation and control method and system for influencing interference
US20230184920A1 (en) Object detection device and object detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant