CN109344785A - A kind of high-precision planetocentric localization method in autonomous deep-space optical navigation - Google Patents

A kind of high-precision planetocentric localization method in autonomous deep-space optical navigation Download PDF

Info

Publication number
CN109344785A
CN109344785A CN201811186730.XA CN201811186730A CN109344785A CN 109344785 A CN109344785 A CN 109344785A CN 201811186730 A CN201811186730 A CN 201811186730A CN 109344785 A CN109344785 A CN 109344785A
Authority
CN
China
Prior art keywords
edge
planet
pixel
gradient direction
precision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811186730.XA
Other languages
Chinese (zh)
Other versions
CN109344785B (en
Inventor
江洁
张勇
张广军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201811186730.XA priority Critical patent/CN109344785B/en
Publication of CN109344785A publication Critical patent/CN109344785A/en
Application granted granted Critical
Publication of CN109344785B publication Critical patent/CN109344785B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The high-precision planetocentric localization method that the invention discloses a kind of in autonomous deep-space optical navigation, comprising: firstly, image preprocessing extracts planet arc edge, obtain the initial value of planetocentric and radius;Secondly, a kind of non-local filtering method is filtered edge area-of-interest (EROI), better fringe region intensity profile is obtained;Third, a kind of regional area effect method based on gradient direction carry out sub-pixel positioning to the arc edge of planet;Finally, being fitted arc edge using least square method, the centre coordinate of high-precision planet is obtained.The present invention handles regular planet, can obtain high-precision sub-pixel edge and centre coordinate, and have good robustness to the planet image of noise, complex background and abundant texture.

Description

A kind of high-precision planetocentric localization method in autonomous deep-space optical navigation
Technical field
The present invention relates to planetocentric field of locating technology in autonomous deep-space optical navigation, and in particular to it is a kind of deep space from High-precision planetocentric localization method in main optical navigation is related to the extraction of Pixel-level arc edge, the planet side of planet The filtering of edge area-of-interest and the sub-pixel edge coordinate of arc edge extract.
Background technique
Compared with Earth's orbit spacecraft, the flying distance of deep space probe is remote, long operational time, traditional based on ground Air navigation aid real-time, Observable segmental arc and in terms of there are larger limitation, be increasingly difficult to meet The needs of detection mission, independent navigation are the effective ways solved these problems.It is certainly leading based on Optical imaging measurement Boat technology is one of the trend of Models For Space Science And Technology development.
Optical guidance increasingly develops at present using relatively wide, technology relative maturity independent navigation mode.Based on optics at The autonomous navigation technology of picture measurement refers to that spacecraft obtains the optical imagery of ambient enviroment using optical sensor, and to image It is handled, obtains position and posture information that image information determines spacecraft.Image processing method in optical guidance is most Whole target is to extract various available navigation observations from the original image that spacecraft is shot, and is generally included: sight (LOS) Angle etc. between vector, apparent diameter and mass center, horizon and reference star.With the implementation of a large amount of deep space exploration tasks, navigation The requirement of precision is also higher and higher.Navigation accuracy relies heavily on high-precision image processing method, wherein celestial body profile The extracted with high accuracy at edge is the basis of measurement.About autonomous optical navigation technology, be largely about navigation measurement modeling and Navigation Filter, few people pay close attention to the high-precision celestial information extracting method of celestial body.
The optical measurement that we pay close attention to is the extraction of high-precision regular planetocentric.In order to accurately obtain planetocentric, The extracted with high accuracy at planet edge is crucial.Common edge detective operators: Roberts, Sobel, Prewitt, Laplacian, Canny, Kirsch, Nevitia etc., these operators may be only accurate to pixel scale, in order to more accurately position margin location It sets, domestic and international experts and scholars have carried out many research in terms of sub-pixel edge detection method, and these methods can be three Major class: Moment Methods, fitting process and interpolation method.
Moment Methods are computationally intensive, if it is considered that fuzzy edge model, just will increase model parameter, so that the determination of analytic solutions Become very difficult.For fitting process because of its model complexity, solving speed is slow.Interpolation calculation process is simple, but is easy to be made an uproar The influence of sound.In deep space exploration Spacecraft Autonomous Navigation, these processing of three traditional big methods for image, in processing speed , there is certain limitation in degree, precision and the noise immunity aspect of acquisition of information.
With deepening continuously for deep space exploration task, existing technical deficiency is high-precision in future probes task to meet Planet information extraction.To limit the implementation of deep space exploration task to a certain extent.
Summary of the invention
In view of this, the main purpose of the present invention is to provide a kind of high-precision planetocentric localization methods, mainly Solve the problems, such as following sections:
(1) planet image has texture abundant and background, and is influenced by asterism and veiling glare, and pretreatment obtains planet Arc edge coordinate and planet center and radius initial value;
(2) planet image sampling environment is complicated, is easy affected by noise, and existing filtering method can not be for the several of planet What carries out efficient filtering processing with characteristics of image;
(3) existing method is since it is to planet Image Edge-Detection precision deficiency, and causes navigation information and extract essence It spends not high.
In order to achieve the above objectives, the technical solution adopted by the present invention are as follows: a kind of high-precision in autonomous deep-space optical navigation The planet localization method of degree, the method steps are as follows:
A, image is pre-processed, obtains the initial of the arc edge coordinate of planet and the center of planet and radius Value;
B, the edge area-of-interest (EROI) of planet is extracted;
C, due to blurring effect in planet image sampling process, according to fuzzy edge intensity profile rule, use is non local Filtering method is filtered the region EROI;
D, sub-pixel edge extraction is carried out using arc edge of the regional area effect based on gradient direction to planet;
E, least square fitting is carried out to sub-pixel edge coordinate, finds out high-precision planetocentric coordinates;
Wherein, the method that morphology opening operation is combined with smallest circle covering in step a pre-processes planet, realizes step It is rapid as follows:
(a1) image is operated using morphology opening operation first, opening operation is the mistake that corrosion reflation is carried out to image Journey;Opening operation eliminates the influence of a large amount of asterism and veiling glare in moon background area herein;
(a2) edge and surface texture of Sobel operator extraction planet are used;
(a3) center of planet and the initial value of radius are acquired using smallest circle covering method, and extracts the circle of the moon Arc side edge Pixel-level coordinate;Smallest circle covering method can find out the minimum for covering all point sets in linear time complexity Circle, can effectively reject the surface texture of planet.
Wherein, arc edge area-of-interest (EROI) is extracted in step b, realizes that steps are as follows:
(b1) obtain the initial center and radius of planet in step a, reject arc edge coordinate to the distance at center and The absolute value of the difference of radius is greater than the abnormal coordinate value of 2 pixels;
(b2) absolute value for choosing the distance of the image centre to centre heart and the difference of radius is less than or equal to (Dblur/ 2+2) pixel Coordinate (DblurFor blurred width), as edge area-of-interest (EROI).
Wherein, non-local filtering is carried out to EROI in step c, realizes that steps are as follows:
(c1) in EROI, any two pixel i and j.Calculate separately two pixels gray scale difference g (i, j) (gray scale difference because Son), the range difference d (i, j) (the gradient direction range difference factor) in two pixel distance centers of circle, two pixels on gradient direction The angle theta (i, j) (the gradient direction difference factor) of gradient direction, θ ∈ [0 °, 180 °);
(c2) degree of correlation weight of any two pixel i and j in EROI are calculated;
(c3) according to the degree of correlation weight of pixel i in EROI and other pixels, filtered pixel i gray value is calculated, into One step expands to the gray value after calculating each pixel filter.
Wherein, using the regional area effect method on gradient direction in step d, sub- picture is carried out to the arc edge of planet Element extracts.Realize that steps are as follows:
(d1) it is indicated in local edge regional model with conic section, conic section crosses the position of pixel (m, n), image It is divided into upper and lower two different gray portions, the gray scale of target and background is respectively A, B;
(d2) (initial value of planetocentric to the direction of edge pixel point is gradient to the gradient direction of calculating edge pixel point Direction), the size for the template that regional area effect uses is calculated, template size is by DblurIt determines;
(d3) template of t*3 is chosen on gradient direction, t is the height of template, and 3 indicate 3 column pixels;
(d4) gray value of the template on gradient direction is calculated according to image-forming principle;Gradient direction out of plumb or it is not parallel with When image coordinate system, it would be desirable to construct the grey scale pixel value of the template of gradient direction;
(d5) new coordinate system is established, y ' axis direction is gradient direction, and x ' axis direction is the direction perpendicular to y ' axis;
(d6) under new coordinate system, in the template of gradient direction, according to the tired of intensity profile and both sides of edges area Add relationship, constructs regional area effect equation group, calculate the subpixel coordinates at edge;
(d7) after calculating the subpixel coordinates at edge under new coordinate system, coordinate is changed under xOy coordinate system, is obtained Sub-pixel edge coordinate set under one group of xOy coordinate system.
Wherein, the center that least square method fitting circle is used in step e, obtains high-precision planetocentric coordinates.It realizes Steps are as follows:
(e1) quadratic sum expression formula of the data point to distance of round;
(e2) partial derivative zero for making each variable of e1 expression formula, lists system of linear equations;
(e3) system of linear equations is solved, the information such as the center of circle radius of fitting circle are acquired.
Compared with prior art, the present invention its advantages and beneficial effects is:
(1), the present invention successfully solves in autonomous deep-space optical navigation, has texture edge abundant and complicated back The arc edge of the planet image of scape extracts problem.
(2), the present invention solves the filtering problem of the planet fringe region influenced by noise and asterism veiling glare, filters back The intensity profile of edge area-of-interest is more in line with edge energy distributed model.
(3), the present invention makes full use of the geometrical characteristic of planet, using the regional area effect method of gradient direction to circular arc Edge carries out subpixel coordinates extraction, and compared with traditional method, our edge detection precision is greatly improved, I Using least square method fitting planet sub-pixel edge, obtain planetocentric coordinates, compared with existing method, use A magnitude can be improved in our method, planetocentric positioning accuracy.
Detailed description of the invention
Fig. 1 is the geometry imaging schematic diagram of planet, wherein Fig. 1 (a) is the planet-sun-spacecraft geometrical plane, Fig. 1 It (b) is the edge contour of planet imaging, Fig. 1 (c) is the geometry of different moments imaging by taking the moon as an example;
Fig. 2 is planetocentric localization method flow chart;
Fig. 3 is that the method combined using morphology opening operation with smallest circle covering carries out pretreatment schematic diagram to planet, In, Fig. 3 (a) is original moon image, and Fig. 3 (b) is morphology opening operation treated image, and Fig. 3 (c) is Sobel operator The edge of planet is extracted, Fig. 3 (d) is that minimum circle-cover method finds out the radius of planet and the initial value at center;
Fig. 4 is edge model, wherein Fig. 4 (a) is ideal step edge model, and Fig. 4 (b) is fuzzy edge model;
Fig. 5 is edge region of interesting extraction, wherein Fig. 5 (a) indicates that edge area-of-interest, Fig. 5 (b) are figure (a) Amplify part;
Fig. 6 is gradient direction regional area effect, wherein Fig. 6 (a) is gradient direction template schematic diagram, and Fig. 6 (b) is mould Pixel grey scale in plate;
Fig. 7 is the planet image simulated using Celesitia software, wherein Fig. 7 (a) is the moon, and Fig. 7 (b) is Venus, Fig. 7 (c) is Mercury, and Fig. 7 (d) is Mars, and Fig. 7 (e) is Jupiter;
Fig. 8 is circular arc and the centralized positioning that true moon image is handled using our method, wherein Fig. 8 (a) is Ah For No. 11 spaceships of polo in the moon image shot on the way that makes a return voyage, Fig. 8 (b) is that international space station staff has taken this Open lunar photograph.
Specific embodiment
With reference to the accompanying drawing and specific embodiment invention is further described in detail.
1. method describes
Step 1: the method that selection morphology opening operation is combined with smallest circle covering pre-processes planet, obtains planet Arc edge coordinate and planet center and radius initial value.
It selects for width surface texture moon image abundant, as shown in Fig. 3 (a).Morphology is used to image first Opening operation operation, opening operation are the processes that corrosion reflation is carried out to image.Opening operation eliminates moon background area herein In a large amount of asterism and veiling glare influence.But opening operation can not eliminate the influence of surface texture completely, shown in Fig. 3 (b). Then sobel operator extraction (b) image border is used, is obtained such as Fig. 3 (c).We use one kind and effectively remove surface texture The method at edge: smallest circle covers (Minimum Spanning Circle).Smallest circle covering method can be in linear session The smallest circle for covering all point sets is found out in complexity, as shown in Fig. 3 (d).Smallest circle covering can obtain the center and half of the moon The initial estimate of diameter.Solid line circular arc portion is the arc edge of the moon in figure (d).
Step 2: extracting the edge area-of-interest (EROI) of planet.
Edge model is as shown in figure 4, Fig. 4 (a) indicates ideal edge intensity profile in the case where no fuzzy excitation For jump function (under one-dimensional condition):
F (x) indicates edge pixel point gray scale, and x indicates the grey scale pixel value in edge gradient direction, and h and k respectively represent gray scale Background and gray scale contrast, X are edge separation.In actual imaging, ideal step shape edge intensity profile becomes such as Fig. 4 (b) fuzzy edge model shown in, X1、X2Indicate the separation of fuzzy region and two sides.Ideally, meet: X2- X= X-X1.Blurred width DblurIs defined as: Dblur=X2-X1
Assuming that the radius of circle and the center of circle where circular arc are respectively R and O, as shown in figure 5, (a) figure indicates arc edge place Fringe region and our interested fringe regions (edge region of interest, EROI), use ΩEROITable Show.(b) figure is the detailed description of (a) figure partial region, it is specified that ΩEROIRange:
R-Δr≤d(ΩEROI, O) and≤R+ Δ r (2)
Δ r=Dblur/2+2(pixel) (3)
d(ΩEROI, O) and indicate ΩEROIOn point to the distance of center of circle O, Δ r can be according to blurred edge width in real image Selection, generally selects the length of 3-5 pixel.ΩEROIRegion is using O as the center of circle, and choosing inside radius in arc edge is R- Δ R, outer radius are the partial circle of R+ Δ r.Edge detection is limited in lesser ΩEROI, the edge of detection mistake is prevented, simultaneously It saves and calculates the time.
Step 3: according to fuzzy edge intensity profile rule, EROI being filtered using non-local filtering method, is realized Steps are as follows.
In actual planet image sampling, sampling environment is complicated, and edge area-of-interest will receive noise, asterism and miscellaneous The influence of light generates certain influence to sub-pixel precision.In order to obtain more accurate ground sub-pixel edge, it would be desirable to right Image carries out special filtering processing, and method is made to have better robustness.
Step 31: in EROI, any two pixel i and j.Calculate separately gray scale difference g (i, j) (gray scale of two pixels The poor factor), the range difference d (i, j) (the gradient direction range difference factor) in two pixel distance centers of circle on gradient direction, two The angle theta (i, j) (the gradient direction difference factor) in pixel gradient direction, θ ∈ [0 °, 180 °);
Step 32: calculate the degree of correlation weight of pixel i and j:
σ1、σ2And σ3Indicate adjustment parameter.N (i) is normalization coefficient.
Step 33: the filtered gray value of pixel i:
I (j) indicates Ω before filteringEROIThe gray value of interior j point, ΩERoIIndicate the set of pixels in EROI.And then in EROI, Acquire the filtered gray value of each pixel.
Step 4: sub-pixel edge being carried out using arc edge of the regional area effect based on gradient direction to planet and is mentioned It takes, realizes that steps are as follows that (traditional regional area effect method detailed process can refer to Agust í n Trujillo-Pino Et al. described in " Accurate subpixel edge location based on partial area effect " in Description).
In view of blurring effect in image forming course to the influence of vision system, we are asking planet arc edge sub- When pixel edge, using fuzzy edge model.As shown in Fig. 6 (a), L1With L '1Between width means blurred width, L2With L′2Between region indicate EROI, use ΩEROIIt indicates.L0Boundary curve is indicated, in local edge regional model conic section Y=c1x2+c2x+c3It indicating, conic section crosses the position of pixel (m, n), divides the image into upper and lower two different gray portions, The gray scale of target and background is respectively A, B, and image is indicated with G.
With edge coordinate be (m, n) pixel for, gradient direction gmn, i.e. planetocentric to pixel (m, n) Direction.Horizontal direction is set as x-axis, and vertical direction is set as y-axis.β is that y-axis rotates counterclockwise to gmnAngle, we set gmn Direction be y ' axis, establish x ' y ' coordinate system as shown in the figure, (m, n) is coordinate origin.It obtains in gmnOn direction, choose Size is the rectangle template of t*3, meets t=ceil (Dblur)+1, ceil expression takes and compare DblurBig minimum positive integer.
Step 41: it is i ', gray scale I that we, which define an inclined unit area region,i', side length h.Such as Fig. 6 (b) shown in, by taking pixel there are four arest neighbors as an example (according to tilt angle difference, the number of pixels of arest neighbors is not also identical), Four grey scale pixel values are respectively I1, I2, I3, I4.According to ideal pixel image-forming principle, the gray value of i ' are as follows:
Wherein ηjIndicate i ' area percentage shared in unit pixel around it, IjIndicate adjacent pixels value.Such as Fig. 6 (b) shown in,
(x1, y1)、(x2, y2) and (xn, yn) respectively represent the apex coordinate of polygon inverse time needle sort.
Step 42: we carry out convolution using smooth mask and step edge model image and obtain fuzzy edge intensity profile Model.Common smooth mask are as follows:
Meet: m00> m01> m11, m00+4m01+4m11=1.Smoothed image indicates are as follows:
Wherein F indicates ideal step edge image.Indicate convolution algorithm.
Step 43: under x ' y ' coordinate system, the curve setting by tilting template is conic section y=c1x2+c2x+c3。 The coefficient of quadratic function is expressed as:
SL、SMAnd SRRespectively represent first row in adaptive template, grey scale pixel value in secondary series and third column rectangle frame It is cumulative and.T indicates that template length, A and B indicate the gray value of both sides of edges.
Step 44: the gray value of both sides of edges can determine are as follows:
The gray value of G ' expression adaptive template.Subpixel coordinates are as follows: (0, c3)。
Step 45: expansion goes to panoramic limit, and as shown in Fig. 6 (a), β indicates the angle of two coordinate systems, then subpixel coordinates turn It changes under xOy coordinate system are as follows:
Step 5: least square fitting being carried out to sub-pixel edge coordinate, finds out high-precision planetocentric coordinates, is realized Steps are as follows.
Step 51: the quadratic sum expression formula of data point to distance of round;
Step 52: making the partial derivative zero of each variable of expression formula in step 51, list system of linear equations;
Step 53: solution system of linear equations acquires the information such as the center of circle radius of fitting circle.
2. embodiment effect
The present invention proposes a kind of high-precision planetocentric localization method by the processing to planet image, for rule The then processing of planet can obtain high-precision sub-pixel edge and centre coordinate, and to noise, complex background and enrich The planet image of texture has good robustness.
In order to verify it is of the invention it is correct with it is effective, We conducted the verifyings of a series of emulation experiment.Particular content is such as Under:
We simulate planet image with the open source software Celesitia that NASA is provided.Celesitia software can simulate The textural characteristics of the outer space environments such as atmosphere, the cloud layer of celestial surface and planet have very high phase with true celestial image Like property.If Fig. 7 (a)-(e) is respectively the image of the moon, Venus, Mercury, Mars and Jupiter.Each planet, simulates respectively 20 width images of different shape different distance are handled and are analyzed.Known to camera to the distance of planet, apparent diameter and with Planet state change value is different.By taking moon centralized positioning precision as an example, our method precision can achieve 0.0972 picture Element, compared with conventional pixel grade edge extracting method, our method precision improves 96.38%, and applies at present in planet Sub-pixel edge on image extracts, then the method for fitting center is compared, our method precision improves 68.21%.
Fig. 8 is to be analyzed using the present invention the moon image really shot.Fig. 8 (a) is american apollo 11 and flies Ship is on the way shot making a return voyage.When shooting this photo, airship is except 10000 nautical miles.Fig. 8 (b) is 2013 On November 12, greenwich time 00:00 in, moon image of the staff in the shooting of international space station.Circle represents in figure The planet profile of fitting, solid line circular arc represent the bright circular arc portion of planet.The moon ball center that × method for indicating us is fitted Coordinate.Image credit: US National Aeronautics and Space Administration.

Claims (5)

1. a kind of high-precision planetocentric localization method in autonomous deep-space optical navigation, it is characterised in that: realize step It is as follows:
A, image is pre-processed, obtains the initial value of the arc edge coordinate of planet and the center of planet and radius;
B, the edge area-of-interest (EROI) of planet is extracted;
C, non-local filtering is carried out to the EROI of planet;
D, using the regional area effect method on gradient direction, sub-pixel edge extraction is carried out to the arc edge of planet;
E, high-precision planetocentric coordinates are found out.
2. a kind of high-precision planetocentric localization method in autonomous deep-space optical navigation according to claim 1, It is characterized by: extracting arc edge area-of-interest (EROI) in step b, realize that steps are as follows:
(b1) initial center and radius of planet, the distance and radius of rejecting arc edge coordinate to center are obtained in step a Difference absolute value be greater than 2 pixels abnormal coordinate value;
(b2) absolute value for choosing the distance of the image centre to centre heart and the difference of radius is less than or equal to (Dblur/ 2+2) pixel seat Mark, as edge area-of-interest (EROI), DblurFor the width of fuzzy region gradient direction.
3. a kind of high-precision planetocentric localization method in autonomous deep-space optical navigation according to claim 1, It is characterized by: carrying out non-local filtering to the region EROI in step c, realize that steps are as follows:
(c1) in EROI, any two pixel i and j calculate separately the gray scale difference g (i, j) of two pixels, be gray scale difference because Son.The range difference d (i, j) in two pixel distance centers of circle on gradient direction is the gradient direction range difference factor;Two pixels The angle theta g (i, j) of gradient direction, be the gradient direction difference factor, θ ∈ [0 °, 180 °);
(c2) degree of correlation weight of any two pixel i and j in EROI are calculated;
(c3) according to the degree of correlation weight of pixel i in EROI and other pixels, filtered pixel i gray value is calculated, further Expand to the gray value after calculating each pixel filter.
4. a kind of high-precision planetocentric localization method in autonomous deep-space optical navigation according to claim 1, It is characterized by: carrying out sub-pix to the arc edge of planet using the regional area effect method on gradient direction in step d It extracts, realizes that steps are as follows:
(d1) it is indicated in local edge regional model with conic section, conic section crosses the position of pixel (m, n), divides the image into Upper and lower two different gray portions, the gray scale of target and background is respectively A, B;
(d2) calculate edge pixel point gradient direction, that is, planetocentric initial value to the direction of edge pixel point be gradient side To the size for the template that calculating regional area effect uses, template size is by DblurIt determines;
(d3) template of t*3 is chosen on gradient direction, t is the height of template, and 3 indicate 3 column pixels;
(d4) gray value of the template on gradient direction, gradient direction out of plumb or not parallel and image are calculated according to image-forming principle When coordinate system, need to construct the grey scale pixel value of the template of gradient direction;
(d5) new coordinate system is established, y ' axis direction is gradient direction, and x ' axis direction is the direction perpendicular to y ' axis;
(d6) under new coordinate system, in the template of gradient direction, according to the cumulative pass of intensity profile and both sides of edges area System constructs regional area effect equation group, calculates the subpixel coordinates at edge;
(d7) after calculating the subpixel coordinates at edge under new coordinate system, coordinate is changed under xOy coordinate system, obtains one group Sub-pixel edge coordinate set under xOy coordinate system.
5. a kind of high-precision planetocentric localization method in autonomous deep-space optical navigation according to claim 1, It is characterized by: obtaining high-precision planetocentric coordinates using the center of least square method fitting planet in step e.
CN201811186730.XA 2018-10-12 2018-10-12 High-precision planet center positioning method in deep space autonomous optical navigation Active CN109344785B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811186730.XA CN109344785B (en) 2018-10-12 2018-10-12 High-precision planet center positioning method in deep space autonomous optical navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811186730.XA CN109344785B (en) 2018-10-12 2018-10-12 High-precision planet center positioning method in deep space autonomous optical navigation

Publications (2)

Publication Number Publication Date
CN109344785A true CN109344785A (en) 2019-02-15
CN109344785B CN109344785B (en) 2021-10-01

Family

ID=65309667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811186730.XA Active CN109344785B (en) 2018-10-12 2018-10-12 High-precision planet center positioning method in deep space autonomous optical navigation

Country Status (1)

Country Link
CN (1) CN109344785B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110634146A (en) * 2019-08-30 2019-12-31 广东奥普特科技股份有限公司 Circle center sub-pixel precision positioning method
CN111383239A (en) * 2020-02-24 2020-07-07 上海航天控制技术研究所 Mars image false edge elimination and contour accurate fitting method based on iterative search

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334263B (en) * 2008-07-22 2010-09-15 东南大学 Circular target circular center positioning method
CN102927973A (en) * 2012-10-24 2013-02-13 北京控制工程研究所 Quick edge locating method of sub pixel image of target celestial body for deep space exploration autonomous navigation
CN107945229A (en) * 2017-10-24 2018-04-20 国家卫星气象中心 Fixed star barycenter extracting method for stationary orbit earth observation satellite face battle array instrument
CN108305288A (en) * 2017-10-24 2018-07-20 国家卫星气象中心 Fixed star barycenter extracting method for stationary orbit earth observation satellite alignment instrument

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334263B (en) * 2008-07-22 2010-09-15 东南大学 Circular target circular center positioning method
CN102927973A (en) * 2012-10-24 2013-02-13 北京控制工程研究所 Quick edge locating method of sub pixel image of target celestial body for deep space exploration autonomous navigation
CN107945229A (en) * 2017-10-24 2018-04-20 国家卫星气象中心 Fixed star barycenter extracting method for stationary orbit earth observation satellite face battle array instrument
CN108305288A (en) * 2017-10-24 2018-07-20 国家卫星气象中心 Fixed star barycenter extracting method for stationary orbit earth observation satellite alignment instrument

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SILIANG DU ET AL.: "A High-accuracy Extraction Algorithm of Planet Centroid Image in Deep-space Autonomous Optical Navigation", 《THE JOURNAL OF NAVIGATION》 *
罗发明: "边缘检测算法在土星测量上的比较研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110634146A (en) * 2019-08-30 2019-12-31 广东奥普特科技股份有限公司 Circle center sub-pixel precision positioning method
CN110634146B (en) * 2019-08-30 2022-06-17 广东奥普特科技股份有限公司 Circle center sub-pixel precision positioning method
CN111383239A (en) * 2020-02-24 2020-07-07 上海航天控制技术研究所 Mars image false edge elimination and contour accurate fitting method based on iterative search
CN111383239B (en) * 2020-02-24 2022-06-03 上海航天控制技术研究所 Mars image false edge elimination and contour accurate fitting method based on iterative search

Also Published As

Publication number Publication date
CN109344785B (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN106708066B (en) View-based access control model/inertial navigation unmanned plane independent landing method
EP3132231B1 (en) A method and system for estimating information related to a vehicle pitch and/or roll angle
CN107590777A (en) A kind of star sensor star point image enchancing method
CN106529538A (en) Method and device for positioning aircraft
CN107504966B (en) Method for extracting navigation star points in daytime cloud environment
CN106339006A (en) Object tracking method of aircraft and apparatus thereof
CN109344878B (en) Eagle brain-like feature integration small target recognition method based on ResNet
WO2012126500A1 (en) 3d streets
CN108917753B (en) Aircraft position determination method based on motion recovery structure
CN111626176A (en) Ground object target detection method and system of remote sensing image
CN110930508A (en) Two-dimensional photoelectric video and three-dimensional scene fusion method
CN114004977B (en) Method and system for positioning aerial data target based on deep learning
CN113686314B (en) Monocular water surface target segmentation and monocular distance measurement method for shipborne camera
Zhang et al. High-accuracy location algorithm of planetary centers for spacecraft autonomous optical navigation
CN109612438A (en) A kind of extraterrestrial target initial orbit under virtual Constraint of coplanarity condition determines method
CN109344785A (en) A kind of high-precision planetocentric localization method in autonomous deep-space optical navigation
CN116051766A (en) External planet surface environment reconstruction method based on nerve radiation field
CN113029132A (en) Spacecraft navigation method combining ground image and astrolabe measurement
CN106250898A (en) A kind of image local area feature extracting method based on scale prediction
CN109064510B (en) Total station and star point centroid extraction method of star image thereof
Koizumi et al. Development of attitude sensor using deep learning
Kikuya et al. Attitude determination algorithm using Earth sensor images and image recognition
CN113436313A (en) Three-dimensional reconstruction error active correction method based on unmanned aerial vehicle
KR101767197B1 (en) Simulator for the Verification of Celestial Navigation Algorithm of a Satellite and the Celestial Navigation Algorithm Verification Method using the Simulator
CN116452757A (en) Human body surface reconstruction method and system under complex scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant