CN104103070A - Landing point selecting method based on optical images - Google Patents

Landing point selecting method based on optical images Download PDF

Info

Publication number
CN104103070A
CN104103070A CN201410225479.9A CN201410225479A CN104103070A CN 104103070 A CN104103070 A CN 104103070A CN 201410225479 A CN201410225479 A CN 201410225479A CN 104103070 A CN104103070 A CN 104103070A
Authority
CN
China
Prior art keywords
landing point
optical imagery
landing
grid
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410225479.9A
Other languages
Chinese (zh)
Other versions
CN104103070B (en
Inventor
王立
张洪华
黄翔宇
杨春河
梁潇
余成武
周建涛
吴奋陟
赵宇
李春艳
郑璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN201410225479.9A priority Critical patent/CN104103070B/en
Publication of CN104103070A publication Critical patent/CN104103070A/en
Application granted granted Critical
Publication of CN104103070B publication Critical patent/CN104103070B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention relates to a landing point selecting method based on optical images. According to the method, firstly, optical images in a landing region are obtained by using an optical imaging sensor and are divided into grids, and the center of each grid is used as a candidate landing point; then, all pixel points are classified according to a K-means clustering method, and all of the pixel points in the brightest class and the darkest class in the classification result are used as obstacle points; next, for each grid, regional obstacle points are searched and counted, and on the basis of solving the searching region grayscale mean square error, the safety degree value corresponding to each candidate landing point is obtained through calculation; the safety degree values are sequenced from the small value to the large value, in addition, the candidate landing points corresponding to the first J safety degree values are selected to be used as initial landing points, and the safety degree values are subjected to normalization processing; then, the fuel consumption values corresponding to the J candidate landing points are calculated and are subjected to normalization processing; and finally, the final landing point is determined through integrating the safety degree values and the fuel consumption values. The method provided by the invention has the advantages that the calculation quantity is small; the precision is high; and the method is applicable to the real-time landing calculation of a lander.

Description

A kind of landing point choosing method based on optical imagery
Technical field
The present invention relates to the choosing method of a kind of spacecraft landing point in the time that extraterrestrial celestial body lands.
Background technology
The landing of extraterrestrial celestial body, for example lunar surface is surveyed, and because the unknown of landing landform and surface distributed a large amount of impact craters and stone, therefore determines that safe landing region just seems most important to evade obstacle.
Current, the sensor that enforcement safety is kept away barrier comprises optical imagery sensor and laser three-dimensional imaging sensor, and the former utilizes two dimensional gray figure to carry out obstacle identification, and the latter utilizes altitude figures (being also dem data) to carry out obstacle identification.Because optics is kept away the advantages such as barrier sensor is lightweight, volume is little, degree of ripeness is high, lander has all used optics to keep away barrier sensor as keeping away barrier means both at home and abroad.
The methods such as the search of classic method use location, shade calculating, rock identification carry out point of safes screening, and for example document on the 1st phase " photoelectric project " in 2009 " based on being detected by rock in the detector landing mission of motion video " has used the method for Threshold segmentation, shadow Detection, rock rim detection; Document on the 6th phase in 2008 " aviation journal " " the planet soft landing rock based on CCD landing camera detects and bypassing method " has used that Threshold segmentation, rock detect, the method for area assessment, spiral search.The weak point of these methods is calculation of complex, is difficult to meet the requirement of real-time; Secondly method has only been considered gray scale size and one direction shadow thresholds, reckons without the danger of bright area, brightness region of variation; Finally in result, only consider security implication, and in actual landing mission, generally can have the landing point that multiple degrees of safety are identical or be more or less the same, caused landing point to be difficult to accurately determine.
Summary of the invention
Technology of the present invention is dealt with problems and is: overcome the deficiencies in the prior art, proposed the landing point system of selection that a kind of reliability is high, real-time is good, be applicable to the fast selecting of spacecraft landing point in the time that extraterrestrial celestial body lands.
Technical solution of the present invention is: a kind of landing point choosing method based on optical imagery, comprises the steps:
(1) utilize the optical imagery sensor carrying on lander to obtain the optical imagery of touchdown area; The resolution of described optical imagery is A*B;
(2) described optical imagery is divided into S*T identical grid, and using each net center of a lattice as landing point to be selected; Wherein S is the number of grid that described optical imagery is gone up in the row direction, and T is the number of grid of described optical imagery on column direction;
(3) all pixels of described optical imagery are classified according to the method for K mean cluster, classification quantity is P, then using all pixels in the darkest class in P class as shadow spots, all pixels in the brightest class in P class are as high bright spot, and shadow spots and high bright spot are together considered as to barrier point;
(4) for each grid, centered by landing point to be selected, in the zoning that radius forms, carry out search and the statistics of barrier point taking the length of at least one grid or width, thereby obtain the region barrier point number Hazard_num (i) of each zoning in S*T zoning, i=1,2,3......, S*T;
(5) ask for the gray scale mean square deviation St_dev (i) of each zoning;
(6) according to the result of step (4) and step (5), calculate degree of safety numerical value Safe_Q (i) corresponding to each landing point to be selected,
Safe_Q(i)=Hazard_num(i)×St_dev(i)
(7) S*T degree of safety numerical value step (6) being obtained sorts according to order from small to large, and the landing point to be selected that before choosing in sequence, J Safe_Q (i) is corresponding is for just determining landing point, the degree of safety numerical value of determining landing point at the beginning of individual to J is normalized calculating, obtain J the degree of safety numerical value Safe1_Q (j) after normalization, j=1,2,3......, J;
(8) at the beginning of J, determine landing point, calculate respectively and obtain position from optical imagery and just determining the required fuel consumption of process of landing point to lander final landing, obtain thus J fuel consumption value d_Vel (j), J fuel consumption value is normalized to calculating, obtains J the fuel consumption value d1_Vel (j) after normalization;
(9) according to the result of calculation of step (7) and step (8), determine the each self-corresponding eliminating degree Land_Q of landing point (j) at the beginning of calculating J,
Land_Q(j)=k 1·Safe1_Q(j)+k 2·d1_Vel(j) k 1+k 2=1;
(10) in J eliminating degree step (9) being calculated, using the landing point to be selected of the some correspondence of eliminating degree minimum as final landing point.
In described step (1), the minimum value of the resolution A * B of optical imagery is 500*500.In described step (2), the minimum value of the quantity S*T of grid is 8*8.In described step (3), the minimum value of P is 3.In described step (7), the minimum value of J is 3.
The present invention's advantage is compared with prior art:
(1) the inventive method, based on optical imagery, has first been carried out the grid of image and has been divided.Divide by grid, establish in advance the set of landing point to be selected, overcome the large deficiency of classic method calculated amount, reduced significantly calculated amount, improved computing velocity, can meet the high request to real-time in engineering;
(2) the present invention has used the screening technique of two-way threshold and two-stage degree of safety, two-way threshold has not only comprised shadow thresholds but also comprised highlight regions (being much stone and pit edge), two-stage degree of safety had both comprised gray scale barrier zone, also use gray scale mean square deviation to carry out the screening of brightness region of variation, significantly improved the reliability that landing point is chosen;
(3) the inventive method has been used the comprehensive safety point deterministic model of degree of safety in conjunction with fuel consumption, has taken into account these two index for selection of degree of safety and Fuel Consumption, has improved on the one hand the reliability that landing point is chosen.Also strengthened on the other hand the engineering practicability of the inventive method.
Brief description of the drawings
Fig. 1 is the FB(flow block) of the inventive method.
Embodiment
As shown in Figure 1, the inventive method mainly comprises following several step: (1) utilizes optical imagery sensor to obtain the optical imagery of touchdown area; (2) carry out gridding at described optical imagery, each grid element center is as landing point to be selected; (3) carry out the classification of K average, using shade and highlighted as barrier point; (4) each grid element center is carried out to region barrier point statistics and mean square deviation calculating; (5) count and obtain degree of safety data with mean square deviation size according to obstacle; (6) according to degree of safety data sorting and export safest J point, and carry out degree of safety normalization calculating; (7) J point carried out to speed increment or fuel consumption calculating; (8) calculate comprehensive evaluation value according to consumption figures and degree of safety; (9) select the final point of safes of position corresponding to comprehensive evaluation value smallest point as output.
Below each step is described in detail.
(1) utilize optical imagery sensor to carry out imaging to touchdown area, obtain thus the optical imagery F (x, y) of touchdown area, suppose that array is 1024 × 1024, x so, y=[1,2 ... 1024].
(2) on described optical imagery, carry out gridding, be for example divided into 16 × 16 (or 32 × 32, but be at least 8 × 8) individual grid, grid element center point [m, n] is landing point to be selected so, corresponding coordinate:
m=[1,64,128…960,1024],n=[1,64,128…960,1024]。
(3) cause that according to topographic relief the principle of brightness of image carries out Target Segmentation, according to picture F (x, y) grey level histogram carries out the classification of K average, the classification P being divided into can be [3,4,5], the sorting algorithm of K average can be obtained in conventional books, suppose that at this wherein the 1st class is shade, P class is highlighted, using shade and highlighted as barrier point.
(4) for landing point grid element center [m to be selected, n] carry out the region obstacle statistics of counting, area size can select 3 × 3 or 5 × 5 sizing grids (generally taking grid as unit, zone radius at least comprises length or the width of a grid), in region, often encounter a barrier point and carry out statistical value Hazard_num and add 1:
Hazard_num=Hazard_num+1
Mean square deviation is calculated in whole region, first zoning average, in hypothesis district, pixel is 48*48 has so:
μ ( m , n ) = 1 48 × 48 [ Σ i = - 23 , i = i + 1 i = 24 Σ j = - 23 , j = j + 1 j = 24 I ( m + i , n + j ) ]
The mean square deviation at computing grid center:
St _ dev = σ ( m , n ) = 1 48 × 48 Σ i = - 23 , i = i + 1 i = 24 Σ j = - 23 , j = j + 1 j = 24 [ I ( m + i , n + j ) - μ ( m , n ) ] 2 .
(5) obtain degree of safety numerical value corresponding to this grid element center
Safe_Q(m,n)=Hazard_num×St_dev。
(6) export a safest J position, and degree of safety data are normalized
Safe_Q corresponding to all grid element center carried out to size sequence, exports a minimum J grid element center and degree of safety numerical value, solve afterwards the maximal value Max_Safe_Q of J degree of safety numerical value, be normalized:
Safe 1 _ Q ( m , n ) = Safe _ Q ( m , n ) Max _ Safe _ Q
The numerical value of J, generally depending on computational resource, is generally not less than 3, generally 3~10.
(7) J grid element center carried out to the calculating of corresponding speed increment, method is to be that preliminary election landing point calculates required fuel consumption d_Vel according to each grid element center n, and be normalized and calculate d1_Vel n, normalized process when method is calculated as degree of safety.
(8) calculate the integrated value of each pre-reconnaissance
Land_Q n=k 1·Safe1_Q n+k 2·d1_Vel n (k 1+k 2=1)
For example k 1=0.7, k 2=0.3, k 1=0.6, k 2=0.4, k 1=0.5, k 2=0.5.
(9) select the final point of safes of position corresponding to integrated value smallest point as output, to Land_Q nsort, grid element center position corresponding to minimum value is final point of safes coordinate.
The content not being described in detail in instructions of the present invention belongs to those skilled in the art's known technology.

Claims (5)

1. the landing point choosing method based on optical imagery, is characterized in that comprising the steps:
(1) utilize the optical imagery sensor carrying on lander to obtain the optical imagery of touchdown area; The resolution of described optical imagery is A*B;
(2) described optical imagery is divided into S*T identical grid, and using each net center of a lattice as landing point to be selected; Wherein S is the number of grid that described optical imagery is gone up in the row direction, and T is the number of grid of described optical imagery on column direction;
(3) all pixels of described optical imagery are classified according to the method for K mean cluster, classification quantity is P, then using all pixels in the darkest class in P class as shadow spots, all pixels in the brightest class in P class are as high bright spot, and shadow spots and high bright spot are together considered as to barrier point;
(4) for each grid, centered by landing point to be selected, in the zoning that radius forms, carry out search and the statistics of barrier point taking the length of at least one grid or width, thereby obtain the region barrier point number Hazard_num (i) of each zoning in S*T zoning, i=1,2,3......, S*T;
(5) ask for the gray scale mean square deviation St_dev (i) of each zoning;
(6) according to the result of step (4) and step (5), calculate degree of safety numerical value Safe_Q (i) corresponding to each landing point to be selected,
Safe_Q(i)=Hazard_num(i)×St_dev(i);
(7) S*T degree of safety numerical value step (6) being obtained sorts according to order from small to large, and the landing point to be selected that before choosing in sequence, J Safe_Q (i) is corresponding is for just determining landing point, the degree of safety numerical value of determining landing point at the beginning of individual to J is normalized calculating, obtain J the degree of safety numerical value Safe1_Q (j) after normalization, j=1,2,3......, J;
(8) at the beginning of J, determine landing point, calculate respectively and obtain position from optical imagery and just determining the required fuel consumption of process of landing point to lander final landing, obtain thus J fuel consumption value d_Vel (j), J fuel consumption value is normalized to calculating, obtains J the fuel consumption value d1_Vel (j) after normalization;
(9) according to the result of calculation of step (7) and step (8), determine the each self-corresponding eliminating degree Land_Q of landing point (j) at the beginning of calculating J,
Land_Q(j)=k 1·Safe1_Q(j)+k 2·d1_Vel(j) k 1+k 2=1;
(10) in J eliminating degree step (9) being calculated, using the landing point to be selected of the some correspondence of eliminating degree minimum as final landing point.
2. a kind of landing point choosing method based on optical imagery according to claim 1, is characterized in that: in described step (1), the minimum value of the resolution A * B of optical imagery is 500*500.
3. a kind of landing point choosing method based on optical imagery according to claim 1, is characterized in that: in described step (2), the minimum value of the quantity S*T of grid is 8*8.
4. a kind of landing point choosing method based on optical imagery according to claim 1, is characterized in that: in described step (3), the minimum value of P is 3.
5. a kind of landing point choosing method based on optical imagery according to claim 1, is characterized in that: in described step (7), the minimum value of J is 3.
CN201410225479.9A 2014-05-26 2014-05-26 Landing point selecting method based on optical images Active CN104103070B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410225479.9A CN104103070B (en) 2014-05-26 2014-05-26 Landing point selecting method based on optical images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410225479.9A CN104103070B (en) 2014-05-26 2014-05-26 Landing point selecting method based on optical images

Publications (2)

Publication Number Publication Date
CN104103070A true CN104103070A (en) 2014-10-15
CN104103070B CN104103070B (en) 2015-07-08

Family

ID=51671190

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410225479.9A Active CN104103070B (en) 2014-05-26 2014-05-26 Landing point selecting method based on optical images

Country Status (1)

Country Link
CN (1) CN104103070B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361576A (en) * 2014-10-20 2015-02-18 南京理工大学 Elevation value-based 3D barrier method and device for remote sensor automatic safety area screening
CN105203114A (en) * 2015-10-20 2015-12-30 北京理工大学 Planet safe landing point online selecting method
CN105333873A (en) * 2015-10-20 2016-02-17 北京理工大学 Planet safe landing guidance method employing landing point on-line selection
CN105589997A (en) * 2015-12-23 2016-05-18 重庆科技学院 Method and system for searching safety zone of elevation map based on Monte Carlo algorithm
CN106228131A (en) * 2016-07-20 2016-12-14 哈尔滨工业大学 Planetary landing device self adaptation disorder detection method
CN107144278A (en) * 2017-04-24 2017-09-08 北京理工大学 A kind of lander vision navigation method based on multi-source feature
CN109460057A (en) * 2018-11-16 2019-03-12 航宇救生装备有限公司 A kind of gridding parafoil towards multiple target is gone home method
CN109598243A (en) * 2018-12-06 2019-04-09 山东大学 A kind of moonscape safe landing area's selection method and system
CN106228131B (en) * 2016-07-20 2019-07-16 哈尔滨工业大学 The adaptive disorder detection method of planetary landing device
CN116628251A (en) * 2023-06-19 2023-08-22 北京控制工程研究所 Method, device, equipment and medium for searching moon surface safety area

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8615337B1 (en) * 2008-09-25 2013-12-24 Rockwell Collins, Inc. System supporting flight operations under instrument meteorological conditions using precision course guidance
CN103499971A (en) * 2013-09-30 2014-01-08 北京控制工程研究所 Sequential control method for landing obstacle avoidance of lunar probe
CN103662091A (en) * 2013-12-13 2014-03-26 北京控制工程研究所 High-precision safe landing guiding method based on relative navigation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8615337B1 (en) * 2008-09-25 2013-12-24 Rockwell Collins, Inc. System supporting flight operations under instrument meteorological conditions using precision course guidance
CN103499971A (en) * 2013-09-30 2014-01-08 北京控制工程研究所 Sequential control method for landing obstacle avoidance of lunar probe
CN103662091A (en) * 2013-12-13 2014-03-26 北京控制工程研究所 High-precision safe landing guiding method based on relative navigation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
丁萌 等: "基于被动图像的探测器着陆过程中岩石检测", 《光电工程》 *
黄翔宇 等: "基于图像测量数据的目标接近段自主导航方法研究与实验", 《中国科学》 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361576A (en) * 2014-10-20 2015-02-18 南京理工大学 Elevation value-based 3D barrier method and device for remote sensor automatic safety area screening
CN104361576B (en) * 2014-10-20 2018-01-05 南京理工大学 The 3D barrier-avoiding methods and device of remote sensor automatic screening safety zone based on height value
CN105203114B (en) * 2015-10-20 2018-01-16 北京理工大学 A kind of online choosing method of planet safe landing point
CN105203114A (en) * 2015-10-20 2015-12-30 北京理工大学 Planet safe landing point online selecting method
CN105333873A (en) * 2015-10-20 2016-02-17 北京理工大学 Planet safe landing guidance method employing landing point on-line selection
CN105333873B (en) * 2015-10-20 2018-01-16 北京理工大学 The planet safe landing method of guidance that a kind of landing point is chosen online
CN105589997A (en) * 2015-12-23 2016-05-18 重庆科技学院 Method and system for searching safety zone of elevation map based on Monte Carlo algorithm
CN105589997B (en) * 2015-12-23 2018-08-14 重庆科技学院 Elevation map safety zone searching method based on Monte Carlo EGS4 method and system
CN106228131A (en) * 2016-07-20 2016-12-14 哈尔滨工业大学 Planetary landing device self adaptation disorder detection method
CN106228131B (en) * 2016-07-20 2019-07-16 哈尔滨工业大学 The adaptive disorder detection method of planetary landing device
CN107144278A (en) * 2017-04-24 2017-09-08 北京理工大学 A kind of lander vision navigation method based on multi-source feature
CN107144278B (en) * 2017-04-24 2020-02-14 北京理工大学 Lander visual navigation method based on multi-source characteristics
CN109460057A (en) * 2018-11-16 2019-03-12 航宇救生装备有限公司 A kind of gridding parafoil towards multiple target is gone home method
CN109460057B (en) * 2018-11-16 2021-10-15 航宇救生装备有限公司 Multi-target-oriented gridding parafoil homing method
CN109598243A (en) * 2018-12-06 2019-04-09 山东大学 A kind of moonscape safe landing area's selection method and system
CN116628251A (en) * 2023-06-19 2023-08-22 北京控制工程研究所 Method, device, equipment and medium for searching moon surface safety area
CN116628251B (en) * 2023-06-19 2023-11-03 北京控制工程研究所 Method, device, equipment and medium for searching moon surface safety area

Also Published As

Publication number Publication date
CN104103070B (en) 2015-07-08

Similar Documents

Publication Publication Date Title
CN104103070B (en) Landing point selecting method based on optical images
Chadwick et al. Distant vehicle detection using radar and vision
CN104536009B (en) Above ground structure identification that a kind of laser infrared is compound and air navigation aid
CN113567984A (en) Method and system for detecting artificial small target in SAR image
CN103473786B (en) Gray level image segmentation method based on multi-objective fuzzy clustering
CN103279759A (en) Vehicle front trafficability analyzing method based on convolution nerve network
CN109598243B (en) Moon surface safe landing area selection method and system
CN103034863A (en) Remote-sensing image road acquisition method combined with kernel Fisher and multi-scale extraction
CN105260737A (en) Automatic laser scanning data physical plane extraction method with multi-scale characteristics fused
CN103366602A (en) Method of determining parking lot occupancy from digital camera images
CN101126812A (en) High resolution ratio remote-sensing image division and classification and variety detection integration method
CN108734219B (en) End-to-end collision pit detection and identification method based on full convolution neural network structure
CN103985127B (en) The detection method of small target of a kind of intensive star background and device
CN103218787A (en) Multi-source heterogeneous remote-sensing image control point automatic collecting method
CN105139375B (en) Combining global DEM and stereoscopic vision a kind of satellite image cloud detection method of optic
CN104361351A (en) Synthetic aperture radar (SAR) image classification method on basis of range statistics similarity
CN105096300A (en) Object detecting method and device
CN109708648A (en) A kind of classification discrimination method of spatial movement point target
CN109977968A (en) A kind of SAR change detecting method of deep learning classification and predicting
Zheng et al. A review of remote sensing image object detection algorithms based on deep learning
CN104008403B (en) A kind of SVM(Vector machine)The multi-targets recognition decision method of pattern
CN101950019A (en) Method for identifying multi-level targets by secondary radar based on attributive data
Kukkonen et al. Fusion of crown and trunk detections from airborne UAS based laser scanning for small area forest inventories
Long et al. Object detection research of SAR image using improved faster region-based convolutional neural network
CN107463944A (en) A kind of road information extracting method using multidate High Resolution SAR Images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant