CN109612333B - Visual auxiliary guide system for vertical recovery of reusable rocket - Google Patents

Visual auxiliary guide system for vertical recovery of reusable rocket Download PDF

Info

Publication number
CN109612333B
CN109612333B CN201811326834.6A CN201811326834A CN109612333B CN 109612333 B CN109612333 B CN 109612333B CN 201811326834 A CN201811326834 A CN 201811326834A CN 109612333 B CN109612333 B CN 109612333B
Authority
CN
China
Prior art keywords
landing
cooperative
beacon
landing platform
platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811326834.6A
Other languages
Chinese (zh)
Other versions
CN109612333A (en
Inventor
高仕博
肖利平
唐波
李少军
魏小丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Launch Vehicle Technology CALT
Beijing Aerospace Automatic Control Research Institute
Original Assignee
China Academy of Launch Vehicle Technology CALT
Beijing Aerospace Automatic Control Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Launch Vehicle Technology CALT, Beijing Aerospace Automatic Control Research Institute filed Critical China Academy of Launch Vehicle Technology CALT
Priority to CN201811326834.6A priority Critical patent/CN109612333B/en
Publication of CN109612333A publication Critical patent/CN109612333A/en
Application granted granted Critical
Publication of CN109612333B publication Critical patent/CN109612333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/22Aiming or laying means for vehicle-borne armament, e.g. on aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Studio Devices (AREA)

Abstract

A visual auxiliary guidance system for vertical recovery of a reusable rocket comprises a landing navigation camera arranged on a rocket body, a landing platform arranged on the ground or on the sea surface, and a real-time estimation module for the position and attitude deviation of the rocket body relative to a main landing platform; the landing navigation cameras installed on the arrow body comprise 4 oblique downward-looking landing navigation cameras and 2 forward-looking landing navigation cameras; the landing platform arranged on the ground or the sea surface comprises 1 main landing platform and 4 auxiliary cooperative beacon platforms; when the distance between the arrow body and the main landing platform is 10 km-100 m, a landing navigation camera arranged on the arrow body images the main landing platform and the auxiliary cooperative beacon platform on the ground or the sea, a real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform estimates the position and attitude deviation of the arrow body relative to the main landing platform in real time according to imaging information, and the deviation can be used for controlling the accuracy of vertical landing of the arrow body.

Description

Visual auxiliary guide system for vertical recovery of reusable rocket
Technical Field
The invention belongs to the technical field of visual navigation, and particularly relates to an accurate estimation method for the position and the attitude of a reusable carrier rocket sublevel vertical landing return.
Background
The reusable rocket with low cost and high reliability is a development direction of aerospace technology, in order to realize safe, stable and accurate vertical recovery of the rocket body, the relative position and attitude deviation of the rocket body relative to an appointed landing area need to be accurately estimated in the rocket body landing process, a guidance and control instruction is sent according to the deviation to correct the motion state of the rocket body, accurate control is implemented on the attitude, position, speed and the like of the rocket body, and the rocket body is accurately guided to a landing platform. How to accurately estimate the position and attitude deviation of the rocket body and the designated landing area is an important technology for realizing the vertical recovery of the reusable rocket.
At present, aiming at the navigation problem in the landing process of the reusable rocket, most of the methods adopted by various research institutions use inertial navigation as a center to combine with various auxiliary navigation devices to form an integrated navigation system, and common navigation devices comprise a DGPS/INS integrated navigation system, an atmospheric data system, a radar altimeter and the like. In the landing stage, the flight environment changes greatly, and the problems of vibration of an aircraft, the pneumatic optical effect, satellite signal unlocking and the like cause different working time periods of different navigation equipment, and certain errors are generated in the navigation precision, such as: the GPS positioning error comprises systematic errors such as ephemeris error, satellite clock error, receiver clock error, antenna phase center error, relativistic effect and the like, and accidental errors such as multipath effect, ionospheric refraction, tropospheric refraction area, tide, load tide and the like, and the estimation of the relative position is inevitably affected due to the objective existence of various errors in the navigation process. The navigation system has to have very high reliability to meet the vertical landing requirement of the reusable rocket, which provides a serious challenge for the design of the navigation system, in order to realize high-reliability and high-precision navigation, in addition to the DGPS/INS combined navigation system, the atmospheric data system, the radar altimeter and other navigation equipment, a new technical approach is required for realizing high-precision navigation, and when a certain sensor fails, a certain navigation redundancy is additionally provided so as to accurately estimate the relative pose of the rocket body relative to the designated landing platform.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: a new navigation technical approach is sought, a visual auxiliary guidance system oriented to vertical recovery of a reusable rocket is provided, the relative position and attitude deviation of a rocket body relative to an appointed landing platform are accurately estimated in the landing process, and high-reliability and high-precision navigation information is provided for a vertical return rocket control and guidance system.
The technical solution of the invention is as follows: a visual auxiliary guidance system for vertical recovery of a reusable rocket comprises a landing navigation camera arranged on a rocket body, a landing platform arranged on the ground or on the sea surface, and a real-time estimation module for the position and attitude deviation of the rocket body relative to a main landing platform; the landing navigation cameras installed on the arrow body comprise 4 oblique downward-looking landing navigation cameras and 2 forward-looking landing navigation cameras; the landing platform arranged on the ground or the sea surface comprises 1 main landing platform and 4 auxiliary cooperative beacon platforms; when the distance between the arrow body and the main landing platform is 10 km-100 m, a landing navigation camera arranged on the arrow body images the main landing platform and the auxiliary cooperative beacon platform on the ground or the sea, a real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform estimates the position and attitude deviation of the arrow body relative to the main landing platform in real time according to imaging information, and the deviation can be used for controlling the accuracy of vertical landing of the arrow body.
Preferably, the 4 oblique downward-looking landing navigation cameras are installed at the middle lower part of the arrow body and are uniformly installed on the arrow body at an included angle of 90 degrees, and the installation position of the forward downward-looking landing navigation camera is 0.2-2 m lower than that of the oblique downward-looking landing navigation camera; the forward overlooking landing navigation cameras are uniformly arranged on the arrow body at an included angle of 180 degrees; the installation included angle between the central axis of the forward overlooking landing navigation camera and the arrow body is 0 degree, the installation included angle between the central axis of the oblique downward-looking landing navigation camera and the arrow body is alpha, and the alpha is half of the field angle gamma of the oblique downward-looking landing navigation camera:
Figure GDA0002924812980000021
wherein: k is the radius of the arrow body, and H is the ground clearance of the oblique downward-looking navigation camera;
the installation angle between the oblique downward-looking landing navigation camera and the forward-looking landing navigation camera is not limited.
Preferably, the value range of the installation included angle alpha is 20-30 degrees.
Preferably, the 4 oblique downward-looking landing navigation cameras and the 2 forward downward-looking landing navigation cameras are all near-infrared imaging sensors, the oblique downward-looking landing navigation cameras and the forward downward-looking landing navigation cameras are the same, the imaging wavelength is 0.9-1.7 micrometers, the field angle is gamma, the value range is 40-60 degrees, the imaging frame rate is greater than 120 frames/second, the imaging resolution is nxn, and the n calculation method is as follows:
Figure GDA0002924812980000031
wherein: β is the spatial angular resolution of the navigation camera.
Preferably, the arrangement mode of 1 main landing platform and 4 auxiliary cooperative beacon platforms arranged on the ground or on the sea surface is as follows: and 4 auxiliary cooperative beacon platforms are uniformly and equidistantly arranged in a circular area around the main landing platform by taking the main landing platform as the center.
Preferably, the distance L between the auxiliary cooperative beacon platform and the main landing platform is as large as possible under the following constraint conditions:
L<Hmintan(γ)
wherein: hminThe minimum ground clearance for all navigation cameras to be able to provide auxiliary guidance information, and gamma is the angle of view of the oblique downward-looking landing navigation camera and the downward-looking landing navigation camera.
Preferably, the distance L ranges from 50m to 80 m.
Preferably, the cooperative beacons on the main landing platform and the auxiliary cooperative beacon platform are both made of near-infrared band sensitive materials or light bands, and the materials are consistent; the non-cooperative beacons on the main landing platform and the auxiliary cooperative beacon platform are made of metal insensitive to near infrared wave bands.
Preferably, the main landing platform is provided with an 'H + circular ring' type cooperative beacon, wherein the 'H' type cooperative beacon has a length and a width
Figure GDA0002924812980000032
The outer diameter of a large circle of the circular ring-shaped cooperative beacon is larger than 100m, the outer diameter of a small circle of the circular ring-shaped cooperative beacon ranges from 30m to 50m, and the width of the circular ring ranges from 10m to 15 m; the a is calculated by the following method: first, the ground distance sr (unit:m/pixel):
Figure GDA0002924812980000033
then, multiplying the sr by the pixels contained in the image width of the beacon to obtain an a value;
wherein: s is the length of the ground instantaneous field of view captured by the oblique downward/forward downward looking navigation cameras, and the value of h is the height of any one navigation camera on the arrow body.
Preferably, the value of a is more than 50m, and the value range of each line width forming the H is 10-20 m.
Preferably, the auxiliary cooperative beacon platform is provided with an "H" type cooperative beacon, wherein the "H" type cooperative beacon has a length of b and a width of b
Figure GDA0002924812980000041
The value of b is more than 40m, and the value range of each line width forming H is 10-15 m.
Preferably, the direction requirements of the "H" type cooperative beacons arranged on the 4 auxiliary cooperative beacon platforms are: the middle dash line "-" that makes up "H" passes through the center point of the main landing platform.
Preferably, the real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform comprises a vision auxiliary guidance strategy module, an image processing module and an image information relative pose resolving module;
the image processing module processes images shot by the landing navigation camera in real time, detects whether the images contain the cooperation beacons or not, detects the types of landing platforms where the cooperation beacons are located, and extracts feature points of the cooperation beacons; whether the cooperative beacons are included or not and the type processing result of the landing platform where the cooperative beacons are located are sent to the visual auxiliary guidance strategy module; sending the extracted cooperative beacon feature points to an image information relative pose resolving module;
and the vision auxiliary guiding strategy module is used for controlling the image information relative pose resolving module to adopt a corresponding pose resolving method according to the received image processing result, and determining the position parameters, the pitch angle, the yaw angle and the roll angle of the arrow body relative to the main landing platform by the image information relative pose resolving module.
Preferably, the visual assistance guidance strategy module specifically controls the strategy as follows:
if the photographed image has the cooperative beacon, judging whether the cooperative beacon belonging to the main landing platform exists in the existing cooperative beacons, and if the cooperative beacon belonging to the main landing platform exists, controlling the image information relative pose resolving module to only adopt the characteristic point information of the cooperative beacon on the main landing platform to resolve the relative pose;
if no cooperative beacon belonging to the main landing platform exists and only the cooperative beacon belonging to the auxiliary landing platform exists, the image information relative pose resolving module is controlled to only adopt the characteristic point information of the cooperative beacon on any auxiliary landing platform to resolve the relative pose;
and if no cooperative beacon exists in the acquired image, controlling the image information relative pose resolving module to resolve the relative pose without adopting the image information of the cooperative beacon.
Preferably, the image processing module adopts an SIFT matching algorithm to detect an H-shaped cooperation beacon image, and adopts a Hough circle detection method to judge whether a circular ring exists in a cooperation beacon area, so as to judge the type of a landing platform where the cooperation beacon is located; and extracting Harris corner characteristic points from the detected H-shaped cooperative beacon image.
Preferably, when the image information relative pose calculating module calculates the relative pose by using the information of the cooperative beacon, a translation matrix and a rotation matrix are calculated by a Tsai algorithm according to the real-time acquired 12 feature points of the H-type cooperative beacon and the coordinates of the 12 feature points of the H-type cooperative beacon in a world coordinate system, so as to obtain the position parameter, the pitch angle, the yaw angle and the roll angle of the arrow body relative to the main landing platform.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention provides a new navigation technical approach for vertical recovery of a reusable rocket, additionally provides certain navigation redundancy, and meets the requirements of high-precision and high-reliability navigation of a vertically returned rocket;
(2) the invention adopts 4 oblique downward looking landing navigation cameras and 2 forward downward looking landing navigation cameras, so that the landing navigation cameras can observe a wider area, the cooperative beacon can be captured favorably, and useful visual information can be obtained under the condition that part of the landing navigation cameras are invalid;
(3) the landing platform arranged in the invention comprises 1 main landing platform and 4 auxiliary cooperation beacon platforms, and the main landing platform and the auxiliary cooperation beacon platforms are respectively arranged in a large area, so that the landing navigation camera can capture cooperation beacons as much as possible, the shielding and the interference are effectively overcome, and the available cooperation beacon information is provided for the landing process.
Drawings
FIG. 1 is a schematic view of the vision-aided guidance system for vertical retrieval of a reusable rocket according to the present invention;
FIG. 2 is a schematic view of a landing navigation camera mounted on an arrow according to the present invention;
fig. 3 is a schematic view of the detection range of the oblique downward-looking navigation camera, (a) is a schematic view of the detection range of the oblique downward-looking navigation camera in a three-dimensional manner, and (b) is a schematic view of the detection range of the oblique downward-looking navigation camera in a planar manner;
FIG. 4 is a schematic cross-sectional view of the landing guidance camera of the present invention mounted on an arrow;
FIG. 5 is a schematic view of the landing platform of the present invention deployed on or near the surface of the ground;
FIG. 6 is a schematic view of the main landing platform of the present invention;
FIG. 7 is a schematic diagram of an assisted cooperative beacon platform according to the present invention;
FIG. 8 is a flow chart of steps of a method for calculating the pose of an arrow body relative to a main landing platform in real time according to the invention;
fig. 9 shows 12 Harris corner feature points of the "H" type cooperative beacon to be extracted in the present invention, (a) is 12 Harris corner feature points of the "H" type cooperative beacon on the main landing platform, and (b) is 12 Harris corner feature points of the "H" type cooperative beacon on the auxiliary landing platform.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features mentioned in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
A visual auxiliary guidance system for vertical recovery of a reusable rocket comprises a landing navigation camera arranged on a rocket body, a landing platform arranged on the ground or on the sea surface, and a real-time estimation module for the position and attitude deviation of the rocket body relative to a main landing platform; the landing navigation cameras installed on the arrow body comprise 4 oblique downward-looking landing navigation cameras and 2 forward-looking landing navigation cameras; the landing platform arranged on the ground or the sea surface comprises 1 main landing platform and 4 auxiliary cooperative beacon platforms; when the distance between the arrow body and the main landing platform is 10 km-100 m, a landing navigation camera arranged on the arrow body images the main landing platform and the auxiliary cooperative beacon platform on the ground or the sea, a real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform estimates the position and attitude deviation of the arrow body relative to the main landing platform in real time according to imaging information, and the deviation can be used for controlling the accuracy of vertical landing of the arrow body.
Fig. 2 is a schematic diagram showing the positions of landing navigation cameras on an arrow body, wherein 4 oblique downward-looking landing navigation cameras and 2 forward-looking landing navigation cameras are mounted on the arrow body; the oblique downward-looking landing navigation camera is arranged at the middle lower part of the arrow body, the installation included angle between the central axis of the oblique downward-looking landing navigation camera and the arrow body is alpha, alpha is half of the field angle gamma of the oblique downward-looking navigation camera, referring to the graph shown in figure 3, the oblique downward-looking landing navigation camera can theoretically observe a complete cooperative beacon on the main landing platform or the auxiliary cooperative beacon platform within the height range of 10 km-100 m,
as can be seen from the geometrical relationship in the figure, the monocular oblique downward-looking navigation camera has the detection radius of
Figure GDA0002924812980000074
To ensure that the vision imaging device can fully cover the 360-degree view field of the landing area, the maximum projection distance D of the arrow body on the ground is satisfied by calculating the mutual coverage R/2 according to the detection radius of the monocular oblique downward-looking navigation camera
Figure GDA0002924812980000071
Assuming that the radius K of the arrow body is 2m, considering the more extreme case, the height H from the ground of the oblique downward-looking navigation camera is 100m,
Figure GDA0002924812980000072
alpha may be chosen to be 20 degrees.
The forward downward looking landing navigation camera is arranged at the middle lower part of the arrow body and is 0.2-2 m lower than the mounting position of the oblique downward looking landing navigation camera; fig. 4 is a schematic view showing the installation of the landing navigation camera on the cross section of the arrow body, wherein the angle between the central axis of the landing navigation camera and the arrow body is 0 degree when the landing navigation camera is overlooked; 4 downward-looking inclined landing navigation cameras are uniformly installed on the arrow body at an included angle of 90 degrees, 2 downward-looking inclined landing navigation cameras are uniformly installed on the arrow body at an included angle of 180 degrees, and the installation included angle between the downward-looking inclined landing navigation cameras and the downward-looking inclined landing navigation cameras is not limited.
The 4 downward-looking oblique landing navigation cameras and the 2 forward-looking down landing navigation cameras are all near-infrared imaging sensors, the downward-looking oblique landing navigation cameras and the forward-looking down landing navigation cameras are identical, the imaging wavelength is 0.9-1.7 micrometers, the field angle gamma is 2 alpha-40 degrees, the imaging frame rate is greater than 120 frames/second, the imaging resolution is n x n, and the spatial angular resolution beta of the navigation cameras is assumed to be better than 0.5mrad, so that
Figure GDA0002924812980000073
The imaging resolution of the camera was 2560 × 2560.
As shown in fig. 5, the main landing platform is used as a center, 4 auxiliary cooperation beacon platforms are uniformly and equidistantly arranged in a circular area around the main landing platform, and cooperation beacons on the main landing platform and the auxiliary cooperation beacon platforms are both made of near-infrared band sensitive materials or light bands and are consistent in material; on main landing platform and auxiliary cooperative beacon platformThe non-cooperative beacon is made of metal insensitive to near infrared wave bands; the distance L between the auxiliary cooperation beacon platform and the main landing platform is as large as possible, and the minimum ground clearance H of all navigation cameras capable of still providing auxiliary guide information is takenminIs 100m, L should also satisfy L<Hmintan (γ) is 83.9m, and L is 80 m.
As shown in fig. 6, an "H + ring" type cooperative beacon is arranged on the main landing platform, the cooperative target needs to have a certain size to facilitate the detection of the cooperative beacon, and when α is 20 ° and n is 2560, and when the height H of the navigation camera is 10km, the ground distance represented by each pixel in the image is
Figure GDA0002924812980000081
From the aspect of facilitating the detection of the H-type cooperative beacon image, the width of the H-type cooperative beacon image should be not less than 10 pixels, and the width of the H-type cooperative beacon should be greater than 2.84m/pixel multiplied by 10 pixel-28.4 m, and the length of the H-type cooperative beacon should be greater than the length of the H-type cooperative beacon
Figure GDA0002924812980000082
The length a of the "H" type cooperative beacon can be set to 70m, the width of the "H" type cooperative beacon is 50m, and the width of each line forming the "H" is set to 20 m; the outer diameter of a large circle of the circular ring-shaped cooperative beacon is 100m, the outer diameter of a small circle is 50m, and the width of the circular ring is 10 m; as shown in fig. 7, the length of the "H" -type cooperative beacon arranged on the auxiliary cooperative beacon platform is 42m, the width thereof is 30m, and the line width of each line constituting "H" is 10 m.
The real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform comprises a vision auxiliary guide strategy module, an image processing module and an image information relative pose resolving module;
a visual assistance guidance strategy module as shown in fig. 8; in the arrow body landing process, 4 oblique downward-looking navigation cameras and 2 forward downward-looking navigation cameras image the ground or the sea surface, and whether a cooperation beacon exists in 6 obtained images is judged through an image processing module; if the cooperative beacon exists, judging whether the cooperative beacon which belongs to the main landing platform exists in the existing cooperative beacons, and if the cooperative beacon which belongs to the main landing platform exists, outputting the position and posture deviation information of the arrow body relative to the main landing platform by only adopting the imaging information of the cooperative beacon on the main landing platform and combining the image processing module and the image information relative pose resolving module; if no cooperative beacon belonging to the main landing platform exists and only the cooperative beacon belonging to the auxiliary landing platform exists, outputting the position and attitude deviation information of the arrow body relative to the main landing platform by only adopting the imaging information of the cooperative beacon on any auxiliary landing platform and combining the image processing module and the image information relative pose resolving module; if no cooperative beacon exists in the acquired 6 images, the image information of the cooperative beacon is not adopted in the resolving process of the position and attitude deviation of the arrow body relative to the main landing platform.
The image processing module comprises online detection of the cooperation beacons, judgment of the types of landing platforms where the cooperation beacons are located and point feature extraction of the cooperation beacon images; pre-storing a standard H image in an image processing computer on an arrow, performing real-time feature matching on the pre-stored standard H image and a real-time image acquired by a navigation camera by adopting an SIFT matching algorithm, and detecting an H-type cooperation beacon on line; then, judging whether a circular ring exists in the cooperation beacon region by adopting a Hough circle detection method, and judging the type of a landing platform where the cooperation beacon is located; if there is a cooperative beacon belonging to the main landing platform, extracting Harris corner point feature points only for the detected "H" type cooperative beacon image, as shown in (a) of fig. 9; if there is no cooperative beacon belonging to the main landing platform and only the cooperative beacon belonging to the auxiliary landing platform, only the Harris angular point feature point is extracted from the "H" type cooperative beacon image on any one of the auxiliary landing platforms, as shown in fig. 9 (b).
For navigation camera intrinsic parameter matrix
Figure GDA0002924812980000091
Wherein
Figure GDA0002924812980000092
f is the effective focal length, du is the horizontal pixel spacing, and dv is the vertical pixelSpacing; u. of0And v0Obtaining a horizontal offset and a vertical offset of the projection of the optical center on an image plane, wherein gamma is a non-vertical factor, and a parameter matrix in the navigation camera is obtained through manual calibration; according to 12 Harris corner characteristic points [ u ] of real-time acquired 'H' -shaped cooperative beaconi,vi]T(i ═ 1, 2.., 12), and the coordinates [ x ] of 12 Harris corner feature points in a world coordinate system1,y1]Calculating the position and attitude angle deviation of the arrow body relative to the cooperative beacon on the landing platform through a Tsai algorithm, and calculating
Figure GDA0002924812980000093
And arranging coordinates of 12 Harris corner point feature points under a world coordinate system into gamma ═ u'1,v′1,1,...,u′12,v′12,1]TAnd
Figure GDA0002924812980000094
calculating Ψ ═ (a)T A)-1ATγ; the height-wise positional deviation of the arrow body with respect to the cooperative beacon on the landing platform is
Figure GDA0002924812980000095
Horizontal lateral position deviation X ═ Z Ψ (4); horizontal forward position deviation Y ═ Z Ψ (8); the arrow body has a pitch angle deviation phi-arcsin (psi (7)) and a yaw angle deviation relative to a cooperative beacon on a landing platform
Figure GDA0002924812980000101
The roll angle deviation θ ═ arctan (Ψ (4)/Ψ (1)).
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Those skilled in the art will appreciate that the invention may be practiced without these specific details.

Claims (14)

1. A visual auxiliary guide system for vertical recovery of a reusable rocket is characterized in that: the system comprises a landing navigation camera arranged on an arrow body, a landing platform arranged on the ground or on the sea surface, and a real-time estimation module for the position and attitude deviation of the arrow body relative to a main landing platform; the landing navigation cameras installed on the arrow body comprise 4 oblique downward-looking landing navigation cameras and 2 forward-looking landing navigation cameras; the landing platform arranged on the ground or the sea surface comprises 1 main landing platform and 4 auxiliary cooperative beacon platforms; when the distance between the arrow body and the main landing platform is 10 km-100 m, a landing navigation camera arranged on the arrow body images the main landing platform and the auxiliary cooperative beacon platform on the ground or the sea, a real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform estimates the position and attitude deviation of the arrow body relative to the main landing platform in real time according to imaging information, and the deviation can be used for controlling the accuracy of vertical landing of the arrow body;
the 4 oblique downward-looking landing navigation cameras are installed at the middle lower part of the arrow body and are uniformly installed on the arrow body at an included angle of 90 degrees, and the installation position of the forward downward-looking landing navigation camera is 0.2-2 m lower than that of the oblique downward-looking landing navigation camera; the forward overlooking landing navigation cameras are uniformly arranged on the arrow body at an included angle of 180 degrees; the installation included angle between the central axis of the forward overlooking landing navigation camera and the arrow body is 0 degree, the installation included angle between the central axis of the oblique downward-looking landing navigation camera and the arrow body is alpha, and the alpha is half of the field angle gamma of the oblique downward-looking landing navigation camera:
Figure FDA0002924812970000011
wherein: k is the radius of the arrow body, and H is the ground clearance of the oblique downward-looking navigation camera;
the installation included angle between the oblique downward-looking landing navigation camera and the forward downward-looking landing navigation camera is not limited;
the real-time estimation module for the position and attitude deviation of the arrow body relative to the main landing platform comprises a vision auxiliary guidance strategy module, an image processing module and an image information relative pose resolving module;
the image processing module processes images shot by the landing navigation camera in real time, detects whether the images contain the cooperation beacons or not, detects the types of landing platforms where the cooperation beacons are located, and extracts feature points of the cooperation beacons; whether the cooperative beacons are included or not and the type processing result of the landing platform where the cooperative beacons are located are sent to the visual auxiliary guidance strategy module; sending the extracted cooperative beacon feature points to an image information relative pose resolving module;
and the vision auxiliary guiding strategy module is used for controlling the image information relative pose resolving module to adopt a corresponding pose resolving method according to the received image processing result, and determining the position parameters, the pitch angle, the yaw angle and the roll angle of the arrow body relative to the main landing platform by the image information relative pose resolving module.
2. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 wherein: the range of the included angle alpha is 20-30 degrees.
3. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 wherein: the 4 downward-looking landing navigation cameras and the 2 forward-looking landing navigation cameras are all near-infrared imaging sensors, the downward-looking landing navigation cameras and the forward-looking landing navigation cameras are the same, the imaging wavelength is 0.9-1.7 micrometers, the field angle is gamma, the value range is 40-60 degrees, the imaging frame rate is greater than 120 frames/second, the imaging resolution is nxn, and the n calculation method is as follows:
Figure FDA0002924812970000021
wherein: β is the spatial angular resolution of the navigation camera.
4. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 wherein: the arrangement mode of 1 main landing platform and 4 auxiliary cooperative beacon platforms arranged on the ground or the sea surface is as follows: and 4 auxiliary cooperative beacon platforms are uniformly and equidistantly arranged in a circular area around the main landing platform by taking the main landing platform as the center.
5. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 4 wherein: the distance L between the auxiliary cooperative beacon platform and the main landing platform is as large as possible on the premise of meeting the following constraint conditions:
L<Hmintan(γ)
wherein: hminThe minimum ground clearance for all navigation cameras to be able to provide auxiliary guidance information, and gamma is the angle of view of the oblique downward-looking landing navigation camera and the downward-looking landing navigation camera.
6. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 5 wherein: the distance L ranges from 50m to 80 m.
7. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 or 4 wherein: the cooperative beacons on the main landing platform and the auxiliary cooperative beacon platform are both made of near-infrared band sensitive materials or light bands, and the materials are consistent; the non-cooperative beacons on the main landing platform and the auxiliary cooperative beacon platform are made of metal insensitive to near infrared wave bands.
8. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 or 4 wherein: the main landing platform is provided with an H + ring-shaped cooperative beacon, wherein the H-shaped cooperative beacon has a length of a and a width of
Figure FDA0002924812970000031
The outer diameter of the big circle of the circular ring-shaped cooperative beacon is more than 100m, and the value range of the outer diameter of the small circleThe circumference is 30-50 m, and the width of the circular ring is 10-15 m; the a is calculated by the following method: firstly, calculating the ground distance sr represented by each pixel in the image, wherein the unit is as follows: m/pixel:
Figure FDA0002924812970000032
then, multiplying the sr by the pixels contained in the image width of the beacon to obtain an a value;
wherein: s is the length of the ground instantaneous field of view captured by the oblique downward/forward downward looking navigation cameras, and the value of h is the height of any one navigation camera on the arrow body.
9. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 8 wherein: the value of a is larger than 50m, and the value range of each line width forming H is 10-20 m.
10. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 or 4 wherein: the auxiliary cooperative beacon platform is provided with an H-shaped cooperative beacon, wherein the length of the H-shaped cooperative beacon is b, and the width of the H-shaped cooperative beacon is b
Figure FDA0002924812970000033
The value of b is more than 40m, and the value range of each line width forming H is 10-15 m.
11. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 10 wherein: the directional requirements of the "H" type cooperative beacons arranged on the 4 auxiliary cooperative beacon platforms: the middle dash line "-" that makes up "H" passes through the center point of the main landing platform.
12. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 1 wherein: the visual assistance guidance strategy module specifically controls the strategies as follows:
if the photographed image has the cooperative beacon, judging whether the cooperative beacon belonging to the main landing platform exists in the existing cooperative beacons, and if the cooperative beacon belonging to the main landing platform exists, controlling the image information relative pose resolving module to only adopt the characteristic point information of the cooperative beacon on the main landing platform to resolve the relative pose;
if no cooperative beacon belonging to the main landing platform exists and only the cooperative beacon belonging to the auxiliary landing platform exists, the image information relative pose resolving module is controlled to only adopt the characteristic point information of the cooperative beacon on any auxiliary landing platform to resolve the relative pose;
and if no cooperative beacon exists in the acquired image, controlling the image information relative pose resolving module to resolve the relative pose without adopting the image information of the cooperative beacon.
13. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 11 wherein: the image processing module adopts an SIFT matching algorithm to detect an H-shaped cooperation beacon image, and adopts a Hough circle detection method to judge whether a circular ring exists in a cooperation beacon area so as to judge the type of a landing platform where the cooperation beacon is positioned; and extracting Harris corner characteristic points from the detected H-shaped cooperative beacon image.
14. A visual aid guidance system for vertical recovery of a reusable rocket according to claim 11 wherein: when the image information relative pose resolving module resolves the relative pose by adopting the information of the cooperative beacons, a translation matrix and a rotation matrix are calculated through a Tsai algorithm according to the real-time acquired 12 characteristic points of the H-shaped cooperative beacons and the coordinates of the 12 characteristic points of the H-shaped cooperative beacons in a world coordinate system, and further position parameters, a pitch angle, a yaw angle and a roll angle of the arrow body relative to the main landing platform are obtained.
CN201811326834.6A 2018-11-08 2018-11-08 Visual auxiliary guide system for vertical recovery of reusable rocket Active CN109612333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811326834.6A CN109612333B (en) 2018-11-08 2018-11-08 Visual auxiliary guide system for vertical recovery of reusable rocket

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811326834.6A CN109612333B (en) 2018-11-08 2018-11-08 Visual auxiliary guide system for vertical recovery of reusable rocket

Publications (2)

Publication Number Publication Date
CN109612333A CN109612333A (en) 2019-04-12
CN109612333B true CN109612333B (en) 2021-07-09

Family

ID=66003427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811326834.6A Active CN109612333B (en) 2018-11-08 2018-11-08 Visual auxiliary guide system for vertical recovery of reusable rocket

Country Status (1)

Country Link
CN (1) CN109612333B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110955974B (en) * 2019-11-29 2022-11-11 清华大学 Rocket recovery simulation platform and implementation method
CN112629322B (en) * 2020-12-14 2023-09-12 湖北航天飞行器研究所 Semi-physical simulation power-on device and method based on overload activation start-control missile
CN115265290B (en) * 2022-07-07 2023-06-16 大连船舶重工集团有限公司 Marine recovery rocket sealing device
CN116123940A (en) * 2022-11-04 2023-05-16 北京理工大学 Rocket ground autonomous recovery system under weak ground-air communication environment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6364026B1 (en) * 1998-04-01 2002-04-02 Irving Doshay Robotic fire protection system
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN104656664A (en) * 2015-02-16 2015-05-27 南京航空航天大学 Vehicle-mounted multi-rotor unmanned helicopter landing guidance control system and guidance control method
CN105718895A (en) * 2016-01-22 2016-06-29 张健敏 Unmanned aerial vehicle based on visual characteristics
CN106326892A (en) * 2016-08-01 2017-01-11 西南科技大学 Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN107539501A (en) * 2017-09-03 2018-01-05 佛山红辉科技有限公司 A kind of band camera enhancement rocket recovery device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6364026B1 (en) * 1998-04-01 2002-04-02 Irving Doshay Robotic fire protection system
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN104656664A (en) * 2015-02-16 2015-05-27 南京航空航天大学 Vehicle-mounted multi-rotor unmanned helicopter landing guidance control system and guidance control method
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN105718895A (en) * 2016-01-22 2016-06-29 张健敏 Unmanned aerial vehicle based on visual characteristics
CN106326892A (en) * 2016-08-01 2017-01-11 西南科技大学 Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN107539501A (en) * 2017-09-03 2018-01-05 佛山红辉科技有限公司 A kind of band camera enhancement rocket recovery device

Also Published As

Publication number Publication date
CN109612333A (en) 2019-04-12

Similar Documents

Publication Publication Date Title
CN109612333B (en) Visual auxiliary guide system for vertical recovery of reusable rocket
US10515458B1 (en) Image-matching navigation method and apparatus for aerial vehicles
US10386188B2 (en) Geo-location or navigation camera, and aircraft and navigation method therefor
CN110426046B (en) Unmanned aerial vehicle autonomous landing runway area obstacle judging and tracking method
CN103149939B (en) A kind of unmanned plane dynamic target tracking of view-based access control model and localization method
CN110221625B (en) Autonomous landing guiding method for precise position of unmanned aerial vehicle
US20110282580A1 (en) Method of image based navigation for precision guidance and landing
CN113657256B (en) Unmanned aerial vehicle sea-air cooperative vision tracking and autonomous recovery method
Hosseinpoor et al. Pricise target geolocation and tracking based on UAV video imagery
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
US20190197908A1 (en) Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets
US20220043112A1 (en) Doppler radar flock detection systems and methods
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN109597432B (en) Unmanned aerial vehicle take-off and landing monitoring method and system based on vehicle-mounted camera unit
KR101883188B1 (en) Ship Positioning Method and System
CN113340272B (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
Xiao-Hong et al. UAV's automatic landing in all weather based on the cooperative object and computer vision
Kim et al. Target detection and position likelihood using an aerial image sensor
CN114973037B (en) Method for intelligently detecting and synchronously positioning multiple targets by unmanned aerial vehicle
US20200382903A1 (en) System and method for navigation and geolocation in gps-denied environments
Medeiros et al. A computer vision system for guidance of vtol uavs autonomous landing
CN115857520A (en) Unmanned aerial vehicle carrier landing state monitoring method based on combination of vision and ship state
CN115855041A (en) Agricultural robot positioning method, system and device
JP2746487B2 (en) Aircraft position measurement method for vertical take-off and landing aircraft
Li et al. Rapid star identification algorithm for fish-eye camera based on PPP/INS assistance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant