CN116503768A - Aerial docking method and device for flight equipment - Google Patents

Aerial docking method and device for flight equipment Download PDF

Info

Publication number
CN116503768A
CN116503768A CN202310777034.0A CN202310777034A CN116503768A CN 116503768 A CN116503768 A CN 116503768A CN 202310777034 A CN202310777034 A CN 202310777034A CN 116503768 A CN116503768 A CN 116503768A
Authority
CN
China
Prior art keywords
docking
equipment
aerial
flying
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310777034.0A
Other languages
Chinese (zh)
Other versions
CN116503768B (en
Inventor
张警吁
郑亚骅
孙向红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Psychology of CAS
Original Assignee
Institute of Psychology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Psychology of CAS filed Critical Institute of Psychology of CAS
Priority to CN202310777034.0A priority Critical patent/CN116503768B/en
Publication of CN116503768A publication Critical patent/CN116503768A/en
Application granted granted Critical
Publication of CN116503768B publication Critical patent/CN116503768B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D39/00Refuelling during flight
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides an aerial docking method and device of flight equipment, which relate to the technical field of data processing, and comprise the steps of obtaining first information, wherein the first information comprises operation parameters of all the flight equipment to be docked, and processing the first information to obtain an aerial docking scheme of the flight equipment; performing aerial docking of the flight equipment based on the aerial docking scheme of the flight equipment to obtain docking image information; sending the butt joint image information to an image processing module to obtain key frame image information of air butt joint; performing feature recognition on the key frame image information of the aerial docking, and performing error analysis and parameter adjustment based on the features to obtain an adjusted aerial docking scheme; and carrying out air docking of the flying equipment based on the adjusted air docking scheme until the flying equipment completes the air docking. The invention can reduce the investment of manpower and material resources and improve the aerial docking efficiency.

Description

Aerial docking method and device for flight equipment
Technical Field
The invention relates to the technical field of data processing, in particular to an air docking method and device of flight equipment.
Background
The air docking technology of the flying equipment is a main guarantee that air force of various countries fights in the global scope, and fighters, bombers, reconnaissance plane and transport plane can be refueled through air docking so as to increase the range. When the flying equipment needs to land on the ground for oiling, the fight capability of the flying equipment is greatly influenced in the fight, the flying equipment has a visual field blind area, the manual control of the docking of the flying equipment is very difficult, and a pilot with great experience is required to perform the aerial docking, so that a control method and equipment capable of automatically determining the oiling equipment and automatically performing docking adjustment are urgently needed to reduce the investment of manpower and material resources and improve the aerial docking efficiency.
Disclosure of Invention
The invention aims to provide an aerial docking method and device for flying equipment, so as to solve the problems. In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
in a first aspect, the present application provides an aerial docking method for a flying device, including:
acquiring first information, wherein the first information comprises operation parameters of all flight equipment to be docked, and the operation parameters of the flight equipment comprise flight speed parameters, flight height parameters and position parameters of the flight equipment;
the first information is sent to a docking planning model of the flying equipment to be processed, and an aerial docking scheme of the flying equipment is obtained;
performing aerial docking of the flight equipment based on the aerial docking scheme of the flight equipment, and acquiring aerial docking image information of the flight equipment;
transmitting the aerial docking image information of the flying equipment to an image processing module to obtain key frame image information of aerial docking of the flying equipment;
performing feature recognition on the key frame image information of the aerial docking of the flight equipment, and performing error analysis and parameter adjustment based on the recognized features to obtain an adjusted aerial docking scheme;
and carrying out air docking of the flying equipment based on the adjusted air docking scheme until the flying equipment completes the air docking.
In a second aspect, the present application further provides an aerial docking device for a flying apparatus, including:
the system comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring first information, the first information comprises operation parameters of all flight equipment to be docked, and the operation parameters of the flight equipment comprise flight speed parameters, flight height parameters and position parameters of the flight equipment;
the first processing unit is used for sending the first information to a docking planning model of the flying equipment for processing to obtain an aerial docking scheme of the flying equipment;
the second processing unit is used for carrying out aerial docking of the flight equipment based on the aerial docking scheme of the flight equipment, and acquiring aerial docking image information of the flight equipment;
the third processing unit is used for sending the aerial docking image information of the flying equipment to the image processing module to obtain the aerial docking key frame image information of the flying equipment;
the fourth processing unit is used for carrying out feature recognition on the key frame image information of the aerial docking of the flight equipment, carrying out error analysis and parameter adjustment based on the recognized features, and obtaining an adjusted aerial docking scheme;
and the fifth processing unit is used for carrying out the aerial docking of the flying equipment based on the adjusted aerial docking scheme until the flying equipment completes the aerial docking.
The beneficial effects of the invention are as follows:
according to the invention, the flight route of the flight equipment is planned, the most suitable flight equipment for docking is rapidly determined, so that the docking of the two flight equipment can be efficiently and rapidly carried out, the docking image of the equipment in the air is used for judging whether the docking can be completed or not through feature recognition, linear regression analysis is carried out on the basis of the errors compared by the features, the success rate of each error is determined, then parameter adjustment is carried out on the basis of the success rate corresponding to each error, the docking success rate is improved while the errors are reduced, and the docking efficiency is ensured.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the embodiments of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of an aerial docking method of a flying device according to an embodiment of the present invention;
FIG. 2 is a schematic structural view of an aerial docking device for a flying apparatus according to an embodiment of the present invention;
FIG. 3 is a schematic view of a flying device and docking device according to an embodiment of the present invention;
FIG. 4 is a schematic view of the shape of a cone of light emitted by the light source emitting apparatus according to an embodiment of the present invention;
fig. 5 is a schematic view of an image of a reflective film on a reflective film of a cone sleeve according to an embodiment of the present invention.
The marks in the figure: 1. a docking device; 2. flying equipment to be docked; 3. a light source emitting device; 4. a taper sleeve; 5. a cone sleeve reflective film; 6. an oil delivery pipe; 7. a first arrow; 8. a first center point; 9. a second arrow; 10. a second center point; 701. an acquisition unit; 702. a first processing unit; 703. a second processing unit; 704. a third processing unit; 705. a fourth processing unit; 706. a fifth processing unit; 7021. a first processing subunit; 7022. a second processing subunit; 7023. a third processing subunit; 7024. a first analysis subunit; 7025. a fourth processing subunit; 70241. a fifth processing subunit; 70242. a first optimization subunit; 70243. a second optimization subunit; 70244. a third optimization subunit; 7041. a sixth processing subunit; 7042. a seventh processing subunit; 7043. an eighth processing subunit; 7051. a first identification subunit; 7052. a second identification subunit; 7053. a third recognition subunit; 7054. a fourth recognition subunit; 7055. a ninth processing subunit; 7056. a second analysis subunit; 7057. tenth processing subunit.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Example 1:
the embodiment provides an aerial docking method of flight equipment.
Referring to fig. 1, the method is shown to include steps S1, S2, S3, S4, S5 and S6.
Step S1, acquiring first information, wherein the first information comprises operation parameters of all flight equipment to be docked, and the operation parameters of the flight equipment comprise flight speed parameters, flight height parameters and position parameters of the flight equipment;
it can be understood that the step is to determine the position, the flying speed and the flying height of each flying device by determining the parameter information of all flying devices needing to be refueled, and rapidly determine the device capable of refueled for the flying device, so as to plan a refueled path, improve refueled efficiency and reduce refueled time.
S2, the first information is sent to a docking planning model of the flying equipment to be processed, and an aerial docking scheme of the flying equipment is obtained;
it can be understood that the step of data conversion is performed on the parameter information of the flying device, so that the most suitable refueling device and the most suitable air refueling path are rapidly determined, and the refueling efficiency is ensured, and in the step of step, step S2 comprises step S21, step S22, step S23, step S24 and step S25.
S21, performing data conversion processing on flight speed parameters, flight height parameters and position parameters of the flight equipment to obtain a path planning diagram, wherein the path planning diagram comprises nodes and edges, the nodes comprise position information of the docking equipment, and the edges are path information from the docking equipment to the flight equipment;
it will be appreciated that the path plan in this step includes attribute information such as the location of the flying device, the altitude of the flying device, the speed of the flying device, etc., and the edges include attribute information such as the start point, the end point, the path between the two points, the time required by the flying device, etc. The generated path planning diagram can intuitively display information such as the connection with equipment butted by the flight equipment, the butting path and the like, and is convenient for subsequent path planning and optimization.
S22, performing adjacency matrix conversion on the path planning diagram to obtain an association relation matrix of each piece of butt joint equipment and the first information;
it will be understood that the adjacency matrix in this step is a method for representing the graph, and the association relationship between nodes in the graph is described by using the matrix. The method comprises the steps of converting a sightseeing directed graph into an adjacent matrix to obtain an incidence relation matrix between the flying equipment and all the butting equipment, wherein elements in the matrix represent the weight of edges between adjacent nodes or the strength of the incidence relation. The association relationships and weights can be used for representing the suitability between the flying device and all the docking devices, and the docking devices and the docking routes are prepared for subsequent determination, so that the docking efficiency is improved.
S23, performing matrix decomposition processing on the incidence relation matrix to obtain potential feature vectors of each piece of butt joint equipment and the first information;
it will be appreciated that the matrix decomposition of each of the discs in this step is a linear algebraic operation that decomposes a matrix into a plurality of sub-matrices which are multiplied to obtain the eigenvectors and eigenvalues of each sub-matrix. In the step, potential feature vectors between the flying device and all the docking devices, namely feature representations between each flying device and all the docking devices, can be obtained by carrying out matrix decomposition processing on the incidence relation matrix, and the feature representations are used for reflecting the similarity and the contact degree between the flying device and all the docking devices. Potential feature vectors between the flying device and all the docking devices are obtained through matrix decomposition, so that the similarity and the degree of connection between the flying device and all the docking devices can be effectively described, basic data support is provided for a subsequent optimal docking path, docking efficiency is improved, and docking success rate is guaranteed.
Step S24, the potential feature vectors of each piece of docking equipment and the first information are sent to a trained neural network model to analyze an optimal docking path, and the optimal docking path is obtained;
it can be understood that in this step, the trained neural network selects an optimal docking path from paths corresponding to different docking devices, so as to ensure docking efficiency and docking success rate, where step S24 includes step S241, step S242, step S243, and step S244.
S241, classifying preset historical potential feature vectors and historical optimal docking paths to obtain a training set and a verification set;
step S242, inputting the training set into a BP neural network model for prediction, and processing all the predicted docking paths as particle swarm parameters of a particle swarm optimization algorithm, wherein the fitness value of each particle is calculated through the particle swarm optimization algorithm, and the individual optimal position and the global optimal position of the particle are obtained according to the fitness of the particle in the particle swarm;
step S243, dynamically tracking the individual optimal position and the global optimal position of the particles to continuously update the speed and the position of all the particles until the particle swarm optimization algorithm reaches the maximum iteration times, and obtaining an optimized BP neural network model;
step S244, the verification set is sent to the optimized BP neural network model to obtain a prediction result, whether the prediction result is consistent with the verification set or not is judged, and if so, the optimized BP neural network model is used as a trained neural network model.
It can be understood that in this step, all the docking paths are predicted through the BP neural network model, and the docking speed and the docking position of each docking path are determined, so that the docking path corresponding to the fastest docking speed and the closest docking position is selected as the docking path through the particle swarm optimization algorithm, and the selected docking path is ensured to be a path capable of efficiently completing the docking task.
And S25, integrating according to the optimal docking path and a preset docking step of the aerial equipment to obtain an aerial docking scheme of the aerial equipment.
It can be appreciated that this step obtains all docking schemes by combining the optimal docking path with the docking step of the preset aerial device. Then, the system performs matrix transformation processing on all the docking schemes to obtain a docking simulation scene set, wherein each scene comprises a docking step and an optimal docking path set, and an aerial docking scheme of the flight equipment is obtained based on simulation of the simulation equipment.
S3, carrying out aerial docking of the flight equipment based on the aerial docking scheme of the flight equipment, and acquiring aerial docking image information of the flight equipment;
it can be understood that the step is to perform aerial docking of the flying device according to an aerial docking scheme of the flying device, obtain docking video information in a docking process, monitor the docking process in real time, judge whether the docking is completed while judging whether a docking danger occurs, wherein the docking can be judged to be completed by setting docking features at a docking interface, if the docking features are completely matched, the docking is judged to be completed, and if the docking features have errors, the docking is successfully judged, so that the docking is ensured to be completed.
It will be appreciated that the first acquisition procedure of the docking image information of the flying device in this step may be as follows: firstly, after the docking device 1 and the flight device to be docked arrive at a designated position, as shown in fig. 3, the docking device 1 emits a conical light beam through a light source emitting device 3 installed on the oil pipeline 6 to irradiate, whether the light beam irradiates on the cone sleeve reflecting film 5 is judged, the reflecting film 5 is used as a presentation information carrier to present indication information carried by the conical light beam, wherein the pattern emitted by the conical light beam is shown in fig. 4, secondly, the docking device 1 collects the formed pattern of the conical light beam on the cone sleeve reflecting film 5, and sends the reflecting pattern to an image processing module to be processed, wherein if the docking device 1 cannot adopt electronic equipment, a pilot can also observe the arrow direction on the cone sleeve reflecting film 5 manually, the flight position of the flight device 1 is adjusted based on the arrow direction, if a first arrow 7 points to the right, the flight device 1 is adjusted to the right until a first center point 8 on the conical light beam is aligned with the center of the reflecting film, and the adjustment is completed.
It will be appreciated that the second process of capturing the image information of the docking of the flying device in this step may also be as follows: firstly, after the docking device 1 and the flying device to be docked arrive at a specified position, as shown in fig. 3, the docking device 1 is provided with a light source emitting device 3 on the oil delivery pipe 6, the light source emitting device 3 emits a beam of light to irradiate the cone sleeve reflecting film 5, wherein the light source emitting device 3 is internally provided with two light beam emitting devices, one light beam emits a conical light beam carrying indication information, and the other light beam emits a single light beam without indication information, and the process adopts the way of independently emitting a beam of light to irradiate the cone sleeve reflecting film 5, wherein the cone sleeve reflecting film 5 is provided with a reflecting pattern, wherein the cone sleeve reflecting film 5 is provided with indication information capable of highlighting and reflecting, when no specific light source irradiates, the reflecting indication information on the cone sleeve reflecting film is not easy to perceive, and when the light source irradiates the cone sleeve reflecting film, the highlighting indication information on the cone sleeve reflecting film can be clearly presented. The light reflection pattern is shown in fig. 5, and the light reflection pattern is sent to an image processing module for processing, where if the electronic device cannot be adopted in the docking device 1, the pilot can also observe the direction of the arrow on the cone sleeve light reflection film 5 manually, and adjust the flight position of the flight device 1 based on the direction of the arrow, if the second arrow 9 points to the upper right, the flight device 1 is adjusted to fly upward right until the second center point 10 on the flight device 1 is aligned with the center point of the light reflection film, and the adjustment is completed.
The acquisition flow of the two kinds of flying equipment docking image information is determined based on whether the reflective film installed on the actual flying equipment is the reflective film with the arrow indication or the reflective film without the arrow indication, if the reflective film without the arrow indication is installed, a first acquisition mode is adopted, and if the reflective film with the arrow indication is installed, a second acquisition mode is adopted.
S4, sending the aerial docking image information of the flying equipment to an image processing module to obtain key frame image information of aerial docking of the flying equipment;
it can be understood that the step is to obtain the key frame image information for controlling the docking, wherein the key frame is the image which can judge whether the docking is successful or not when the docking is performed to the end. In this step, step S4 includes step S41, step S42, and step S43.
Step S41, performing image recognition processing on the aerial docking image information of the flying equipment, wherein the aerial docking image information of all the flying equipment is respectively compared with the image of the docking equipment of the preset flying equipment to obtain at least one piece of image information of the docking equipment containing the flying equipment;
step S42, extracting video contents in a preset time period before and after image information of the docking equipment of the flight equipment to obtain at least one video segment containing the docking equipment;
and step S43, marking each video segment, sorting the marked videos according to the time before and after acquisition, traversing the sorting position in the last video segment, and taking all images in the last video segment as key frame images.
It can be understood that the image including the docking device is acquired through image recognition, and then video clips before and after the docking device image are acquired to judge whether docking is completed or not, wherein the last docking image is selected, so that the calculated amount of image recognition is reduced, and the recognition efficiency is improved.
S5, carrying out feature recognition on the key frame image information of the aerial docking of the flight equipment, and carrying out error analysis and parameter adjustment based on the recognized features to obtain an adjusted aerial docking scheme;
it can be understood that the step is to identify the device features in the key frame image to determine whether the docking is completed based on the features, and whether the docking scheme needs to be adjusted, and in this step, step S5 includes step S51,
Step S51, carrying out gray level transformation on all the key frame image information to obtain gray level images corresponding to the key frame images of each frame;
step S52, connecting the pixel points with the same gray value in the gray image, wherein the connecting line is subjected to interpolation processing by adopting a linear interpolation method to obtain the outline image of the key frame image;
step S53, a two-dimensional space rectangular coordinate system is established by taking a central point of the key frame image information as a coordinate origin, and contour images of the key frame images are sent to the two-dimensional space rectangular coordinate system to be calculated to obtain coordinate values corresponding to each contour image;
and S54, clustering all R, G, B components of the key frame image information corresponding to the contour image respectively, averaging all obtained clustering clusters to obtain an average value of the central points of all clustering clusters, and taking the average value as the color characteristic of the key frame image information corresponding to the contour image.
It can be understood that the step identifies the outline feature, the color feature and the position feature in the key frame image by performing feature identification processing on the key frame image, and further judges whether the docking operation is completed by identifying whether three features of the docking device are in a preset state, so as to achieve the purpose of rapid identification, and performs parameter adjustment on the features if the docking operation is not completed, thereby ensuring that the docking device can be docked stably.
It is understood that in this step, step S5 further includes step S55, step S56, and step S57.
Step S55, comparing the contour image of the identified key frame image, the coordinate value corresponding to each contour image and the color characteristic corresponding to each contour image with the characteristic parameters of the preset docking equipment respectively to obtain the error information of the identified characteristic and the characteristic parameters of the preset docking equipment;
step S56, performing linear regression analysis processing on the error information and the preset historical docking information of the flight equipment to obtain a docking success rate relation corresponding to each error information;
and step S57, carrying out constraint optimization processing by utilizing a Lagrangian multiplier method according to the corresponding butt joint success rate relation of each error information, and obtaining an adjusted air butt joint scheme.
It can be understood that the step is to compare the contour image of the identified key frame image, the coordinate value corresponding to each contour image and the color feature corresponding to each contour image with the preset characteristic parameters of the docking device respectively, so as to determine the error therein, and then perform linear regression analysis based on the error and the preset historical docking information of the flying device, wherein the linear regression is a common statistical method and can be used for establishing a linear relation model between the independent variable and the dependent variable.
The step also comprises the steps of constructing constraint conditions by utilizing the linear relation between the error information and the butt joint success rate corresponding to each error, and obtaining an adjusted air butt joint scheme by solving a Lagrangian function so as to realize the aim of successful butt joint. The calculation process is as follows:
firstly, defining x as error information, then defining success rate corresponding to the error information as g (x), and according to the expected utilization relation obtained in step S56, expressing g (x) as a form of a set of linear equations, namely:
wherein the method comprises the steps of,/>,...,/>For each coefficient corresponding to error information, +.>,/>,...,/>Is error data. This is a constrained optimization problem, which is then translated into an unconstrained optimization problem:
wherein lambda is Lagrangian multiplier, U is a preset threshold, g (x) is a success rate corresponding to definition error information; the method comprises the steps of obtaining an objective function of an optimization problem, solving an optimal solution of the objective function by a numerical optimization method, obtaining the minimum error of completing the butt joint in percentage, and adjusting parameters larger than the minimum error, for example, moving and adjusting a butt joint structure of the flight equipment to the left and the lower side with the characteristic position at the right, wherein the adjustment value is an error value, and further, the butt joint success rate is improved while the error is reduced.
And S6, carrying out air docking on the flying equipment based on the adjusted air docking scheme until the flying equipment completes the air docking.
Example 2:
as shown in fig. 2, the present embodiment provides an aerial docking device for a flying apparatus, which includes an acquisition unit 701, a first processing unit 702, a second processing unit 703, a third processing unit 704, a fourth processing unit 705, and a fifth processing unit 706.
An obtaining unit 701, configured to obtain first information, where the first information includes operation parameters of all flight devices that need to be docked, and the operation parameters of the flight devices include a flight speed parameter, a flight altitude parameter, and a position parameter of the flight devices;
the first processing unit 702 is configured to send the first information to a docking planning model of the flying device for processing, so as to obtain an aerial docking scheme of the flying device;
the first processing unit 702 includes a first processing subunit 7021, a second processing subunit 7022, a third processing subunit 7023, a first analysis subunit 7024, and a fourth processing subunit 7025.
A first processing subunit 7021, configured to perform data conversion processing on the flight speed parameter, the flight altitude parameter, and the position parameter of the flight device to obtain a path planning chart, where the path planning chart includes nodes and edges, the nodes include position information of the docking device, and the edges are path information from the docking device to the flight device;
a second processing subunit 7022, configured to perform adjacency matrix conversion on the path planning chart, so as to obtain an association relationship matrix of each docking device and the first information;
a third processing subunit 7023, configured to perform matrix decomposition processing on the association relation matrix to obtain potential feature vectors of each docking device and the first information;
a first analysis subunit 7024, configured to send the potential feature vectors of each docking device and the first information to the trained neural network model to perform an analysis on an optimal docking path, so as to obtain an optimal docking path;
the first analysis subunit 7024 includes a fifth processing subunit 70241, a first optimization subunit 70242, a second optimization subunit 70243, and a third optimization subunit 70244.
A fifth processing subunit 70241, configured to classify a preset historical potential feature vector and a historical optimal docking path to obtain a training set and a verification set;
a first optimization subunit 70242, configured to input the training set to a BP neural network model for predicting a docking path, and process all docking paths obtained by prediction as particle swarm parameters of a particle swarm optimization algorithm, where an fitness value of each particle is calculated by the particle swarm optimization algorithm, and an individual optimal position and a global optimal position of the particle are obtained according to the fitness of the particle in the particle swarm;
the second optimizing subunit 70243 is configured to dynamically track the individual optimal position and the global optimal position of the particles to continuously update the speeds and positions of all the particles until the particle swarm optimization algorithm reaches the maximum iteration number, thereby obtaining an optimized BP neural network model;
and the third optimizing subunit 70244 is configured to send the verification set to the optimized BP neural network model to obtain a prediction result, determine whether the prediction result is consistent with the verification set, and if so, use the optimized BP neural network model as a trained neural network model.
The fourth processing subunit 7025 is configured to integrate the optimal docking path with a preset docking step of the aerial device, so as to obtain an aerial docking scheme of the aerial device.
A second processing unit 703, configured to perform aerial docking of the flight device based on the aerial docking scheme of the flight device, and acquire aerial docking image information of the flight device;
a third processing unit 704, configured to send the aerial docking image information of the flying device to an image processing module, to obtain key frame image information of aerial docking of the flying device;
wherein the third processing unit 704 includes a sixth processing subunit 7041, a seventh processing subunit 7042, and an eighth processing subunit 7043.
A sixth processing subunit 7041, configured to perform image recognition processing on the aerial docking image information of the flying device, where the aerial docking image information of all the flying devices is respectively compared with an image of a docking device of a preset flying device, so as to obtain image information of at least one docking device including the flying device;
a seventh processing subunit 7042, configured to extract video content in a preset time period before and after the image information of the docking device of the flight device, to obtain at least one video segment including the docking device;
the eighth processing subunit 7043 is configured to mark each video segment, sort the marked video according to the collected time, traverse the video segment with the sorting position at the last video segment, and use all the images with the sorting position at the last video segment as key frame images.
A fourth processing unit 705, configured to identify features of key frame image information of the aerial docking of the flight device, and perform error analysis and parameter adjustment based on the identified features, to obtain an adjusted aerial docking scheme;
the fourth processing unit 705 includes a first recognition subunit 7051, a second recognition subunit 7052, a third recognition subunit 7053, and a fourth recognition subunit 7054.
A first recognition subunit 7051, configured to perform gray-level transformation on all the key frame image information to obtain a gray-level image corresponding to each frame of the key frame image;
a second recognition subunit 7052, configured to perform connection on pixel points with the same gray value in the gray image, where a linear interpolation method is used to perform interpolation processing on the connection line, so as to obtain a contour image of the key frame image;
the third recognition subunit 7053 is configured to establish a two-dimensional space rectangular coordinate system with the center point of the key frame image information as a coordinate origin, and send the contour image of the key frame image to the two-dimensional space rectangular coordinate system to calculate to obtain a coordinate value corresponding to each contour image;
and a fourth recognition subunit 7054, configured to perform clustering processing on three components R, G, B of all pixel points of the key frame image information corresponding to the contour image, perform average calculation on center points of all obtained clusters, obtain an average value of center points of all clusters, and use the average value as a color feature of the key frame image information corresponding to the contour image.
Wherein the fourth processing unit 705 further comprises a ninth processing subunit 7055, a second analysis subunit 7056, and a tenth processing subunit 7057.
A ninth processing subunit 7055, configured to compare the contour image of the identified key frame image, the coordinate value corresponding to each contour image, and the color feature corresponding to each contour image with a preset feature parameter of the docking device, to obtain error information of the identified feature and the preset feature parameter of the docking device;
the second analysis subunit 7056 is configured to perform linear regression analysis processing on the error information and the preset historical docking information of the flight device, so as to obtain a docking success rate relationship corresponding to each error information;
and a tenth processing subunit 7057, configured to perform constraint optimization processing by using a lagrangian multiplier method according to the docking success rate relationship corresponding to each error information, so as to obtain an adjusted air docking scheme.
And a fifth processing unit 706, configured to perform aerial docking of the flying device based on the adjusted aerial docking scheme until the aerial docking of the flying device is completed.
It should be noted that, regarding the apparatus in the above embodiments, the specific manner in which the respective modules perform the operations has been described in detail in the embodiments regarding the method, and will not be described in detail herein.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (10)

1. An aerial docking method for flying equipment, comprising:
acquiring first information, wherein the first information comprises operation parameters of all flight equipment to be docked, and the operation parameters of the flight equipment comprise flight speed parameters, flight height parameters and position parameters of the flight equipment;
the first information is sent to a docking planning model of the flying equipment to be processed, and an aerial docking scheme of the flying equipment is obtained;
performing aerial docking of the flight equipment based on the aerial docking scheme of the flight equipment, and acquiring aerial docking image information of the flight equipment;
transmitting the aerial docking image information of the flying equipment to an image processing module to obtain key frame image information of aerial docking of the flying equipment;
performing feature recognition on the key frame image information of the aerial docking of the flight equipment, and performing error analysis and parameter adjustment based on the recognized features to obtain an adjusted aerial docking scheme;
and carrying out air docking of the flying equipment based on the adjusted air docking scheme until the flying equipment completes the air docking.
2. The aerial docking method of claim 1, wherein the sending the first information to a docking planning model of the aerial device for processing, to obtain an aerial docking scheme of the aerial device, comprises:
performing data conversion processing on the flight speed parameter, the flight height parameter and the position parameter of the flight equipment to obtain a path planning diagram, wherein the path planning diagram comprises nodes and edges, the nodes comprise position information of the docking equipment, and the edges are path information from the docking equipment to the flight equipment;
performing adjacency matrix conversion on the path planning diagram to obtain an association relation matrix of each piece of butt joint equipment and the first information;
performing matrix decomposition processing on the incidence relation matrix to obtain potential feature vectors of each piece of docking equipment and the first information;
transmitting the potential feature vectors of each docking device and the first information to a trained neural network model to analyze an optimal docking path, so as to obtain an optimal docking path;
and integrating according to the optimal docking path and the preset docking step of the aerial equipment to obtain an aerial docking scheme of the flying equipment.
3. The flying device air docking method of claim 2, wherein the trained neural network model construction method comprises:
classifying preset historical potential feature vectors and historical optimal docking paths to obtain a training set and a verification set;
inputting the training set into a BP neural network model for predicting the docking paths, and treating all the docking paths obtained by prediction as particle swarm parameters of a particle swarm optimization algorithm, wherein the fitness value of each particle is calculated by the particle swarm optimization algorithm, and the individual optimal position and the global optimal position of the particle are obtained according to the fitness of the particle in the particle swarm;
dynamically tracking the individual optimal position and the global optimal position of the particles to continuously update the speeds and positions of all the particles until the particle swarm optimization algorithm reaches the maximum iteration times, and obtaining an optimized BP neural network model;
and sending the verification set to the optimized BP neural network model to obtain a prediction result, judging whether the prediction result is consistent with the verification set, and taking the optimized BP neural network model as a trained neural network model if the prediction result is consistent with the verification set.
4. The aerial docking method of claim 1, wherein sending the aerial docking image information of the aerial device to an image processing module, to obtain key frame image information of the aerial docking of the aerial device, comprises:
performing image recognition processing on the aerial docking image information of the flying equipment, wherein the aerial docking image information of all the flying equipment is respectively compared with the image of the docking equipment of the preset flying equipment to obtain at least one image information of the docking equipment containing the flying equipment;
extracting video contents in a preset time period before and after image information of docking equipment of the flying equipment to obtain at least one video segment containing the docking equipment;
marking each video segment, sorting marked videos according to the collected time, traversing the video segments at the last sorting position, and taking all images at the last sorting position as key frame images.
5. The aerial docking method of claim 1, wherein the feature recognition of the aerial-docked keyframe image information of the aerial device comprises:
carrying out gray level transformation on all the key frame image information to obtain gray level images corresponding to each frame of key frame image;
connecting lines of pixel points with the same gray value in the gray image, wherein the connecting lines are subjected to interpolation processing by adopting a linear interpolation method to obtain a contour image of the key frame image;
establishing a two-dimensional space rectangular coordinate system by taking a central point of the key frame image information as a coordinate origin, and sending the outline image of the key frame image to the two-dimensional space rectangular coordinate system to calculate and obtain a coordinate value corresponding to each outline image;
and clustering R, G, B components of all pixel points of the key frame image information corresponding to the contour image respectively, averaging all obtained clustering clusters to obtain an average value of all the clustering cluster center points, and taking the average value as the color characteristic of the key frame image information corresponding to the contour image.
6. An aerial docking device for a flying apparatus, comprising:
the system comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring first information, the first information comprises operation parameters of all flight equipment to be docked, and the operation parameters of the flight equipment comprise flight speed parameters, flight height parameters and position parameters of the flight equipment;
the first processing unit is used for sending the first information to a docking planning model of the flying equipment for processing to obtain an aerial docking scheme of the flying equipment;
the second processing unit is used for carrying out aerial docking of the flight equipment based on the aerial docking scheme of the flight equipment, and acquiring aerial docking image information of the flight equipment;
the third processing unit is used for sending the aerial docking image information of the flying equipment to the image processing module to obtain the aerial docking key frame image information of the flying equipment;
the fourth processing unit is used for carrying out feature recognition on the key frame image information of the aerial docking of the flight equipment, carrying out error analysis and parameter adjustment based on the recognized features, and obtaining an adjusted aerial docking scheme;
and the fifth processing unit is used for carrying out the aerial docking of the flying equipment based on the adjusted aerial docking scheme until the flying equipment completes the aerial docking.
7. The flying device air docking apparatus of claim 6, wherein the first processing unit comprises:
the first processing subunit is used for carrying out data conversion processing on the flight speed parameter, the flight height parameter and the position parameter of the flight equipment to obtain a path planning diagram, wherein the path planning diagram comprises nodes and edges, the nodes comprise position information of the docking equipment, and the edges are path information from the docking equipment to the flight equipment;
the second processing subunit is used for conducting adjacent matrix conversion on the path planning diagram to obtain an association relation matrix of each piece of butting equipment and the first information;
the third processing subunit is used for carrying out matrix decomposition processing on the incidence relation matrix to obtain potential feature vectors of each piece of butting equipment and the first information;
the first analysis subunit is used for sending the potential feature vectors of each piece of docking equipment and the first information to the trained neural network model to analyze an optimal docking path so as to obtain the optimal docking path;
and the fourth processing subunit is used for integrating the optimal docking path and the preset docking step of the aerial equipment to obtain an aerial docking scheme of the aerial equipment.
8. The flying device air docking apparatus of claim 7, wherein the first analysis subunit comprises:
the fifth processing subunit is used for classifying the preset historical potential feature vectors and the historical optimal docking paths to obtain a training set and a verification set;
the first optimizing subunit is used for inputting the training set into a BP neural network model for predicting the docking paths, and treating all the docking paths obtained by prediction as particle swarm parameters of a particle swarm optimization algorithm, wherein the fitness value of each particle is calculated through the particle swarm optimization algorithm, and the individual optimal position and the global optimal position of the particle are obtained according to the fitness of the particle in the particle swarm;
the second optimizing subunit is used for dynamically tracking the individual optimal position and the global optimal position of the particles to continuously update the speeds and positions of all the particles until the particle swarm optimization algorithm reaches the maximum iteration times, so as to obtain an optimized BP neural network model;
and the third optimization subunit is used for sending the verification set to the optimized BP neural network model to obtain a prediction result, judging whether the prediction result is consistent with the verification set, and taking the optimized BP neural network model as a trained neural network model if the prediction result is consistent with the verification set.
9. The flying device air docking apparatus of claim 6, wherein the third processing unit comprises:
a sixth processing subunit, configured to perform image recognition processing on the aerial docking image information of the flying device, where the aerial docking image information of all the flying devices is respectively compared with an image of a docking device of a preset flying device, so as to obtain image information of at least one docking device including the flying device;
a seventh processing subunit, configured to extract video content in a preset time period before and after image information of a docking device of the flight device, to obtain at least one video segment including the docking device;
and the eighth processing subunit is used for marking each video segment, sorting marked videos according to the collected time, traversing the sorting position in the last video segment, and taking all images in the last video segment as key frame images.
10. The flying device air docking apparatus of claim 6, wherein the fourth processing unit comprises:
the first identification subunit is used for carrying out gray level transformation on all the key frame image information to obtain a gray level image corresponding to each frame of key frame image;
the second recognition subunit is used for connecting the pixel points with the same gray value in the gray image, wherein the connecting line is subjected to interpolation processing by adopting a linear interpolation method, so that a contour image of the key frame image is obtained;
the third identification subunit is used for establishing a two-dimensional space rectangular coordinate system by taking the central point of the key frame image information as a coordinate origin, and sending the outline image of the key frame image to the two-dimensional space rectangular coordinate system to calculate and obtain a coordinate value corresponding to each outline image;
and the fourth recognition subunit is used for respectively carrying out clustering processing on R, G, B components of all pixel points of the key frame image information corresponding to the contour image, carrying out average value calculation on the central points of all obtained clustering clusters to obtain an average value of the central points of all the clustering clusters, and taking the average value as the color characteristic of the key frame image information corresponding to the contour image.
CN202310777034.0A 2023-06-29 2023-06-29 Aerial docking method and device for flight equipment Active CN116503768B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310777034.0A CN116503768B (en) 2023-06-29 2023-06-29 Aerial docking method and device for flight equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310777034.0A CN116503768B (en) 2023-06-29 2023-06-29 Aerial docking method and device for flight equipment

Publications (2)

Publication Number Publication Date
CN116503768A true CN116503768A (en) 2023-07-28
CN116503768B CN116503768B (en) 2023-11-07

Family

ID=87330582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310777034.0A Active CN116503768B (en) 2023-06-29 2023-06-29 Aerial docking method and device for flight equipment

Country Status (1)

Country Link
CN (1) CN116503768B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103822635A (en) * 2014-03-05 2014-05-28 北京航空航天大学 Visual information based real-time calculation method of spatial position of flying unmanned aircraft
WO2016015547A1 (en) * 2014-08-01 2016-02-04 深圳中集天达空港设备有限公司 Machine vision-based method and system for aircraft docking guidance and aircraft type identification
CN111025246A (en) * 2019-11-28 2020-04-17 北京遥测技术研究所 Simulation system and method for composite scene imaging of sea surface and ship by using stationary orbit SAR
CN115660477A (en) * 2022-10-25 2023-01-31 中国农业科学院草原研究所 Mutton quality evaluation method and system based on multiple evaluation indexes
CN115686043A (en) * 2022-10-28 2023-02-03 南京航空航天大学 Fixed-wing aircraft and air docking method of rotor aircraft
CN116310898A (en) * 2023-02-28 2023-06-23 武汉理工大学 Forest fire spread prediction method and system based on neural network and Huygens principle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103822635A (en) * 2014-03-05 2014-05-28 北京航空航天大学 Visual information based real-time calculation method of spatial position of flying unmanned aircraft
WO2016015547A1 (en) * 2014-08-01 2016-02-04 深圳中集天达空港设备有限公司 Machine vision-based method and system for aircraft docking guidance and aircraft type identification
CN111025246A (en) * 2019-11-28 2020-04-17 北京遥测技术研究所 Simulation system and method for composite scene imaging of sea surface and ship by using stationary orbit SAR
CN115660477A (en) * 2022-10-25 2023-01-31 中国农业科学院草原研究所 Mutton quality evaluation method and system based on multiple evaluation indexes
CN115686043A (en) * 2022-10-28 2023-02-03 南京航空航天大学 Fixed-wing aircraft and air docking method of rotor aircraft
CN116310898A (en) * 2023-02-28 2023-06-23 武汉理工大学 Forest fire spread prediction method and system based on neural network and Huygens principle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
于大海;: "高空无人机自动加油对接过程轨迹控制研究", 计算机测量与控制, no. 02, pages 460 - 466 *

Also Published As

Publication number Publication date
CN116503768B (en) 2023-11-07

Similar Documents

Publication Publication Date Title
CN111353413B (en) Low-missing-report-rate defect identification method for power transmission equipment
JP6294615B2 (en) System and method for detection and tracking of moving objects
CN110703800A (en) Unmanned aerial vehicle-based intelligent identification method and system for electric power facilities
Alexandrov et al. Analysis of machine learning methods for wildfire security monitoring with an unmanned aerial vehicles
CN106356757A (en) Method for inspecting electric power lines by aid of unmanned aerial vehicle on basis of human vision characteristics
EP3017403A2 (en) System and method for abnormality detection
CN107014827A (en) Transmission line of electricity defect analysis method based on image processing, device and system
CN109387828A (en) Automatic detection and avoidance system
KR102346676B1 (en) Method for creating damage figure using the deep learning-based damage image classification of facility
CN107067026A (en) Electrical equipment fault detection method based on deep neural network
CN108229587A (en) A kind of autonomous scan method of transmission tower based on aircraft floating state
CN112101088A (en) Automatic unmanned aerial vehicle power inspection method, device and system
CN109389105B (en) Multitask-based iris detection and visual angle classification method
CN112180955A (en) Visual feedback-based secondary rechecking method and system for automatically polling unmanned aerial vehicle
CN111354028B (en) Binocular vision-based power transmission channel hidden danger identification and tracking method
CN115649501A (en) Night driving illumination system and method for unmanned aerial vehicle
CN116503768B (en) Aerial docking method and device for flight equipment
CN110220851A (en) A kind of air pollution emission detection method and system based on unmanned plane
Kerle et al. UAV-based structural damage mapping–Results from 6 years of research in two European projects
CN116866520B (en) AI-based monorail crane safe operation real-time monitoring management system
CN115393738A (en) Unmanned aerial vehicle-based PAPI flight verification method and system
CN113192017A (en) Package defect identification method, device, equipment and storage medium
Agnihotram et al. Combination of advanced robotics and computer vision for shelf analytics in a retail store
CN116954264A (en) Distributed high subsonic unmanned aerial vehicle cluster control system and method thereof
CN115297303B (en) Image data acquisition and processing method and device suitable for power grid power transmission and transformation equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant