CN116700324A - Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment - Google Patents

Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment Download PDF

Info

Publication number
CN116700324A
CN116700324A CN202310536236.6A CN202310536236A CN116700324A CN 116700324 A CN116700324 A CN 116700324A CN 202310536236 A CN202310536236 A CN 202310536236A CN 116700324 A CN116700324 A CN 116700324A
Authority
CN
China
Prior art keywords
information
unmanned aerial
aerial vehicle
plane
vehicle formation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310536236.6A
Other languages
Chinese (zh)
Inventor
刘泽峰
龚晶
熊嵩
李慧盈
刘君泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Era Feipeng Co ltd
Original Assignee
Aerospace Era Feipeng Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Era Feipeng Co ltd filed Critical Aerospace Era Feipeng Co ltd
Priority to CN202310536236.6A priority Critical patent/CN116700324A/en
Publication of CN116700324A publication Critical patent/CN116700324A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Optical Communication System (AREA)

Abstract

The application provides an unmanned aerial vehicle formation information transmission method, a system and electronic equipment, wherein the unmanned aerial vehicle formation information transmission method comprises the following steps: presetting a machine body mark for a long machine and a bureau; presetting signal lamps, an image acquisition device and an information processing device for a plane; the long machine transmits decision information through the signal lamp and/or the machine body mark; the application can still complete the unmanned aerial vehicle cooperative formation flight in real time and high efficiency under the condition of signal blockage or communication interference, and can carry out formation task decision, relative position adjustment and formation task execution under the condition of departing from ground control.

Description

Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment
[ field of technology ]
The application relates to the technical field of ferrous metallurgy, in particular to an unmanned aerial vehicle formation information transmission method, an unmanned aerial vehicle formation information transmission system and electronic equipment.
[ background Art ]
Unmanned aerial vehicles are an important aircraft, and imaging the ground by mounting an optical imaging sensor is a very important function. According to different designs, the unmanned aerial vehicle has the common working height of 10-20000 meters, and compared with other platforms, the unmanned aerial vehicle platform imaging has the characteristic of high resolution, can image the same target and scene for multiple times, forms multi-view and multi-position imaging, and plays an important role in the military and civil fields. Applications such as topographic mapping, urban building imaging, environmental remote sensing and the like based on unmanned aerial vehicle platforms are all used for directly extracting information through optical imaging. In the civil field, based on the advantages of fast task execution speed and high resolution of the unmanned aerial vehicle on-ground remote sensing, the unmanned aerial vehicle is rapidly developed in the aspects of digital city, environment monitoring and the like, and plays an irreplaceable role. For example, the unmanned aerial vehicle cloud low-altitude remote sensing operation plays an important role in the aspects of disaster monitoring, commanding and disaster relief decision and the like after earthquake.
Compared with single-machine flight, unmanned aerial vehicle cooperative formation flight has incomparable advantages. For unmanned aerial vehicle image analysis, direct two-dimensional image analysis mainly identifies targets such as roads and vehicles in the images, or analyzes the relative positions of the targets and certain specified targets, and the emphasis is still on the extraction and identification of ground features. The three-dimensional information of the target is acquired from the unmanned aerial vehicle image, and the method has qualitative leaps relative to two-dimensional image analysis, so that the method is an important direction for unmanned aerial vehicle image research and application. The unmanned plane platform can obtain accurate information such as target position, morphology, three-dimensional structure and the like through three-dimensional information analysis of the sequence image, and has important significance for modern war and remote sensing mapping. For unmanned aerial vehicle formation vision coordination, the precondition of three-dimensional reconstruction is that image information acquired by different planes is correctly transmitted.
Information interaction is a very important technology in unmanned aerial vehicle cooperative formation flight, and is used for transmitting control commands to unmanned aerial vehicles and receiving feedback information sent back by the unmanned aerial vehicles. At present, most unmanned aerial vehicles conduct information interaction through Wi-Fi, bluetooth, radio waves and the like, and the interactive range is generally limited to about 1 km. Besides the limited interaction distance, signals of the information interaction modes are usually easy to be blocked by barriers, and the anti-interference capability is poor.
The existing formation communication means mainly communicate through a data link, including point-to-point data link or wireless broadcast serial port and other distributed communication among the aircrafts, and centralized communication among the aircrafts and the ground; the most direct problems in the prior art are that interference is easy, formation information cannot be interacted after the interference is caused, the cost of an anti-interference data link is too high, and the technical threshold is high; and the transmission means is single and is easy to limit. The technical difficulty of developing unmanned aerial vehicle communication is that a traditional communication mode needs to be abandoned, and a new communication mode may have limited transmission efficiency and needs technical support of intelligent algorithms such as pattern recognition and the like.
Accordingly, there is a need to develop an unmanned aerial vehicle formation information delivery method, system, and electronic device to address the deficiencies of the prior art, to solve or mitigate one or more of the problems described above.
[ application ]
In view of the above, the application provides an unmanned aerial vehicle formation information transmission method, system and electronic equipment, which can still complete unmanned aerial vehicle cooperative formation flight in real time and high efficiency under the condition of signal blockage or communication interference, and can carry out formation task decision, relative position adjustment and formation task execution under the condition of departing from ground control.
In one aspect, the present application provides an unmanned aerial vehicle formation information transmission method, which includes:
presetting a machine body mark for a long machine and a bureau;
presetting signal lamps, an image acquisition device and an information processing device for a plane;
the long machine transmits decision information through the signal lamp and/or the machine body mark;
the assistant plane acquires the body mark through the image acquisition device and transmits relative position information through the information processing device.
Aspects and any one of the possible implementations as described above, further providing an implementation, the decision information delivery means including signal light delivery and pattern recognition delivery;
the signal lamp transmission is specifically as follows: the feature information of the signal lamps of the long machine and/or other wing machines is acquired by the wing plane through the image acquisition device, and then the corresponding decision information is acquired through the feature information of the signal lamps;
the pattern recognition transfer specifically comprises: the assistant machine acquires behavior information of the long machine through the image acquisition device, and acquires corresponding decision information through the behavior information.
Aspects and any one of the possible implementations as described above, further providing an implementation, the behavior information including movement in a vertical direction and/or a horizontal direction periodically and/or regularly.
Aspects and any one of the possible implementations as described above, further provide an implementation, where the preset body identifier includes:
image features not lower than 1/3 of the area of the plane are arranged on the plane bodies of the long plane and the assistant plane;
matching the type and the number of each unmanned aerial vehicle with the unique information of the image features arranged on the unmanned aerial vehicle;
the result of the unique information matching is stored in the information processing apparatus.
Aspects and any one of the possible implementations as described above, further providing an implementation, the image features including color features and shape features.
In the aspect and any possible implementation manner as described above, there is further provided an implementation manner, where the setting signal lamp, the image acquisition device, and the information processing device include:
a signal lamp is arranged on the unmanned aerial vehicle;
presetting a unique corresponding relation between decision information and characteristic information of a signal lamp in an information processing device;
connecting the image acquisition device and the signal lamp with the information processing device;
the characteristic information of the signal lamp is the color of the signal lamp.
In aspects and any possible implementation manner as described above, there is further provided an implementation manner, where the image capturing device is a depth camera.
Aspects and any one of the possible implementations as described above, further provide an implementation, the relative position information transfer includes:
the bureau acquires the projection positions of the bureau and/or other bureaus through the depth camera;
acquiring relative position information through the projection position;
the relative position information includes a relative azimuth angle and a relative elevation angle.
Aspects and any possible implementation manner as described above, further provide an unmanned aerial vehicle formation information delivery system, the unmanned aerial vehicle formation information delivery system including:
the machine body identifier setting module: the system is used for presetting machine body marks for the long plane and the bureau plane;
the machine body hardware setting module: the system is used for presetting signal lamps, image acquisition devices and information processing devices for the long plane and the bureau plane;
the decision information transmission module is used for transmitting decision information through the signal lamp and/or the machine body mark by the long machine;
and the relative position information transmission module is used for the plane to acquire the plane body identifier through the image acquisition device and transmit the relative position information through the information processing device.
Aspects and any possible implementation manner as described above, further provide an electronic device, including: at least one processor, and; a memory communicatively coupled to the at least one processor, wherein; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform any one of the unmanned aerial vehicle formation information delivery methods.
Compared with the prior art, the application can obtain the following technical effects:
the application can enrich communication means while realizing unmanned aerial vehicle formation distributed communication, and bypasses the possible interference problem of traditional communication, thereby effectively improving the reliability of formation internal communication and ensuring the normal cooperative execution of formation tasks.
Of course, it is not necessary for any of the products embodying the application to achieve all of the technical effects described above at the same time.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an unmanned aerial vehicle formation information transfer method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an electronic device according to an embodiment of the present application.
[ detailed description ] of the application
For a better understanding of the technical solution of the present application, the following detailed description of the embodiments of the present application refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
As shown in fig. 1, the present application provides an unmanned aerial vehicle formation information transmission method, which includes:
presetting a machine body mark for a long machine and a bureau;
presetting signal lamps, an image acquisition device and an information processing device for a plane;
the long machine transmits decision information through the signal lamp and/or the machine body mark;
the assistant plane acquires the body mark through the image acquisition device and transmits relative position information through the information processing device.
Aspects and any one of the possible implementations as described above, further providing an implementation, the decision information delivery means including signal light delivery and pattern recognition delivery;
the signal lamp transmission is specifically as follows: the feature information of the signal lamps of the long machine and/or other wing machines is acquired by the wing plane through the image acquisition device, and then the corresponding decision information is acquired through the feature information of the signal lamps;
the pattern recognition transfer specifically comprises: the assistant machine acquires behavior information of the long machine through the image acquisition device, and acquires corresponding decision information through the behavior information. The behavior information comprises a periodic and/or regular movement in a vertical direction and/or in a horizontal direction.
The preset body mark comprises:
image features not lower than 1/3 of the area of the plane are arranged on the plane bodies of the long plane and the assistant plane;
matching the type and the number of each unmanned aerial vehicle with the unique information of the image features arranged on the unmanned aerial vehicle;
the result of the unique information matching is stored in the information processing apparatus.
The image features include color features and shape features.
The signal lamp, the image acquisition device and the information processing device are arranged, and the signal lamp, the image acquisition device and the information processing device comprise:
a signal lamp is arranged on the unmanned aerial vehicle;
presetting a unique corresponding relation between decision information and characteristic information of a signal lamp in an information processing device;
connecting the image acquisition device and the signal lamp with the information processing device;
the characteristic information of the signal lamp is the color of the signal lamp.
The image acquisition device is a depth camera. The relative position information transfer includes:
the bureau acquires the projection positions of the bureau and/or other bureaus through the depth camera;
acquiring relative position information through the projection position;
the relative position information includes a relative azimuth angle and a relative elevation angle.
Example 1:
the application is applicable to the following communication conditions: low speed multi-rotor, compound-wing or fixed-wing unmanned aerial vehicle, daytime (with general visual conditions).
The unmanned aerial vehicle formation mechanism is as follows: by adopting a long-plane formation mechanism and a star-shaped control structure, all plane planes can listen to the task command of the only long plane, and the formation can be independently decided under the condition of lacking ground station control in principle.
The unmanned aerial vehicle formation communication standard is as follows:
according to the task characteristics of unmanned aerial vehicle formation, main elements of unmanned aerial vehicle formation communication are determined: the unmanned aerial vehicle type and number are included; the relative position between the unmanned aerial vehicles; a formation task execution stage; and forming task decisions.
The application discloses an unmanned aerial vehicle formation information transmission required airborne equipment, which comprises the following components: a fuselage identification; a depth camera; a signal lamp; an image processing computing platform.
The information transmission mode of the application is as follows:
in one particular embodiment, the drone type and number: the machine body identification is transmitted, the machine body identification can occupy a larger area, the characteristic parameters comprise color and shape, and a plurality of identifications can be adopted to respectively represent types and numbers;
in one particular embodiment, the relative positions between the drones: judging relative azimuth angles and high-low angles according to projection positions of other aircrafts on the cameras through the depth camera, adjusting the high-low positions, judging relative distances by the depth camera and the computing platform, and adjusting the horizontal positions according to formation configuration requirements;
in one particular embodiment, the formation task execution phase: typical formation tasks can be divided into a near target area (searching for a target), formation (approaching or dispersing according to a certain formation configuration), operation (aiming at the target unfolding and reconnaissance tasks and the like), light rays with different colors can be displayed at different stages through signal lamps, and other aircraft (mainly a long aircraft) acquire the information through cameras;
formation task decision:
in one embodiment, when the long machine decision formation is shifted to a certain task stage or performs a certain action (e.g. after finding a target, notifying each machine to get close), decision information can be transmitted through a signal lamp.
In a specific embodiment, under the condition of far distance or too strong background illumination, the motion information can be transmitted in a mode identification mode, for example, a long aircraft moves up and down regularly and periodically, other aircraft identify the motion through a camera and a computing platform, further obtain the decision information and execute corresponding tasks.
Corresponding to the above embodiment, the embodiment of the present application also discloses an unmanned aerial vehicle formation information transfer system, which includes:
the machine body identifier setting module: the system is used for presetting machine body marks for the long plane and the bureau plane;
the machine body hardware setting module: the system is used for presetting signal lamps, image acquisition devices and information processing devices for the long plane and the bureau plane;
the decision information transmission module is used for transmitting decision information through the signal lamp and/or the machine body mark by the long machine;
and the relative position information transmission module is used for the plane to acquire the plane body identifier through the image acquisition device and transmit the relative position information through the information processing device.
The parts of this embodiment, which are not described in detail, are referred to the content described in the above method embodiment, and are not described in detail herein.
Referring to fig. 2, an embodiment of the present application also provides an electronic device 60, including:
at least one processor, and;
a memory communicatively coupled to the at least one processor, wherein;
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of multi-rotor and fixed-wing drone formation information transfer in the foregoing method embodiments.
The embodiment of the application also provides a computer program product, which comprises a computer program stored on a non-transitory computer readable storage medium, the computer program comprises program instructions, when the program instructions are executed by a computer, the computer is caused to execute the multi-rotor and fixed-wing unmanned aerial vehicle formation information transmission method of the composite-wing unmanned aerial vehicle in the embodiment of the method.
Referring now to fig. 2, a schematic diagram of an electronic device 60 suitable for use in implementing embodiments of the present application is shown. The electronic device in the embodiment of the present application may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a car-mounted terminal (e.g., car navigation terminal), etc., and a stationary terminal such as a digital TV, a desktop computer, etc. The electronic device shown in fig. 2 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the application.
As shown in fig. 2, the electronic device 60 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic device 60 are also stored. The processing device 601, the ROM602, and the RAM603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, image sensor, microphone, accelerometer, gyroscope, etc.; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 60 to communicate with other devices wirelessly or by wire to exchange data. While an electronic device 60 having various means is shown, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the method of the embodiment of the present application are performed when the computer program is executed by the processing means 601.
The computer readable medium of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects an internet protocol address from the at least two internet protocol addresses and returns the internet protocol address; receiving an Internet protocol address returned by the node evaluation equipment; wherein the acquired internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer-readable medium carries one or more programs that, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof.
The unmanned aerial vehicle formation information transmission method, the unmanned aerial vehicle formation information transmission system and the electronic equipment provided by the embodiment of the application are described in detail. The above description of embodiments is only for aiding in the understanding of the method of the present application and its core ideas; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.
Certain terms are used throughout the description and claims to refer to particular components. Those of skill in the art will appreciate that a hardware manufacturer may refer to the same component by different names. The description and claims do not take the form of an element differentiated by name, but rather by functionality. As referred to throughout the specification and claims, the terms "comprising," including, "and" includes "are intended to be interpreted as" including/comprising, but not limited to. By "substantially" is meant that within an acceptable error range, a person skilled in the art is able to solve the technical problem within a certain error range, substantially achieving the technical effect. The description hereinafter sets forth a preferred embodiment for practicing the application, but is not intended to limit the scope of the application, as the description is given for the purpose of illustrating the general principles of the application. The scope of the application is defined by the appended claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a product or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such product or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a commodity or system comprising such elements.
It should be understood that the term "and/or" as used herein is merely one relationship describing the association of the associated objects, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
While the foregoing description illustrates and describes the preferred embodiments of the present application, it is to be understood that the application is not limited to the forms disclosed herein, but is not to be construed as limited to other embodiments, and is capable of numerous other combinations, modifications and environments and is capable of changes or modifications within the scope of the inventive concept as expressed herein, either as a result of the foregoing teachings or as a result of the knowledge or technology of the relevant art. And that modifications and variations which do not depart from the spirit and scope of the application are intended to be within the scope of the appended claims.

Claims (10)

1. An unmanned aerial vehicle formation information transfer method for a communication interference environment, the unmanned aerial vehicle formation information transfer method comprising:
presetting a machine body mark for a long machine and a bureau;
presetting signal lamps, an image acquisition device and an information processing device for a plane;
the long machine transmits decision information through the signal lamp and/or the machine body mark;
the assistant plane acquires the body mark through the image acquisition device and transmits relative position information through the information processing device.
2. The unmanned aerial vehicle formation information transfer method of claim 1, wherein the decision information transfer mode comprises signal lamp transfer and pattern recognition transfer;
the signal lamp transmission is specifically as follows: the feature information of the signal lamps of the long machine and/or other wing machines is acquired by the wing plane through the image acquisition device, and then the corresponding decision information is acquired through the feature information of the signal lamps;
the pattern recognition transfer specifically comprises: the assistant machine acquires behavior information of the long machine through the image acquisition device, and acquires corresponding decision information through the behavior information.
3. The unmanned aerial vehicle formation information delivery method of claim 2, wherein the behavioral information comprises periodic and/or regular movements in a vertical direction and/or a horizontal direction.
4. The unmanned aerial vehicle formation information delivery method of claim 1, wherein the preset fuselage identification comprises:
image features not lower than 1/3 of the area of the plane are arranged on the plane bodies of the long plane and the assistant plane;
matching the type and the number of each unmanned aerial vehicle with the unique information of the image features arranged on the unmanned aerial vehicle;
the result of the unique information matching is stored in the information processing apparatus.
5. The unmanned aerial vehicle formation information delivery method of claim 4, wherein the image features comprise color features and shape features.
6. The unmanned aerial vehicle formation information delivery method according to claim 2, wherein the setting signal lamp, the image acquisition device, and the information processing device comprise:
a signal lamp is arranged on the unmanned aerial vehicle;
presetting a unique corresponding relation between decision information and characteristic information of a signal lamp in an information processing device;
connecting the image acquisition device and the signal lamp with the information processing device;
the characteristic information of the signal lamp is the color of the signal lamp.
7. The unmanned aerial vehicle formation information transfer method of claim 6, wherein the image acquisition device is a depth camera.
8. The unmanned aerial vehicle formation information delivery method of claim 7, wherein the relative location information delivery comprises:
the bureau acquires the projection positions of the bureau and/or other bureaus through the depth camera;
acquiring relative position information through the projection position;
the relative position information includes a relative azimuth angle and a relative elevation angle.
9. An unmanned aerial vehicle formation information transfer system, characterized in that the unmanned aerial vehicle formation information transfer system includes:
the machine body identifier setting module: the system is used for presetting machine body marks for the long plane and the bureau plane;
the machine body hardware setting module: the system is used for presetting signal lamps, image acquisition devices and information processing devices for the long plane and the bureau plane;
the decision information transmission module is used for transmitting decision information through the signal lamp and/or the machine body mark by the long machine;
and the relative position information transmission module is used for the plane to acquire the plane body identifier through the image acquisition device and transmit the relative position information through the information processing device.
10. An electronic device, the electronic device comprising: at least one processor, and; a memory communicatively coupled to the at least one processor, wherein; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the unmanned aerial vehicle formation information delivery method of any of the preceding claims 1-8.
CN202310536236.6A 2023-05-12 2023-05-12 Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment Pending CN116700324A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310536236.6A CN116700324A (en) 2023-05-12 2023-05-12 Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310536236.6A CN116700324A (en) 2023-05-12 2023-05-12 Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment

Publications (1)

Publication Number Publication Date
CN116700324A true CN116700324A (en) 2023-09-05

Family

ID=87842397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310536236.6A Pending CN116700324A (en) 2023-05-12 2023-05-12 Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment

Country Status (1)

Country Link
CN (1) CN116700324A (en)

Similar Documents

Publication Publication Date Title
CN110019570B (en) Map construction method and device and terminal equipment
CN111448476B (en) Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle
US10282591B2 (en) Systems and methods for depth map sampling
US11365014B2 (en) System and method for automated tracking and navigation
US20200026720A1 (en) Construction and update of elevation maps
AU2019321145A1 (en) Method, device, and equipment for obstacle or ground recognition and flight control, and storage medium
CN110164135B (en) Positioning method, positioning device and positioning system
CN111192341A (en) Method and device for generating high-precision map, automatic driving equipment and storage medium
CN113286081B (en) Target identification method, device, equipment and medium for airport panoramic video
CN109656319B (en) Method and equipment for presenting ground action auxiliary information
CN113009505A (en) Airborne laser radar data acquisition equipment, system and unmanned aerial vehicle aircraft
CN113008237A (en) Path planning method and device and aircraft
US20210357620A1 (en) System, moving object, and information processing apparatus
CN114092660A (en) High-precision map generation method and device and vehicle for generating map
CN109240314A (en) Method and apparatus for acquiring data
CN110470302B (en) Airline display method, apparatus and system, ground station, and computer-readable storage medium
CN217435657U (en) Electrical system of automatic driving vehicle and automatic driving vehicle
CN116700324A (en) Unmanned aerial vehicle formation information transmission method, unmanned aerial vehicle formation information transmission system and electronic equipment
CN113312403B (en) Map acquisition method and device, electronic equipment and storage medium
CN113237464A (en) Positioning system, positioning method, positioner, and storage medium
CN111664860A (en) Positioning method and device, intelligent equipment and storage medium
JP2020198024A (en) Remote control system and remote control method
Ryan et al. Evaluation of small unmanned aerial system highway volume and speed‐sensing applications
EP4250251A1 (en) System and method for detecting and recognizing small objects in images using a machine learning algorithm
CN113625752B (en) Vehicle-mounted six-rotor unmanned aerial vehicle control method and device based on satellite navigation positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination