CN115826622B - Night co-location method for unmanned aerial vehicle group - Google Patents

Night co-location method for unmanned aerial vehicle group Download PDF

Info

Publication number
CN115826622B
CN115826622B CN202310106488.5A CN202310106488A CN115826622B CN 115826622 B CN115826622 B CN 115826622B CN 202310106488 A CN202310106488 A CN 202310106488A CN 115826622 B CN115826622 B CN 115826622B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
camera
color
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310106488.5A
Other languages
Chinese (zh)
Other versions
CN115826622A (en
Inventor
王震
张涛
于登秀
高超
李学龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202310106488.5A priority Critical patent/CN115826622B/en
Publication of CN115826622A publication Critical patent/CN115826622A/en
Application granted granted Critical
Publication of CN115826622B publication Critical patent/CN115826622B/en
Priority to US18/403,503 priority patent/US12085962B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/46Control of position or course in three dimensions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/02Arrangements or adaptations of signal or lighting devices
    • B64D47/06Arrangements or adaptations of signal or lighting devices for indicating aircraft presence
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/249Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/69Coordinated control of the position or course of two or more vehicles
    • G05D1/695Coordinated control of the position or course of two or more vehicles for maintaining a fixed relative position of the vehicles, e.g. for convoy travelling or formation flight
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0008Transmission of traffic-related information to or from an aircraft with other aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0052Navigation or guidance aids for a single aircraft for cruising
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0078Surveillance aids for monitoring traffic from the aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D2203/00Aircraft or airfield lights using LEDs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/102UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] adapted for flying in formations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/20Aircraft, e.g. drones
    • G05D2109/25Rotorcrafts
    • G05D2109/254Flying platforms, e.g. multicopters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention belongs to the technical field of navigation and positioning of aircrafts and discloses a night cooperative positioning method of an unmanned aerial vehicle group.

Description

Night co-location method for unmanned aerial vehicle group
Technical Field
The invention belongs to the technical field of navigation and positioning of aircrafts, and relates to a night cooperative positioning method for unmanned aerial vehicle groups.
Background
With the development of scientific technology, the unmanned aerial vehicle cluster has wide application prospect in the military field and the civil field, and particularly has great significance for the protection, production, safety, rescue, national defense safety, social stability, economic development and the like in the aspects of industrial production, social economy, scientific research education and the like of China due to the advantages of strong combat capability, high system survival rate, low attack cost and the like of the unmanned aerial vehicle cluster facing the low-altitude security under the future security system. How to acquire the high-precision and high-reliability relative space-time relationship among unmanned aerial vehicles in the cluster is important to the flight safety of the unmanned aerial vehicle cluster and the execution of tasks. Thus, the need and necessity for fast, economical and high quality unmanned cluster co-location technology is increasing.
At present, students at home and abroad have obtained abundant results in the field of autonomous relative positioning of unmanned aerial vehicle clusters, and a series of methods such as laser pulse ranging positioning, UWB ranging positioning, visual ranging positioning, ultrasonic ranging positioning, radio ranging positioning and the like are proposed and widely applied to various fields. The laser pulse ranging and positioning cost is extremely high; UWB ranging positioning stability is poor, and other wireless communication can be interfered; the ultrasonic ranging, positioning and collecting speed is low, and the application range is smaller; the radio ranging positioning is easy to be interfered and has poor reliability. Compared with other methods, the visual positioning has the advantages of low cost, passive sensing, low detectability and the like, and is one of important research directions in the future, and the existing visual ranging positioning mainly adopts a binocular camera, has heavy calculation task and cannot meet the use requirement at night.
At the same time, the ultimate goal of unmanned aerial vehicle cluster applications is to accommodate all-weather full-scene requirements, for which challenges from complex environments are primarily faced. At present, a plurality of related research results are provided in complex geographic and meteorological environments, and less research is performed on co-location sensing by the unmanned aerial vehicle under night conditions, and the unmanned aerial vehicle is one of important application scenes of unmanned aerial vehicle clusters at night, particularly in the military field. Therefore, a method for cooperatively positioning the internal vision of the unmanned aerial vehicle cluster at night is needed to ensure the normal operation of the unmanned aerial vehicle cluster in the night environment.
Disclosure of Invention
Aiming at the defects existing in the prior art, the invention aims to provide a night cooperative positioning method for unmanned aerial vehicle groups.
In order to achieve the above purpose, the invention adopts the following technical scheme:
the unmanned aerial vehicle group comprises 5 unmanned aerial vehicles, each unmanned aerial vehicle comprises a two-dimensional turntable camera and an LED lamp, and the unmanned aerial vehicle group comprises the following steps:
step 1: unmanned cluster formation pre-take-off arrangement
Arranging the unmanned aerial vehicles 1, 2, 3 and 4 of the unmanned aerial vehicle group in a take-off field in sequence according to a rectangular formation, arranging the unmanned aerial vehicle 0 at the rectangular center, namely, at the intersection point of the rectangular diagonal lines, ensuring that the initial actual distance between adjacent unmanned aerial vehicles is larger than the sum of the safety radiuses of the adjacent unmanned aerial vehicles, and ensuring that the LED lamps of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2 in front of the unmanned aerial vehicle 0 do not appear in the camera sight range of the unmanned aerial vehicle 0;
step 2: electrifying unmanned cluster formation;
step 3: setting the light color of the LED lamp of each unmanned aerial vehicle;
the LED lamps of two unmanned aerial vehicles on opposite angles of the rectangle are set to be in a color I, the LED lamps of two unmanned aerial vehicles on the other opposite angles of the rectangle are set to be in a color II, and the LED lamps of the unmanned aerial vehicle 0 are set to be in a color III, wherein the color I, the color II and the color III are different colors;
step 4: automatic reference construction before take-off of the unmanned aerial vehicle comprises positioning reference construction and time reference construction;
step 5: the take-off control of the unmanned aerial vehicle 0 is realized through a control instruction of an external system; meanwhile, when the LED light of the unmanned aerial vehicle 1 or the unmanned aerial vehicle 2 enters the camera visual angle range of the unmanned aerial vehicle 0 in the flight process, namely, when the LED light pixel point with the color I or the color II appears in the imaging plane of the camera of the unmanned aerial vehicle 0, the unmanned aerial vehicle 0 sends an anti-collision warning instruction to the unmanned aerial vehicle 1 or the unmanned aerial vehicle 2 through a communication topology;
step 6: along with the take-off action of the unmanned aerial vehicle 0, the real-time pixel coordinate values of the LED lamp of the color III of the unmanned aerial vehicle 0 in the camera imaging planes of the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 are changed, so that the deviation between the LED lamp and the pixel coordinate values (x 10, y 10) - (x 40, y 40) recorded and stored on the ground of the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 is calculated through the gesture controllers of the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4, and the take-off action of the unmanned aerial vehicle 0 and the like are finally realized through closed-loop control.
Further, the specific process of constructing the positioning reference comprises the following steps:
step 4.1.1: setting an included angle alpha 0 between the axis of a two-dimensional turntable camera of the unmanned aerial vehicle 0 and the course of the unmanned aerial vehicle to be zero;
step 4.1.2: the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 respectively and automatically rotate the two-dimensional turntable camera thereof clockwise to search the LED light of the color III of the unmanned aerial vehicle 0, and the LED light of the color III is positioned at the horizontal center of the camera imaging plane;
step 4.1.3: recording and storing the angle value between alpha 0 and alpha 4 of the two-dimensional turntable camera axis of the unmanned aerial vehicle 0 to the unmanned aerial vehicle 4 and the course of the two-dimensional turntable camera axis;
step 4.1.4: starting a two-dimensional turntable camera included angle closed-loop maintenance control program, so that the included angle values of alpha 0-alpha 4 are consistent with recorded and stored values before take-off in the subsequent whole flight process;
step 4.1.5: recording and storing pixel coordinate values (x 10, y 10) to (x 40, y 40) of the light spot III of the LED of the unmanned aerial vehicle 0 at the moment on the imaging plane of the cameras of the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 respectively; meanwhile, the pixel coordinates (x 12, y 12) and (x 21, y 21) of the LED lights of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2 in the camera imaging planes of the other side at the moment are recorded and stored, and the pixel coordinates (x 34, y 34) and (x 43, y 43) of the LED lights of the unmanned aerial vehicle 3 and the unmanned aerial vehicle 4 in the camera imaging planes of the other side at the moment are recorded;
step 4.1.6: obtaining pixel coordinates (x 21, y 21) and (x 41, y 41) of the unmanned aerial vehicle 1 in the imaging planes of the unmanned aerial vehicle 2 and the unmanned aerial vehicle 4 respectively by means of a directional communication topology, so as to obtain co-location reference information of the unmanned aerial vehicle 1 in a cluster, wherein the co-location reference information is recorded as { (x 10, y 10), (x 21, y 21), (x 41, y 41) }; similarly, the drones 2-4 may obtain their co-location reference information { (x 20, y 20), (x 12, y 12), (x 32, y 32) }, { (x 30, y 30), (x 23, y 23), (x 43, y 43) } and { (x 40, y 40), (x 14, y 14), (x 34, y 34) }, respectively, in the cluster; and takes the co-location reference information as a position control instruction of the subsequent unmanned aerial vehicle 1-4.
Further, the time reference construction includes: and carrying out communication clock synchronization among the unmanned aerial vehicles.
Further, in the flight process, the position change of any one of the unmanned aerial vehicles 1 to 4 can lead to the deviation between the real-time monitoring LED lamplight pixel coordinate values of each unmanned aerial vehicle and the co-positioning reference thereof, and further the unmanned aerial vehicle group formation maintenance is realized through the gesture closed-loop controller.
Further, the directed communication topology in step 4.1.6 is specifically: the unmanned aerial vehicle 1, the unmanned aerial vehicle 2 and the unmanned aerial vehicle 4 are in a two-way communication relationship; the unmanned plane 3 and the unmanned planes 2 and 4 are in a two-way communication relationship; the unmanned aerial vehicle 1 and the unmanned aerial vehicle 3 have no communication relationship, and the unmanned aerial vehicles 2 and 4 have no communication relationship; the unmanned aerial vehicle 0 is in one-way communication relationship with the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2, wherein the unmanned aerial vehicle 0 is an information sender.
Preferably, the camera is a monocular camera.
Further preferably, the angle of view of the monocular camera is 90 °.
Preferably, the LED lamp can set the light color through driving software.
The safety radius is 2 times of the radius of the circumcircle of the maximum outline of the unmanned aerial vehicle body.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, the unmanned aerial vehicle LED lamp and the two-dimensional turntable camera are used for realizing the cooperative visual positioning of the unmanned aerial vehicle group at night, no additional equipment is required, no GPS, laser radar, ultrasonic radar and the like are required to be relied on, no external signal source is required to be relied on, and the unmanned aerial vehicle group is prevented from being interfered by the outside.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
fig. 1 is a technical scheme diagram of an embodiment of the present invention.
Fig. 2 is a communication topology map according to an embodiment of the present invention.
Fig. 3 is a flowchart of the operation of an embodiment of the present invention.
Description of the embodiments
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
The following is a further detailed description of the embodiments:
the embodiment provides a night collaborative positioning method for an unmanned aerial vehicle group, which mainly comprises 5 unmanned aerial vehicles as shown in an attached figure 1, wherein each unmanned aerial vehicle comprises a two-dimensional turntable monocular camera and an LED lamp, the visual angle of the monocular camera is 90 degrees, and the LED lamp can set light colors through driving software to serve as a marker for night visual positioning.
The specific implementation flow chart is shown in fig. 3, and the steps are as follows:
step 1: unmanned cluster formation pre-takeoff arrangement:
arranging the unmanned aerial vehicles on a take-off site according to the rectangular geometric formation shown in the figure 1, wherein the initial actual distances among the unmanned aerial vehicles are ensured to be gamma 10, gamma 20, gamma 30, gamma 40, gamma 12, gamma 13, gamma 24 and gamma 34 and are larger than the sum of the safety radiuses of the unmanned aerial vehicles, so that collision risks are avoided; meanwhile, the LED lamps of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2 are ensured not to appear in the camera sight range of the unmanned aerial vehicle 0; where γ10 refers to the initial actual distance between the drone 1 and the drone 0, and so on.
Step 2: the unmanned cluster formation is powered on.
Step 3: and setting corresponding unmanned aerial vehicle lamplight colors according to the requirements shown in the attached figure 1.
Two unmanned aerial vehicles of rectangle diagonal angle set up the LED lamp of colour I, the unmanned aerial vehicle of another diagonal angle of rectangle sets up the LED lamp of colour II, unmanned aerial vehicle 0 sets up the LED lamp of colour III, wherein colour I, colour II and colour III are different colours; in this embodiment, unmanned aerial vehicle 1 and unmanned aerial vehicle 4 that are located the rectangle diagonal angle set up yellow LED lamp, unmanned aerial vehicle 2 and unmanned aerial vehicle 3 that are located another diagonal angle of rectangle set up green LED lamp, unmanned aerial vehicle 0 sets up red LED lamp.
Description: the color setting of each unmanned aerial vehicle in this embodiment is not limited to the color shown in fig. 1, and all other color settings according to the color distribution rule shown in fig. 1 are also possible.
Step 4: and (5) constructing an automatic reference before taking off of the unmanned aerial vehicle.
Step 4.1: positioning reference construction:
step 4.1.1: setting an included angle alpha 0 between the axis of a two-dimensional turntable camera of the unmanned aerial vehicle 0 and the course of the unmanned aerial vehicle to be zero;
step 4.1.2: the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 respectively and automatically rotate the two-dimensional turntable camera thereof clockwise to search the red LED light of the unmanned aerial vehicle 0, and the red LED light is positioned at the horizontal center of the plane formed by the cameras;
step 4.1.3: recording and storing the angle value of alpha 0-alpha 4 of the two-dimensional turntable camera axis of the unmanned aerial vehicle 0-unmanned aerial vehicle 4 and the heading thereof at the moment, wherein alpha 0 is the angle between the two-dimensional turntable camera axis of the unmanned aerial vehicle 0 and the heading of the unmanned aerial vehicle 0, alpha 1 is the angle between the two-dimensional turntable camera axis of the unmanned aerial vehicle 1 and the heading of the unmanned aerial vehicle 1, and so on, alpha 2 is the angle between the two-dimensional turntable camera axis of the unmanned aerial vehicle 2 and the heading of the unmanned aerial vehicle 2, alpha 3 is the angle between the two-dimensional turntable camera axis of the unmanned aerial vehicle 3 and the heading of the unmanned aerial vehicle 3, and alpha 4 is the angle between the two-dimensional turntable camera axis of the unmanned aerial vehicle 4 and the heading of the unmanned aerial vehicle 4;
step 4.1.4: starting a two-dimensional turntable camera included angle closed-loop maintenance control program, so that the included angle values of alpha 0-alpha 4 are consistent with recorded and stored values before take-off in the subsequent whole flight process;
step 4.1.5: recording and storing the pixel coordinate values (x 10, y 10) to (x 40, y 40) of the LED red light spot of the unmanned aerial vehicle 0 on the unmanned aerial vehicle 1-unmanned aerial vehicle 4 camera imaging plane respectively, wherein the pixel coordinate value of the LED red light spot of the unmanned aerial vehicle 0 on the unmanned aerial vehicle 1 camera imaging plane is (x 10, y 10), and the like, the pixel coordinate value of the LED red light spot of the unmanned aerial vehicle 0 on the unmanned aerial vehicle 2 camera imaging plane is (x 20, y 20), the pixel coordinate value of the LED red light spot of the unmanned aerial vehicle 0 on the unmanned aerial vehicle 3 camera imaging plane is (x 30, y 30), and the pixel coordinate value of the LED red light spot of the unmanned aerial vehicle 0 on the unmanned aerial vehicle 4 camera imaging plane is (x 40, y 40); meanwhile, the pixel coordinates (x 21, y 21) and (x 12, y 12) of the LED lights of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2 in the camera imaging planes of the other side at the moment are recorded and stored, and the pixel coordinates (x 34, y 34) and (x 43, y 43) of the LED lights of the unmanned aerial vehicle 3 and the unmanned aerial vehicle 4 in the camera imaging planes of the other side at the moment are recorded; it should be noted that, due to the light shielding, at this time, the pixel coordinates (x 14, y 14) and (x 41, y 41) of the LED lights of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 4 in the camera imaging plane of the counterpart respectively coincide with the pixel coordinates (x 10, y 10) and (x 40, y 40) of the LED lights of the unmanned aerial vehicle 2 and the unmanned aerial vehicle 3 in the camera imaging plane of the counterpart respectively coincide with the pixel coordinates (x 23, y 23) and (x 32, y 32) of the unmanned aerial vehicle 2 and the unmanned aerial vehicle 3 respectively with the pixel coordinates (x 20, y 20) and (x 30, y 30) of the counterpart respectively.
Step 4.1.6: obtaining pixel coordinates (x 21, y 21) and (x 41, y 41) of the unmanned aerial vehicle 1 in the imaging planes of the unmanned aerial vehicle 2 and the unmanned aerial vehicle 4 respectively by means of the directional communication topology shown in fig. 2, so as to obtain co-location reference information of the unmanned aerial vehicle 1 in a cluster, wherein the co-location reference information is recorded as { (x 10, y 10), (x 21, y 21), (x 41, y 41) }; similarly, the drones 2-4 may obtain their co-location reference information { (x 20, y 20), (x 12, y 12), (x 32, y 32) }, { (x 30, y 30), (x 23, y 23), (x 43, y 43) } and { (x 40, y 40), (x 14, y 14), (x 34, y 34) }, respectively, in the cluster; and takes the co-location reference information as a position control instruction of the subsequent unmanned aerial vehicle 1-4. The directional communication topology describes the communication relationship between each unmanned aerial vehicle and other unmanned aerial vehicles in the formation, and the direction of the directional communication topology is represented as the receiving and sending relationship of communication information among unmanned aerial vehicles. In fig. 2, the arrows point to the information receivers, that is, the unmanned aerial vehicle 1 and the unmanned aerial vehicles 2 and 4 are in bidirectional communication relationship; the unmanned plane 3 and the unmanned planes 2 and 4 are in a two-way communication relationship; the unmanned aerial vehicle 1 and the unmanned aerial vehicle 3 are in an unmanned communication relationship, and the unmanned aerial vehicles 2 and 4 are in an unmanned communication relationship; the unmanned aerial vehicle 0 and unmanned aerial vehicles 1 and 2 are in a one-way communication relation of sending and receiving.
Step 4.2: time reference construction: and synchronizing communication clocks among the unmanned aerial vehicles, so that consistency of cooperative positioning of the unmanned aerial vehicles in the cluster is ensured.
Step 5: the control of the vertical take-off and the like of the unmanned aerial vehicle 0 is realized through a control instruction of an external system; meanwhile, when the LED light of the unmanned aerial vehicle 1 or the unmanned aerial vehicle 2 enters the camera visual angle range of the unmanned aerial vehicle 0 in the flight process, namely, yellow or green LED light pixel points can appear in the imaging plane of the unmanned aerial vehicle 0 camera, so that the unmanned aerial vehicle 0 sends an anti-collision warning instruction to the unmanned aerial vehicle 1 or 2 through the communication topology, and collision risks between unmanned aerial vehicles are avoided.
Step 6: along with actions such as the vertical take-off of the unmanned aerial vehicle 0, real-time pixel coordinate values of the red LED lamp of the unmanned aerial vehicle 0 in the camera imaging planes of the unmanned aerial vehicle 1-unmanned aerial vehicle 4 are changed, so that deviation between the red LED lamp and the pixel coordinate values (x 10, y 10) - (x 40, y 40) recorded and stored on the ground of the unmanned aerial vehicle 1-unmanned aerial vehicle 4 is calculated through the gesture controllers of the unmanned aerial vehicle 1-unmanned aerial vehicle 4, and further, the actions such as take-off of the unmanned aerial vehicle 0 are finally realized through closed loop control.
Supplementary explanation: in the flight process, the position change of any one of the unmanned aerial vehicles 1 to 4 can lead to the deviation between the real-time monitoring LED lamplight pixel coordinate values of each unmanned aerial vehicle and the co-positioning reference thereof, and then the control that the deviation is zero or within a certain precision range is realized through the gesture closed-loop controller, so that the formation and the maintenance of the unmanned aerial vehicle group are realized.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely an embodiment of the present invention, and a specific structure and characteristics of common knowledge in the art, which are well known in the scheme, are not described herein, so that a person of ordinary skill in the art knows all the prior art in the application day or before the priority date of the present invention, and can know all the prior art in the field, and have the capability of applying the conventional experimental means before the date, so that a person of ordinary skill in the art can complete and implement the present embodiment in combination with his own capability in the light of the present application, and some typical known structures or known methods should not be an obstacle for a person of ordinary skill in the art to implement the present application. It should be noted that modifications and improvements can be made by those skilled in the art without departing from the structure of the present invention, and these should also be considered as the scope of the present invention, which does not affect the effect of the implementation of the present invention and the utility of the patent. The protection scope of the present application shall be subject to the content of the claims, and the description of the specific embodiments and the like in the specification can be used for explaining the content of the claims.

Claims (6)

1. The night co-location method of the unmanned aerial vehicle group is characterized in that one unmanned aerial vehicle group comprises 5 unmanned aerial vehicles, each unmanned aerial vehicle comprises a two-dimensional turntable camera and an LED lamp, and the night co-location method comprises the following steps:
step 1: unmanned cluster formation pre-take-off arrangement
Arranging the unmanned aerial vehicles 1, 2, 3 and 4 of the unmanned aerial vehicle group in a take-off field in sequence according to a rectangular formation, arranging the unmanned aerial vehicle 0 at the rectangular center, namely, at the intersection point of the rectangular diagonal lines, ensuring that the initial actual distance between adjacent unmanned aerial vehicles is larger than the sum of the safety radiuses of the adjacent unmanned aerial vehicles, and ensuring that the LED lamps of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2 in front of the unmanned aerial vehicle 0 do not appear in the camera sight range of the unmanned aerial vehicle 0;
step 2: electrifying unmanned cluster formation;
step 3: setting the light color of the LED lamp of each unmanned aerial vehicle;
the LED lamps of two unmanned aerial vehicles on opposite angles of the rectangle are set to be in a color I, the LED lamps of two unmanned aerial vehicles on the other opposite angles of the rectangle are set to be in a color II, and the LED lamps of the unmanned aerial vehicle 0 are set to be in a color III, wherein the color I, the color II and the color III are different colors;
step 4: automatic reference construction before take-off of the unmanned aerial vehicle comprises positioning reference construction and time reference construction;
step 5: the take-off control of the unmanned aerial vehicle 0 is realized through a control instruction of an external system; meanwhile, when the LED light of the unmanned aerial vehicle 1 or the unmanned aerial vehicle 2 enters the camera visual angle range of the unmanned aerial vehicle 0 in the flight process, namely, when the LED light pixel point with the color I or the color II appears in the imaging plane of the camera of the unmanned aerial vehicle 0, the unmanned aerial vehicle 0 sends an anti-collision warning instruction to the unmanned aerial vehicle 1 or the unmanned aerial vehicle 2 through a communication topology;
step 6: along with the take-off action of the unmanned aerial vehicle 0, the real-time pixel coordinate values of the LED lamps of the color III of the unmanned aerial vehicle 0 in the camera imaging planes of the unmanned aerial vehicles 1 and 4 are changed, so that the deviation between the LED lamps and the pixel coordinate values (x 10, y 10) - (x 40, y 40) recorded and stored on the ground of the unmanned aerial vehicles 1 and 4 is calculated through the gesture controllers of the unmanned aerial vehicles 1 and 4, and the take-off action of the unmanned aerial vehicle 0 is finally realized through closed-loop control;
the specific construction process of the positioning reference comprises the following steps:
step 4.1.1: setting an included angle alpha 0 between the axis of a two-dimensional turntable camera of the unmanned aerial vehicle 0 and the course of the unmanned aerial vehicle to be zero;
step 4.1.2: the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 respectively and automatically rotate the two-dimensional turntable camera thereof clockwise to search the LED light of the color III of the unmanned aerial vehicle 0, and the LED light of the color III is positioned at the horizontal center of the camera imaging plane;
step 4.1.3: recording and storing the angle value between alpha 0 and alpha 4 of the two-dimensional turntable camera axis of the unmanned aerial vehicle 0 to the unmanned aerial vehicle 4 and the course of the two-dimensional turntable camera axis;
step 4.1.4: starting a two-dimensional turntable camera included angle closed-loop maintenance control program, so that the included angle values of alpha 0-alpha 4 are consistent with recorded and stored values before take-off in the subsequent whole flight process;
step 4.1.5: recording and storing pixel coordinate values (x 10, y 10) to (x 40, y 40) of the light spot III of the LED of the unmanned aerial vehicle 0 at the moment on the imaging plane of the cameras of the unmanned aerial vehicle 1 to the unmanned aerial vehicle 4 respectively; meanwhile, the pixel coordinates (x 12, y 12) and (x 21, y 21) of the LED lights of the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2 in the camera imaging planes of the other side at the moment are recorded and stored, and the pixel coordinates (x 34, y 34) and (x 43, y 43) of the LED lights of the unmanned aerial vehicle 3 and the unmanned aerial vehicle 4 in the camera imaging planes of the other side at the moment are recorded;
step 4.1.6: obtaining pixel coordinates (x 21, y 21) and (x 41, y 41) of the unmanned aerial vehicle 1 in the imaging planes of the unmanned aerial vehicle 2 and the unmanned aerial vehicle 4 respectively by means of a directional communication topology, so as to obtain co-location reference information of the unmanned aerial vehicle 1 in a cluster, wherein the co-location reference information is recorded as { (x 10, y 10), (x 21, y 21), (x 41, y 41) }; similarly, the drones 2-4 may obtain their co-location reference information { (x 20, y 20), (x 12, y 12), (x 32, y 32) }, { (x 30, y 30), (x 23, y 23), (x 43, y 43) } and { (x 40, y 40), (x 14, y 14), (x 34, y 34) }, respectively, in the cluster; and the co-location reference information is used as a position control instruction of the follow-up unmanned aerial vehicle 1-4;
the time reference construction includes: and carrying out communication clock synchronization among the unmanned aerial vehicles.
2. The unmanned aerial vehicle group night cooperative positioning method according to claim 1, wherein in the flight process, the position change of any one of the unmanned aerial vehicles 1 to 4 causes the deviation between the real-time monitored LED light pixel coordinate values of each unmanned aerial vehicle and the cooperative positioning reference thereof, so that unmanned aerial vehicle group formation maintenance is realized through the gesture closed-loop controller.
3. The unmanned aerial vehicle group night co-location method of claim 1, wherein the directional communication topology in step 4.1.6 is specifically: the unmanned aerial vehicle 1, the unmanned aerial vehicle 2 and the unmanned aerial vehicle 4 are in a two-way communication relationship; the unmanned plane 3 and the unmanned planes 2 and 4 are in a two-way communication relationship; the unmanned aerial vehicle 1 and the unmanned aerial vehicle 3 have no communication relationship, and the unmanned aerial vehicles 2 and 4 have no communication relationship; the unmanned aerial vehicle 0 is in one-way communication relationship with the unmanned aerial vehicle 1 and the unmanned aerial vehicle 2, wherein the unmanned aerial vehicle 0 is an information sender.
4. The unmanned aerial vehicle crowd night co-location method of claim 1, wherein the camera is a monocular camera.
5. The unmanned aerial vehicle crowd night co-location method of claim 4, wherein the monocular camera view angle is 90 °.
6. The unmanned aerial vehicle group night co-location method of claim 1, wherein the LED lights are capable of setting light colors through driving software.
CN202310106488.5A 2023-02-13 2023-02-13 Night co-location method for unmanned aerial vehicle group Active CN115826622B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310106488.5A CN115826622B (en) 2023-02-13 2023-02-13 Night co-location method for unmanned aerial vehicle group
US18/403,503 US12085962B2 (en) 2023-02-13 2024-01-03 Nighttime cooperative positioning method based on unmanned aerial vehicle group

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310106488.5A CN115826622B (en) 2023-02-13 2023-02-13 Night co-location method for unmanned aerial vehicle group

Publications (2)

Publication Number Publication Date
CN115826622A CN115826622A (en) 2023-03-21
CN115826622B true CN115826622B (en) 2023-04-28

Family

ID=85521114

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310106488.5A Active CN115826622B (en) 2023-02-13 2023-02-13 Night co-location method for unmanned aerial vehicle group

Country Status (2)

Country Link
US (1) US12085962B2 (en)
CN (1) CN115826622B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116540784B (en) * 2023-06-28 2023-09-19 西北工业大学 Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107831783A (en) * 2017-11-10 2018-03-23 南昌航空大学 A kind of ground station control system for supporting multiple no-manned plane autonomous flight
WO2022247597A1 (en) * 2021-05-25 2022-12-01 北京天华航宇科技有限公司 Papi flight inspection method and system based on unmanned aerial vehicle

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8862285B2 (en) * 2013-02-15 2014-10-14 Disney Enterprises, Inc. Aerial display system with floating pixels
WO2016143256A1 (en) * 2015-03-12 2016-09-15 パナソニックIpマネジメント株式会社 Flying body
EP3168704B1 (en) * 2015-11-12 2021-02-24 Hexagon Technology Center GmbH 3d surveying of a surface by mobile vehicles
US10902734B2 (en) * 2015-11-17 2021-01-26 SZ DJI Technology Co., Ltd. Systems and methods for managing flight-restriction regions
US10657827B2 (en) * 2015-12-09 2020-05-19 Dronesense Llc Drone flight operations
US9734684B2 (en) * 2016-01-04 2017-08-15 International Business Machines Corporation Perimeter monitoring using autonomous drones
US10086956B2 (en) * 2016-01-27 2018-10-02 Amazon Technologies, Inc. Light adjustment control for cameras of an aerial vehicle
US10073454B2 (en) * 2016-03-17 2018-09-11 Northrop Grumman Systems Corporation Machine vision enabled swarm guidance technology
AT16608U1 (en) * 2016-07-11 2020-02-15 Ars Electronica Linz Gmbh & Co Kg Unmanned aerial vehicle and system for generating an image in airspace
US20180067502A1 (en) * 2016-08-10 2018-03-08 Richard Chi-Hsueh Drone security system
US10114384B2 (en) * 2016-09-13 2018-10-30 Arrowonics Technologies Ltd. Formation flight path coordination of unmanned aerial vehicles
CN115649439A (en) * 2017-02-24 2023-01-31 深圳市大疆创新科技有限公司 Many cloud platforms subassembly
US11238281B1 (en) * 2017-02-27 2022-02-01 Amazon Technologies, Inc. Light source detection in field of view
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision
US10386842B2 (en) * 2017-11-13 2019-08-20 Intel IP Corporation Unmanned aerial vehicle light show
US20190176987A1 (en) * 2017-12-13 2019-06-13 James E. Beecham System and method for fire suppression via artificial intelligence
US10674719B2 (en) * 2018-02-12 2020-06-09 International Business Machines Corporation Wild-life surveillance and protection
EP3752881A1 (en) * 2018-02-13 2020-12-23 Ars Electronica Linz GmbH & Co KG System for presenting and identifying markers of a changeable geometric image
CA3096122A1 (en) * 2018-04-09 2019-10-17 PreNav, Inc. Unmanned aerial vehicles with stereoscopic imaging, and associated systems and methods
US11697497B2 (en) * 2018-10-03 2023-07-11 Sarcos Corp. Aerial vehicles having countermeasures deployed from a platform for neutralizing target aerial vehicles
US11192646B2 (en) * 2018-10-03 2021-12-07 Sarcos Corp. Anchored aerial countermeasures for rapid deployment and neutralizing of target aerial vehicles
US20210403159A1 (en) * 2018-10-18 2021-12-30 Telefonaktiebolaget Lm Ericsson (Publ) Formation Flight of Unmanned Aerial Vehicles
IL263189B2 (en) * 2018-11-21 2023-03-01 Elta Systems Ltd Flexible array antenna and methods for methods of operting it
CN110119158B (en) * 2019-05-13 2020-08-18 浙江大学 Multi-machine cooperative formation control system and method for high subsonic speed unmanned aerial vehicle
CN110426029B (en) * 2019-07-31 2022-03-25 南京航空航天大学 Dynamic mutual observation online modeling method for unmanned aerial vehicle swarm cooperative navigation
EP4121955A1 (en) * 2020-03-20 2023-01-25 Sony Group Corporation Unmanned aerial vehicle and method for an unmanned aerial vehicle for generating a temporary flight-plan for a region
EP4097555A2 (en) * 2020-03-31 2022-12-07 Skygrid, LLC Coordinating an aerial search among unmanned aerial vehicles
CN113766416A (en) * 2020-12-02 2021-12-07 北京京东乾石科技有限公司 Unmanned aerial vehicle positioning method and device and storage medium
CN112631329A (en) * 2020-12-18 2021-04-09 北京泊松技术有限公司 Unmanned aerial vehicle formation cooperative control system and method based on optical coding LED navigation lamp
US11861896B1 (en) * 2021-03-31 2024-01-02 Skydio, Inc. Autonomous aerial navigation in low-light and no-light conditions
US20230058405A1 (en) * 2021-08-20 2023-02-23 Sony Group Corporation Unmanned aerial vehicle (uav) swarm control
CN113821052A (en) * 2021-09-22 2021-12-21 一飞智控(天津)科技有限公司 Cluster unmanned aerial vehicle cooperative target positioning method and system and cooperative target positioning terminal
CN115097846B (en) * 2022-07-20 2023-04-25 北京交通大学 Unmanned vehicle and unmanned vehicle cooperative landing method and system
CN115661204B (en) * 2022-11-01 2023-11-10 中国人民解放军军事科学院国防科技创新研究院 Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster
CN115651204B (en) 2022-12-27 2023-03-28 广州鹿山新材料股份有限公司 N-carbonyl-bisamide-polyolefin compound, and preparation method and application thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107831783A (en) * 2017-11-10 2018-03-23 南昌航空大学 A kind of ground station control system for supporting multiple no-manned plane autonomous flight
WO2022247597A1 (en) * 2021-05-25 2022-12-01 北京天华航宇科技有限公司 Papi flight inspection method and system based on unmanned aerial vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
无人机集群编队控制演示验证系统;朱创创;梁晓龙;张佳强;何吕龙;刘流;;北京航空航天大学学报(第08期) *

Also Published As

Publication number Publication date
US12085962B2 (en) 2024-09-10
US20240272650A1 (en) 2024-08-15
CN115826622A (en) 2023-03-21

Similar Documents

Publication Publication Date Title
CN107144281B (en) Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision
CN108762291B (en) Method and system for discovering and tracking remote controller of black flying unmanned aerial vehicle
CA2803332C (en) Personal electronic target vision system, device and method
CN109799842B (en) Multi-unmanned aerial vehicle sequence flight control method
EP3674657A1 (en) Construction and update of elevation maps
US12085962B2 (en) Nighttime cooperative positioning method based on unmanned aerial vehicle group
CN103413463B (en) The data of a kind of ADS-B target and radar target hold fusion implementation method
CN107247458A (en) UAV Video image object alignment system, localization method and cloud platform control method
Li et al. Design and implementation of UAV intelligent aerial photography system
CN105353772A (en) Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN106774405B (en) Orchard plant protection drone obstacle avoidance apparatus and method based on three-level avoidance mechanism
CN108008738A (en) Target Tracking System under being cooperateed with based on unmanned plane with unmanned vehicle
CN110239677A (en) A kind of unmanned plane autonomous classification target simultaneously drops to the method on the unmanned boat of movement
CN113359786A (en) Multi-rotor unmanned aerial vehicle accurate landing guiding method integrating flying heights
CN107783119A (en) Apply the Decision fusion method in obstacle avoidance system
CN116540784B (en) Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision
CN109270949A (en) A kind of UAV Flight Control System
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
CN207274661U (en) Unmanned vehicle context aware systems
CN116558497A (en) Night relative positioning method for unmanned aerial vehicle group
CN110907945A (en) Positioning method considering indoor and outdoor flight of unmanned aerial vehicle
CN214705387U (en) Aircraft take-off and landing identification projection display system
CN115755099B (en) Distributed navigation decoy system and method for anti-interference unmanned aerial vehicle countering
CN116027804B (en) Unmanned plane ground photoelectric measurement and control guiding device and guiding method
CN109799522A (en) More visual field multiple targets quick taking turn method automatically

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant