CN112949292A - Cluster unmanned aerial vehicle return data processing method and device, equipment and storage medium - Google Patents

Cluster unmanned aerial vehicle return data processing method and device, equipment and storage medium Download PDF

Info

Publication number
CN112949292A
CN112949292A CN202110084275.8A CN202110084275A CN112949292A CN 112949292 A CN112949292 A CN 112949292A CN 202110084275 A CN202110084275 A CN 202110084275A CN 112949292 A CN112949292 A CN 112949292A
Authority
CN
China
Prior art keywords
data
local data
overlapping
fusion
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110084275.8A
Other languages
Chinese (zh)
Other versions
CN112949292B (en
Inventor
刘薇
朱俊锋
刘松林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongce Zhihui Technology Co ltd
61540 Troops of PLA
Original Assignee
Beijing Zhongce Zhihui Technology Co ltd
61540 Troops of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongce Zhihui Technology Co ltd, 61540 Troops of PLA filed Critical Beijing Zhongce Zhihui Technology Co ltd
Priority to CN202110084275.8A priority Critical patent/CN112949292B/en
Publication of CN112949292A publication Critical patent/CN112949292A/en
Application granted granted Critical
Publication of CN112949292B publication Critical patent/CN112949292B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities

Abstract

The application relates to a cluster unmanned aerial vehicle return data processing method, which comprises the following steps: reading single machine local data currently transmitted back to a ground processor by each unmanned aerial vehicle in the unmanned aerial vehicle cluster, and performing overlapping detection on the single machine local data; when detecting that data overlap exists in the single machine local data, performing fusion processing on the single machine local data with the overlap to obtain fusion data; and returning the fusion data to the corresponding ground processor, and updating the single machine local data with the overlapping into the fusion data. The method effectively saves redundant analysis and calculation of the overlapped data, effectively improves the data processing efficiency finally, and avoids the interference of the redundant data, so that the finally obtained overall regional data is more accurate.

Description

Cluster unmanned aerial vehicle return data processing method and device, equipment and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicle data processing, in particular to a cluster unmanned aerial vehicle return data processing method, device, equipment and storage medium.
Background
The intelligent cluster unmanned aerial vehicle is an unmanned aerial vehicle group which is composed of a plurality of unmanned aerial vehicles gathered in a certain space, develops complex and ordered group behaviors by adopting a self-organizing cooperation mechanism taking excitation work as a main characteristic, and has the characteristics of distributivity, autonomy and simplicity. Intelligent cluster unmanned aerial vehicle possesses very high flexibility, possesses the mode of no centralization moreover.
The cluster unmanned aerial vehicle is a key step for completing large-area surveying and mapping operation and carrying out integral area network adjustment on large-range data. In the related art, there is usually much research on spatial triangulation processing for a single drone or a small amount of drone data, and relatively little research on fusing the results of respective regional free nets in an intelligent cluster drone to form a large-region overall free net. When fusion of respective regional free networks in an intelligent cluster unmanned aerial vehicle is performed to form a large-regional integral free network, an instant processing strategy of receiving and processing is generally adopted to ensure processing timeliness. Namely, when the unmanned aerial vehicle cluster works, each unmanned aerial vehicle continuously transmits data back to the corresponding ground processing unit. The data received by each ground processing unit (namely, single machine return data) is read in real time, and the received data is gradually fused into complete area data, so that the construction of the whole area data is realized. However, a large amount of redundancy often exists in the single-machine return data of a large number of unmanned aerial vehicles, which are returned to the ground station through the data transmission network, so that the accuracy of the finally fused overall regional data is affected.
Disclosure of Invention
In view of this, the application provides a cluster unmanned aerial vehicle return data processing method, which can effectively improve the accuracy of the integrated region data.
According to an aspect of the present application, a method for processing return data of a cluster unmanned aerial vehicle is provided, including:
reading single machine local data currently transmitted back to a ground processor by each unmanned aerial vehicle in the unmanned aerial vehicle cluster, and performing overlapping detection on each single machine local data;
when the data overlapping exists in the single machine local data, performing fusion processing on the single machine local data with the overlapping to obtain fusion data;
and returning the fusion data to the corresponding ground processor, and updating the single machine local data with the overlapping into the fusion data.
In one possible implementation manner, the performing overlap detection on each of the single-machine local data includes:
extracting corresponding local features from the single machine local data according to the scene environment acquired by the unmanned aerial vehicle cluster;
quantizing the extracted local features by adopting a visual dictionary construction method to obtain representative feature vectors, and defining the representative feature vectors as visual words;
acquiring visual words of fused scene data, and matching the obtained representative feature vectors with the visual words of the fused scene data by adopting a nearest neighbor search algorithm so as to detect whether the single-machine local data has an overlapping region;
wherein the fused scene data is: and fusion data obtained after fusion of the single-machine local data with the overlapping area is detected in the previous overlapping mode based on the current overlapping mode of the single-machine local data.
In one possible implementation manner, when detecting that there is data overlap in the single-machine local data, the method further includes:
judging whether the overlapping degree of the overlapping area reaches a preset overlapping degree;
and when the preset contact ratio is reached, performing fusion processing on the single machine local data with the overlap.
In a possible implementation manner, the value of the preset overlap ratio is greater than or equal to 15%.
In a possible implementation manner, performing fusion processing on the single-machine local data with the overlapping to obtain fused data includes:
obtaining similarity transformation parameters of the single-machine local data with overlapping by using a relative orientation technology;
and performing data fusion processing on the single machine local data with the overlapping according to the obtained similarity transformation parameters.
In one possible implementation manner, when it is detected that there is an overlap in the single-machine local data, the method further includes: and stopping the ground processor corresponding to the overlapped single machine local data from updating the local data.
According to another aspect of the present application, there is also provided a cluster unmanned aerial vehicle backhaul data processing apparatus, including: the device comprises a data reading module, an overlap detection module, a data fusion module and a data updating module;
the data reading module is configured to read the stand-alone local data currently transmitted back to the ground processor by each unmanned aerial vehicle in the unmanned aerial vehicle cluster;
the overlap detection module is configured to perform overlap detection on each of the single-machine local data;
the data fusion module is configured to perform fusion processing on the single machine local data with the overlapping to obtain fusion data when the overlapping detection module detects that the single machine local data has data overlapping;
and the data updating module is configured to transmit the fusion data back to the corresponding ground processor, and update the single machine local data with the overlapping into the fusion data.
In one possible implementation, the overlap detection module includes a local feature extraction sub-module, a visual word construction sub-module, and a visual word matching sub-module;
the local feature extraction submodule is configured to extract corresponding local features from each single machine local data according to the scene environment acquired by the unmanned aerial vehicle cluster;
the visual word construction submodule is configured to quantize the extracted local features by adopting a visual dictionary construction method to obtain representative feature vectors, and define the representative feature vectors as visual words;
the visual word matching submodule is configured to acquire visual words of fused scene data, and match the acquired representative feature vectors with the visual words of the fused scene data by adopting a nearest neighbor search algorithm so as to detect whether the single-machine local data has an overlapping region;
the fused scene data is overlapping area data detected in the previous overlapping mode when the single-machine local data are detected in the overlapping mode currently.
According to another aspect of the present application, there is also provided a cluster unmanned aerial vehicle backhaul data processing apparatus, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the executable instructions to implement any of the methods described above.
According to an aspect of the application, there is also provided a non-transitory computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the method of any of the preceding.
The single machine local data returned by each unmanned aerial vehicle in the unmanned aerial vehicle cluster are subjected to overlapping detection, when data overlapping exists in the single machine local data, the single machine local data with the overlapping are subjected to fusion processing to obtain corresponding fusion data, and then the fusion data are returned to the corresponding ground processor, so that the single machine local data with the overlapping are updated into the fusion data, and therefore, when the data returned by the unmanned aerial vehicles are subjected to overall fusion, fusion calculation is not required to be repeatedly performed on the data of the overlapping part, the redundant analysis calculation of the overlapping data is effectively saved, the data processing efficiency is effectively improved finally, meanwhile, the interference of the redundant data is avoided, and the finally obtained overall regional data is more accurate.
Other features and aspects of the present application will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the application and, together with the description, serve to explain the principles of the application.
Fig. 1 shows a flowchart of a method for processing backhaul data of a cluster drone according to an embodiment of the present application;
fig. 2 shows a flowchart of a method for processing backhaul data of a cluster drone according to another embodiment of the present application;
fig. 3 shows a block diagram of a structure of a cluster drone backhaul data processing device according to an embodiment of the present application;
fig. 4 shows a block diagram of a structure of a cluster drone backhaul data processing device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments, features and aspects of the present application will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present application. It will be understood by those skilled in the art that the present application may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present application.
Fig. 1 shows a flowchart of a method for processing backhaul data of a cluster drone according to an embodiment of the present application. As shown in fig. 1, the method includes: and S100, reading the local data of the single unmanned aerial vehicles currently returned to the ground processor in the unmanned aerial vehicle cluster, and performing overlapping detection on the local data of the single unmanned aerial vehicles. Here, it should be noted that the stand-alone local data is data acquired by the unmanned aerial vehicle, and may be image data or video data.
Meanwhile, as can be understood by those skilled in the art, a drone cluster refers to a cluster in which a plurality of drone singlets are arranged in a certain array. Each unmanned aerial vehicle stand-alone is respectively provided with a corresponding ground processor, and each unmanned aerial vehicle stand-alone continuously issues the acquired image or video data to the corresponding ground processor. And after receiving the data currently issued by the single unmanned aerial vehicle, the ground processor fuses the currently received data and the previously issued data. In the method of the embodiment of the application, the read local data of each single machine refers to: and each ground processor fuses the currently received data and the received data.
The scene data collected by each unmanned aerial vehicle stand-alone generally has the condition of partial scene overlapping. Therefore, in the method of the embodiment of the present application, after each drone returns scene data (i.e., single machine local data) collected currently to the corresponding ground processor, the single machine local data returned by each drone single machine is read by the ground processor, and overlap detection is performed on each single machine local data.
When it is detected that there is data overlap in the single-machine local data, the single-machine local data having the data overlap may be subjected to fusion processing in step S200 to obtain fused data. Here, the fact that data overlaps are present in the detected single-machine partial data means that data overlaps are present in a certain single-machine partial data and at least one single-machine partial data in the other document partial data.
For example, drones include stand-alone a, stand-alone B, stand-alone C, and stand-alone D. Wherein, every unmanned aerial vehicle unit all corresponds the corresponding ground treater of configuration, promptly, ground treater A, ground treater B, ground treater C and ground treater D. Correspondingly, the local data of the single machine read by the ground processor corresponding to each unmanned aerial vehicle are respectively: the ground processor A fuses local data A obtained by fusing the acquired data issued by the unmanned aerial vehicle A at present, the ground processor B fuses the acquired data issued by the unmanned aerial vehicle B at present, the ground processor C fuses the acquired data issued by the unmanned aerial vehicle C at present, and the ground processor C fuses the acquired data issued by the unmanned aerial vehicle D at present.
When the four groups of data are subjected to overlapping detection, for example: when the data overlap of the local data a is detected, it indicates that the local data a overlaps at least one of the local data B, the local data C, and the local data D. Such as: the local data a and the local data B overlap each other in area, or the local data a and the local data B and the local data C both overlap each other in area.
When the situation that data overlap exists in the single machine local data is detected, the single machine local data with the overlap is subjected to fusion processing through the steps to obtain corresponding fusion data, then the fusion data is transmitted back to the corresponding ground processor through the step S300, and the single machine local data with the overlap is updated into the fusion data.
Therefore, according to the cluster unmanned aerial vehicle return data processing method, the single machine local data returned by each unmanned aerial vehicle in the unmanned aerial vehicle cluster are subjected to overlap detection, when data overlap exists in the single machine local data, corresponding fusion data are obtained by performing fusion processing on the single machine local data with the overlap, then the fusion data are returned to the corresponding ground processor, the single machine local data with the overlap are updated to be the fusion data, and therefore when the data returned by the unmanned aerial vehicles are subjected to overall fusion in the follow-up process, fusion calculation does not need to be performed repeatedly on the data of the overlap part, redundant analysis calculation of the overlap data is effectively saved, and finally data processing efficiency is effectively improved.
When the overlap detection is performed on the local data of each single machine, the method can be realized in a scene matching mode based on a bag-of-words method.
Namely, firstly, according to scene image data collected by an unmanned aerial vehicle cluster, corresponding local features are extracted from each single machine local data. Here, it should be noted that the scene image data acquired by the drone cluster refers to actual image data acquired by the current drone cluster. Such as: when a certain area in a city is mapped, the scene image data acquired by the unmanned aerial vehicle cluster refers to actual image data of the certain area in the city. Meanwhile, scene image data acquired by the unmanned aerial vehicle cluster can be acquired through conventional shooting technology in the field, and the scene image data is not repeated here. In addition, it should be noted that, the local feature extraction from each single local data can also be implemented by using an image feature extraction algorithm that is conventional in the art, and the present invention is not limited thereto.
Then, a visual dictionary construction method is adopted to quantize each extracted local feature to obtain representative feature vectors, and each representative feature vector is defined as a visual word. Here, it should be noted that the visual dictionary construction method is mainly a method for repetitive scene detection. Meanwhile, in a possible implementation manner, after the extracted local features are quantized, redundancy removal processing can be performed on the quantized local features to obtain representative feature vectors. And after the representative feature vector of each single machine local data is obtained, defining each representative feature vector as a visual word so as to be convenient for matching repeated scene data subsequently.
And after the representative feature vector of each single machine local data is obtained, the visual words of the fused scene data are obtained in the local feature extraction mode. And further, based on the obtained visual words of the fused scene data, matching the obtained representative feature vectors with the visual words of the fused scene data by adopting a nearest neighbor search algorithm so as to detect whether the local data of each single machine has an overlapping area.
Here, it should be explained that the fused scene data is: and fusion data obtained after data fusion is carried out on the single machine local data with the overlapping area based on the overlapping area data which is detected in the previous overlapping way when the single machine local data is detected in the overlapping way.
That is, in the method of the embodiment of the present application, the process of performing overlap detection on each single-machine local data is an iterative process. And when the current overlap detection is performed by adopting the fusion scene data as the last overlap detection of the current time, the detected single-machine local data with the overlap region is subjected to data fusion to obtain fusion data.
Wherein, when the first overlapping detection is carried out, the first local data (such as the local data A which is processed by the current fusion of the ground processor A corresponding to the unmanned aerial vehicle A) and the second local data (such as the local data B which is processed by the current fusion of the ground processor B corresponding to the unmanned aerial vehicle B) are selected from the read single local data, then the representative feature vector of the first local data and the representative feature vector of the second local data are respectively extracted by adopting the mode described in the foregoing, after the representative local feature vector of the first local data and the representative local feature vector of the second local data are respectively defined as the visual words, a nearest neighbor search algorithm is adopted, matching is performed between the visual words of the first partial data and the visual words of the second partial data to detect whether there is an overlapping area between the first partial data and the second partial data.
After detecting that an overlapping area exists between the first local data and the second local data, recording the detected overlapping area, and sequentially performing overlapping detection on the single-machine local data (namely, data B, data C and data D) issued by the other unmanned aerial vehicles except the first local data.
After the overlapping detection is carried out on each single machine local data, the overlapping detection result can be obtained. The results of the overlapping detection are: in each single-machine local data, there are those single-machine local data which have an overlapping area. And simultaneously, recording the overlapping detection result after the overlapping detection is finished, and then carrying out data fusion on the detected single-machine local data with the overlapping area based on the overlapping detection result to obtain corresponding fused data.
In the next overlapping detection process, fused data obtained by data fusion of the overlapping detection result detected last time is used as comparison reference, and corresponding visual word matching is carried out in the current overlapping detection process. Meanwhile, it should be noted that, when performing overlap detection on the read local data of each single machine, a mode of performing two-to-two combination on the local data of each single machine so as to perform overlap detection on the local data of each two single machines may be adopted, and a mode of performing permutation and combination may also be adopted so as to perform overlap detection on the local data of each single machine as long as the overlap detection on the local data of each single machine is completed.
In addition, in the above embodiment, the two single local data (i.e., the first local data and the second local data) selected during the overlap detection may be selected at will or sequentially, and the selection manner is not particularly limited. In a possible implementation, the selection may be performed according to an arrangement order of the drones in the drone cluster.
Furthermore, after the overlapping detection of each single machine local data is completed, the fusion of the single machine local data with the overlapping can be performed to achieve the purpose of redundancy removal.
In one possible implementation manner, when detecting that there is data overlap in the single-machine local data, the method further includes: judging whether the overlapping degree of the overlapping area reaches a preset overlapping degree; and when the preset contact ratio is reached, performing fusion processing on the single machine local data with the overlap. By judging the overlapping degree of the overlapping area and then performing fusion processing after the overlapping degree reaches the preset overlapping degree, the loss of data is avoided, the integrity of the data is ensured while redundancy is removed, and the finally fused integral area data is more accurate.
Here, it should be explained that the overlapping degree refers to a ratio between the overlapping area and two single-machine local data currently detected. The ratio may be an area ratio or a pixel ratio. Meanwhile, it should be noted that the overlapping degree is the minimum value of the ratio of the overlapping area to the two single-machine local data currently detected. Such as: when the pixel ratio is used as the parameter of the overlapping degree, the pixel ratio of the pixel of the overlapping area to the first local data and the pixel ratio of the pixel of the overlapping area to the second local data are respectively calculated, and then the smaller ratio is selected from the two ratios to be used as the overlapping degree of the overlapping area. And comparing the selected smaller ratio with the preset contact ratio to judge whether the preset contact ratio is reached.
The value of the preset contact ratio can be flexibly set according to the actual situation. In the method of the embodiment of the present application, the value of the preset contact ratio is preferably greater than or equal to 15%. Such as: the predetermined contact ratio may be 15%. Therefore, when the single-machine local data is detected to have the overlapped data and the overlapping degree of the overlapped data reaches 15%, the single-machine local data is subjected to data fusion processing.
Here, it should be noted that the calculation of the contact ratio of the overlapping region can be implemented by the conventional technical means in the art, and will not be described herein.
After the overlap detection of each single-machine local data is completed in any of the manners described above, step S200 may be executed to perform fusion processing on the single-machine local data with the overlap to obtain fused data. When the data fusion processing is performed, the following procedure can be performed.
Specifically, first, the similarity transformation parameters { R, t, s } of each of the single-machine local data where the overlap exists are obtained using a relative orientation technique. Wherein R is rotation, t is translation, and s is dimension. And then, according to the obtained similarity transformation parameters, fusing data of the single-machine local data with the overlapped area to obtain corresponding fused data.
For example, after the read single-machine local data are subjected to overlap detection in any one of the manners, the obtained local data a and local data B overlap, and the overlap area reaches 15%. Then, the local data a and the local data B having the overlapping area are subjected to fusion processing. When the fusion processing is carried out, firstly, the similarity transformation parameters of the local data A and the local data B are obtained by using a relative orientation technology, and then the local data A is subjected to rotation, translation and scale transformation according to the similarity transformation parameters obtained by the relative orientation, so that the local data A and the local data B can be fused and superposed together to form fused data.
For another example: after the read single-machine local data are subjected to overlapping detection in any mode, the obtained local data A, the local data B and the local data C are overlapped, and the overlapping area reaches 15%. Then, the local data a, the local data B, and the local data B having the overlapping area are subjected to fusion processing. When fusion processing is carried out, firstly, a relative orientation technology is used for obtaining similarity transformation parameters of the local data A and the local data, and then rotation, translation and scale transformation are carried out on the local data A according to the similarity transformation parameters obtained by the relative orientation, so that the local data A and the local data B can be fused and superposed together to form fused data. Then, a relative orientation technology is used to obtain the similarity transformation parameters of the fused data and the local data C after the local data A and the local data B are fused, and then the fused data (or the local data C) after the local data A and the local data B are fused is rotated, translated and subjected to scale transformation, so that the local data C and the fused data after the local data A and the local data B are fused can be superposed to form the fused data.
By analogy, when the number of the local data with the overlapped area is detected to be larger than two through overlapping, the two local data are fused firstly, and then other local data are sequentially fused based on the fused data.
After the fusion processing of the overlapped local data of the single machine is completed in the above manner, step S300 may be executed, the fusion data is transmitted back to the corresponding ground processor, and the overlapped local data of the single machine is updated to the fusion data.
In a possible implementation manner, referring to fig. 2, in the process of returning the fused data to the corresponding ground processor and updating the overlapped single-machine local data into the fused data, the method may further include: step S310, detecting whether the fusion processing of all the overlapped single-machine local data is completed. After all the overlapped local data of the single machine are fused, step S400 is executed to integrate the fusion result.
Furthermore, in the method of the embodiment of the present application, when processing overlap detection of local data, the processing thread of the ground processing center of the drone of any one of the two data is temporarily suspended, because detection of overlapping candidate data in this process easily affects determination of the similarity transformation parameter s. To reduce redundancy, every overlapping all object points will be fused. And finally, the calculated scale factor is applied to all local data fusion of two centers. After this step is completed, the processing thread is resumed and a new handler will run on the new merged data.
That is to say, in the method according to the embodiment of the present application, each drone continuously transmits back data to its corresponding ground processor, each ground processor fuses the received new data with the original data, overlap detection is performed between different processors, once an overlapping area is found, fusion processing is performed between processing units having the overlapping area, and the result after fusion is updated to a sub-graph owned by the original processing unit having the overlapping area until all sub-graphs are entirely updated. The method realizes the redundant analysis and fusion of the local data returned by a plurality of single machines, and is beneficial to improving the speed and the precision of the overall data processing.
Correspondingly, based on any one of the above methods for processing the return data of the cluster unmanned aerial vehicle, the application also provides a device for processing the return data of the cluster unmanned aerial vehicle. Because the working principle of the cluster unmanned aerial vehicle return processing device provided by the application is the same as or similar to the principle of the cluster unmanned aerial vehicle return data processing method provided by the application, repeated parts are not repeated.
Referring to fig. 3, the clustered drone backhaul data processing apparatus 100 provided by the present application includes a data reading module 110, an overlap detection module 120, a data fusion module 130, and a data updating module 140. The data reading module 110 is configured to read the stand-alone local data currently returned to the ground processor by each drone in the drone cluster. An overlap detection module 120 configured to perform overlap detection on the respective single local data. And a data fusion module 130 configured to perform fusion processing on the single machine local data with the data overlap when the overlap detection module 120 detects that the single machine local data has the data overlap, so as to obtain fused data. And the data updating module 140 is configured to transmit the fusion data back to the corresponding ground processor, and update the single local data with the overlapping into the fusion data.
In one possible implementation, the overlap detection module 120 includes a local feature extraction sub-module, a visual word construction sub-module, and a visual word matching sub-module (not shown). The local feature extraction submodule is configured to extract corresponding local features from each single machine local data according to the scene environment acquired by the unmanned aerial vehicle cluster. And the visual word construction submodule is configured to quantize each extracted local feature by adopting a visual dictionary construction method to obtain representative feature vectors, and define each representative feature vector as a visual word. And the visual word matching submodule is configured to acquire visual words of fused scene data, and match the acquired representative feature vectors with the visual words of the fused scene data by adopting a nearest neighbor search algorithm so as to detect whether the local data of each single machine has an overlapping area. It should be noted that the fused scene data is overlap region data detected by previous overlap when the overlap detection is currently performed on the single-machine local data.
Still further, according to another aspect of the present application, there is also provided a cluster drone backhaul data processing device 200. Referring to fig. 4, the cluster drone backhaul data processing device 200 of the embodiment of the present application includes a processor 210 and a memory 220 for storing instructions executable by the processor 210. Wherein the processor 210 is configured to execute the executable instructions to implement any one of the aforementioned methods for processing backhaul data of the cluster drone.
Here, it should be noted that the number of the processors 210 may be one or more. Meanwhile, in the cluster drone backhaul data processing device 200 according to the embodiment of the present application, an input device 230 and an output device 240 may also be included. The processor 210, the memory 220, the input device 230, and the output device 240 may be connected via a bus, or may be connected via other methods, which is not limited in detail herein.
The memory 220, which is a computer-readable storage medium, may be used to store software programs, computer-executable programs, and various modules, such as: the cluster unmanned aerial vehicle returns a program or a module corresponding to the data processing method. The processor 210 executes various functional applications and data processing of the clustered drone backhaul data processing device 200 by running software programs or modules stored in the memory 220.
The input device 230 may be used to receive an input number or signal. Wherein the signal may be a key signal generated in connection with user settings and function control of the device/terminal/server. The output device 240 may include a display device such as a display screen.
According to another aspect of the present application, there is also provided a non-transitory computer-readable storage medium, on which computer program instructions are stored, and the computer program instructions, when executed by the processor 210, implement the group drone backhaul data processing method described in any one of the above.
Having described embodiments of the present application, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A cluster unmanned aerial vehicle feedback data processing method is characterized by comprising the following steps:
reading single machine local data currently transmitted back to a ground processor by each unmanned aerial vehicle in the unmanned aerial vehicle cluster, and performing overlapping detection on each single machine local data;
when the data overlapping exists in the single machine local data, performing fusion processing on the single machine local data with the overlapping to obtain fusion data;
and returning the fusion data to the corresponding ground processor, and updating the single machine local data with the overlapping into the fusion data.
2. The method of claim 1, wherein performing overlap detection on each of the single local data comprises:
extracting corresponding local features from the single machine local data according to the scene environment acquired by the unmanned aerial vehicle cluster;
quantizing the extracted local features by adopting a visual dictionary construction method to obtain representative feature vectors, and defining the representative feature vectors as visual words;
acquiring visual words of fused scene data, and matching the obtained representative feature vectors with the visual words of the fused scene data by adopting a nearest neighbor search algorithm so as to detect whether the single-machine local data has an overlapping region;
wherein the fused scene data is: and fusion data obtained after fusion of the single-machine local data with the overlapping area is detected in the previous overlapping mode based on the current overlapping mode of the single-machine local data.
3. The method of claim 1, wherein upon detecting that there is data overlap in the single-machine local data, further comprising:
judging whether the overlapping degree of the overlapping area reaches a preset overlapping degree;
and when the preset contact ratio is reached, performing fusion processing on the single machine local data with the overlap.
4. The method according to claim 3, wherein the predetermined degree of overlap is greater than or equal to 15%.
5. The method of claim 1, wherein the fusing the single-machine local data with the overlapping to obtain fused data comprises:
obtaining similarity transformation parameters of the single-machine local data with overlapping by using a relative orientation technology;
and performing data fusion processing on the single machine local data with the overlapping according to the obtained similarity transformation parameters.
6. The method according to any of claims 1 to 5, wherein upon detecting that there is an overlap in the single-machine local data, further comprising: and stopping the ground processor corresponding to the overlapped single machine local data from updating the local data.
7. The utility model provides a cluster unmanned aerial vehicle passback data processing apparatus which characterized in that includes: the device comprises a data reading module, an overlap detection module, a data fusion module and a data updating module;
the data reading module is configured to read the stand-alone local data currently transmitted back to the ground processor by each unmanned aerial vehicle in the unmanned aerial vehicle cluster;
the overlap detection module is configured to perform overlap detection on each of the single-machine local data;
the data fusion module is configured to perform fusion processing on the single machine local data with the overlapping to obtain fusion data when the overlapping detection module detects that the single machine local data has data overlapping;
and the data updating module is configured to transmit the fusion data back to the corresponding ground processor, and update the single machine local data with the overlapping into the fusion data.
8. The apparatus of claim 7, wherein the overlap detection module comprises a local feature extraction sub-module, a visual word construction sub-module, and a visual word matching sub-module;
the local feature extraction submodule is configured to extract corresponding local features from each single machine local data according to the scene environment acquired by the unmanned aerial vehicle cluster;
the visual word construction submodule is configured to quantize the extracted local features by adopting a visual dictionary construction method to obtain representative feature vectors, and define the representative feature vectors as visual words;
the visual word matching submodule is configured to acquire visual words of fused scene data, and match the acquired representative feature vectors with the visual words of the fused scene data by adopting a nearest neighbor search algorithm so as to detect whether the single-machine local data has an overlapping region;
the fused scene data is overlapping area data detected in the previous overlapping mode when the single-machine local data are detected in the overlapping mode currently.
9. The utility model provides a cluster unmanned aerial vehicle passback data processing equipment which characterized in that includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to carry out the method of any one of claims 1 to 6 when executing the executable instructions.
10. A non-transitory computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the method of any of claims 1 to 6.
CN202110084275.8A 2021-01-21 2021-01-21 Method, device, equipment and storage medium for processing return data of cluster unmanned aerial vehicle Active CN112949292B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110084275.8A CN112949292B (en) 2021-01-21 2021-01-21 Method, device, equipment and storage medium for processing return data of cluster unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110084275.8A CN112949292B (en) 2021-01-21 2021-01-21 Method, device, equipment and storage medium for processing return data of cluster unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112949292A true CN112949292A (en) 2021-06-11
CN112949292B CN112949292B (en) 2024-04-05

Family

ID=76235810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110084275.8A Active CN112949292B (en) 2021-01-21 2021-01-21 Method, device, equipment and storage medium for processing return data of cluster unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112949292B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109029422A (en) * 2018-07-10 2018-12-18 北京木业邦科技有限公司 A kind of method and apparatus of the three-dimensional investigation map of multiple no-manned plane cooperation building
CN109559277A (en) * 2018-11-28 2019-04-02 中国人民解放军国防科技大学 Multi-unmanned aerial vehicle cooperative map construction method oriented to data sharing
CN109615698A (en) * 2018-12-03 2019-04-12 哈尔滨工业大学(深圳) Multiple no-manned plane SLAM map blending algorithm based on the detection of mutual winding
CN109978755A (en) * 2019-03-11 2019-07-05 广州杰赛科技股份有限公司 Panoramic image synthesis method, device, equipment and storage medium
CN110068335A (en) * 2019-04-23 2019-07-30 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110675506A (en) * 2019-08-21 2020-01-10 佳都新太科技股份有限公司 System, method and equipment for realizing three-dimensional augmented reality of multi-channel video fusion
CN111461464A (en) * 2020-05-06 2020-07-28 农业农村部南京农业机械化研究所 Plant protection unmanned aerial vehicle cluster operation task allocation method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109029422A (en) * 2018-07-10 2018-12-18 北京木业邦科技有限公司 A kind of method and apparatus of the three-dimensional investigation map of multiple no-manned plane cooperation building
CN109559277A (en) * 2018-11-28 2019-04-02 中国人民解放军国防科技大学 Multi-unmanned aerial vehicle cooperative map construction method oriented to data sharing
CN109615698A (en) * 2018-12-03 2019-04-12 哈尔滨工业大学(深圳) Multiple no-manned plane SLAM map blending algorithm based on the detection of mutual winding
CN109978755A (en) * 2019-03-11 2019-07-05 广州杰赛科技股份有限公司 Panoramic image synthesis method, device, equipment and storage medium
CN110068335A (en) * 2019-04-23 2019-07-30 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110675506A (en) * 2019-08-21 2020-01-10 佳都新太科技股份有限公司 System, method and equipment for realizing three-dimensional augmented reality of multi-channel video fusion
CN111461464A (en) * 2020-05-06 2020-07-28 农业农村部南京农业机械化研究所 Plant protection unmanned aerial vehicle cluster operation task allocation method and device

Also Published As

Publication number Publication date
CN112949292B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
US20200242013A1 (en) Champion test case generation
EP3625696A1 (en) Systems and methods for searching images
CN109658454B (en) Pose information determination method, related device and storage medium
CN111860300A (en) Key point detection method and device, terminal equipment and storage medium
US20230237666A1 (en) Image data processing method and apparatus
CN111931720B (en) Method, apparatus, computer device and storage medium for tracking image feature points
WO2014203687A1 (en) Image processing method, image processing device, and image processing program
CN108304578B (en) Map data processing method, medium, device and computing equipment
CN114492601A (en) Resource classification model training method and device, electronic equipment and storage medium
US4783831A (en) Method for producing a standard pattern for pattern matching
CN112949292B (en) Method, device, equipment and storage medium for processing return data of cluster unmanned aerial vehicle
CN113759338A (en) Target detection method and device, electronic equipment and storage medium
CN115830342A (en) Method and device for determining detection frame, storage medium and electronic device
CN113723515B (en) Moire pattern recognition method, device, equipment and medium based on image recognition
CN116009581A (en) Unmanned aerial vehicle inspection method for power transmission line, unmanned aerial vehicle control terminal and storage medium
Li et al. HybridPoint: Point Cloud Registration Based on Hybrid Point Sampling and Matching
CN116069340A (en) Automatic driving model deployment method, device, equipment and storage medium
CN115115919A (en) Power grid equipment thermal defect identification method and device
CN114973173A (en) Method and device for classifying driving scene data, electronic equipment and storage medium
Gollub et al. A partitioned approach for efficient graph–based place recognition
CN110705695A (en) Method, device, equipment and storage medium for searching model structure
JP7393809B2 (en) Automatic phase mapping processing method based on omnidirectional image information, its system and computer program
KR102384177B1 (en) Auto topology mapping method based on omni-directional image and system thereof
KR102617222B1 (en) Auto topology mapping method based on omni-directional image and system thereof
KR102489115B1 (en) Method and Apparatus for Deep Machine Learning for Vision Inspection of a Manufactured Product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant