CN116804882A - Intelligent unmanned aerial vehicle control system based on stream data processing and unmanned aerial vehicle thereof - Google Patents
Intelligent unmanned aerial vehicle control system based on stream data processing and unmanned aerial vehicle thereof Download PDFInfo
- Publication number
- CN116804882A CN116804882A CN202310704509.3A CN202310704509A CN116804882A CN 116804882 A CN116804882 A CN 116804882A CN 202310704509 A CN202310704509 A CN 202310704509A CN 116804882 A CN116804882 A CN 116804882A
- Authority
- CN
- China
- Prior art keywords
- image
- coefficient
- influence
- acquisition
- transmission
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 title claims abstract description 44
- 230000005540 biological transmission Effects 0.000 claims description 93
- 238000004891 communication Methods 0.000 claims description 38
- 238000000034 method Methods 0.000 claims description 37
- 238000004458 analytical method Methods 0.000 claims description 29
- 238000005286 illumination Methods 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 5
- 230000001105 regulatory effect Effects 0.000 claims description 5
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 12
- 230000007547 defect Effects 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The application discloses an intelligent unmanned aerial vehicle control system based on stream data processing and an unmanned aerial vehicle thereof, and relates to the technical field of unmanned aerial vehicles.
Description
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an intelligent unmanned aerial vehicle control system based on stream data processing and an unmanned aerial vehicle thereof.
Background
The stream data is a group of sequential, massive, rapid and continuous arrival data sequences, and can be regarded as a dynamic data set which continues with time and grows infinitely in general, and the unmanned aerial vehicle automatic obstacle avoidance system can avoid the obstacles in the flight path in time by collecting the image information in the flight process to process the stream data and identify the obstacles in the image, so that various losses caused by misoperation are greatly reduced;
however, the existing civil unmanned aerial vehicle has the defects of low accuracy of an obstacle avoidance system and large influence by the environment, when a binocular camera is selected to shoot and collect image information, objects with long distance can be detected, obstacles can be found earlier, but the image accuracy is not high and is limited by the ambient light; when the laser radar is selected to scan and generate image information, small-sized obstacles can be found, the accuracy is high, but side lobes are easy to form, so that laser energy is dispersed, and objects with long distances are difficult to detect; when the two image acquisition modes are used simultaneously, the energy consumption of the unmanned aerial vehicle can be increased, so that the cruising ability of the unmanned aerial vehicle is poor;
aiming at the technical defects, an intelligent unmanned aerial vehicle control system based on stream data processing and an unmanned aerial vehicle thereof are provided.
Disclosure of Invention
The application aims to provide an intelligent unmanned aerial vehicle control system based on stream data processing and an unmanned aerial vehicle thereof, which solve the problems that the image precision of an obstacle avoidance system is not high and the influence of environment is great.
In order to achieve the above purpose, the present application adopts the following technical scheme: an intelligent unmanned aerial vehicle control system based on stream data processing comprises an information acquisition unit, a data processing unit, an image regulation and control unit and an automatic control unit, wherein the information acquisition unit, the data processing unit, the image regulation and control unit and the automatic control unit are connected through signals;
the information acquisition unit is used for acquiring flight information of the unmanned aerial vehicle and transmitting the flight information to the data processing unit;
the unmanned aerial vehicle flight information comprises image information T and influence factor information Y, wherein the image information T comprises acquired image parameters TJ and transmission image parameters TS;
the data processing unit is used for receiving flight information of the unmanned aerial vehicle, preprocessing the flight information to fuse the information, and then performing depth analysis:
acquiring an influence mode of influence factor information Y on the acquisition process by comparing and analyzing acquired image parameters TJ under different image acquisition modes;
acquiring an influence mode of influence factor information Y in the transmission process by comparing and analyzing acquired image parameters TJ and transmission image parameters TS before and after communication transmission in the same image acquisition mode;
then comprehensively generating a comprehensive influence mode of influence factor information Y on the acquisition and transmission process, and judging the stability of image acquisition and transmission;
the image regulation and control unit regularly selects an image acquisition mode with optimal stability through a comprehensive influence mode, identifies obstacles in the flight process according to the acquired images, and regenerates and sends obstacle avoidance signals to the automatic control unit;
and the automatic control unit receives the obstacle avoidance signal, generates an optimal flight path and automatically controls the unmanned aerial vehicle to fly.
Preferably, the unmanned aerial vehicle flight information acquisition and analysis process is as follows:
s1: the acquired image parameters TJ comprise a frame rate Ac of an acquired image and a pixel Bc of the acquired image;
the information acquisition unit simultaneously generates two groups of acquired images in the flight process of the unmanned aerial vehicle through two image acquisition modes, wherein the two groups of acquired images comprise a first acquired image Yx1 and a second acquired image Yx2, and acquired image parameters TJ are respectively extracted from the two groups of acquired images Yx1 and Yx 2;
s2: the transmission image parameter TS includes a frame rate As of the transmission image and a pixel Bs of the transmission image;
after the two groups of collected images are transmitted to the data processing unit through signals, the images received by the data processing unit are called transmission images, the transmission images comprise a first transmission image Yc1 and a second transmission image Yc2, and transmission image parameters TS are respectively extracted from the two groups of transmission images Yc1 and Yc 2;
s3: the influence factor information Y comprises a flying speed value V, an illumination intensity value L and an air humidity value H, and generates an influence factor coefficient Fyx according to the influence factor information;
s4: substituting the acquired image parameters TJ, the transmission image parameters TS and the influence factor coefficients Fyx into an influence mode analysis model to acquire the influence mode of the influence factor information Y on the image information acquisition process and the transmission process.
Preferably, the process of establishing the influence pattern analysis model is as follows:
b1: generating an acquired image precision coefficient Fj according to the acquired image parameter TJ, and generating a transmission image precision coefficient Fs according to the transmission image parameter TS;
b3: acquiring communication loss Sh according to an acquired image precision coefficient Fj and a transmission image precision coefficient Fs under the same image acquisition mode;
b4: dynamically analyzing acquired image precision coefficients Fj under different image acquisition modes with an influence factor coefficient Fyx respectively to acquire a first influence mode of influence factor information Y on an acquisition process;
b5: the communication loss Sh under different image acquisition modes is respectively and dynamically analyzed with an influence factor coefficient Fyx, and a second influence mode of the influence factor information Y in the transmission process is obtained;
b6: and synthesizing an image precision coefficient Fj and a communication loss degree Sh under the same image acquisition mode, dynamically analyzing the image precision coefficient Fj and the communication loss degree Sh and an influence factor coefficient Fyx to generate a stability influence coefficient Fx, and acquiring a comprehensive influence mode of influence factor information on the image acquisition mode.
Preferably, the specific analysis of the first influence pattern is as follows:
b4-1: generating a first acquisition image precision coefficient Fj1 according to the image frame rate Ac1 and the image pixels Bc1 of the first acquisition image Yx 1;
b4-2: generating a second acquisition image precision coefficient Fj2 according to the image frame rate Ac2 and the image pixels Bc2 of the second acquisition image Yx 2;
b4-3: and determining a first influence mode of the influence factor information on the image acquisition process by acquiring a change function F1 of the influence factor coefficient Fyx-first acquired image precision coefficient Fj1 and a change function F2 of the influence factor coefficient Fyx-second acquired image precision coefficient Fj 2.
Preferably, the specific analysis of the second influence pattern is as follows:
b5-1: generating a first transmission image precision coefficient Fs1 according to the image frame rate As1 of the first transmission image Yc1 and the image pixels Bs 1;
b5-2: generating a second transmission image precision coefficient Fs2 according to the image frame rate As2 of the second transmission image Yc2 and the image pixels Bs 2;
b5-3: generating a first communication loss Sh1 according to the first acquisition image precision coefficient Fj1 and the first transmission image precision coefficient Fs1;
b5-4: generating a second communication loss Sh2 according to the second laser acquisition image precision coefficient Fj2 and a second laser transmission image precision coefficient Fs2;
b5-5: and determining a second influence mode of the influence factor information on the image transmission process by acquiring a change function S1 of the influence factor coefficient Fyx-first communication loss degree Sh1 and a change function S2 of the influence factor coefficient Fyx-second communication loss degree Sh 2.
Preferably, the specific analysis process of the comprehensive influence mode is as follows:
according to the change function F1 of the influence factor coefficient Fyx-first acquisition image precision coefficient Fj1 and the change function S1 of the influence factor coefficient Fyx-first communication loss Sh1, a stability influence coefficient Fx1 of influence factor information on a first image acquisition mode is comprehensively generated;
according to a change function F2 of an influence factor coefficient Fyx-second acquired image precision coefficient Fj2 and a change function S2 of an influence factor coefficient Fyx-second communication loss Sh2, a stability influence coefficient Fx2 of influence factor information on a second image acquisition mode is comprehensively generated;
and comparing the values of the stability influence coefficients Fx1 and Fx2, generating corresponding image acquisition signals, transmitting the corresponding image acquisition signals to an image regulation and control unit, and regulating an image acquisition mode in real time by the image regulation and control unit according to the flying environment state.
The intelligent unmanned aerial vehicle based on stream data processing comprises the technical scheme of the intelligent unmanned aerial vehicle control system based on stream data processing.
In summary, due to the adoption of the technical scheme, the beneficial effects of the application are as follows:
1. according to the intelligent unmanned aerial vehicle control system based on stream data processing, flight information of an unmanned aerial vehicle is collected through an information collecting unit, information preprocessing and depth analysis are carried out through a data processing unit, influence modes of influence factor information on image collecting and image transmission processes are obtained respectively, so that a comprehensive influence mode is generated to judge the stability of an image, then an image collecting mode with optimal stability is selected at regular time through an image regulating and controlling unit, an obstacle avoidance signal is generated and sent to an automatic control unit, and finally the unmanned aerial vehicle is automatically controlled to fly through the automatic control unit;
2. the application solves the defects of low precision and large environmental influence of the obstacle avoidance system on the existing civil unmanned aerial vehicle system, and establishes an influence mode analysis model by analyzing the flight environment, thereby selecting an image acquisition mode with good stability, ensuring the precision of the image, realizing precise obstacle avoidance, being free from environmental restriction and realizing automatic control of unmanned aerial vehicle flight;
3. the application can directly judge the image acquisition mode with good image acquisition effect by establishing the contrast analysis model, is suitable for the condition that the influence factor information is difficult to calculate, and is simple and convenient by directly comparing the acquired images and then selecting the optimal image acquisition mode when the influence factor information is inaccurate to calculate or the acquisition device of the influence factor information is accidentally damaged.
Drawings
For a clearer description of embodiments of the present application or technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, it will be apparent that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained according to these drawings for a person of ordinary skill in the art;
FIG. 1 is a schematic block diagram of the present application;
FIG. 2 is a flow chart of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Example 1:
1-2, an intelligent unmanned aerial vehicle control system based on stream data processing comprises an information acquisition unit, a data processing unit, an image regulation and control unit and an automatic control unit, wherein the information acquisition unit, the data processing unit, the image regulation and control unit and the automatic control unit are connected through signals;
the working steps are as follows:
s1: the information acquisition unit is used for acquiring flight information of the unmanned aerial vehicle and transmitting the flight information to the data processing unit;
the unmanned aerial vehicle flight information comprises image information T and influence factor information Y, wherein the image information T comprises acquired image parameters TJ and transmission image parameters TS;
the unmanned aerial vehicle flight information acquisition and analysis process is as follows:
s1: the acquired image parameters TJ comprise a frame rate Ac of an acquired image and a pixel Bc of the acquired image;
the information acquisition unit simultaneously generates two groups of acquired images in the flight process of the unmanned aerial vehicle through two image acquisition modes, wherein the two groups of acquired images comprise a first acquired image Yx1 and a second acquired image Yx2, and acquired image parameters TJ are respectively extracted from the two groups of acquired images Yx1 and Yx 2;
for example, the information acquisition unit shoots two image acquisition modes through a binocular camera and scans the phased array laser radar, two groups of acquisition images in the flight process of the unmanned aerial vehicle are generated simultaneously, the two groups of acquisition images comprise an image Yx1 shot by the binocular camera and an image Yx2 scanned by the phased array laser radar, and acquired image parameters TJ are respectively extracted from the two groups of acquisition images Yx1 and Yx 2;
the image acquisition mode of the phased array laser radar scanning has the characteristics of high shooting precision and easy formation of side lobes, so that laser energy is dispersed; the acquired image parameters TJ in the images are extracted for analysis by carrying out normalization processing on the images acquired in the two modes, so that the image acquisition modes can be conveniently selected through image precision;
s2: the transmission image parameter TS includes a frame rate As of the transmission image and a pixel Bs of the transmission image;
after the two groups of collected images are transmitted to the data processing unit through signals, the images received by the data processing unit are called transmission images, the transmission images comprise a first transmission image Yc1 and a second transmission image Yc2, and transmission image parameters TS are respectively extracted from the two groups of transmission images Yc1 and Yc 2;
for example, the transmission image includes a transmission image Yc1 of a binocular camera and a transmission image Yc2 scanned by a phased array laser radar, and transmission image parameters TS are extracted from the two sets of transmission images Yc1 and Yc2, respectively;
s3: the influence factor information Y comprises flying speed, illumination intensity and air humidity;
for example, the flying speed value V is measured by a velocimeter, the illumination intensity value L of the flying environment is measured by a photoelectric sensor, the air humidity value H of the flying environment is measured by a humidity sensor, and an influence factor coefficient Fyx is generated according to the influence factor information;
the preset influence factor coefficient calculation formula is as follows:
Fyx=λ1*V+λ2*L+λ3*H
wherein, λ1, λ2, λ3 are weight factor coefficients of the flying speed value V, the illumination intensity value L and the air humidity value H respectively, and λ1, λ2, λ3 are all larger than 0;
s4: substituting the acquired image parameters TJ, the transmission image parameters TS and the influence factor coefficients Fyx into an influence mode analysis model to acquire an influence mode of influence factor information Y on an image information acquisition process and a transmission process;
s2: the data processing unit is used for receiving flight information of the unmanned aerial vehicle, preprocessing the flight information to fuse the information, and then performing depth analysis:
acquiring an influence mode of influence factor information Y on the acquisition process by comparing and analyzing acquired image parameters TJ under different image acquisition modes;
acquiring an influence mode of influence factor information Y in the transmission process by comparing and analyzing acquired image parameters TJ and transmission image parameters TS before and after communication transmission in the same image acquisition mode;
then comprehensively generating a comprehensive influence mode of influence factor information Y on the acquisition and transmission process, and judging the stability of image acquisition and transmission;
the process for establishing the influence mode analysis model is as follows:
b1: generating an acquired image precision coefficient Fj according to the acquired image parameter TJ, and generating a transmission image precision coefficient Fs according to the transmission image parameter TS;
the preset image precision coefficient measuring and calculating formula is as follows:
image precision coefficient=μ1×image frame rate+μ2×image pixels
Wherein, mu 1 and mu 2 are respectively the image frame rate and the weight factor coefficient of the image pixel, and mu 1 and mu 2 are both larger than 0;
b3: acquiring communication loss Sh according to an acquired image precision coefficient Fj and a transmission image precision coefficient Fs under the same image acquisition mode;
the preset communication loss measurement formula is as follows:
communication loss sh=acquisition image precision coefficient Fj-transmission image precision coefficient Fs;
b4: dynamically analyzing acquired image precision coefficients Fj under different image acquisition modes with an influence factor coefficient Fyx respectively to acquire a first influence mode of influence factor information Y on an acquisition process;
the specific analysis process of the first influence mode is as follows:
b4-1: generating a binocular acquired image precision coefficient Fj1 according to the image frame rate Ac1 of the acquired image Yx1 and the image pixels Bc 1;
b4-2: generating a phased array laser acquisition image precision coefficient Fj2 according to the image frame rate Ac2 of the acquired image Yx2 and the image pixels Bc2;
b4-3: determining a first influence mode of influence factor information on an image acquisition process by acquiring a change function F1 of an influence factor coefficient Fyx-binocular acquired image precision coefficient Fj1 and a change function F2 of an influence factor coefficient Fyx-phased array laser acquired image precision coefficient Fj2;
preset fj1= Fyx/F1, fj2= Fyx/F2;
when the unmanned aerial vehicle flies, the acquisition time periods of the two groups of image acquisition modes are the same, the influence factor coefficients Fyx are equal, and the change functions F1 and F2 are different, so that the acquired image precision coefficients are different;
if the change function value F is larger, the influence degree of the influence factor information on the image acquisition process is represented to be deeper, and the acquired image precision coefficient is lower;
b5: the communication loss Sh under different image acquisition modes is respectively and dynamically analyzed with an influence factor coefficient Fyx, and a second influence mode of the influence factor information Y in the transmission process is obtained;
the specific analysis process of the second influence mode is as follows:
b5-1: generating a binocular transmission image precision coefficient Fs1 according to the image frame rate As1 of the transmission image Yc1 and the image pixels Bs 1;
b5-2: generating a phased array laser transmission image precision coefficient Fs2 according to the image frame rate As2 of the transmission image Yc2 and the image pixels Bs 2;
b5-3: generating communication loss Sh1 of binocular shooting according to the binocular acquired image precision coefficient Fj1 and the binocular transmission image precision coefficient Fs1;
b5-4: generating the communication loss Sh2 of the phased array laser according to the phased array laser acquisition image precision coefficient Fj2 and the phased array laser transmission image precision coefficient Fs2;
b5-5: determining a second influence mode of influence factor information on the image transmission process by acquiring a change function S1 of the influence factor coefficient Fyx-communication loss degree Sh1 of binocular shooting and a change function S2 of the influence factor coefficient Fyx-communication loss degree Sh2 of phased array laser;
preset sh1= Fyx ×s1, sh2= Fyx ×s2;
when the unmanned aerial vehicle flies, the communication transmission time periods of the two groups of image acquisition modes are the same, the influence factor coefficients Fyx are equal, and the communication loss degree Sh is different when the change functions S1 and S2 are different;
if the change function value S is larger, the influence degree of the influence factor information on the image transmission process is deeper, and the communication loss degree is higher;
b6: synthesizing an image precision coefficient Fj and a communication loss degree Sh under the same image acquisition mode, dynamically analyzing the image precision coefficient Fj and the communication loss degree Sh and an influence factor coefficient Fyx to generate a stability influence coefficient Fx, and acquiring a comprehensive influence mode of influence factor information on the image acquisition mode;
the specific analysis process of the comprehensive influence mode is as follows:
b6-1: according to the change function F1 of the influence factor coefficient Fyx-binocular acquired image precision coefficient Fj1 and the change function S1 of the influence factor coefficient Fyx-binocular shooting communication loss Sh1, a stability influence coefficient Fx1 of influence factor information on binocular shooting is comprehensively generated;
wherein, alpha 1 and alpha 2 are weight factors of the change functions F1 and S1 respectively, and alpha 1 and alpha 2 are both larger than 0;
b6-2: according to the influence factor coefficient Fyx-a change function F2 of the phased array laser acquisition image precision coefficient Fj2 and the influence factor coefficient Fyx-a change function S2 of the communication loss Sh2 of the phased array laser, comprehensively generating a stability influence coefficient Fx2 of influence factor information on the phased array laser;
wherein, beta 1 and beta 2 are weight factors of the change functions F2 and S2 respectively, and alpha 1 and alpha 2 are both larger than 0;
comparing the values of the stability influence coefficients Fx1 and Fx2, generating corresponding image acquisition signals, transmitting the corresponding image acquisition signals to an image regulation and control unit, and regulating an image acquisition mode in real time by the image regulation and control unit according to the flying environment state;
when Fx1 is larger than Fx2, generating a first image acquisition signal, and starting a bidirectional camera to shoot an image when the image regulation and control unit receives the first image acquisition signal;
when Fx1 is smaller than Fx2, generating a second image acquisition signal, and starting a phased array laser radar to scan an image when the image regulation and control unit receives the second image acquisition signal;
when fx1=fx2, the original image acquisition mode is maintained without change;
s3: the image regulation and control unit regularly selects an image acquisition mode with optimal stability through a comprehensive influence mode, identifies obstacles in the flight process according to the acquired images, and regenerates and sends obstacle avoidance signals to the automatic control unit;
the method comprises the steps of selecting an existing image recognition technology in the process of recognizing the obstacle in the image, and acquiring an environment image in the flight process at fixed time, so that an optimal image acquisition mode is selected according to the unmanned aerial vehicle flight environment, the image accuracy is high, the obstacle is not limited by the environment, and the obstacle avoidance effect is good;
s4: the automatic control unit receives the obstacle avoidance signal, generates an optimal flight path and automatically controls the unmanned aerial vehicle to fly;
the intelligent unmanned aerial vehicle based on the stream data processing comprises the embodiment of the intelligent unmanned aerial vehicle control system based on the stream data processing.
In addition, under the condition of good flying environment conditions, an optimal image acquisition mode can be selected according to the condition of flying obstacles;
recording the times of obstacle avoidance signals, measuring and calculating the obstacle avoidance frequency Pl in unit time, and setting a threshold value for the obstacle avoidance frequency Pl: when the obstacle avoidance frequency Pl exceeds a preset threshold, the unmanned aerial vehicle is high in obstacle avoidance frequency and more in flight obstacle, and a binocular vision acquisition mode is selected; when the obstacle avoidance frequency Pl is lower than a preset threshold value, the unmanned aerial vehicle is low in obstacle avoidance frequency and few in flight obstacle, and phased array laser scanning is selected.
According to the intelligent unmanned aerial vehicle control system based on stream data processing, flight information of an unmanned aerial vehicle is collected through an information collecting unit, information preprocessing and depth analysis are carried out through a data processing unit, influence modes of influence factor information on image collecting and image transmission processes are obtained respectively, so that a comprehensive influence mode is generated to judge the stability of an image, then an image collecting mode with optimal stability is selected at regular time through an image regulating and controlling unit, an obstacle avoidance signal is generated and sent to an automatic control unit, and finally the unmanned aerial vehicle is automatically controlled to fly through the automatic control unit;
the application solves the defects of low precision and large environmental influence of the obstacle avoidance system on the existing civil unmanned aerial vehicle system, and establishes the influence mode analysis model by analyzing the flight environment, thereby selecting the image acquisition mode with good stability, ensuring the precision of the image, realizing precise obstacle avoidance, being free from environmental restriction and realizing automatic control of unmanned aerial vehicle flight.
Example 2: an intelligent unmanned aerial vehicle control system based on stream data processing, the main body part of which is basically the same as embodiment 1, except that: the data processing unit establishes a contrast analysis model so as to judge an image acquisition mode with good image acquisition effect:
the comparative analysis model was established as follows:
a1: firstly, setting the lengths of the image videos to be T0, and comparing two groups of image videos with the same shooting duration;
a2: the method comprises the steps of enabling an image shot by a binocular camera to be Yx1, enabling an image scanned by a phased array laser radar to be Yx2, respectively extracting image information from two groups of images, and obtaining collected image parameters TJ, wherein the collected image parameters TJ comprise an image frame rate Ac1 and an image pixel Bc1 in the image Yx1 shot by the binocular camera, and an image frame rate Ac2 and an image pixel Bc2 in the image Yx2 scanned by the phased array laser radar;
a3: comparing the image frame rates Ac1 and Ac2 with the image pixels Bc1 and Bc2;
a3-1: when Ac1 is greater than Ac2 and Bc1 is greater than Bc2, the effect of Yx1 images is good, and the influence acquisition mode of binocular camera shooting is better than the image acquisition mode of phased array laser radar scanning;
a3-2: when Ac2 is greater than Ac1 and Bc2 is greater than Bc1, the effect of Yx2 images is good, and the acquisition mode of the phased array laser radar scanning is better than the acquisition mode of images shot by the binocular cameras;
a3-3: if ac1=ac2 and bc1=bc2, the effect of the Yx1 image is the same as the effect of the Yx2 image, and the influence acquisition mode of binocular camera shooting is the same as the image acquisition mode of phased array laser radar scanning;
a3-4: when Ac1 is greater than Ac2 and Bc1 is less than Bc2, or Ac1 is less than Ac2 and Bc1 is greater than Bc2, comparing Ca and Cb by calculating the difference degree Ca of the image frame rate and the difference degree Cb of the image pixels;
a3-41: if Ca > Cb, the difference degree of the image frame rate is large, the image frame rates Ac1 and Ac2 are compared,
when Ac1 is larger than Ac2, the effect of the Yx1 image is good; when Ac1 is smaller than Ac2, the effect of the Yx2 image is good;
a3-42: if Cb > Ca, representing that the degree of difference of the image pixels is large, the image pixels Bc1 and Bc2 are compared,
when Bc1 > Bc2, the effect of representing Yx1 image is good; when Bc1 is less than Bc2, the effect of representing Yx2 image is good;
the application can directly judge the image acquisition mode with good image acquisition effect by establishing the contrast analysis model, is suitable for the condition that the influence factor information is difficult to calculate, such as inaccurate calculation of the influence factor information or accidental damage of the acquisition device of the influence factor information, and selects the optimal image acquisition mode by directly comparing the acquired images, thereby being simple and convenient.
The above formulas are all formulas with dimensions removed and numerical values calculated, the formulas are formulas with a large amount of data collected for software simulation to obtain the latest real situation, and preset parameters in the formulas are set by those skilled in the art according to the actual situation.
The foregoing is only a preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art, who is within the scope of the present application, should make equivalent substitutions or modifications according to the technical scheme of the present application and the inventive concept thereof, and should be covered by the scope of the present application.
Claims (7)
1. An intelligent unmanned aerial vehicle control system based on stream data processing, which is characterized in that: the system comprises an information acquisition unit, a data processing unit, an image regulation and control unit and an automatic control unit, wherein the information acquisition unit, the data processing unit, the image regulation and control unit and the automatic control unit are connected through signals;
the information acquisition unit is used for acquiring flight information of the unmanned aerial vehicle and transmitting the flight information to the data processing unit;
the unmanned aerial vehicle flight information comprises image information T and influence factor information Y, wherein the image information T comprises acquired image parameters TJ and transmission image parameters TS;
the data processing unit is used for receiving flight information of the unmanned aerial vehicle, preprocessing the flight information to fuse the information, and then performing depth analysis:
acquiring an influence mode of influence factor information Y on the acquisition process by comparing and analyzing acquired image parameters TJ under different image acquisition modes;
acquiring an influence mode of influence factor information Y in the transmission process by comparing and analyzing acquired image parameters TJ and transmission image parameters TS before and after communication transmission in the same image acquisition mode;
then comprehensively generating a comprehensive influence mode of influence factor information Y on the acquisition and transmission process, and judging the stability of image acquisition and transmission;
the image regulation and control unit regularly selects an image acquisition mode with optimal stability through a comprehensive influence mode, identifies obstacles in the flight process according to the acquired images, and regenerates and sends obstacle avoidance signals to the automatic control unit;
and the automatic control unit receives the obstacle avoidance signal, generates an optimal flight path and automatically controls the unmanned aerial vehicle to fly.
2. An intelligent unmanned aerial vehicle control system based on stream data processing as claimed in claim 1, wherein: the unmanned aerial vehicle flight information acquisition and analysis process is as follows:
s1: the acquired image parameters TJ comprise a frame rate Ac of an acquired image and a pixel Bc of the acquired image;
the information acquisition unit simultaneously generates two groups of acquired images in the flight process of the unmanned aerial vehicle through two image acquisition modes, wherein the two groups of acquired images comprise a first acquired image Yx1 and a second acquired image Yx2, and acquired image parameters TJ are respectively extracted from the two groups of acquired images Yx1 and Yx 2;
s2: the transmission image parameter TS includes a frame rate As of the transmission image and a pixel Bs of the transmission image;
after the two groups of collected images are transmitted to the data processing unit through signals, the images received by the data processing unit are called transmission images, the transmission images comprise a first transmission image Yc1 and a second transmission image Yc2, and transmission image parameters TS are respectively extracted from the two groups of transmission images Yc1 and Yc 2;
s3: the influence factor information Y comprises a flying speed value V, an illumination intensity value L and an air humidity value H, and generates an influence factor coefficient Fyx according to the influence factor information;
s4: substituting the acquired image parameters TJ, the transmission image parameters TS and the influence factor coefficients Fyx into an influence mode analysis model to acquire the influence mode of the influence factor information Y on the image information acquisition process and the transmission process.
3. An intelligent unmanned aerial vehicle control system based on stream data processing as claimed in claim 2, wherein: the process of establishing the influence pattern analysis model is as follows:
b1: generating an acquired image precision coefficient Fj according to the acquired image parameter TJ, and generating a transmission image precision coefficient Fs according to the transmission image parameter TS;
b3: acquiring communication loss Sh according to an acquired image precision coefficient Fj and a transmission image precision coefficient Fs under the same image acquisition mode;
b4: dynamically analyzing acquired image precision coefficients Fj under different image acquisition modes with an influence factor coefficient Fyx respectively to acquire a first influence mode of influence factor information Y on an acquisition process;
b5: the communication loss Sh under different image acquisition modes is respectively and dynamically analyzed with an influence factor coefficient Fyx, and a second influence mode of the influence factor information Y in the transmission process is obtained;
b6: and synthesizing an image precision coefficient Fj and a communication loss degree Sh under the same image acquisition mode, dynamically analyzing the image precision coefficient Fj and the communication loss degree Sh and an influence factor coefficient Fyx to generate a stability influence coefficient Fx, and acquiring a comprehensive influence mode of influence factor information on the image acquisition mode.
4. A stream data processing based intelligent drone control system as claimed in claim 3, wherein: the specific analysis of the first impact pattern is as follows:
b4-1: generating a first acquisition image precision coefficient Fj1 according to the image frame rate Ac1 and the image pixels Bc1 of the first acquisition image Yx 1;
b4-2: generating a second acquisition image precision coefficient Fj2 according to the image frame rate Ac2 and the image pixels Bc2 of the second acquisition image Yx 2;
b4-3: and determining a first influence mode of the influence factor information on the image acquisition process by acquiring a change function F1 of the influence factor coefficient Fyx-first acquired image precision coefficient Fj1 and a change function F2 of the influence factor coefficient Fyx-second acquired image precision coefficient Fj 2.
5. An intelligent unmanned aerial vehicle control system based on stream data processing as claimed in claim 4, wherein: the specific analysis of the second influence pattern is as follows:
b5-1: generating a first transmission image precision coefficient Fs1 according to the image frame rate As1 of the first transmission image Yc1 and the image pixels Bs 1;
b5-2: generating a second transmission image precision coefficient Fs2 according to the image frame rate As2 of the second transmission image Yc2 and the image pixels Bs 2;
b5-3: generating a first communication loss Sh1 according to the first acquisition image precision coefficient Fj1 and the first transmission image precision coefficient Fs1;
b5-4: generating a second communication loss Sh2 according to the second laser acquisition image precision coefficient Fj2 and a second laser transmission image precision coefficient Fs2;
b5-5: and determining a second influence mode of the influence factor information on the image transmission process by acquiring a change function S1 of the influence factor coefficient Fyx-first communication loss degree Sh1 and a change function S2 of the influence factor coefficient Fyx-second communication loss degree Sh 2.
6. An intelligent unmanned aerial vehicle control system based on stream data processing as claimed in claim 5, wherein: the specific analysis process of the comprehensive influence mode is as follows:
according to the change function F1 of the influence factor coefficient Fyx-first acquisition image precision coefficient Fj1 and the change function S1 of the influence factor coefficient Fyx-first communication loss Sh1, a stability influence coefficient Fx1 of influence factor information on a first image acquisition mode is comprehensively generated;
according to a change function F2 of an influence factor coefficient Fyx-second acquired image precision coefficient Fj2 and a change function S2 of an influence factor coefficient Fyx-second communication loss Sh2, a stability influence coefficient Fx2 of influence factor information on a second image acquisition mode is comprehensively generated;
and comparing the values of the stability influence coefficients Fx1 and Fx2, generating corresponding image acquisition signals, transmitting the corresponding image acquisition signals to an image regulation and control unit, and regulating an image acquisition mode in real time by the image regulation and control unit according to the flying environment state.
7. Intelligent unmanned aerial vehicle based on stream data processing, its characterized in that: the intelligent unmanned aerial vehicle is composed of the above claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310704509.3A CN116804882B (en) | 2023-06-14 | 2023-06-14 | Intelligent unmanned aerial vehicle control system based on stream data processing and unmanned aerial vehicle thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310704509.3A CN116804882B (en) | 2023-06-14 | 2023-06-14 | Intelligent unmanned aerial vehicle control system based on stream data processing and unmanned aerial vehicle thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116804882A true CN116804882A (en) | 2023-09-26 |
CN116804882B CN116804882B (en) | 2023-12-29 |
Family
ID=88079303
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310704509.3A Active CN116804882B (en) | 2023-06-14 | 2023-06-14 | Intelligent unmanned aerial vehicle control system based on stream data processing and unmanned aerial vehicle thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116804882B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117392561A (en) * | 2023-10-07 | 2024-01-12 | 中国公路工程咨询集团有限公司 | Remote sensing unmanned aerial vehicle image processing method and system for intelligent traffic construction data acquisition |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180144644A1 (en) * | 2016-11-23 | 2018-05-24 | Sharper Shape Oy | Method and system for managing flight plan for unmanned aerial vehicle |
CN208873047U (en) * | 2018-11-26 | 2019-05-17 | 国网宁夏电力有限公司银川供电公司 | A kind of inspection device based on multi-rotor unmanned aerial vehicle |
CN111026150A (en) * | 2019-11-25 | 2020-04-17 | 国家电网有限公司 | System and method for pre-warning geological disasters of power transmission line by using unmanned aerial vehicle |
CN112966699A (en) * | 2021-03-24 | 2021-06-15 | 沸蓝建设咨询有限公司 | Target detection system of communication engineering project |
CN113056904A (en) * | 2020-05-28 | 2021-06-29 | 深圳市大疆创新科技有限公司 | Image transmission method, movable platform and computer readable storage medium |
US20210306558A1 (en) * | 2020-03-30 | 2021-09-30 | Beijing Xiaomi Mobile Software Co., Ltd. | Photographing method and device, mobile terminal and storage medium |
CN114511675A (en) * | 2022-02-14 | 2022-05-17 | 山东志诚地理信息技术有限公司 | Unmanned aerial vehicle camera management and control system based on real-scene three-dimensional data manufacturing |
-
2023
- 2023-06-14 CN CN202310704509.3A patent/CN116804882B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180144644A1 (en) * | 2016-11-23 | 2018-05-24 | Sharper Shape Oy | Method and system for managing flight plan for unmanned aerial vehicle |
CN208873047U (en) * | 2018-11-26 | 2019-05-17 | 国网宁夏电力有限公司银川供电公司 | A kind of inspection device based on multi-rotor unmanned aerial vehicle |
CN111026150A (en) * | 2019-11-25 | 2020-04-17 | 国家电网有限公司 | System and method for pre-warning geological disasters of power transmission line by using unmanned aerial vehicle |
US20210306558A1 (en) * | 2020-03-30 | 2021-09-30 | Beijing Xiaomi Mobile Software Co., Ltd. | Photographing method and device, mobile terminal and storage medium |
CN113056904A (en) * | 2020-05-28 | 2021-06-29 | 深圳市大疆创新科技有限公司 | Image transmission method, movable platform and computer readable storage medium |
CN112966699A (en) * | 2021-03-24 | 2021-06-15 | 沸蓝建设咨询有限公司 | Target detection system of communication engineering project |
CN114511675A (en) * | 2022-02-14 | 2022-05-17 | 山东志诚地理信息技术有限公司 | Unmanned aerial vehicle camera management and control system based on real-scene three-dimensional data manufacturing |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117392561A (en) * | 2023-10-07 | 2024-01-12 | 中国公路工程咨询集团有限公司 | Remote sensing unmanned aerial vehicle image processing method and system for intelligent traffic construction data acquisition |
CN117392561B (en) * | 2023-10-07 | 2024-05-14 | 中国公路工程咨询集团有限公司 | Remote sensing unmanned aerial vehicle image processing method and system for intelligent traffic construction data acquisition |
Also Published As
Publication number | Publication date |
---|---|
CN116804882B (en) | 2023-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116804882B (en) | Intelligent unmanned aerial vehicle control system based on stream data processing and unmanned aerial vehicle thereof | |
CN108255198B (en) | Shooting cradle head control system and control method under unmanned aerial vehicle flight state | |
US9491440B2 (en) | Depth-sensing camera system | |
CN103808723A (en) | Exhaust gas blackness automatic detection device for diesel vehicles | |
CN110244314A (en) | One kind " low slow small " target acquisition identifying system and method | |
CN112327906A (en) | Intelligent automatic inspection system based on unmanned aerial vehicle | |
CN110046584B (en) | Road crack detection device and detection method based on unmanned aerial vehicle inspection | |
US20220103799A1 (en) | Image data processing method and apparatus, image processing chip, and aircraft | |
CN114430462B (en) | Unmanned aerial vehicle autonomous photographing parameter adjusting method, device, equipment and storage medium | |
Huang et al. | An underwater image enhancement method for simultaneous localization and mapping of autonomous underwater vehicle | |
CN110596734B (en) | Multi-mode Q learning-based unmanned aerial vehicle positioning interference source system and method | |
CN104580895A (en) | Airborne imaging system with synchronous camera shooting and photographing capacity | |
CN110702016A (en) | Power transmission line icing measurement system and method | |
CN113222838A (en) | Unmanned aerial vehicle autonomous line patrol method based on visual positioning | |
CN114740482B (en) | Underwater explosion positioning method based on combination of acoustics and vision | |
CN110012280B (en) | TOF module for VSLAM system and VSLAM calculation method | |
CN111380833A (en) | Multi-view vision-based impurity removal system | |
CN111311640A (en) | Unmanned aerial vehicle identification and tracking method based on motion estimation | |
CN114693556B (en) | High-altitude parabolic frame difference method moving object detection and smear removal method | |
CN114092522B (en) | Airport plane take-off and landing intelligent capturing and tracking method | |
CN113093799A (en) | Unmanned aerial vehicle power parameter control system based on artificial intelligence and image processing | |
CN113408396A (en) | Bridge intelligent sensing system based on cloud computing | |
CN113406091A (en) | Unmanned aerial vehicle system for detecting fan blade and control method | |
CN208522852U (en) | Recognition of face crime scene investigation device | |
CN112493228A (en) | Laser bird repelling method and system based on three-dimensional information estimation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |