CN107943068A - A kind of unmanned plane vision by Pisces eye is autologous to perceive group system and its control method - Google Patents
A kind of unmanned plane vision by Pisces eye is autologous to perceive group system and its control method Download PDFInfo
- Publication number
- CN107943068A CN107943068A CN201711002466.5A CN201711002466A CN107943068A CN 107943068 A CN107943068 A CN 107943068A CN 201711002466 A CN201711002466 A CN 201711002466A CN 107943068 A CN107943068 A CN 107943068A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- eye
- earth station
- communication module
- fish
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000004891 communication Methods 0.000 claims abstract description 54
- 230000008447 perception Effects 0.000 claims abstract description 5
- 238000001514 detection method Methods 0.000 claims description 12
- 241000251468 Actinopterygii Species 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 3
- 230000035807 sensation Effects 0.000 abstract description 3
- 239000011159 matrix material Substances 0.000 description 5
- 239000000178 monomer Substances 0.000 description 5
- 230000004927 fusion Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000000205 computational method Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a kind of autologous perception group system of unmanned plane vision by Pisces eye and its control method, it is made of multiple UAVs and earth station part;Wherein, unmanned plane includes unmanned aerial vehicle platform and Pisces eye vision module, UAV Communication module and airborne computer on unmanned aerial vehicle platform, unmanned aerial vehicle platform, Pisces eye vision module and UAV Communication module are connected with airborne computer;Earth station part includes connected earth station's communication module and earth station's computer;UAV Communication module on all unmanned planes is with earth station's communication module by wirelessly communicating.Unmanned plane in the present invention is by handling fish-eye camera image, remaining unmanned plane position and contexture by self behavior are independently perceived using visible sensation method, remove the communication steps for needing mutually to send positional information in distributing group system between unmanned plane from, limitation of the traffic for cluster scale is released, a kind of feasible system architecture is provided to build ultra-large distributing group system.
Description
Technical field
The present invention relates to unmanned plane application field, more particularly to a kind of autologous perceive of unmanned plane vision by Pisces eye to collect
Group's system and its control method.
Background technology
As unmanned air vehicle technique developed rapidly in recent years, the task that unmanned plane undertakes in practical applications becomes increasingly complex more
Sample, the performance requirement of unmanned plane are also higher and higher.However, due to monomer UAV system limited mass, monomer unmanned plane can be held
The equipment of load is limited, limits the performance of monomer unmanned plane, and monomer unmanned air vehicle technique is absorbed in bottleneck.
Compared to development high-performance monomer unmanned plane, using a large amount of simply unmanned planes come to complete task be a kind of feasible choosing
Select.The advantage of multiple no-manned plane cluster has:Reliability is high, and when single rack unmanned plane failure can be made up by other unmanned planes or guest machine;
Using wide, multiple UAVs can complete the task that single rack unmanned plane cannot be completed, and such as carry out three-dimensional measurement to target;It is efficient,
Multiple UAVs carry distinct device, and complex task is split into some simple tasks, by unmanned plane cluster cooperation completion task.
The control mode of unmanned plane cluster is divided into centerized fusion substantially and distributing controls two kinds.Centerized fusion refers to
Set up control centre and unmanned plane cluster is controlled in aerial or ground.Centerized fusion is realized simply, but as cluster is advised
Mould expands, and the data of control centre's processing will sharply increase.Distributing control refers to every frame unmanned plane and is furnished with relatively independent control
Device, can complete undertaken task according to remaining unmanned plane state contexture by self.It is more difficult that distributing control is realized, by communication environment
Influence greatly, but cluster scalability is high, is the research hotspot of unmanned plane cluster.
Distributing unmanned plane group system still has some limitations.On the one hand, in order to ensure not occur between unmanned plane
Collision, every frame unmanned plane need to obtain the position of remaining unmanned plane in cluster in real time, and in existing group system, unmanned plane passes through
Digital communication technology, outwardly broadcasts self-position, and receives the positional information of remaining unmanned plane.With the increasing of unmanned plane scale
Add, communication pressure will sharply increase, and communication quality will drastically decline, and limit accessible cluster scale in practice.The opposing party
Face, outdoor unmanned plane cluster generally use GPS positioning, and gps signal is interfered or during dropout, and unmanned plane will be difficult to ensure that
The safe distance between unmanned plane around, there is the danger to collide.
The content of the invention
It is an object of the invention to the deficiency for existing unmanned plane cluster, proposes a kind of unmanned aerial vehicle vision by Pisces eye
Feel autologous perception group system, the unmanned plane in the system substitutes GPS and the method to communicate acquisition nothing using binocular flake method
Man-machine position, solves the problems, such as that traditional unmanned plane cluster is high with positioning requirements to communicating.
The purpose of the present invention is what is be achieved through the following technical solutions:It is a kind of autologous by the unmanned plane vision of Pisces eye
Group system is perceived, it is made of multiple UAVs and earth station part;Wherein, the unmanned plane include unmanned aerial vehicle platform and
Pisces eye vision module, UAV Communication module and airborne computer on unmanned aerial vehicle platform, the unmanned aerial vehicle platform,
Pisces eye vision module and UAV Communication module are connected with airborne computer;The earth station part includes connected ground
Communication module of standing and earth station's computer;UAV Communication module on all unmanned planes passes through nothing with earth station communication module
Line communicates.
Further, the Pisces eye vision module includes the first fish-eye camera and the second fish-eye camera, respectively position
It is connected in the unmanned aerial vehicle platform both sides and with the unmanned aerial vehicle platform, first fish-eye camera and the second fish-eye camera
Direction it is consistent with the unmanned aerial vehicle platform heading.
The present invention also provides a kind of autologous control method for perceiving group system of unmanned plane vision by Pisces eye, the party
Method includes the following steps:
(1) operator inputs the control instruction of each frame unmanned plane in earth station's computer, and earth station's computer will control
Instruction is sent to earth station's communication module;
(2) corresponding control instruction is sent to corresponding UAV Communication module by earth station's communication module;
(3) control instruction is sent to airborne computer by UAV Communication module;
(4) image information is sent to airborne computer by Pisces eye vision module;
(5) airborne computer receives control instruction, and the hand automatic mode instruction in control instruction carries out in next step
Operation, if manual mode, then performs step (6.1);If automatic mode, then step (6.2) is performed;
(6.1) manual mode:The manual control instruction received is converted into airborne control instruction and sent by airborne computer
To unmanned aerial vehicle platform;
(6.2) automatic mode:Airborne computer receives the image information that Pisces eye vision module is sent, and detects remaining nobody
The position of machine in the picture, calculates position of remaining unmanned plane with respect to this unmanned plane, and generation, which automatically controls, to be instructed and be converted into
Airborne control instruction is sent to unmanned aerial vehicle platform;
(7) unmanned aerial vehicle platform is moved according to airborne control instruction is converted into;
(8) unmanned aerial vehicle platform sends unmanned plane status information to airborne computer;
(9) unmanned plane status information is sent to UAV Communication module by airborne computer;
(10) unmanned plane status information is sent to earth station's communication module by UAV Communication module;
(11) unmanned plane status information is sent to earth station's computer by earth station's communication module;
(12) each frame unmanned plane of earth station's computer analyzing is sent unmanned plane status information is simultaneously shown on screen.
Further, the Pisces eye vision module includes the first fish-eye camera and the second fish-eye camera, respectively position
It is connected in the unmanned aerial vehicle platform both sides and with the unmanned aerial vehicle platform, first fish-eye camera and the second fish-eye camera
Direction it is consistent with the unmanned aerial vehicle platform heading.
Further, airborne control instruction is sent to unmanned plane by airborne computer in step (6.1) and step (6.2)
Platform, the airborne control instruction include setting speed, position and Heading control instruction, and speed, position and course are established
Under earth absolute coordinate system.
Further, in step (6.1), the manual control instruction refers to including setting speed, position, Heading control
Order.
Further, in step (6.2), calculate remaining unmanned plane with respect to the position of this unmanned plane specific method such as
Under:
(6.2.1) first fish-eye camera and the second fish-eye camera gather original image information respectively, in two flakes
The position in the picture of remaining unmanned plane in field range and length and width are detected in image, generate unmanned plane detection information;
(6.2.2) airborne computer receives the original image information and the unmanned plane detection information, to described two
Unmanned plane position and length and width in fish eye images are matched, and generate unmanned plane unpaired message;
(6.2.3) this unmanned plane calculates another unmanned plane and the first fish of this unmanned plane according to the unmanned plane unpaired message
Eye imaging head angle, and the angle of another unmanned plane and the second fish-eye camera of this unmanned plane is calculated, due to the first fish
Distance between Eye imaging head and the second fish-eye camera is it is known that the position of another unmanned plane can be calculated using trigonometric function method
Put;
(6.2.4) calculates the position of each unmanned plane by step (6.2.1)-step (6.2.3).
Further, in step (8), the unmanned plane status information include unmanned aerial vehicle platform present speed, position and
Course information, speed, position and course are established under earth absolute coordinate system.
The beneficial effects of the invention are as follows:The present invention increases for unmanned plane independently perceives remaining unmanned seat in the plane using visible sensation method
The ability put, removes the communication steps for needing mutually to send positional information in distributing group system between unmanned plane from, releases logical
Limitation of the traffic for cluster scale, while dependence of the unmanned plane cluster to GPS positioning system is reduced, lifting unmanned plane cluster
Safety guarantee.
Brief description of the drawings
Fig. 1 is the system framework of the present invention;
Fig. 2 is the binocular flake visible sensation method principle schematic of the present invention;
Fig. 3 is the unmanned plane front view of the embodiment of the present invention;
Fig. 4 is the structure diagram of the embodiment of the present invention;
Description of reference numerals:Unmanned aerial vehicle platform 1, Pisces eye vision module 2, UAV Communication module 3, airborne computer 4,
Earth station's communication module 5, earth station's computer 6, the first fish-eye camera 7, the second fish-eye camera 8, the first unmanned plane 9, the
Two unmanned planes 10, the 3rd unmanned plane 11, the 4th unmanned plane 12.
Embodiment
The present invention will be further described below in conjunction with the accompanying drawings:
Shown as shown in Figure 1, Figure 3 and Figure 4, a kind of autologous perception group system of unmanned plane vision by Pisces eye, it is by more
Frame unmanned plane and earth station part form;Wherein, unmanned plane includes unmanned aerial vehicle platform 1 and on unmanned aerial vehicle platform 1
Pisces eye vision module 2, UAV Communication module 3 and airborne computer 4, unmanned aerial vehicle platform 1, Pisces eye vision module 2 and nothing
Man-machine communication module 3 is connected with airborne computer 4;Earth station part includes connected earth station's communication module 5 and earth station
Computer 6;UAV Communication module 3 on all unmanned planes is with earth station communication module 5 by wirelessly communicating.
Wherein, as shown in Figure 3 and Figure 4, Pisces eye vision module 2 includes the first fish-eye camera 7 and the second flake images
First 8, it is connected respectively positioned at 1 both sides of unmanned aerial vehicle platform and with unmanned aerial vehicle platform 1, the first fish-eye camera 7 and the shooting of the second flake
First 8 direction is consistent with 1 heading of unmanned aerial vehicle platform, and the first fish-eye camera 7 and the second fish-eye camera 8 gather figure respectively
As information and send to airborne computer 4.
Unmanned aerial vehicle platform 1 selects the M100 models unmanned aerial vehicle platform 1 of DJI companies in the present invention, but not limited to this;
UAV Communication module 3 selects the XBee Pro digital transmission modules of MaxStream companies in the present invention, but is not limited to
This;
Airborne computer 4 selects the NVIDIA Jetson TX2 model modules of NVIDIA companies in the present invention, but is not limited to
This;
Earth station's communication module 5 selects the XBee Pro digital transmission modules of MaxStream companies in the present invention, but is not limited to
This;
Earth station's computer 6 selects the Precision M7510 of Dell in the present invention, but not limited to this;
The first fish-eye camera 7 and the second fish-eye camera 8 select the SY019HD models of Wei Xin visual fields company in the present invention
Module, but not limited to this.
The present invention also provides a kind of autologous control for perceiving group system of unmanned plane vision by Pisces eye, this method bag
Include following steps:
(1) operator inputs the control instruction of each frame unmanned plane by keyboard in earth station's computer 6, and earth station calculates
Control instruction is sent to earth station's communication module 5 by machine 6;
(2) corresponding control instruction is sent to corresponding UAV Communication module 3 by earth station's communication module 5;
(3) control instruction is sent to airborne computer 4 by UAV Communication module 3;
(4) image information is sent to airborne computer 4 by Pisces eye vision module 2;
(5) airborne computer 4 receives control instruction, and the hand automatic mode instruction in control instruction carries out in next step
Operation, if manual mode, then performs step (6.1);If automatic mode, then step (6.2) is performed;
(6.1) manual mode:The manual control instruction received is converted into airborne control instruction and sent by airborne computer 4
To unmanned aerial vehicle platform 1;Manual control instruction includes setting speed, position, Heading control instruction, and manual control instruction passes through keyboard
Input earth station's computer 6;Airborne control instruction includes setting speed, position and Heading control instruction, speed, position and course
Establish under earth absolute coordinate system.
(6.2) automatic mode:Airborne computer 4 receives the image information that Pisces eye vision module 2 is sent, and detects remaining nothing
Man-machine position in the picture, calculates position of remaining unmanned plane with respect to this unmanned plane, and generation automatically controls instruction and changes
Unmanned aerial vehicle platform 1 is sent into airborne control instruction;
(7) unmanned aerial vehicle platform 1 is moved according to airborne control instruction is converted into;
(8) unmanned aerial vehicle platform 1 sends unmanned plane status information to airborne computer 4;
(9) unmanned plane status information is sent to UAV Communication module 3 by airborne computer 4;
(10) unmanned plane status information is sent to earth station's communication module 5 by UAV Communication module 3;
(11) unmanned plane status information is sent to earth station's computer 6 by earth station's communication module 5;
(12) earth station's computer 6 parses the unmanned plane status information that each frame unmanned plane is sent and is shown on screen.
In step (6.2), it is as follows with respect to the specific method of the position of this unmanned plane to calculate remaining unmanned plane:
(6.2.1) the first fish-eye camera 7 and the second fish-eye camera 8 gather the concurrent feeding device of original image information respectively
Load computer 4, two images collected respectively in synchronization for the first fish-eye camera 7 and the second fish-eye camera 8,
Visual field model is detected in two images using the algorithm of target detection based on deep learning, such as SSD or Faster RCNN respectively
X coordinate, y-coordinate, height and the width of remaining unmanned plane in the picture in enclosing, the x coordinate of a frame unmanned plane in image, y-coordinate,
Height and width are denoted as unmanned plane detection information.
(6.2.2) believes for the unmanned machine testing generated by the image of the first fish-eye camera 7 and the second fish-eye camera 8
Breath is matched.
If generating m unmanned plane detection information by the image of the first fish-eye camera 7, wherein i-th of unmanned machine testing letter
Breath is denoted as Di;N unmanned plane detection information is generated by the image of the second fish-eye camera 8, wherein j-th of unmanned plane detection information
It is denoted as Dj.Unmanned plane detection information and the n for calculating a images generations by the first fish-eye camera 7 of m are a by the shooting of the second flake
Diversity factor between the unmanned plane detection information of first 8 image generation, can obtain the similarity matrix C of m*n dimensions by common m*n.
The computational methods of Di and Dj diversity factoies Cij are:
Cij=k1* | (hi*wi)-(hj*wj) |+k2* | (xi-xj) |+k3* | (yi-yj) |
Wherein, k1, k2 and k3 are proportionality coefficient, and xi, yi, hi and wi are respectively x coordinate, y-coordinate height and width in Di
Degree, xj, yj, hj and wj are respectively x coordinate, y-coordinate height and width in Dj.
The method that unmanned plane unpaired message is obtained by similarity matrix C is:
(a) Elements C min, the Cmin corresponding Di of minimum is found in element in not shielding for similarity matrix C, Dj is carried out
Pairing.
(b) for similarity matrix C, all elements of the corresponding i rows of shielding Cmin and j row, if all elements of Matrix C
Shielded, then stopped;Otherwise (a) is performed.
Remember that in m and n compared with decimal be min (m, n), this method can obtain min (m, n) to unmanned plane detection information, be denoted as nothing
Man-machine unpaired message.
(6.2.3) is as shown in Fig. 2, this unmanned plane is calculated separately according to unmanned plane unpaired message using pinhole camera model
One unmanned plane and this unmanned plane the first fish-eye camera angle, and calculate another unmanned plane and the second flake of this unmanned plane
The angle of camera, due to the distance between the first fish-eye camera and the second fish-eye camera it is known that trigonometric function side can be used
Method calculates the position of another unmanned plane;
(6.2.4) calculates the position of each unmanned plane by step (6.2.1)-step (6.2.3).
In above-mentioned control method, the unmanned plane status information includes 1 present speed of unmanned aerial vehicle platform, position and course
Information, speed, position and course are established under earth absolute coordinate system.
Above-mentioned specific embodiment is used for illustrating the present invention, is merely a preferred embodiment of the present invention, rather than
Limit the invention, in the protection domain of spirit and claims of the present invention, to the present invention make any modification,
Equivalent substitution, improvement etc., both fall within protection scope of the present invention.
Claims (8)
- A kind of 1. autologous perception group system of unmanned plane vision by Pisces eye, it is characterised in that it by multiple UAVs and Earth station part forms;Wherein, the unmanned plane includes unmanned aerial vehicle platform and the Pisces eye on unmanned aerial vehicle platform regards Feel module, UAV Communication module and airborne computer, the unmanned aerial vehicle platform, Pisces eye vision module and UAV Communication mould Block is connected with airborne computer;The earth station part includes connected earth station's communication module and earth station's computer;Institute There is the UAV Communication module on unmanned plane with earth station's communication module by wirelessly communicating.
- 2. the autologous perception group system of a kind of unmanned plane vision by Pisces eye according to claim 1, its feature exist In the Pisces eye vision module includes the first fish-eye camera and the second fish-eye camera, is put down respectively positioned at the unmanned plane Platform both sides are simultaneously connected with the unmanned aerial vehicle platform, the direction and the nothing of first fish-eye camera and the second fish-eye camera Man-machine platform heading is consistent.
- A kind of 3. autologous controlling party for perceiving group system of unmanned plane vision by Pisces eye according to claim 1 Method, it is characterised in that this method comprises the following steps:(1) operator inputs the control instruction of each frame unmanned plane in earth station's computer, and earth station's computer is by control instruction It is sent to earth station's communication module.(2) corresponding control instruction is sent to corresponding UAV Communication module by earth station's communication module;(3) control instruction is sent to airborne computer by UAV Communication module;(4) image information is sent to airborne computer by Pisces eye vision module;(5) airborne computer receives control instruction, and the hand automatic mode instruction in control instruction carries out next step operation, If manual mode, then step (6.1) is performed;If automatic mode, then step (6.2) is performed;(6.1) manual mode:The manual control instruction received is converted into airborne control instruction and is sent to nothing by airborne computer People's machine platform;(6.2) automatic mode:Airborne computer receives the image information that Pisces eye vision module is sent, and detects remaining unmanned plane and exists Position in image, calculates position of remaining unmanned plane with respect to this unmanned plane, and generation, which automatically controls, to be instructed and be converted into airborne Control instruction is sent to unmanned aerial vehicle platform;(7) unmanned aerial vehicle platform is moved according to airborne control instruction is converted into;(8) unmanned aerial vehicle platform sends unmanned plane status information to airborne computer;(9) unmanned plane status information is sent to UAV Communication module by airborne computer;(10) unmanned plane status information is sent to earth station's communication module by UAV Communication module;(11) unmanned plane status information is sent to earth station's computer by earth station's communication module;(12) each frame unmanned plane of earth station's computer analyzing is sent unmanned plane status information is simultaneously shown on screen.
- A kind of 4. autologous controlling party for perceiving group system of unmanned plane vision by Pisces eye according to claim 3 Method, it is characterised in that the Pisces eye vision module includes the first fish-eye camera and the second fish-eye camera, respectively positioned at institute State unmanned aerial vehicle platform both sides and be connected with the unmanned aerial vehicle platform, the court of first fish-eye camera and the second fish-eye camera To consistent with the unmanned aerial vehicle platform heading.
- A kind of 5. autologous controlling party for perceiving group system of unmanned plane vision by Pisces eye according to claim 3 Method, it is characterised in that airborne control instruction is sent to unmanned plane and put down by airborne computer in step (6.1) and step (6.2) Platform, the airborne control instruction include setting speed, position and Heading control instruction, and speed, position and course are established on ground Under ball absolute coordinate system.
- A kind of 6. autologous controlling party for perceiving group system of unmanned plane vision by Pisces eye according to claim 3 Method, it is characterised in that in step (6.1), the manual control instruction includes setting speed, position, Heading control instruction.
- A kind of 7. autologous controlling party for perceiving group system of unmanned plane vision by Pisces eye according to claim 4 Method, it is characterised in that in step (6.2), it is as follows with respect to the specific method of the position of this unmanned plane to calculate remaining unmanned plane:(6.2.1) first fish-eye camera and the second fish-eye camera gather original image information respectively, in two fish eye images In detect the position in the picture of remaining unmanned plane in field range and length and width, generate unmanned plane detection information;(6.2.2) airborne computer receives the original image information and the unmanned plane detection information, to two flakes Unmanned plane position and length and width in image are matched, and generate unmanned plane unpaired message;(6.2.3) this unmanned plane calculates another unmanned plane according to the unmanned plane unpaired message and is taken the photograph with the first flake of this unmanned plane As head angle, and the angle of another unmanned plane and the second fish-eye camera of this unmanned plane is calculated, since the first flake is taken the photograph As the distance between head and the second fish-eye camera it is known that the position of another unmanned plane can be calculated using trigonometric function method;(6.2.4) calculates the position of each unmanned plane by step (6.2.1)-step (6.2.3).
- A kind of 8. autologous controlling party for perceiving group system of unmanned plane vision by Pisces eye according to claim 3 Method, it is characterised in that in step (8), the unmanned plane status information includes unmanned aerial vehicle platform present speed, position and course Information, speed, position and course are established under earth absolute coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711002466.5A CN107943068A (en) | 2017-10-24 | 2017-10-24 | A kind of unmanned plane vision by Pisces eye is autologous to perceive group system and its control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711002466.5A CN107943068A (en) | 2017-10-24 | 2017-10-24 | A kind of unmanned plane vision by Pisces eye is autologous to perceive group system and its control method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107943068A true CN107943068A (en) | 2018-04-20 |
Family
ID=61936450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711002466.5A Pending CN107943068A (en) | 2017-10-24 | 2017-10-24 | A kind of unmanned plane vision by Pisces eye is autologous to perceive group system and its control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107943068A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109792951A (en) * | 2019-02-21 | 2019-05-24 | 华南农业大学 | For the unmanned plane course line correction system of hybrid rice pollination and its bearing calibration |
CN112539732A (en) * | 2020-12-04 | 2021-03-23 | 杭州电子科技大学 | Unmanned aerial vehicle cluster state and trajectory data acquisition platform |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104748696A (en) * | 2015-04-13 | 2015-07-01 | 西安交通大学 | Measuring method for full field deformation of large-dip-angle wing |
CN105159317A (en) * | 2015-09-14 | 2015-12-16 | 深圳一电科技有限公司 | Unmanned plane and control method |
CN105501457A (en) * | 2015-12-16 | 2016-04-20 | 南京航空航天大学 | Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle) |
CN105793792A (en) * | 2014-12-25 | 2016-07-20 | 深圳市大疆创新科技有限公司 | Flight auxiliary method and system of unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal |
CN105955067A (en) * | 2016-06-03 | 2016-09-21 | 哈尔滨工业大学 | Multi-satellite intelligent cluster control simulation system based on quadrotor unmanned planes, and simulation method using the same to implement |
CN106249752A (en) * | 2016-08-31 | 2016-12-21 | 中测新图(北京)遥感技术有限责任公司 | A kind of unmanned plane networking flight monitoring and collaborative collision avoidance method and device |
WO2017017675A1 (en) * | 2015-07-28 | 2017-02-02 | Margolin Joshua | Multi-rotor uav flight control method and system |
CN106384382A (en) * | 2016-09-05 | 2017-02-08 | 山东省科学院海洋仪器仪表研究所 | Three-dimensional reconstruction system and method based on binocular stereoscopic vision |
CN106444423A (en) * | 2016-09-30 | 2017-02-22 | 天津大学 | Indoor multi unmanned aerial vehicle formation flight simulation verification platform and achieving method thereof |
CN106444810A (en) * | 2016-10-31 | 2017-02-22 | 浙江大学 | Unmanned plane mechanical arm aerial operation system with help of virtual reality, and control method for unmanned plane mechanical arm aerial operation system |
-
2017
- 2017-10-24 CN CN201711002466.5A patent/CN107943068A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105793792A (en) * | 2014-12-25 | 2016-07-20 | 深圳市大疆创新科技有限公司 | Flight auxiliary method and system of unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal |
CN104748696A (en) * | 2015-04-13 | 2015-07-01 | 西安交通大学 | Measuring method for full field deformation of large-dip-angle wing |
WO2017017675A1 (en) * | 2015-07-28 | 2017-02-02 | Margolin Joshua | Multi-rotor uav flight control method and system |
CN105159317A (en) * | 2015-09-14 | 2015-12-16 | 深圳一电科技有限公司 | Unmanned plane and control method |
CN105501457A (en) * | 2015-12-16 | 2016-04-20 | 南京航空航天大学 | Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle) |
CN105955067A (en) * | 2016-06-03 | 2016-09-21 | 哈尔滨工业大学 | Multi-satellite intelligent cluster control simulation system based on quadrotor unmanned planes, and simulation method using the same to implement |
CN106249752A (en) * | 2016-08-31 | 2016-12-21 | 中测新图(北京)遥感技术有限责任公司 | A kind of unmanned plane networking flight monitoring and collaborative collision avoidance method and device |
CN106384382A (en) * | 2016-09-05 | 2017-02-08 | 山东省科学院海洋仪器仪表研究所 | Three-dimensional reconstruction system and method based on binocular stereoscopic vision |
CN106444423A (en) * | 2016-09-30 | 2017-02-22 | 天津大学 | Indoor multi unmanned aerial vehicle formation flight simulation verification platform and achieving method thereof |
CN106444810A (en) * | 2016-10-31 | 2017-02-22 | 浙江大学 | Unmanned plane mechanical arm aerial operation system with help of virtual reality, and control method for unmanned plane mechanical arm aerial operation system |
Non-Patent Citations (2)
Title |
---|
NICOLA KROMBACH 等: "Evaluation of Stereo Algorithms for Obstacle Detection with Fisheye Lenses", 《INTERNATIONAL CONFERENCE ON UNMANNED AERIAL VEHICLES IN GEOMATICS》 * |
曹玉琪: "多无人机安全通信距离的协同控制策略研究", 《中国优秀硕士学位论文全文数据库 工程科技II辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109792951A (en) * | 2019-02-21 | 2019-05-24 | 华南农业大学 | For the unmanned plane course line correction system of hybrid rice pollination and its bearing calibration |
CN112539732A (en) * | 2020-12-04 | 2021-03-23 | 杭州电子科技大学 | Unmanned aerial vehicle cluster state and trajectory data acquisition platform |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103049912B (en) | Random trihedron-based radar-camera system external parameter calibration method | |
CN106426186A (en) | Electrified operation robot autonomous operation method based on multi-sensor information fusion | |
CN102339021B (en) | UAV(unmanned aerial vehicle) visual simulation system and simulation method | |
CN113485392A (en) | Virtual reality interaction method based on digital twins | |
CN102176161B (en) | Flight simulation system facing to power line polling | |
CN106454209A (en) | Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology | |
CN112115607B (en) | Mobile intelligent body digital twin system based on multidimensional microblog space | |
EP3879443A2 (en) | Method and apparatus for recognizing wearing state of safety belt, electronic device, and storage medium | |
CN104457704A (en) | System and method for positioning ground targets of unmanned planes based on enhanced geographic information | |
CN110047150A (en) | It is a kind of based on augmented reality complex device operation operate in bit emulator system | |
US20200012756A1 (en) | Vision simulation system for simulating operations of a movable platform | |
CN106530293A (en) | Manual assembly visual detection error prevention method and system | |
Krückel et al. | Intuitive visual teleoperation for UGVs using free-look augmented reality displays | |
CN108846891B (en) | Man-machine safety cooperation method based on three-dimensional skeleton detection | |
CN107071297A (en) | A kind of virtual reality system that logical computer room displaying is believed for electric power | |
CN110889873A (en) | Target positioning method and device, electronic equipment and storage medium | |
CN104103081A (en) | Virtual multi-camera target tracking video material generation method | |
CN113566825B (en) | Unmanned aerial vehicle navigation method, system and storage medium based on vision | |
CN104932523A (en) | Positioning method and apparatus for unmanned aerial vehicle | |
CN106780337B (en) | Unmanned aerial vehicle carrier landing visual simulation method based on two-dimensional image | |
CN104680528A (en) | Space positioning method of explosive-handling robot based on binocular stereo vision | |
CN111666876A (en) | Method and device for detecting obstacle, electronic equipment and road side equipment | |
CN107943068A (en) | A kind of unmanned plane vision by Pisces eye is autologous to perceive group system and its control method | |
CN111208842A (en) | Virtual unmanned aerial vehicle and entity unmanned aerial vehicle mixed cluster task control system | |
CN109885091B (en) | Unmanned aerial vehicle autonomous flight control method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180420 |