CN106371460A - Target searching method and apparatus - Google Patents

Target searching method and apparatus Download PDF

Info

Publication number
CN106371460A
CN106371460A CN201610806325.8A CN201610806325A CN106371460A CN 106371460 A CN106371460 A CN 106371460A CN 201610806325 A CN201610806325 A CN 201610806325A CN 106371460 A CN106371460 A CN 106371460A
Authority
CN
China
Prior art keywords
described
target object
unmanned plane
object
panoramic picture
Prior art date
Application number
CN201610806325.8A
Other languages
Chinese (zh)
Inventor
滕龙
孙晓刚
李泽塬
任骥
张弟
Original Assignee
四川天辰智创科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 四川天辰智创科技有限公司 filed Critical 四川天辰智创科技有限公司
Priority to CN201610806325.8A priority Critical patent/CN106371460A/en
Publication of CN106371460A publication Critical patent/CN106371460A/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control

Abstract

The invention provides a target searching method and apparatus. The target searching method comprises the following steps: an unmanned aerial vehicle obtaining a panoramic image and sending the panoramic image to an auxiliary driving device; the auxiliary driving device receiving the panoramic image and performing visual scene analysis on the panoramic image so as to obtain a scene and objects in the panoramic image; performing target identification on the objects in the scene corresponding to the panoramic image, and in the presence of a target object, sending orientation information of the target object to the unmanned aerial vehicle; the unmanned aerial vehicle performing distance detection on orientation of the target object to obtain a distance between the target object and the unmanned aerial vehicle; and the unmanned aerial vehicle, according to the orientation of the target object and the distance between the target object and the unmanned aerial vehicle, approaching the target object. The method solves the problem of difficulty in locking the target object and even losing the target object due to incapability of obtaining the position of the target object in real time by the unmanned aerial vehicle when the target object moves. The method has the advantage of continuously locking the target object.

Description

Find the method and device of target

Technical field

The present invention relates to unmanned plane automation field, find the side of target in particular to a kind of utilization unmanned plane Method and device.

Background technology

In recent years, always one very active research field of unmanned plane.By radar, remote control or communicator Deng equipment, unmanned plane can be tracked, remote control or Digital Transmission etc..Small-sized unmanned plane can also be in narrow space Interior VTOL and hovering, this becomes the ideal platform completing to scout, monitor, follow the trail of etc. task.In civil area, people The picture pick-up device that can be carried with it imaged, high-altitude can also be carried out and take photo by plane.And can be by unmanned plane be delivered to The area of manpower inconvenience is gone to complete special duty.The prospect of unmanned plane is extremely wide.

However, in prior art, unmanned plane positional information mainly leans on gps to revise it is impossible to direct lock onto target, need according to Bad third party technology just can carry out the locking of target object, is then modified close to target to self-position again, particularly When target is for moving object, the difficulty that unmanned plane carries out target lock-on is bigger, and the process of target lock-on is also limited by gps signal Intensity.

Content of the invention

In view of this, the purpose of the present invention first preferred embodiment is to provide one kind can voluntarily carry out target object locking Searching mesh calibration method, described searching mesh calibration method is applied to be in communication with each other the unmanned plane of connection and assistant equipments on board, Methods described includes:

Described unmanned plane obtains panoramic picture, sends described panoramic picture to described assistant equipments on board;

Described assistant equipments on board receives described panoramic picture, and described panoramic picture is carried out with visual scene analysis, obtains Scene under described panoramic picture and object;

Object under the corresponding scene to described panoramic picture carries out target recognition, when there is target object, sends target The azimuth information of object gives described unmanned plane;

The orientation that described unmanned plane is located to described target object is to carry out distance measurement, obtains described target object and institute State the distance between unmanned plane;

Described unmanned plane is according to the distance between the orientation of described target object and described target object and unmanned plane, close Described target object.

In the present invention first preferred embodiment, the described mode near described target object includes:

Being monitored to described target object in the default distance range of described target object, and/or rest in On described target object.

In the present invention first preferred embodiment, described unmanned plane obtains in the step of panoramic picture:

Described unmanned plane obtains described panoramic picture by two panoramic optical cameras being installed on unmanned plane.

In the present invention first preferred embodiment, line-spacing is entered in orientation said target object being located in described unmanned plane From detection, obtain in the step of the distance between described target object and described unmanned plane:

Described unmanned plane carries out distance measurement by the continuous wave radar being installed on described unmanned plane.

The present invention second preferred embodiment also provide a kind of find mesh calibration method, be applied to be connected with UAV Communication Assistant equipments on board, methods described includes:

Receive the panoramic picture being obtained by described unmanned plane;

Described panoramic picture is carried out with visual scene analysis, obtains the scene under described panoramic picture and object;

Object under the corresponding scene to described panoramic picture carries out target recognition, when there is target object, sends described The azimuth information of target object gives described unmanned plane, by described unmanned plane according to the orientation of described target object and detection The distance between target object and unmanned plane, near described target object.

In the present invention second preferred embodiment, described assistant equipments on board carries out visual scene to described panoramic picture Analysis, obtains described visual scene and the step of object includes:

Visual scene analysis is carried out using the neural network algorithm based on deep learning to described panoramic picture, obtains described Scene under panoramic picture and object.

In the present invention second preferred embodiment, methods described also includes:

Target object corresponding under different scenes is stored in advance in a data base;

Object under the corresponding scene of described panoramic picture carries out in the step of target recognition,

The target object prestoring in object under corresponding for the described panoramic picture and described data base is compared.

The present invention the 3rd preferred embodiment also provides a kind of device finding target, is applied to be connected with UAV Communication Assistant equipments on board, described device includes:

Receiver module, for receiving the panoramic picture being obtained by described unmanned plane;

Visual scene analysis module, for described panoramic picture is carried out with visual scene analysis, obtains described panoramic picture Under scene and object;

Target recognition module, carries out target recognition for the object under scene corresponding to described panoramic picture, there is mesh During mark object, the azimuth information sending target object to described unmanned plane, by described unmanned plane according to the side of described target object The distance between position and the described target object detecting and described unmanned plane, near described target object.

In the present invention the 3rd preferred embodiment, described visual scene analysis module is using the nerve based on deep learning Network algorithm carries out visual scene analysis to described panoramic picture, obtains the scene under described panoramic picture and object.

In the present invention the 3rd preferred embodiment, described device also includes:

Described target recognition module is passed through to prestore in the object under corresponding for described panoramic picture scene and data base Target object compare, realize target object identification.

In terms of existing technologies, the method and device finding target provided in an embodiment of the present invention, having following has Beneficial effect:

The panoramic picture of shooting is carried out visual scene analysis, is analyzed by described visual scene and obtain panoramic picture midfield Scape and object, obtain the target object under described scene by target recognition.Described target object is locked, works as object When body is mobile, unmanned plane can real-time tracking target object determine target object location, simultaneously near described target object, keep away Exempt to lead to target object to be occurred by the situation with losing because the mobile unmanned plane of target is difficult to lock.

Brief description

In order to be illustrated more clearly that the technical scheme of the embodiment of the present invention, below will be attached to use required in embodiment Figure is briefly described it will be appreciated that the following drawings illustrate only certain embodiments of the present invention, and it is right to be therefore not construed as The restriction of scope, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to this A little accompanying drawings obtain other related accompanying drawings.

Fig. 1 be unmanned plane provided in an embodiment of the present invention with assistant equipments on board interact schematic diagram.

Fig. 2 is the block diagram of assistant equipments on board provided in an embodiment of the present invention.

The flow chart of the searching mesh calibration method that Fig. 3 provides for first embodiment of the invention.

The flow chart of the searching mesh calibration method that Fig. 4 provides for second embodiment of the invention.

The functional block diagram of the device of the searching target that Fig. 5 provides for third embodiment of the invention.

Icon: 100- unmanned plane;200- assistant equipments on board;210- finds the device of target;211- memorizer;212- deposits Storage controller;213- processor;214- Peripheral Interface;215- input-output unit;217- display unit;219- communication unit; 2101- receiver module;2102- visual scene analysis module;2103- target recognition module;300- network.

Specific embodiment

Purpose, technical scheme and advantage for making the embodiment of the present invention are clearer, below in conjunction with the embodiment of the present invention In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described it is clear that described embodiment is The a part of embodiment of the present invention, rather than whole embodiments.The present invention generally described and illustrated in accompanying drawing herein is implemented The assembly of example can be arranged with various different configurations and design.

Therefore, below the detailed description of the embodiments of the invention providing in the accompanying drawings is not intended to limit claimed The scope of the present invention, but be merely representative of the selected embodiment of the present invention.Based on the embodiment in the present invention, this area is common The every other embodiment that technical staff is obtained under the premise of not making creative work, broadly falls into the model of present invention protection Enclose.

It should also be noted that similar label and letter expression similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined in individual accompanying drawing, then do not need it to be defined further and explains in subsequent accompanying drawing.

In describing the invention, it should be noted that unless otherwise clearly defined and limited, term " setting ", " company Connect " should be interpreted broadly, for example, it may be being fixedly connected or being detachably connected, or it is integrally connected;It can be machine Tool connects or electrically connects;Can be to be joined directly together it is also possible to be indirectly connected to by intermediary, can be two units Connection within part.For the ordinary skill in the art, above-mentioned term can be understood in the present invention with concrete condition Concrete meaning.

Refer to Fig. 1, be the friendship that the unmanned plane 100 that present pre-ferred embodiments provide is communicated with assistant equipments on board 200 Mutually schematic diagram.Described unmanned plane 100 can be communicated with described assistant equipments on board 200 by network 300, to realize unmanned plane Data communication between 100 with assistant equipments on board 200 or interact.

Described unmanned plane 100 may be, but not limited to, many rotor wing unmanned aerial vehicles, fixed-wing unmanned plane, umbrella wing unmanned plane etc. Deng.On described unmanned plane 100, the hardware devices such as automatic pilot, control device can be installed.

Described unmanned plane 100 includes fuselage, wing.It is provided with driving means, wireless communication apparatus, power supply etc. in fuselage. Wireless communication apparatus are used for described unmanned plane 100 and enter row data communication with described assistant equipments on board 200.Such as, described unmanned The flight information of machine 100 can send described assistant equipments on board 200 to by wireless communication apparatus.Described unmanned plane 100 The control command that described assistant equipments on board 200 sends can be received by wireless communication apparatus.

It is provided with head, for carrying other equipment, such as image collecting device (such as, take the photograph by optics on described unmanned plane 100 As head), distance mearuring equipment (such as, radar) etc..By changing the direction of head, thus it is possible to vary the visual field of image acquisition and range finding Scope.

Refer to Fig. 2, Fig. 2 is the block diagram of described assistant equipments on board 200.Described assistant equipments on board 200 wraps Include and find the device 210 of target, memorizer 211, storage control 212, processor 213, Peripheral Interface 214, input and output list Unit 215, display unit 217 and communication unit 219.

Described memorizer 211, storage control 212, processor 213, Peripheral Interface 214, input-output unit 215, aobvious Show that unit 217 and each element of communication unit 219 are directly or indirectly electrically connected with each other, with realize data transmission or Interaction.For example, these elements can be realized being electrically connected with by one or more communication bus or holding wire each other.Described seek The device 210 looking for target includes at least one and can be stored in described memorizer 211 in the form of software or firmware (firmware) In or be solidificated in the software function module in the operating system (operating system, os) of described assistant equipments on board 200. Described processor 213 is used for executing the executable module of storage in described memorizer 211, for example described device finding target Software function module included by 210 and computer program etc..

Described memorizer 211 may be, but not limited to, random access memory (random access memory, Ram), read only memory (read only memory, rom), programmable read only memory (programmable read-only Memory, prom), erasable read-only memory (erasable programmable read-only memory, eprom), Electricallyerasable ROM (EEROM) electric erasable programmable read-only memory, eeprom) etc.. Wherein, memorizer 211 is used for storage program or data.Described communication unit 219 is described for being set up by described network 300 Communication connection between unmanned plane 100 and described assistant equipments on board 200, and for by described network 300 transceiving data.

Described processor 213 can be a kind of IC chip, have the disposal ability to signal.Described processor 213 can be general processor, including central processing unit (central processing unit, cpu), network processing unit (network processor, np) etc.;Can also be digital signal processor (dsp), special IC (asic), scene Programmable gate array (fpga) or other PLDs, discrete gate or transistor logic, discrete hardware group Part.Can realize or execute disclosed each method in the embodiment of the present invention, step and logic diagram.General processor is permissible It is microprocessor or this processor can also be any conventional processor etc..

Described Peripheral Interface 214 is by various input/output devices (such as input-output unit 215 and display unit 217) Coupled to described processor 213 and described memorizer 211.In certain embodiments, described Peripheral Interface 214, processor 213 And storage control 212 can realize in single chip.In some examples other, they can be respectively by independence Chip realize.

Described input-output unit 215 is used for being supplied to user input control instruction to be driven by described auxiliary with realizing user Sail the remotely control to described unmanned plane 100 for the equipment 200.Described input-output unit 215 may be, but not limited to, mouse and Keyboard etc..

Described display unit 217 provides an interactive interface (for example to use between described assistant equipments on board 200 and user Family operation interface) for showing the state of flight information of unmanned plane 100.In the present embodiment, described display unit 217 can be Liquid crystal display or touch control display.If touch control display, it can be for supporting the capacitive touch of single-point and multi-point touch operation Control screen or resistance type touch control screen etc..Support that single-point and multi-point touch operation refer to that touch control display can sense and show from this touch-control Show the touch control operation that on device, one or more positions produce, and the touch control operation that this is sensed transfers to described processor 213 Processed and calculated.

Described communication unit 219 is used for setting up even by the wireless communication apparatus of described network 300 and described unmanned plane 100 Connect, thus realizing the communication connection between described unmanned plane 100 and assistant equipments on board 200.For example, this communication unit 219 can To be connected to network 300 using radiofrequency signal, and then communicated with the radio communication device foundation of unmanned plane 100 by network 300 Connect.

First embodiment

Refer to Fig. 3, Fig. 3 is the flow chart of the searching mesh calibration method that first embodiment of the invention provides, methods described It is applied to be in communication with each other the unmanned plane 100 of connection and assistant equipments on board 200.Below searching mesh calibration method idiographic flow is entered Row elaborates.

Step s111, unmanned plane 100 obtains panoramic picture, sends panoramic picture to assistant equipments on board 200.

In the present embodiment, on described unmanned plane 100 can by arrange two panoramic optical cameras carry out described unmanned The collection of machine 100 periphery panoramic picture.Specifically, the set location of described two panoramic optical cameras can be relative, with Facilitate implementation the image acquisition at no dead angle.In this embodiment, it is preferred that, one of panoramic optical camera is arranged at described nothing Man-machine 100 propeller side is used for obtaining the image of sky, and another panoramic optical camera is arranged at described unmanned plane 100 Abdominal part is used for obtaining the image on ground.When arranging panoramic optical camera, itself part that should also be avoided described unmanned plane 100 hides Keep off the image acquisition visual field of described panoramic optical camera.The image of two panoramic optical collected by cameras is merged, that is, obtains The panoramic picture of described unmanned plane 100.The panoramic picture of acquisition is sent to assistant equipments on board by wireless communication apparatus again 200.

Step s112, assistant equipments on board 200 receives panoramic picture, and panoramic picture is carried out with visual scene analysis, obtains Scene under panoramic picture and object.

In the present embodiment, described assistant equipments on board 200 adopts the neural network algorithm of deep learning to described panorama Image carries out visual scene analysis, to obtain scene and object under described panoramic picture.

Wherein, the neural network algorithm of described deep learning is common recognizer in pattern recognition.By described depth The assistant equipments on board 200 described in neural network algorithm of degree study can tell scene residing for unmanned plane (such as, sky, City or desert etc.).May recognize that the object (such as, people, animal and electric pole etc.) under this scene simultaneously.

Step s113, carries out target recognition to the object under panoramic picture, when there is target object, sends target object Azimuth information to unmanned plane 100.

After described assistant equipments on board 200 carries out visual scene analysis, described assistant equipments on board 200 is to described scene In object carry out target recognition.

Specifically, in the present embodiment, it is previously stored with a data base in described assistant equipments on board 200.Described data The different scenes that are stored with storehouse and corresponding target object, such as, when scene is for bank's periphery, the masked man of bank's periphery or The artificial target object of the hand-held weapon of person.

During target recognition is carried out to object, by object and described data base under corresponding for the described panoramic picture The target object prestoring is compared.Then show there is target object when comparing successfully, conversely, then not existing.

Step s114, the orientation that unmanned plane 100 is located to target object carries out distance measurement, obtains target object and nobody The distance between machine 100.

In the present embodiment, described unmanned plane 100 is provided with head, continuous wave radar is fixedly connected on described head On.Wherein, described continuous wave radar is the radar continuously launching electromagnetic wave.Continuous wave radar has and suitable in the distance is had The feature that the target of any speed is tested the speed.The present embodiment can carry out the distance of target object using millimeter continuous wave radar Detect.

Described unmanned plane 100, behind the orientation receiving the target object place that described assistant equipments on board 200 sends, is adjusted Whole described head makes the orientation that the described target object of described continuous wave radar be aligned is located.When described continuous wave radar be aligned is described After target object, described continuous wave radar carries out distance measurement to described target object, obtains described target object and described nothing The distance between man-machine 100.

Step s115, unmanned plane 100 according to the distance between the orientation of target object and target object and unmanned plane 100, Near target object.

Unmanned plane 100 according to the azimuth-range of described target object, near target object.In the present embodiment, described Unmanned plane 100 can the distance away from described target object be a predeterminable range (such as, 20 meters) place to described object Body is tracked it is also possible to rest directly against on described target object.By the distance control between described unmanned plane 100 and target object System, in the range of predeterminable range, can prevent described unmanned plane 100 with losing described target object.

Second embodiment

Refer to Fig. 4, Fig. 4 is the flow chart of the searching mesh calibration method that second embodiment of the invention provides, methods described It is applied to the assistant equipments on board 200 with unmanned plane 100 communication connection.Below searching mesh calibration method idiographic flow is carried out in detail Thin elaboration.

Step s211, receives the panoramic picture being obtained by unmanned plane 100.

Specifically, unmanned plane 100, after panoramic optical camera obtains panoramic picture, the data of panoramic picture is passed to auxiliary Help steer 200.

Step s212, carries out visual scene analysis to panoramic picture, obtains the scene under panoramic picture and object.

In the present embodiment, assistant equipments on board 200 adopts the neural network algorithm of deep learning to described panoramic picture Carry out visual scene analysis, to obtain scene and object under described panoramic picture.

Wherein, the neural network algorithm of described deep learning is common recognizer in pattern recognition.By described depth The assistant equipments on board 200 described in neural network algorithm of degree study can tell scene residing for unmanned plane (such as, sky, City or desert etc.).May recognize that the object (such as, people, animal and electric pole etc.) under this scene simultaneously.

Step s213, the object under the corresponding scene to panoramic picture carries out target recognition, when there is target object, sends The azimuth information of target object to unmanned plane 100, by unmanned plane 100 according to the orientation of target object and the target object of detection with The distance between unmanned plane 100, near target object.

After described assistant equipments on board 200 carries out visual scene analysis, described assistant equipments on board 200 is to described scene In object carry out target recognition.

Specifically, in the present embodiment, it is previously stored with a data base in described assistant equipments on board 200.Described data The different scenes that are stored with storehouse and corresponding target object, such as, when scene is for bank's periphery, the masked man of bank's periphery or The artificial target object of the hand-held weapon of person.

During target recognition is carried out to object, by object and described data base under corresponding for the described panoramic picture The target object prestoring is compared.Then show there is target object when comparing successfully, conversely, then not existing.

Described unmanned plane 100, behind the orientation receiving the target object place that described assistant equipments on board 200 sends, is adjusted Whole described head makes the orientation that the described target object of described continuous wave radar be aligned is located.When described continuous wave radar be aligned is described After target object, described continuous wave radar carries out distance measurement to described target object, obtains described target object and described nothing The distance between man-machine 100.

3rd embodiment

Refer to Fig. 5, a kind of device 210 of searching target that Fig. 5 provides for third embodiment of the invention, described device 210 are applied to the assistant equipments on board 200 with unmanned plane 100 communication connection.This device 210 includes: receiver module 2101, vision Scene analysis module 2102, target recognition module 2103.

Described receiver module 2101, for receiving the panoramic picture being obtained by unmanned plane 100.

Described receiver module 2101 is used for executing step s2111 in second embodiment.

Described visual scene analysis module 2102, for panoramic picture is carried out with visual scene analysis, obtains panoramic picture Under scene and object.Visual scene analysis module 2102 adopts the neural network algorithm based on deep learning, to receive Panoramic picture is analyzed, and obtains the scene under panoramic picture and object.

Described visual scene analysis module 2102 is used for executing step s212 in second embodiment.

Described target recognition module 2103, carries out target recognition for the object under scene corresponding to panoramic picture, is depositing In target object, the azimuth information sending target object to unmanned plane 100, by unmanned plane 100 according to the orientation of target object The distance between and target object and the unmanned plane 100 detecting, near target object.

Described target recognition module 2103 is used for executing step s213 in second embodiment.

In sum, the method and device finding target provided in an embodiment of the present invention, the panoramic picture of shooting is carried out Visual scene is analyzed, and is analyzed by described visual scene and obtains panoramic picture Scene and object, obtains institute by target recognition State the target object under scene.Described target object is locked, when target object is mobile, unmanned plane being capable of real-time tracking Target simultaneously determines target location, simultaneously near described target, it is to avoid because target is mobile, unmanned plane is difficult to locking and leads to Target object is occurred by the situation with losing.

The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, for the skill of this area For art personnel, the present invention can have various modifications and variations.All within the spirit and principles in the present invention, made any repair Change, equivalent, improvement etc., should be included within the scope of the present invention.

The above, the only specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, and any Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, all should contain Cover within protection scope of the present invention.Therefore, protection scope of the present invention should described be defined by scope of the claims.

Claims (10)

1. a kind of find mesh calibration method, be applied to be in communication with each other the unmanned plane of connection and assistant equipments on board it is characterised in that Methods described includes:
Described unmanned plane obtains panoramic picture, sends described panoramic picture to described assistant equipments on board;
Described assistant equipments on board receives described panoramic picture, and described panoramic picture is carried out with visual scene analysis, obtains described Scene under panoramic picture and object;
Object under the corresponding scene to described panoramic picture carries out target recognition, when there is target object, sends target object Azimuth information give described unmanned plane;
The orientation that described unmanned plane is located to described target object is to carry out distance measurement, obtains described target object and described nothing The distance between man-machine;
Described unmanned plane according to the distance between the orientation of described target object and described target object and unmanned plane, near described Target object.
2. according to claim 1 find mesh calibration method it is characterised in that the described mode near described target object Including:
Being monitored to described target object in the default distance range of described target object, and/or rest in described On target object.
3. searching mesh calibration method according to claim 1 is it is characterised in that obtain panoramic picture in described unmanned plane In step:
Described unmanned plane obtains described panoramic picture by two panoramic optical cameras being installed on unmanned plane.
4. searching mesh calibration method according to claim 1 is it is characterised in that in described unmanned plane to said target object The orientation being located carries out distance measurement, obtains in the step of the distance between described target object and described unmanned plane:
Described unmanned plane carries out distance measurement by the continuous wave radar being installed on described unmanned plane.
5. a kind of find mesh calibration method, be applied to the assistant equipments on board that is connected with UAV Communication it is characterised in that described Method includes:
Receive the panoramic picture being obtained by described unmanned plane;
Described panoramic picture is carried out with visual scene analysis, obtains the scene under described panoramic picture and object;
Object under the corresponding scene to described panoramic picture carries out target recognition, when there is target object, sends described target The azimuth information of object gives described unmanned plane, by the described target in the orientation according to described target object for the described unmanned plane and detection The distance between object and unmanned plane, near described target object.
6. searching mesh calibration method according to claim 5 is it is characterised in that described assistant equipments on board is to described panorama Image carries out visual scene analysis, obtains described visual scene and the step of object includes:
Visual scene analysis is carried out using the neural network algorithm based on deep learning to described panoramic picture, obtains described panorama Scene under image and object.
7. searching mesh calibration method according to claim 5 is it is characterised in that methods described also includes:
Target object corresponding under different scenes is stored in advance in a data base;
Object under the corresponding scene of described panoramic picture carries out in the step of target recognition,
The target object prestoring in object under corresponding for the described panoramic picture and described data base is compared.
8. a kind of device finding target, is applied to the assistant equipments on board that is connected with UAV Communication it is characterised in that described Device includes:
Receiver module, for receiving the panoramic picture being obtained by described unmanned plane;
Visual scene analysis module, for described panoramic picture is carried out with visual scene analysis, obtains under described panoramic picture Scene and object;
Target recognition module, carries out target recognition for the object under scene corresponding to described panoramic picture, there is object During body, the azimuth information sending target object to described unmanned plane, by described unmanned plane according to the orientation of described target object and The distance between the described target object detecting and described unmanned plane, near described target object.
9. according to claim 8 find target device it is characterised in that:
Described visual scene analysis module carries out vision using the neural network algorithm based on deep learning to described panoramic picture Scene analysis, obtain the scene under described panoramic picture and object.
10. the device finding target according to claim 8 is it is characterised in that described device also includes:
Described target recognition module passes through the mesh that will prestore in the object under corresponding for described panoramic picture scene and data base Mark object is compared, and realizes target object identification.
CN201610806325.8A 2016-09-07 2016-09-07 Target searching method and apparatus CN106371460A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610806325.8A CN106371460A (en) 2016-09-07 2016-09-07 Target searching method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610806325.8A CN106371460A (en) 2016-09-07 2016-09-07 Target searching method and apparatus

Publications (1)

Publication Number Publication Date
CN106371460A true CN106371460A (en) 2017-02-01

Family

ID=57900153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610806325.8A CN106371460A (en) 2016-09-07 2016-09-07 Target searching method and apparatus

Country Status (1)

Country Link
CN (1) CN106371460A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919640A (en) * 2018-04-20 2018-11-30 西北工业大学 The implementation method of the adaptive multiple target tracking of unmanned plane

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
US20120078451A1 (en) * 2010-09-28 2012-03-29 Kabushiki Kaisha Topcon Automatic Taking-Off And Landing System
CN102621419A (en) * 2012-03-28 2012-08-01 山东省电力学校 Method for automatically recognizing and monitoring line electrical equipment based on laser and binocular vision image
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
CN103823470A (en) * 2014-03-03 2014-05-28 青岛宏百川金属精密制品有限公司 Panoramic real-time dynamic monitoring system of unmanned aerial vehicle
US20150032299A1 (en) * 2013-07-24 2015-01-29 Airbus Operations (S.A.S.) Autonomous and automatic landing method and system
CN104890875A (en) * 2015-05-28 2015-09-09 天津大学 Multi-rotor-wing unmanned aerial vehicle for panoramic shooting
CN105512628A (en) * 2015-12-07 2016-04-20 北京航空航天大学 Vehicle environment sensing system and method based on unmanned plane
CN105825713A (en) * 2016-04-08 2016-08-03 重庆大学 Vehicular-mounted unmanned aerial vehicle auxiliary driving system and operation mode
CN205510277U (en) * 2016-02-06 2016-08-24 普宙飞行器科技(深圳)有限公司 Unmanned aerial vehicle panoramic picture transmission equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
US20120078451A1 (en) * 2010-09-28 2012-03-29 Kabushiki Kaisha Topcon Automatic Taking-Off And Landing System
CN102621419A (en) * 2012-03-28 2012-08-01 山东省电力学校 Method for automatically recognizing and monitoring line electrical equipment based on laser and binocular vision image
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
US20150032299A1 (en) * 2013-07-24 2015-01-29 Airbus Operations (S.A.S.) Autonomous and automatic landing method and system
CN103823470A (en) * 2014-03-03 2014-05-28 青岛宏百川金属精密制品有限公司 Panoramic real-time dynamic monitoring system of unmanned aerial vehicle
CN104890875A (en) * 2015-05-28 2015-09-09 天津大学 Multi-rotor-wing unmanned aerial vehicle for panoramic shooting
CN105512628A (en) * 2015-12-07 2016-04-20 北京航空航天大学 Vehicle environment sensing system and method based on unmanned plane
CN205510277U (en) * 2016-02-06 2016-08-24 普宙飞行器科技(深圳)有限公司 Unmanned aerial vehicle panoramic picture transmission equipment
CN105825713A (en) * 2016-04-08 2016-08-03 重庆大学 Vehicular-mounted unmanned aerial vehicle auxiliary driving system and operation mode

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
中国无人机大会论文集编审组: "《尖兵之翼 第三届中国无人机大会论文集》", 30 June 2010, 航空工业出版社 *
刘乃琦 等: "《生活在信息时代:信息技术发展》", 30 September 2007 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919640A (en) * 2018-04-20 2018-11-30 西北工业大学 The implementation method of the adaptive multiple target tracking of unmanned plane

Similar Documents

Publication Publication Date Title
Máthé et al. Vision and control for UAVs: A survey of general methods and of inexpensive platforms for infrastructure inspection
CN105759834B (en) A kind of system and method actively capturing low latitude small-sized unmanned aircraft
CN102866706B (en) Sweeping robot navigated by smart phone and navigation sweeping method of sweeping robot
US10031518B1 (en) Feedback to facilitate control of unmanned aerial vehicles (UAVs)
US9429945B2 (en) Surveying areas using a radar system and an unmanned aerial vehicle
US20170313439A1 (en) Methods and syststems for obstruction detection during autonomous unmanned aerial vehicle landings
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
US20170023938A1 (en) Systems and methods for target tracking
JP2018512687A (en) Environmental scanning and unmanned aircraft tracking
CN205263655U (en) A system, Unmanned vehicles and ground satellite station for automatic generation panoramic photograph
WO2017004799A1 (en) Camera configuration on movable objects
JP5690539B2 (en) Automatic take-off and landing system
CN104854428B (en) sensor fusion
CN104168455B (en) A kind of space base large scene camera system and method
CN101419055B (en) Space target position and pose measuring device and method based on vision
US10282591B2 (en) Systems and methods for depth map sampling
EP2724204B1 (en) Method for acquiring images from arbitrary perspectives with uavs equipped with fixed imagers
US7734063B2 (en) Multi-agent autonomous system
US20170003690A1 (en) Wide area sensing system, in-flight detection method, and non-transitory computer readable medium storing program of wide area sensing system
US20160309124A1 (en) Control system, a method for controlling an uav, and a uav-kit
US9977434B2 (en) Automatic tracking mode for controlling an unmanned aerial vehicle
KR101747180B1 (en) Auto video surveillance system and method
CN103984357A (en) Unmanned aerial vehicle automatic obstacle avoidance flight system based on panoramic stereo imaging device
US8855846B2 (en) System and method for onboard vision processing
US20130289858A1 (en) Method for controlling and communicating with a swarm of autonomous vehicles using one-touch or one-click gestures from a mobile platform

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination