CN206388067U - Unmanned plane and system of taking photo by plane - Google Patents
Unmanned plane and system of taking photo by plane Download PDFInfo
- Publication number
- CN206388067U CN206388067U CN201720067031.8U CN201720067031U CN206388067U CN 206388067 U CN206388067 U CN 206388067U CN 201720067031 U CN201720067031 U CN 201720067031U CN 206388067 U CN206388067 U CN 206388067U
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- cpu
- distance
- objects
- plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003287 optical Effects 0.000 claims description 19
- 238000004891 communication Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000000034 method Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000000295 complement Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N lithium Chemical compound data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0nMS4wJyBlbmNvZGluZz0naXNvLTg4NTktMSc/Pgo8c3ZnIHZlcnNpb249JzEuMScgYmFzZVByb2ZpbGU9J2Z1bGwnCiAgICAgICAgICAgICAgeG1sbnM9J2h0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnJwogICAgICAgICAgICAgICAgICAgICAgeG1sbnM6cmRraXQ9J2h0dHA6Ly93d3cucmRraXQub3JnL3htbCcKICAgICAgICAgICAgICAgICAgICAgIHhtbG5zOnhsaW5rPSdodHRwOi8vd3d3LnczLm9yZy8xOTk5L3hsaW5rJwogICAgICAgICAgICAgICAgICB4bWw6c3BhY2U9J3ByZXNlcnZlJwp3aWR0aD0nMzAwcHgnIGhlaWdodD0nMzAwcHgnIHZpZXdCb3g9JzAgMCAzMDAgMzAwJz4KPCEtLSBFTkQgT0YgSEVBREVSIC0tPgo8cmVjdCBzdHlsZT0nb3BhY2l0eToxLjA7ZmlsbDojRkZGRkZGO3N0cm9rZTpub25lJyB3aWR0aD0nMzAwLjAnIGhlaWdodD0nMzAwLjAnIHg9JzAuMCcgeT0nMC4wJz4gPC9yZWN0Pgo8dGV4dCB4PScxMzguMCcgeT0nMTcwLjAnIGNsYXNzPSdhdG9tLTAnIHN0eWxlPSdmb250LXNpemU6NDBweDtmb250LXN0eWxlOm5vcm1hbDtmb250LXdlaWdodDpub3JtYWw7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOm5vbmU7Zm9udC1mYW1pbHk6c2Fucy1zZXJpZjt0ZXh0LWFuY2hvcjpzdGFydDtmaWxsOiMzQjQxNDMnID5MPC90ZXh0Pgo8dGV4dCB4PScxNjUuNicgeT0nMTcwLjAnIGNsYXNzPSdhdG9tLTAnIHN0eWxlPSdmb250LXNpemU6NDBweDtmb250LXN0eWxlOm5vcm1hbDtmb250LXdlaWdodDpub3JtYWw7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOm5vbmU7Zm9udC1mYW1pbHk6c2Fucy1zZXJpZjt0ZXh0LWFuY2hvcjpzdGFydDtmaWxsOiMzQjQxNDMnID5pPC90ZXh0Pgo8cGF0aCBkPSdNIDE4MC4zLDE1MC4wIEwgMTgwLjMsMTQ5LjggTCAxODAuMywxNDkuNyBMIDE4MC4zLDE0OS41IEwgMTgwLjIsMTQ5LjMgTCAxODAuMiwxNDkuMiBMIDE4MC4xLDE0OS4wIEwgMTgwLjAsMTQ4LjkgTCAxNzkuOSwxNDguNyBMIDE3OS44LDE0OC42IEwgMTc5LjcsMTQ4LjUgTCAxNzkuNSwxNDguNCBMIDE3OS40LDE0OC4zIEwgMTc5LjIsMTQ4LjIgTCAxNzkuMSwxNDguMSBMIDE3OC45LDE0OC4xIEwgMTc4LjcsMTQ4LjAgTCAxNzguNiwxNDguMCBMIDE3OC40LDE0OC4wIEwgMTc4LjIsMTQ4LjAgTCAxNzguMCwxNDguMCBMIDE3Ny45LDE0OC4xIEwgMTc3LjcsMTQ4LjEgTCAxNzcuNiwxNDguMiBMIDE3Ny40LDE0OC4yIEwgMTc3LjMsMTQ4LjMgTCAxNzcuMSwxNDguNCBMIDE3Ny4wLDE0OC41IEwgMTc2LjksMTQ4LjcgTCAxNzYuOCwxNDguOCBMIDE3Ni43LDE0OC45IEwgMTc2LjYsMTQ5LjEgTCAxNzYuNSwxNDkuMiBMIDE3Ni40LDE0OS40IEwgMTc2LjQsMTQ5LjYgTCAxNzYuNCwxNDkuNyBMIDE3Ni4zLDE0OS45IEwgMTc2LjMsMTUwLjEgTCAxNzYuNCwxNTAuMyBMIDE3Ni40LDE1MC40IEwgMTc2LjQsMTUwLjYgTCAxNzYuNSwxNTAuOCBMIDE3Ni42LDE1MC45IEwgMTc2LjcsMTUxLjEgTCAxNzYuOCwxNTEuMiBMIDE3Ni45LDE1MS4zIEwgMTc3LjAsMTUxLjUgTCAxNzcuMSwxNTEuNiBMIDE3Ny4zLDE1MS43IEwgMTc3LjQsMTUxLjggTCAxNzcuNiwxNTEuOCBMIDE3Ny43LDE1MS45IEwgMTc3LjksMTUxLjkgTCAxNzguMCwxNTIuMCBMIDE3OC4yLDE1Mi4wIEwgMTc4LjQsMTUyLjAgTCAxNzguNiwxNTIuMCBMIDE3OC43LDE1Mi4wIEwgMTc4LjksMTUxLjkgTCAxNzkuMSwxNTEuOSBMIDE3OS4yLDE1MS44IEwgMTc5LjQsMTUxLjcgTCAxNzkuNSwxNTEuNiBMIDE3OS43LDE1MS41IEwgMTc5LjgsMTUxLjQgTCAxNzkuOSwxNTEuMyBMIDE4MC4wLDE1MS4xIEwgMTgwLjEsMTUxLjAgTCAxODAuMiwxNTAuOCBMIDE4MC4yLDE1MC43IEwgMTgwLjMsMTUwLjUgTCAxODAuMywxNTAuMyBMIDE4MC4zLDE1MC4yIEwgMTgwLjMsMTUwLjAgTCAxNzguMywxNTAuMCBaJyBzdHlsZT0nZmlsbDojMDAwMDAwO2ZpbGwtcnVsZTpldmVub2RkO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTojMDAwMDAwO3N0cm9rZS13aWR0aDowLjBweDtzdHJva2UtbGluZWNhcDpidXR0O3N0cm9rZS1saW5lam9pbjptaXRlcjtzdHJva2Utb3BhY2l0eToxOycgLz4KPC9zdmc+Cg== data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0nMS4wJyBlbmNvZGluZz0naXNvLTg4NTktMSc/Pgo8c3ZnIHZlcnNpb249JzEuMScgYmFzZVByb2ZpbGU9J2Z1bGwnCiAgICAgICAgICAgICAgeG1sbnM9J2h0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnJwogICAgICAgICAgICAgICAgICAgICAgeG1sbnM6cmRraXQ9J2h0dHA6Ly93d3cucmRraXQub3JnL3htbCcKICAgICAgICAgICAgICAgICAgICAgIHhtbG5zOnhsaW5rPSdodHRwOi8vd3d3LnczLm9yZy8xOTk5L3hsaW5rJwogICAgICAgICAgICAgICAgICB4bWw6c3BhY2U9J3ByZXNlcnZlJwp3aWR0aD0nODVweCcgaGVpZ2h0PSc4NXB4JyB2aWV3Qm94PScwIDAgODUgODUnPgo8IS0tIEVORCBPRiBIRUFERVIgLS0+CjxyZWN0IHN0eWxlPSdvcGFjaXR5OjEuMDtmaWxsOiNGRkZGRkY7c3Ryb2tlOm5vbmUnIHdpZHRoPSc4NS4wJyBoZWlnaHQ9Jzg1LjAnIHg9JzAuMCcgeT0nMC4wJz4gPC9yZWN0Pgo8dGV4dCB4PSczNS4wJyB5PSc1My42JyBjbGFzcz0nYXRvbS0wJyBzdHlsZT0nZm9udC1zaXplOjIzcHg7Zm9udC1zdHlsZTpub3JtYWw7Zm9udC13ZWlnaHQ6bm9ybWFsO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTpub25lO2ZvbnQtZmFtaWx5OnNhbnMtc2VyaWY7dGV4dC1hbmNob3I6c3RhcnQ7ZmlsbDojM0I0MTQzJyA+TDwvdGV4dD4KPHRleHQgeD0nNTEuMCcgeT0nNTMuNicgY2xhc3M9J2F0b20tMCcgc3R5bGU9J2ZvbnQtc2l6ZToyM3B4O2ZvbnQtc3R5bGU6bm9ybWFsO2ZvbnQtd2VpZ2h0Om5vcm1hbDtmaWxsLW9wYWNpdHk6MTtzdHJva2U6bm9uZTtmb250LWZhbWlseTpzYW5zLXNlcmlmO3RleHQtYW5jaG9yOnN0YXJ0O2ZpbGw6IzNCNDE0MycgPmk8L3RleHQ+CjxwYXRoIGQ9J00gNjEuMiw0Mi4wIEwgNjEuMiw0MS45IEwgNjEuMiw0MS44IEwgNjEuMiw0MS43IEwgNjEuMiw0MS42IEwgNjEuMSw0MS41IEwgNjEuMSw0MS40IEwgNjEuMCw0MS4zIEwgNjEuMCw0MS4zIEwgNjAuOSw0MS4yIEwgNjAuOCw0MS4xIEwgNjAuNyw0MS4xIEwgNjAuNyw0MS4wIEwgNjAuNiw0MS4wIEwgNjAuNSw0MC45IEwgNjAuNCw0MC45IEwgNjAuMyw0MC45IEwgNjAuMiw0MC44IEwgNjAuMSw0MC44IEwgNjAuMCw0MC44IEwgNTkuOSw0MC45IEwgNTkuOCw0MC45IEwgNTkuNyw0MC45IEwgNTkuNiw0MC45IEwgNTkuNSw0MS4wIEwgNTkuNCw0MS4wIEwgNTkuNCw0MS4xIEwgNTkuMyw0MS4yIEwgNTkuMiw0MS4yIEwgNTkuMSw0MS4zIEwgNTkuMSw0MS40IEwgNTkuMCw0MS41IEwgNTkuMCw0MS42IEwgNTkuMCw0MS43IEwgNTguOSw0MS44IEwgNTguOSw0MS45IEwgNTguOSw0Mi4wIEwgNTguOSw0Mi4wIEwgNTguOSw0Mi4xIEwgNTguOSw0Mi4yIEwgNTkuMCw0Mi4zIEwgNTkuMCw0Mi40IEwgNTkuMCw0Mi41IEwgNTkuMSw0Mi42IEwgNTkuMSw0Mi43IEwgNTkuMiw0Mi44IEwgNTkuMyw0Mi44IEwgNTkuNCw0Mi45IEwgNTkuNCw0My4wIEwgNTkuNSw0My4wIEwgNTkuNiw0My4xIEwgNTkuNyw0My4xIEwgNTkuOCw0My4xIEwgNTkuOSw0My4xIEwgNjAuMCw0My4yIEwgNjAuMSw0My4yIEwgNjAuMiw0My4yIEwgNjAuMyw0My4xIEwgNjAuNCw0My4xIEwgNjAuNSw0My4xIEwgNjAuNiw0My4wIEwgNjAuNyw0My4wIEwgNjAuNyw0Mi45IEwgNjAuOCw0Mi45IEwgNjAuOSw0Mi44IEwgNjEuMCw0Mi43IEwgNjEuMCw0Mi43IEwgNjEuMSw0Mi42IEwgNjEuMSw0Mi41IEwgNjEuMiw0Mi40IEwgNjEuMiw0Mi4zIEwgNjEuMiw0Mi4yIEwgNjEuMiw0Mi4xIEwgNjEuMiw0Mi4wIEwgNjAuMSw0Mi4wIFonIHN0eWxlPSdmaWxsOiMwMDAwMDA7ZmlsbC1ydWxlOmV2ZW5vZGQ7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOiMwMDAwMDA7c3Ryb2tlLXdpZHRoOjAuMHB4O3N0cm9rZS1saW5lY2FwOmJ1dHQ7c3Ryb2tlLWxpbmVqb2luOm1pdGVyO3N0cm9rZS1vcGFjaXR5OjE7JyAvPgo8L3N2Zz4K [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 2
- 229910052744 lithium Inorganic materials 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000006011 modification reaction Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005183 dynamical system Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 230000000149 penetrating Effects 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003595 spectral Effects 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Abstract
The utility model provides a kind of unmanned plane and system of taking photo by plane, wherein, information acquisition unit obtains positional information of the unmanned plane relative to its surrounding objects, the flight parameter generation avoidance instruction for the unmanned plane that CPU is obtained according to the positional information and flight control units, is achieved in unmanned plane avoidance.
Description
Technical field
The utility model is related to Intelligent flight device technical field, in particular to a kind of unmanned plane and system of taking photo by plane.
Background technology
In recent years, with the fast development of science and technology, various unmanned planes are occurred in that.By radar, remote control or
The equipment such as communicator, can be tracked, remote control or Digital Transmission etc. to unmanned plane.Many fields can use nobody
Machine, such as, the task such as in military field, can be used for scouting, monitor;In civil area, it can be used for pesticide spraying, patrol
Inspection etc..The application prospect of unmanned plane is boundless.
However, in some special environment (such as, mountain area, jungle etc.), because environment complexity is, it is necessary in flight course
The middle state of flight for needing real-time monitored unmanned plane and position.The manually operated mesh to reach avoidance is needed when finding barrier
, this make it that the workload of driver is very big, while needing driver to have the respond being exceedingly fast.Easily because avoidance is sent out not in time
Raw unmanned plane air crash accident.
Utility model content
In order to overcome above-mentioned deficiency of the prior art, technical problem to be solved in the utility model is to provide a kind of nothing
Man-machine, it can grasp positional information of the unmanned plane relative to surrounding objects in real time, and when obstructing objects occur, realization is kept away
Barrier.
The utility model preferred embodiment provides a kind of unmanned plane, and the unmanned plane includes:Unmanned plane body, information are adopted
Collect unit, CPU and flight control units;
Described information collecting unit, CPU and flight control units are arranged at the unmanned plane body
On;
Described information collecting unit, flight control units are electrically connected with the CPU respectively;
Described information collecting unit is used to obtain positional information of the unmanned plane relative to its surrounding objects;
The flight control units are used for the flight parameter for obtaining the unmanned plane, and receive the CPU root
The avoidance instruction generated according to the positional information and flight parameter, realizes unmanned plane avoidance.
In the utility model preferred embodiment, described information collecting unit includes a depth camera, and the depth camera is set
Put on the unmanned plane body, for obtaining the distance between the unmanned plane and unmanned plane during flying direction objects in front
And orientation.
In the utility model preferred embodiment, described information collecting unit also includes an optical camera, and the optics is taken the photograph
As head is arranged on the unmanned plane body, the image for obtaining unmanned plane during flying direction objects in front, so that described
The image that CPU is obtained by the optical camera is recognized to the shape of heading objects in front.
In the utility model preferred embodiment, the CPU is additionally operable to according to the body form after identification and institute
Orientation of the orientation of depth camera acquisition to the unmanned plane relative to heading objects in front is stated to calibrate.
In the utility model preferred embodiment, the unmanned plane also includes being used for first turn that installs the optical camera
Motivation structure, first rotating mechanism is electrically connected with the CPU, after the CPU is according to calibration
Orientation control first rotating mechanism to rotate so that the optical camera is directed at the object.
In the utility model preferred embodiment, described information collecting unit also includes range sensor, the Distance-sensing
Device include at least one be used to detecting first of distance between the unmanned plane and unmanned plane during flying direction objects in front away from
From sensor;And at least one second distance for being used to detect distance between the unmanned plane and the unmanned plane underlying object is passed
Sensor.
In the utility model preferred embodiment, the unmanned plane also includes being used for second turn that installs the range sensor
Motivation structure, second rotating mechanism is electrically connected with the CPU, and the CPU will be according to described
The object identification that the distance that depth camera is obtained is less than a pre-determined distance is obstructing objects, and the CPU is according to described
Shape, the orientation of obstructing objects control second rotating mechanism to rotate, so that the range sensor is directed at the barrier
Body go forward side by side row distance calibration.
In the utility model preferred embodiment, the range sensor includes:Infrared distance sensor, supersonic sensing
Device.
In the utility model preferred embodiment, the unmanned plane also includes power subsystem, and the power subsystem is the nothing
Man-machine offer power.
The utility model preferred embodiment additionally provides one kind and taken photo by plane system, and the system includes above-mentioned unmanned plane and user
Terminal, the unmanned plane is connected with the user terminal communication, and the user terminal receives and shows what the unmanned plane was obtained
Image.
In terms of existing technologies, the unmanned plane and the system of taking photo by plane that the utility model is provided have the advantages that:
The unmanned plane obtains positional information of the unmanned plane relative to its surrounding objects by information acquisition unit, and
The flight parameter for the unmanned plane that flight control units are obtained is received, avoidance instruction, the flight control units is thus generated
Receive and perform the avoidance instruction, so as to realize unmanned plane avoidance, it is to avoid the generation of unmanned plane air crash accident.
Brief description of the drawings
In order to illustrate more clearly of the technical scheme of the utility model embodiment, it will use below required in embodiment
Accompanying drawing be briefly described, it will be appreciated that the following drawings illustrate only some embodiments of the present utility model, therefore should not be by
Regard the restriction to scope as, for those of ordinary skill in the art, on the premise of not paying creative work, may be used also
To obtain other related accompanying drawings according to these accompanying drawings.
Fig. 1 is the block diagram for the system of taking photo by plane that the utility model embodiment is provided.
Fig. 2 is one of block diagram of unmanned plane shown in Fig. 1.
Fig. 3 is the two of the block diagram for the unmanned plane that the utility model embodiment is provided.
Fig. 4 is the three of the block diagram for the unmanned plane that the utility model embodiment is provided.
Icon:10- takes photo by plane system;100- unmanned planes;110- CPU;120- information acquisition units;121- is deep
Spend camera;122- optical cameras;123- range sensors;130- flight control units;140- power subsystems;200- user is whole
End.
Embodiment
Below in conjunction with accompanying drawing in the utility model embodiment, the technical scheme in the utility model embodiment is carried out clear
Chu, it is fully described by, it is clear that described embodiment is only a part of embodiment of the utility model, rather than whole realities
Apply example.The component for the utility model embodiment being generally described and illustrated herein in the accompanying drawings can be come with a variety of configurations
Arrangement and design.Therefore, the detailed description of embodiment of the present utility model below to providing in the accompanying drawings is not intended to limit
Claimed scope of the present utility model, but it is merely representative of selected embodiment of the present utility model.Based on the utility model
Embodiment, the every other embodiment that those skilled in the art are obtained on the premise of creative work is not made, all
Belong to the scope of the utility model protection.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi
It is defined in individual accompanying drawing, then it further need not be defined and explained in subsequent accompanying drawing.Meanwhile, it is new in this practicality
In the description of type, term " first ", " second " etc. are only used for distinguishing description, and it is not intended that indicating or implying relatively important
Property.
, it is necessary to which explanation, unless otherwise clearly defined and limited, term " is set in description of the present utility model
Put ", " connection " should be interpreted broadly, for example, it may be being fixedly connected or being detachably connected, or be integrally connected;Can
To be mechanical connection or electrical connection;Can be joined directly together, can also be indirectly connected to by intermediary, Ke Yishi
The connection of two element internals.For the ordinary skill in the art, with concrete condition above-mentioned term can be understood at this
Concrete meaning in utility model.
Fig. 1 is refer to, Fig. 1 is the block diagram for the system 10 of taking photo by plane that the utility model preferred embodiment is provided.It is described
System of taking photo by plane 10 includes unmanned plane 100 and user terminal 200.The unmanned plane 100 is communicated with the user terminal 200,
To realize the data communication between unmanned plane 100 and user terminal 200 or interact.
The unmanned plane 100 may be, but not limited to, multi-rotor unmanned aerial vehicle, fixed-wing unmanned plane, umbrella wing unmanned plane etc..
The hardware devices such as automatic pilot, control device can be installed on the unmanned plane 100.
The unmanned plane 100 includes being provided with wireless communication apparatus in unmanned plane body, unmanned plane body.Wireless telecommunications are filled
Put and enter row data communication for the unmanned plane 100 and the user terminal 200.Such as, the image that the unmanned plane 100 is obtained
The user terminal 200 can be sent to by wireless communication apparatus, the unmanned plane 100 can also pass through wireless communication apparatus
Receive the control instruction that the user terminal 200 is sent.
The user terminal 200 may be, but not limited to, smart mobile phone, PC (Personal Computer,
PC), tablet personal computer, personal digital assistant (Personal Digital Assistant, PDA), mobile internet surfing equipment (Mobile
Internet Device, MID) etc..The user terminal 200 is received by wireless communication apparatus and shows the unmanned plane 100
The image of acquisition.
Fig. 2 is refer to, Fig. 2 is one of block diagram of unmanned plane 100 shown in Fig. 1.The unmanned plane 100 is wrapped
Include:CPU 110, information acquisition unit 120 and flight control units 130.
The CPU 110, information acquisition unit 120 and flight control units 130 are arranged at the nothing
On man-machine body.Described information collecting unit 120, flight control units 130 electrically connect with the CPU 110 respectively
Connect.Described information collecting unit 120 obtains positional information of the unmanned plane 100 relative to its surrounding objects, the flight control
Unit 130 processed obtains the flight parameter of the unmanned plane 100, and receives the CPU 110 according to position letter
Breath and the avoidance instruction of flight parameter generation, realize the avoidance of unmanned plane 100.
Wherein, the flight parameter can include, but not limited to heading, flying speed, flight attitude, aircraft side
Position etc..Avoidance software is previously stored with the CPU 110, the CPU 110 is according to the position
Information and flight parameter are instructed by avoidance Software Create avoidance, reduce the workload of driver.
Fig. 3 is refer to, Fig. 3 is the two of the block diagram for the unmanned plane 100 that the utility model embodiment is provided.The letter
Ceasing collecting unit 120 includes a depth camera 121, and the depth camera 121 is arranged on the unmanned plane body, for obtaining
The distance between the unmanned plane 100 and the heading objects in front of unmanned plane 100 and orientation.Specifically, the depth
Camera 121 may be, but not limited to, on the wing for being arranged on the unmanned plane 100, be arranged on the fuselage of the unmanned plane 100
Front, the fuselage rear for being arranged on the unmanned plane 100 etc..
Wherein, the depth camera 121 has TOF (Time Of Flight, flight time), structure light, laser scanning etc.
It is several.In the embodiment of the present embodiment, the depth camera 121 is TOF camera.The TOF camera passes through continuous to object
Light pulse is sent, the light returned from object is then received using sensor, mesh is obtained by the fly event of detecting optical pulses
Mark the distance of thing.And the TOF camera can obtain the position relationship more enriched between object by range information, that is, before dividing
Scape and background.Therefore by the depth camera 121, the distance of the unmanned plane 100 and heading objects in front can be obtained
With orientation.
Referring once again to Fig. 3, described information collecting unit 120 can also include an optical camera 122.The optics is taken the photograph
It is arranged on the unmanned plane body as first 122, the image for obtaining the heading objects in front of unmanned plane 100, with
Make the image that the CPU 110 is obtained by the optical camera 122 to the shape of heading objects in front
Recognized.
Wherein, the operation principle of camera is:The optical imagery that object is generated by camera lens projects imaging sensor table
On face, electric signal is then converted into, is changed into data image signal after A/D (analog-to-digital conversion) conversions, is then sent through data signal
Working process in process chip, is then sent to the CPU 110.Described 110 pairs of receptions of CPU
Described image carries out graphical analysis, extracts the texture information of described image, and then obtains the precise shapes of object.
Common camera is main with CMOS (Complementary Metal-Oxide- in the market
Semiconductor, complementary metal oxide semiconductor) and CCD (Charge-coupled Device, charge coupled cell)
Based on, they are all to carry out opto-electronic conversion using photo-sensitive cell, and image is converted into numerical data to realize the mesh of image capture
's.But CMOS camera has compared to CCD camera in terms of response speed, cost, power consumption, spectral response range, integrated level
There is larger advantage.
In the present embodiment, the CPU 110 is additionally operable to according to the body form after identification and the depth
The orientation that camera 121 is obtained is calibrated by the unmanned plane 100 relative to the orientation of heading objects in front.
The precise shapes of object are obtained by optical camera 122, the unmanned plane 100 can preferably obtain flight side
The specific distribution of object forwards, so as to avoid heading objects in front.
Referring once again to Fig. 3, the unmanned plane 100 also includes being used for the first rotation for installing the optical camera 122
Mechanism.First rotating mechanism is electrically connected with the CPU 110, and the CPU 110 is according to school
Orientation after standard controls first rotating mechanism to rotate, so that the optical camera 122 is directed at the object.
After the alignment of optical camera 122 object, certain party in the heading objects in front of unmanned plane 100 can be obtained
The subject image of position, so as to obtain the body form of particular orientation.In the embodiment of the present embodiment, the object can be
Obstructing objects or target object when taking photo by plane.
In the embodiment of the present embodiment, first rotating mechanism is head.Two motor are provided with head,
Wherein, a motor is responsible for the rotation of horizontal direction, and another motor is responsible for the rotation of vertical direction.It is electronic by two
The rotation of machine is so that cloud platform rotation, head drives the optical camera 122 to be rotated to be directed at the object.
Referring once again to Fig. 3, described information collecting unit 120 can also include range sensor 123.The Distance-sensing
Device 123, which includes at least one, to be used to detect distance between the unmanned plane 100 and the heading objects in front of the unmanned plane 100
The first range sensor;And at least one spacing for detecting the unmanned plane 100 and the underlying object of unmanned plane 100
From second distance sensor.
In the embodiment of the present embodiment, when the first sensor is multiple, the first sensor can uniformly be set
Put in the both sides of the depth camera 121, may also be arranged on the other positions of the unmanned plane body.In the present embodiment, lead to
Nothing can be aided in by crossing the distance between the second distance sensor detection unmanned plane 100 and described underlying object of unmanned plane 100
Man-machine 100 hide the object of the lower section of unmanned plane 100, can also when unmanned plane 100 lands auxiliary landing.
Wherein, the range sensor 123 is called displacement transducer, for sense between unmanned plane 100 and object away from
From.The range sensor 123 according to the difference of its operation principle can be divided into optical distance sensor, outer red distance measuring sensor,
Ultrasonic distance sensor etc. is a variety of.
Wherein, infrared distance sensor has a pair of infrared signal transmittings and reception diode, is sensed using infrared distance measurement
Device launches a branch of infrared light, and the process of a reflection is formed after object is irradiated to, and reflexes to sensor and is followed by the collection of letters number, so
Afterwards transmitting and the data of the time difference received are received using ccd image processor.Calculate and obtain after signal processor processes
The distance of object.
Ultrasonic sensor is the sensor that ultrasonic signal is converted into other energy signals (being typically electric signal).It is super
Sound wave there is good directionality, ray can be turned into and the features such as direction propagation, it is and very big to the penetrating power of liquid, solid.
Therefore, ultrasonic wave is employed for measurement distance.
In the present embodiment, the unmanned plane 100 also includes the second rotating machine for being used to install the range sensor 123
Structure.Second rotating mechanism is electrically connected with the CPU 110, and the CPU 110 will be according to institute
It is obstructing objects, the CPU 110 to state depth camera 121 and obtain distance less than the object identification of a pre-determined distance
Second rotating mechanism is controlled to rotate according to the shape of the obstructing objects, orientation, so that the range sensor 123 is directed at institute
State obstructing objects and carry out range calibration.
In the present embodiment, second rotating mechanism is identical with the first rotating mechanism hardware configuration, herein just not
Introduce again.
In the present embodiment, the CPU 110 sets a pre-determined distance (such as, 50 meters).When acquisition away from
During from less than the pre-determined distance, the object in the distance is identified as obstructing objects.According to the flight parameter, if described
Unmanned plane 100 will pass through the obstructing objects, and the CPU 110 then controls second mechanism to rotate, so as to obtain
Unmanned plane 100 is obtained to the accurate distance of obstructing objects to realize the purpose of the avoiding barrier body of unmanned plane 100.
The CPU 110 is obtained according to the flight parameter to depth camera 121 and range sensor 123
Information carry out Coordinate Conversion, so as to obtain the object location information around unmanned plane 100.The CPU 110 by
Flight parameter, positional information are instructed by avoidance Software Create avoidance;The flight control units 130 receive the avoidance and referred to
Order, controls the dynamical system of the unmanned plane 100 to make avoidance action, so as to realize avoidance.
Fig. 4 is refer to, Fig. 4 is the three of the block diagram for the unmanned plane 100 that the utility model is provided.In the present embodiment
In, the unmanned plane 100 also includes power subsystem 140.The power subsystem 140 is that the unmanned plane 100 provides power.
In the embodiment of the present embodiment, the power subsystem 140 can be lithium battery, can in the case where not having electricity
So that lithium battery to be removed from the unmanned plane 100, charged.The power subsystem 140 can also be solar cell, in nothing
In man-machine 100 flight course, solar cell can constantly carry out electric energy supply, so as to improve the endurance of unmanned plane 100.
In summary, the utility model provides unmanned plane and system of taking photo by plane, wherein, the information gathering in the unmanned plane
Unit can obtain positional information of the unmanned plane relative to its surrounding objects, flight control units can obtain it is described nobody
The flight parameter of machine, and the avoidance instruction that CPU is generated according to the positional information and flight parameter is received, so that
Realize unmanned plane avoidance.
Preferred embodiment of the present utility model is the foregoing is only, the utility model is not limited to, for this
For the technical staff in field, the utility model can have various modifications and variations.It is all it is of the present utility model spirit and principle
Within, any modification, equivalent substitution and improvements made etc. should be included within protection domain of the present utility model.
Claims (10)
1. a kind of unmanned plane, it is characterised in that including:Unmanned plane body, information acquisition unit, CPU and flight
Control unit;
Described information collecting unit, CPU and flight control units are arranged on the unmanned plane body;
Described information collecting unit, flight control units are electrically connected with the CPU respectively;
Described information collecting unit is used to obtain positional information of the unmanned plane relative to its surrounding objects;
The flight control units are used for the flight parameter for obtaining the unmanned plane, and receive the CPU according to institute
The avoidance instruction of positional information and flight parameter generation is stated, unmanned plane avoidance is realized.
2. unmanned plane according to claim 1, it is characterised in that described information collecting unit includes a depth camera, institute
State depth camera to be arranged on the unmanned plane body, for obtaining the unmanned plane and thing in front of the unmanned plane during flying direction
The distance between body and orientation.
3. unmanned plane according to claim 2, it is characterised in that described information collecting unit also includes an optical camera
Head, the optical camera is arranged on the unmanned plane body, for obtaining unmanned plane during flying direction objects in front
Image, so that shape of the image that is obtained by the optical camera of the CPU to heading objects in front
Recognized.
4. unmanned plane according to claim 3, it is characterised in that the CPU is additionally operable to according to after identification
Orientation of the orientation that body form and the depth camera are obtained to the unmanned plane relative to heading objects in front is carried out
Calibration.
5. unmanned plane according to claim 4, it is characterised in that the unmanned plane also includes taking the photograph for installing the optics
As the first rotating mechanism of head, first rotating mechanism is electrically connected with the CPU, and the center processing is single
Member controls first rotating mechanism to rotate according to the orientation after calibration, so that the optical camera is directed at the object.
6. unmanned plane according to claim 5, it is characterised in that described information collecting unit also includes range sensor,
The range sensor, which includes at least one, to be used to detect between the unmanned plane and unmanned plane during flying direction objects in front
First range sensor of distance;And at least one is used to detect distance between the unmanned plane and the unmanned plane underlying object
Second distance sensor.
7. unmanned plane according to claim 6, it is characterised in that the unmanned plane also includes being used to install the distance biography
Second rotating mechanism of sensor, second rotating mechanism is electrically connected with the CPU, and the center processing is single
The object identification that the distance obtained according to the depth camera is less than a pre-determined distance is obstructing objects by member, the center processing
Unit controls second rotating mechanism to rotate according to shape, the orientation of the obstructing objects, so that the range sensor pair
The accurate obstructing objects are gone forward side by side row distance calibration.
8. unmanned plane according to claim 6, it is characterised in that the range sensor includes:Infrared distance sensor,
Ultrasonic sensor.
9. unmanned plane according to claim 1, it is characterised in that the unmanned plane also includes power subsystem, the power supply
Unit provides power for the unmanned plane.
The system 10. one kind is taken photo by plane, it is characterised in that the system includes the unmanned plane described in any one in claim 1-9
And user terminal, the unmanned plane is connected with the user terminal communication, and the user terminal receives and shows the unmanned plane
The image of acquisition.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720067031.8U CN206388067U (en) | 2017-01-19 | 2017-01-19 | Unmanned plane and system of taking photo by plane |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720067031.8U CN206388067U (en) | 2017-01-19 | 2017-01-19 | Unmanned plane and system of taking photo by plane |
Publications (1)
Publication Number | Publication Date |
---|---|
CN206388067U true CN206388067U (en) | 2017-08-08 |
Family
ID=59493694
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201720067031.8U Active CN206388067U (en) | 2017-01-19 | 2017-01-19 | Unmanned plane and system of taking photo by plane |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN206388067U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020107454A1 (en) * | 2018-11-30 | 2020-06-04 | 深圳市大疆创新科技有限公司 | Method and apparatus for accurately locating obstacle, and computer readable storage medium |
-
2017
- 2017-01-19 CN CN201720067031.8U patent/CN206388067U/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020107454A1 (en) * | 2018-11-30 | 2020-06-04 | 深圳市大疆创新科技有限公司 | Method and apparatus for accurately locating obstacle, and computer readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6637068B2 (en) | Modular LIDAR system | |
CN107202793A (en) | A kind of detecting system and method for detecting external wall mass defect | |
CN206709853U (en) | Drawing system is synchronously positioned and builds in a kind of multi-rotor unmanned aerial vehicle room | |
CN102891453B (en) | Unmanned aerial vehicle patrolling line corridor method and device based on millimeter-wave radar | |
CN105785393A (en) | Unmanned aerial vehicle real-time imaging and obstacle avoidance system and method based on laser radar | |
CN104808675A (en) | Intelligent terminal-based somatosensory flight operation and control system and terminal equipment | |
KR101650136B1 (en) | The apparatus of smart drone | |
CN108469817B (en) | Unmanned ship obstacle avoidance control system based on FPGA and information fusion | |
CN108037768A (en) | Unmanned plane obstruction-avoiding control system, avoidance obstacle method and unmanned plane | |
CN108427431A (en) | A kind of four-axle aircraft and its method based on laser scanning map structuring system | |
CN104460671A (en) | Cross positioning method and system for radioactive source in three-dimensional space | |
CN108958284A (en) | A kind of unmanned plane obstacle avoidance system and method | |
CN203012513U (en) | Wireless model airplane control system | |
CN206388067U (en) | Unmanned plane and system of taking photo by plane | |
CN108132673A (en) | A kind of four-rotor aircraft control system based on STM32 | |
CN106339691A (en) | Method and device used for marking object | |
CN106371460A (en) | Target searching method and apparatus | |
CN106950976B (en) | Indoor airship three-dimensional positioning device and method based on Kalman and particle filtering | |
CN111999730A (en) | Black-flying unmanned aerial vehicle flyer positioning method and system | |
CN103995264A (en) | Vehicle-mounted mobile laser radar mapping system | |
CN207301375U (en) | Land sky combined detection system | |
CN206710892U (en) | Unmanned plane multiple remote avoidance obstacle device | |
CN206649347U (en) | A kind of application deployment system based on unmanned vehicle | |
WO2021223124A1 (en) | Position information obtaining method and device, and storage medium | |
CN112363176A (en) | Elevator shaft inspection and modeling method and device and inspection modeling system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |